Modes and Context
Understanding Query Modes
Orion's two modes adjust the system message sent to the AI—essentially giving the AI different "instructions" for how to respond.
flowchart LR
User["Your Question"]
ModeSelect{{"Selected Mode"}}
Orbital["Orbital Context:<br/>Broad UE5, accessible language"]
DeepScan["Deep Scan Context:<br/>Technical, BP nodes, optimization"]
AI["AI Response"]
User --> ModeSelect
ModeSelect -->|"Orbital (Ask)"| Orbital
ModeSelect -->|"Deep Scan"| DeepScan
Orbital --> AI
DeepScan --> AI
Orbital (Ask Mode)
Use Orbital when:
- You're learning a new Unreal Engine concept
- You have a general question ("How does X work?")
- You're exploring options ("What's the best way to do Y?")
- You want beginner-friendly explanations
AI behavior:
- Broad UE5 knowledge base
- Explains concepts in accessible language
- Provides context and reasoning, not just node names
- Links to official documentation when relevant
Example conversations:
Q: "How do I make a pickup item that the player can collect?"
A: "To create a collectible pickup item in Unreal Engine, you typically:
- Create an Actor Blueprint for the pickup (e.g., BP_Pickup)
- Add a Static Mesh Component for the visual representation (coin, health pack, etc.)
- Add a Box Collision or Sphere Collision component
- Enable 'Generate Overlap Events' on the collision
- Create an Event 'OnComponentBeginOverlap' for the collision
- Check if the overlapping actor is the player (Cast to your player character class)
- If yes, apply the pickup effect (add health, add points, etc.)
- Destroy the pickup actor (or disable it and play a collection effect)
For a more polished experience, consider adding a rotating movement component, a particle effect, and a sound cue when collected."
Deep Scan (Blueprint Mode)
Use Deep Scan when:
- You're debugging a Blueprint
- You want performance optimization suggestions
- You need technical, node-level analysis
- You're using the Blueprint toolbar or right-click analysis
AI behavior:
- Highly technical responses
- Node-level detail ("The Branch node on line 34...")
- Blueprint-specific knowledge
- Optimization and performance focus
Note: Deep Scan is automatically selected when you:
- Use the Blueprint toolbar Send button
- Use Content Browser right-click "Blueprint: Graph Flow" or "Full Graph"
- (If "Graph Analysis Uses Deep Scan Mode" setting is enabled)
Example conversation:
Q: "Analyze this Blueprint for performance issues" (after sending BP_EnemyAI)
A: "I've analyzed BP_EnemyAI. Here are performance concerns:
-
Tick event is enabled and active: The Tick event has 12 nodes that execute every frame. This includes GetAllActorsOfClass (very expensive) and distance calculations. Recommendation: Disable Tick and use a Timer with 0.1-0.5 second interval. Enemies don't need frame-perfect updates.
-
GetAllActorsOfClass called frequently: In the LookForTarget function, GetAllActorsOfClass(PlayerCharacter) is called every time. Recommendation: Cache the player reference in BeginPlay and reuse it.
-
Heavy operations in BeginPlay: BeginPlay has 50+ nodes including level load operations. If you have many enemies, this will cause hitching. Recommendation: Split initialization into BeginPlay (essentials only) and a delayed Initialize function (called 0.1s later via Timer).
-
Unused variables: Variables 'DebugMode' and 'TestCounter' are defined but never used. Consider removing them."
Switching Modes
How to switch:
- Look at the top-left of the Orion chat window
- Click the mode dropdown (shows current mode: "Orbital" or "Deep Scan")
- Select the mode you want
- Next query you send will use the new mode
Important:
- Mode change affects future queries only (doesn't re-process old messages)
- You can switch modes mid-conversation
- No need to clear chat or restart
Screenshot Placeholder: [Chat window with mode dropdown in top-left corner expanded, showing two radio button options: "Orbital (Ask)" (selected), "Deep Scan (Blueprint)"]
Conversation Context
By default, Orion treats each query as independent. The AI doesn't remember what you asked previously.
Enable memory:
Settings > Interface > Use Conversation Context (Experimental)
When enabled:
- Orion sends your last N messages (default 10) with each new query
- The AI can reference prior conversation
- Follow-up questions work naturally without repeating yourself
Example with context disabled:
- You: "How do I make a health bar?"
- AI: [explains]
- You: "Show that with Event Graph instead"
- AI: "Show what with Event Graph?" (doesn't remember the health bar)
Example with context enabled:
- You: "How do I make a health bar?"
- AI: [explains Blueprint approach]
- You: "Show that with Event Graph instead"
- AI: [provides Event Graph approach] (remembers you were asking about health bars)
Adjusting context size:
- Settings > Interface > Max Context Messages (0-35)
- Lower = cheaper (fewer tokens sent), but AI forgets older messages
- Higher = better memory, but costs more tokens
Cost consideration:
Sending 10 prior messages with each query can double or triple your token usage. Use conversation context when:
- You're having a multi-turn discussion
- You're refining an idea iteratively
- Follow-up questions are essential
Disable it when:
- Asking one-off questions
- Token cost is a concern
- Each query is independent
Screenshot Placeholder: [Settings window showing Interface category with "Use Conversation Context" checkbox checked and "Max Context Messages" slider set to 10, with tooltip visible explaining the feature]