Retrieve AI context including relevant memories and recent conversations.
Generate context for LLM prompts by retrieving the most relevant memories and recent messages for a user. Use this to manually inject memory context into your AI applications.
Path Parameters:
custom_user_id (str): Unique identifier for the usersession_id (UUID): Session UUID to get context fromQuery Parameters:
recall_strategy (RecallStrategy): Memory retrieval strategy. Options: low_latency, balanced, agentic and auto. Default: balancedmin_top_k (int): Minimum number of memories to return. Range: 5-50. Default: 15max_top_k (int): Maximum number of memories to return. Range: 10-100. Default: 50memories_threshold (float): Similarity threshold for memories. Range: 0.2-0.8. Default: 0.6summaries_threshold (float): Similarity threshold for session summaries. Range: 0.2-0.8. Default: 0.5last_n_messages (int): Number of last messages to include in context. Range: 1-100. Optionallast_n_summaries (int): Number of last summaries to include in context. Range: 1-20. Optionaltimezone (str): Timezone for formatting timestamps (e.g., ‘America/New_York’). Default: UTCinclude_system_prompt (bool): Whether to include the default system prompt. Default: truestream (bool): Whether to stream status updates via Server-Sent Events. Default: falseReturns:
stream=false: JSON GetContextResponse with is_final=true and context fieldstream=true: SSE stream of GetContextResponse events with progress updates, ending with a final event containing the contextType of recall strategy.
low_latency, balanced, agentic, auto 5 <= x <= 5010 <= x <= 1000.2 <= x <= 0.80.2 <= x <= 0.81 <= x <= 1001 <= x <= 20