March 16, 2026

The Cognitive Contract and Situation Modeling

Implemented the Stage 8 cognitive contract for wrapper communication and introduced deterministic conversation phase tracking.

Standardizing the Cognitive Contract

This update marks the rollout of the Stage 8 cognitive contract, a major infrastructure push to standardize how Chalie interacts with external wrappers and internal services. We’ve introduced a more robust API surface including bearer token authentication for wrappers, signal ingestion, and a dedicated Intent system. This allows the system to route actions and updates through a formal act_dispatcher, making the interaction between the reasoning loop and the UI more predictable.

As part of this, we’ve improved trait extraction safety. To prevent the system from accidentally learning traits from pasted articles or transcripts, we added rules to ignore quoted content and implemented a length-based skip for messages over 300 characters.

Situation Awareness and Phase Tracking

We are moving beyond simple message history toward a more nuanced ‘Situation Model.’ A new SituationModelService now aggregates ambient context into a coherent narrative that is injected into both ACT and RESPOND prompts via a `` placeholder.

Alongside this, the ConversationPhaseService provides a deterministic state machine to track where a conversation stands (e.g., opening, exploring, deepening, or closing). Crucially, this is handled via rolling window calculations in the MemoryStore rather than LLM calls, keeping the overhead low and the logic predictable.

Protocol Evolution and WebSockets

The WebSocket architecture is transitioning to an intent-based model. The OutputService and Digest workers now emit intents like present_response and show_narration. The WebSocket drift_sender has been updated to subscribe to these intent channels and translate them back into the existing protocol, ensuring the UI remains in sync without breaking legacy clients.

Resilience and Parser Refinement

We addressed a critical failure in the ACT loop where GPT-5.4 would occasionally return multiple concatenated JSON objects in a single response, causing the standard parser to fail and drop tool invocations. We’ve implemented a 6-layer parse cascade that uses json.JSONDecoder().raw_decode() to safely extract the first valid object from noisy or multi-object streams.

Finally, we fixed a bug where the chat history would appear empty or show yesterday’s thread after a server restart. Since the MemoryStore is wiped on restart, the system now correctly falls back to SQLite to resolve the most recent active thread ID before defaulting to the history of expired threads.