The Chat Interface (AGISystem2.sh) provides a natural language front-end to AGISystem2. It uses an LLM (Large Language Model) to translate your conversational input into the structured Sys2DSL commands that the engine understands. This makes it ideal for exploration, demonstration, and working with domain experts who don't need to learn the formal syntax.

Natural Language "Dogs are animals" LLM Agent Intent Detection Fact Extraction Response Generation Sys2DSL Commands ASSERT Dog IS_A... AGISystem2 Engine Reasoning Storage Natural language response

The Chat Interface translates natural language through an LLM into Sys2DSL commands, executes them in the AGISystem2 engine, and converts results back to natural language.

Requirements

The Chat Interface requires:

API keys can be set in your shell environment or in a .env file in your working directory or any parent directory.

Starting the Chat

# Set API key (or use .env file)
export OPENAI_API_KEY="sk-your-key-here"

# Start chat
./bin/AGISystem2.sh

# With debug output
./bin/AGISystem2.sh --debug

# Without colors
./bin/AGISystem2.sh --no-color

On startup, the interface shows which LLM providers are configured:

╔════════════════════════════════════════════════════════════╗
║              AGISystem2 Chat Interface                     ║
║        Natural Language Knowledge Reasoning                 ║
╚════════════════════════════════════════════════════════════╝

Initializing...
✓ AGISystem2 Chat initialized. LLM providers: OpenAI

Type /help for commands, or just start chatting!

You: 

Basic Usage

Teaching Facts

Simply state facts in natural language. The LLM extracts Subject-Relation-Object triples:

You: Dogs are mammals
AI: Got it! I've learned 1 fact(s):
    - Dog IS_A Mammal

You: Fire causes smoke and heat
AI: Got it! I've learned 2 fact(s):
    - Fire CAUSES Smoke
    - Fire CAUSES Heat

You: Paris is located in France, which is in Europe
AI: Got it! I've learned 2 fact(s):
    - Paris LOCATED_IN France
    - France LOCATED_IN Europe

Asking Questions

Ask questions naturally. The system determines the query type and responds conversationally:

You: Is a dog a mammal?
AI: Yes, that appears to be true based on what I know.

You: What causes smoke?
AI: Based on the facts I have, Fire causes Smoke.

You: Tell me about Paris
AI: Here's what I know about Paris:
    - Paris LOCATED_IN France

REPL Commands

Special commands starting with / provide direct control:

CommandDescription
/helpShow help information
/factsList all known facts
/conceptsList all concepts
/theoriesShow theory stack
/historyShow conversation history
/clearClear conversation history
/exitExit the chat

Advanced Features

Theory Management

Create, switch, and save theory branches through conversation:

You: Create a new theory called medical_facts
AI: Created new theory branch: "medical_facts". All new facts will be added to this context.

You: Aspirin reduces inflammation
AI: Got it! I've learned 1 fact(s):
    - Aspirin REDUCES Inflammation

You: List theories
AI: Current theory stack:
    { "depth": 1, "name": "medical_facts" }

You: Save theory as medical_v1
AI: Saved theory "medical_v1" to .AGISystem2/theories/medical_v1.sys2dsl

You: Pop theory
AI: Discarded current theory layer. Returned to previous context.

Importing Files

Import facts from text files:

You: Import file my_knowledge.txt
AI: Imported 15 facts from my_knowledge.txt

The file should contain facts in natural language, one per line. Comments start with #.

Contradiction Detection

The system automatically detects when new facts might contradict existing knowledge:

You: Dogs are reptiles
AI: I noticed potential contradictions with existing knowledge:
    - Dog was previously classified as Mammal, which is different from Reptile

    Would you like me to create a new theory branch?
    Suggested name: "alternative_classification"

How It Works Internally

The Chat Interface follows this processing pipeline:

  1. Intent Detection – The LLM analyzes your message to determine intent:
    • teach – You're stating facts to learn
    • ask – You're asking a question
    • import – You want to import a file
    • manage_theory – You want to create/switch theories
    • list – You want to see facts/concepts/theories
  2. Processing – Based on intent, specialized handlers process the request:
    • For teaching: LLM extracts facts as S-R-O triples, contradiction check, then ASSERT commands
    • For questions: LLM parses to canonical form, ASK command executed, result converted to natural language
  3. Response Generation – Results are converted back to natural language for the user

See DS(/theory/dsl_engine) for details on command execution, and DS(/chat/chat_engine) for the chat-specific processing logic.

Relation Mapping

The LLM maps natural language to these core relations:

Natural LanguageRelationExample
"is a", "are", "is a type of"IS_ADog IS_A Mammal
"has", "contains", "includes"HAS_PROPERTYWater HAS_PROPERTY liquid
"causes", "leads to", "results in"CAUSESFire CAUSES Smoke
"is caused by", "comes from"CAUSED_BYSmoke CAUSED_BY Fire
"is in", "is located in"LOCATED_INParis LOCATED_IN France
"requires", "needs"REQUIRESDriving REQUIRES License
"allows", "permits"PERMITSLicense PERMITS Driving
"prohibits", "forbids"PROHIBITED_BYSpeeding PROHIBITED_BY Law

For the full list of relations, see the Relations Reference.

Limitations

Environment Configuration

The Chat Interface auto-discovers the AchillesAgentLib library in these locations:

If the library isn't found, set NODE_PATH explicitly:

export NODE_PATH="/path/to/AchillesAgentLib/parent:$NODE_PATH"
./bin/AGISystem2.sh

Use --debug to see which paths are being searched and which library is loaded.