┌─────────────────────────────────────────────────────────────────┐
│ User Query │
│ (Natural Language) │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ LLM (Claude, GPT, etc.) │
│ │
│ 1. Parse intent │
│ 2. Extract entities and relationships │
│ 3. Generate structured query │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ AGISystem2 │
│ │
│ 1. Formal reasoning over knowledge base │
│ 2. Generate proof chain │
│ 3. Return structured result + explanation │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ LLM (Response Generation) │
│ │
│ 1. Format proof as natural language │
│ 2. Add context and caveats │
│ 3. Generate user-friendly response │
└─────────────────────────────────────────────────────────────────┘
Use the LLM to translate complex natural language into AGISystem2 DSL:
// Prompt for LLM const prompt = ` Convert this natural language to AGISystem2 DSL: User: "Is a penguin a bird that can fly?" Rules: - Use isA for type relationships - Use hasProperty for attributes - Use And for conjunctions - Variables start with ? Output only valid DSL. `; // LLM returns: // @c1 isA Penguin Bird // @c2 hasProperty Penguin can_fly // And $c1 $c2 // Execute in AGISystem2 const result = session.query(llmOutput); // Returns: { success: false, reason: "Penguins cannot fly" }
Use AGISystem2 to verify LLM claims:
// LLM generates a claim const llmClaim = "Marie Curie won the Nobel Prize in Physics"; // Translate to DSL const { dsl } = translateContextWithGrammar(llmClaim); // Verify against knowledge base const verification = session.query(dsl); if (verification.success) { console.log("Verified with proof:", verification.proof); } else { console.log("Could not verify. Checking for contradictions..."); const contradiction = session.query(`Not (${dsl})`); if (contradiction.success) { console.log("CONTRADICTION DETECTED"); } }
Use AGISystem2 for reasoning, LLM for explanation:
// AGISystem2 provides proof const result = session.query('isA Penguin Animal'); // Proof chain: // 1. isA Penguin Bird (fact) // 2. isA Bird Animal (rule) // 3. Therefore: isA Penguin Animal (transitive) // Ask LLM to explain const explanationPrompt = ` Explain this reasoning chain in simple terms: ${JSON.stringify(result.proof)} User question: "Why is a penguin an animal?" `; // LLM generates: // "A penguin is an animal because penguins are birds, // and all birds are animals. This follows from the // biological classification hierarchy."
import { Session } from './src/session.mjs'; import { translateContextWithGrammar } from './src/nlp/nl2dsl/grammar.mjs'; async function verifiedRAG(userQuery, llmClient) { // Step 1: LLM extracts intent and entities const extraction = await llmClient.complete({ prompt: `Extract the logical query from: "${userQuery}"` }); // Step 2: Retrieve relevant documents const docs = await retrieveDocuments(extraction.entities); // Step 3: Load documents into AGISystem2 const session = new Session(); for (const doc of docs) { const { dsl, errors } = translateContextWithGrammar(doc.text); if (errors.length === 0) { for (const line of dsl.split('\n')) { session.learn(line); } } } // Step 4: Reason over the knowledge const { dsl: queryDsl } = translateContextWithGrammar(extraction.query); const result = session.query(queryDsl); // Step 5: Generate response with verified facts const response = await llmClient.complete({ prompt: ` User question: ${userQuery} Verified facts (with proof): ${JSON.stringify(result)} Generate a response using only verified information. If something couldn't be verified, say so explicitly. ` }); return { answer: response.text, verified: result.success, proof: result.proof }; }
import express from 'express'; import { Session } from './src/session.mjs'; const app = express(); const sessions = new Map(); app.post('/session/:id/learn', (req, res) => { const session = sessions.get(req.params.id) || new Session(); sessions.set(req.params.id, session); const { dsl } = req.body; session.learn(dsl); res.json({ success: true }); }); app.post('/session/:id/query', (req, res) => { const session = sessions.get(req.params.id); if (!session) return res.status(404).json({ error: 'Session not found' }); const result = session.query(req.body.query); res.json(result); }); app.listen(3000);
// Define AGISystem2 as a tool for Claude const tools = [{ name: "reason", description: "Perform formal logical reasoning over a knowledge base", input_schema: { type: "object", properties: { context: { type: "string", description: "Facts and rules in natural language" }, query: { type: "string", description: "Question to answer" } } } }]; // Claude can now call: // reason({ context: "All cats are mammals. Tom is a cat.", query: "Is Tom a mammal?" }) // Returns: { success: true, proof: [...] }
| Do | Don't |
|---|---|
| Use LLM for NL understanding | Use LLM for logical reasoning |
| Verify LLM claims with AGISystem2 | Trust LLM outputs blindly |
| Return proofs with answers | Hide reasoning steps |
| Handle translation errors gracefully | Assume all NL translates perfectly |
| Use sessions for isolated contexts | Share state across unrelated queries |
`You are a translator from natural language to AGISystem2 DSL.
DSL Syntax:
- isA Subject Type (e.g., isA Tom Cat)
- hasProperty Subject Property (e.g., hasProperty Ball red)
- operator Subject Object (e.g., loves John Mary)
- @ref statement (named reference)
- Implies $antecedent $consequent (rules)
- And $a $b (conjunction)
- Not (statement) (negation)
- Variables: ?x, ?y (for rules)
Input: "${userInput}"
Output only valid DSL, one statement per line.`
`Explain this formal proof in simple, accessible language:
Proof:
${JSON.stringify(proof, null, 2)}
User's original question: "${question}"
Requirements:
- Use everyday language
- Explain each step
- Highlight key inferences
- Note any assumptions`