Metacognition is the system's ability to reason about its own knowledge and reasoning processes. AGISystem2 exposes this internal structure through Sys2DSL commands, allowing explicit introspection, self-monitoring, and adaptive behavior.

Overview: How It All Connects

┌─────────────────────────────────────────────────────────────────┐
│                      Sys2DSL Script                             │
│  @var COMMAND subject VERB complement                           │
└───────────────────────────┬─────────────────────────────────────┘
                            │
                            ▼
┌─────────────────────────────────────────────────────────────────┐
│                    DSL Engine (dsl_engine.js)                   │
│  ┌──────────────┐  ┌──────────────┐  ┌──────────────┐          │
│  │   Parser     │─▶│  Topological │─▶│   Command    │          │
│  │              │  │    Sorter    │  │   Executor   │          │
│  └──────────────┘  └──────────────┘  └──────────────┘          │
└───────────────────────────┬─────────────────────────────────────┘
                            │
        ┌───────────────────┼───────────────────┐
        ▼                   ▼                   ▼
┌───────────────┐   ┌───────────────┐   ┌───────────────┐
│ Concept Store │   │ Theory Stack  │   │   Reasoner    │
│ (knowledge)   │   │ (layers)      │   │ (inference)   │
└───────┬───────┘   └───────┬───────┘   └───────┬───────┘
        │                   │                   │
        ▼                   ▼                   ▼
┌─────────────────────────────────────────────────────────────────┐
│                    Vector Space (1024 dims)                     │
│  ┌─────────────────────┐  ┌─────────────────────────────────┐  │
│  │  Ontology (0-255)   │  │  Axiology (256-383) + Extended  │  │
│  │  Facts, categories  │  │  Values, norms, context         │  │
│  └─────────────────────┘  └─────────────────────────────────┘  │
└─────────────────────────────────────────────────────────────────┘
      

1. Knowledge Layer: Concept Store

The Concept Store maintains all known concepts, facts, and their relationships.

Introspection Commands

Command Purpose Example
FACTS_MATCHING Query facts by pattern @facts FACTS_MATCHING Dog ? ?
GET_USAGE Usage statistics for a concept @stats GET_USAGE Water
LIST_CONCEPTS All known concepts @all LIST_CONCEPTS
COUNT Count items in a list @n COUNT $facts

Example: Self-Assessment of Knowledge

# How much do we know about animals?
@animalFacts FACTS_MATCHING ? IS_A animal
@animalCount COUNT $animalFacts

# What concepts are most used?
@dogUsage GET_USAGE Dog
@catUsage GET_USAGE Cat

# Check knowledge coverage
@hasTemperature NONEMPTY FACTS_MATCHING ? BOILS_AT ?
@text TO_NATURAL $hasTemperature

2. Theory Layer: Context Management

The Theory Stack enables hypothetical reasoning and context switching.

Theory Commands

Command Purpose Example
THEORY_PUSH Create new hypothetical layer @_ THEORY_PUSH name="exploration"
THEORY_POP Discard hypothetical layer @_ THEORY_POP
LIST_THEORIES Show theory stack @stack LIST_THEORIES
LOAD_THEORY Load saved theory @_ LOAD_THEORY physics
SAVE_THEORY Save current layer @_ SAVE_THEORY myDomain

Example: What-If Exploration

# Base knowledge
@f1 ASSERT Ice IS_A solid
@f2 ASSERT Water IS_A liquid

# Explore hypothetical: what if ice was a liquid?
@_ THEORY_PUSH name="hypothetical_ice"
@h1 ASSERT Ice IS_A liquid

# Query in this context
@q1 ASK Ice IS_A solid  # FALSE in this layer

# Check theory stack
@stack LIST_THEORIES
# Returns: ["base", "hypothetical_ice"]

# Discard hypothetical
@_ THEORY_POP

# Back to base
@q2 ASK Ice IS_A solid  # TRUE again

3. Reasoning Layer: Inference Introspection

Understand how conclusions are reached, not just what they are.

Reasoning Commands

Command Purpose Example
PROVE Find proof chain @proof PROVE Dog IS_A animal
EXPLAIN Human-readable explanation @exp EXPLAIN $proof
ABDUCT Find possible causes @causes ABDUCT Smoke CAUSED_BY
HYPOTHESIZE Generate hypotheses @hyp HYPOTHESIZE Symptom
VALIDATE Check consistency @valid VALIDATE
INFER Query with full inference @result INFER Dog IS_A animal
FORWARD_CHAIN Derive all conclusions @derived FORWARD_CHAIN
WHY Explain how a fact was derived @exp WHY Dog IS_A animal

Example: Explanation Chain

# Establish knowledge
@f1 ASSERT Dog IS_A mammal
@f2 ASSERT mammal IS_A animal

# Prove a transitive conclusion
@proof PROVE Dog IS_A animal

# Get human-readable explanation
@explanation EXPLAIN $proof
# Returns: "Dog IS_A animal because:
#   1. Dog IS_A mammal (direct fact)
#   2. mammal IS_A animal (direct fact)
#   3. IS_A is transitive, so Dog IS_A animal"

Example: Full Inference with Forward Chaining

# Establish knowledge with inference rules
@f1 ASSERT Dog IS_A mammal
@f2 ASSERT mammal IS_A animal
@f3 ASSERT animal HAS_PROPERTY needs_food

# Define a custom inference rule
@r1 DEFINE_RULE "inheritance" | ?X IS_A ?Y ; ?Y HAS_PROPERTY ?P => ?X HAS_PROPERTY ?P

# Forward chain to derive all conclusions
@derived FORWARD_CHAIN

# Query derived facts
@result INFER Dog HAS_PROPERTY needs_food
# Returns: { truth: "TRUE_INFERRED", method: "inheritance", chain: [...] }

# Understand why
@why WHY Dog HAS_PROPERTY needs_food

3.1 Contradiction Detection: Knowledge Consistency

The system actively monitors for logical contradictions and can prevent inconsistent facts from being added.

Contradiction Types

Type Description Example
DISJOINT_VIOLATION Entity belongs to mutually exclusive types Fluffy IS_A Cat AND Fluffy IS_A Dog (when Cat DISJOINT_WITH Dog)
FUNCTIONAL_VIOLATION Multiple values for a functional relation Alice BORN_IN Paris AND Alice BORN_IN London
TAXONOMIC_CYCLE Circular inheritance A IS_A B, B IS_A C, C IS_A A
CARDINALITY_VIOLATION Too many/few relations Person with 3 biological parents (max is 2)

Contradiction Commands

Command Purpose Example
CHECK_CONTRADICTION Check all facts for contradictions @report CHECK_CONTRADICTION
CHECK_WOULD_CONTRADICT Test if adding a fact would create contradiction @test CHECK_WOULD_CONTRADICT Fluffy IS_A Dog
REGISTER_FUNCTIONAL Declare a relation as functional (one value only) @_ REGISTER_FUNCTIONAL BORN_IN
REGISTER_CARDINALITY Set min/max constraints on a relation @_ REGISTER_CARDINALITY Person HAS_PARENT 0 2

Example: Consistency Checking

# Establish type disjointness
@d1 ASSERT Cat DISJOINT_WITH Dog
@d2 ASSERT mammal DISJOINT_WITH reptile

# Add some facts
@f1 ASSERT Fluffy IS_A Cat
@f2 ASSERT Rex IS_A Dog

# Check consistency
@report CHECK_CONTRADICTION
# Returns: { consistent: true, contradictions: [] }

# Test hypothetical addition
@test CHECK_WOULD_CONTRADICT Fluffy IS_A Dog
# Returns: { wouldContradict: true, reason: "DISJOINT_VIOLATION",
#            details: "Fluffy cannot be both Cat and Dog" }

# Set up functional constraints
@_ REGISTER_FUNCTIONAL BORN_IN

# This would be flagged:
@f3 ASSERT Alice BORN_IN Paris
@f4 ASSERT Alice BORN_IN London  # WARNING: functional violation

# Check again
@report2 CHECK_CONTRADICTION
# Returns: { consistent: false, contradictions: [
#   { type: "FUNCTIONAL_VIOLATION", subject: "Alice", relation: "BORN_IN", values: ["Paris", "London"] }
# ]}

4. Memory Management: Self-Optimization

The system tracks usage and can selectively forget low-value knowledge.

Memory Commands

Command Purpose Example
GET_USAGE Usage statistics @stats GET_USAGE concept
PROTECT Mark as unforgettable @_ PROTECT CoreConcept
UNPROTECT Allow forgetting @_ UNPROTECT TempConcept
FORGET Remove low-usage concepts @_ FORGET threshold=5
BOOST Increase priority @_ BOOST ImportantConcept 50

Example: Memory Maintenance

# Check what's rarely used
@lowUsage FACTS_MATCHING ? usage_count LessThan 3

# Protect critical concepts
@_ PROTECT Water
@_ PROTECT fundamental_axiom

# Boost important domain concepts
@_ BOOST CustomerData 100

# Clean up (dry run first)
@preview FORGET threshold=5 dryRun=true

# Actually forget
@result FORGET threshold=5

5. Dimension Masks: Attention Focus

Masks control which dimensions participate in reasoning – a form of selective attention.

Mask Commands

Command Purpose Example
MASK_PARTITIONS Mask by partition @m MASK_PARTITIONS ontology
MASK_DIMS Mask specific dims @m MASK_DIMS temperature color
MASK_CONCEPT Mask by concept relevance @m MASK_CONCEPT $waterRef
ASK_MASKED Query with mask @q ASK_MASKED $m Water IS_A liquid

Example: Bias Control

# Create mask that ignores value judgments
@fairMask MASK_PARTITIONS ontology

# Query without axiological bias
@result ASK_MASKED $fairMask Candidate QUALIFIES_FOR Position

# Compare with full query
@fullResult ASK Candidate QUALIFIES_FOR Position

# Explain the difference
@diff EXPLAIN_DIFFERENCE $result $fullResult

6. Relation Introspection

Query and understand the relation system itself.

Relation Introspection Commands

# List all known relations
@rels LIST_RELATIONS

# Get properties of a relation
@info GET_RELATION_INFO IS_A
# Returns: { symmetric: false, transitive: true, inverse: null }

# Find relations between concepts
@links GET_RELATIONS_BETWEEN Dog mammal
# Returns: ["IS_A"]

7. Script Evaluation: Understanding Execution

The DSL engine itself can be introspected.

Evaluation Model

Scripts are evaluated in topological order based on variable dependencies:

# These can appear in any order in the file
@b BOOL_AND $a $a    # Depends on @a
@a NONEMPTY $list    # Depends on @list
@list FACTS_MATCHING dog IS_A ?  # No dependencies

# Execution order:
# 1. @list (no deps)
# 2. @a (depends on @list)
# 3. @b (depends on @a)

Variable Composition

Variables enable metacognitive composition:

# Store a concept reference
@subject BIND_CONCEPT Water
@relation BIND_RELATION BOILS_AT
@value BIND_CONCEPT Celsius100

# Compose dynamically
@fact ASSERT $subject $relation $value

# The system "knows" it asserted Water BOILS_AT Celsius100
@reflection GET_LAST_ASSERTION
# Returns: { subject: "Water", relation: "BOILS_AT", object: "Celsius100" }

8. The Four Use Cases of Sys2DSL

Understanding the system's metacognitive purpose:

Use Case Primary Commands Metacognitive Purpose
Define Theory ASSERT, DEFINE_RELATION Building the knowledge base
Validate VALIDATE, ALL_REQUIREMENTS_SATISFIED Self-checking consistency
Hypothesize CF, THEORY_PUSH/POP, HYPOTHESIZE Exploring possibilities
Prove PROVE, ASK, EXPLAIN Deriving and explaining conclusions

9. Full Metacognitive Example

# === Domain Setup ===
@r1 DEFINE_RELATION SYMPTOM_OF inverse=HAS_SYMPTOM
@r2 DEFINE_RELATION TREATS inverse=TREATED_BY

@f1 ASSERT Fever SYMPTOM_OF Infection
@f2 ASSERT Headache SYMPTOM_OF Stress
@f3 ASSERT Aspirin TREATS Headache

# === Self-Assessment ===
@symptoms FACTS_MATCHING ? SYMPTOM_OF ?
@treatments FACTS_MATCHING ? TREATS ?
@symptomCount COUNT $symptoms
@treatmentCount COUNT $treatments

# === Hypothetical Reasoning ===
@_ THEORY_PUSH name="new_symptom_hypothesis"
@h1 ASSERT Fatigue SYMPTOM_OF Infection
@newSymptoms FACTS_MATCHING ? SYMPTOM_OF Infection
@_ THEORY_POP

# === Consistency Check ===
@valid VALIDATE

# === Explanation ===
@report TO_NATURAL "We know $symptomCount symptoms and $treatmentCount treatments"

Technical References