What are Vector Symbolic Architectures (VSA)?
Vector Symbolic Architectures, or Hyperdimensional Computing (HDC), represent information using wide vectors (typically >10,000 dimensions). This paradigm replaces dense weight matrices with structured algebraic operations.
Core Algebraic Operations
- Superposition (+): Aggregates vectors to represent sets or collections.
- Binding (⊗): Associates independent vectors (e.g., Key and Value) into a composite representation.
- Permutation (ρ): Facilitates the representation of sequences and structural hierarchies.
Theoretical and Practical Utility
Current LLMs function primarily as predictive engines. VSA introduces capabilities for structured and auditable reasoning:
- Transparency: Operations are algebraic, allowing for the decomposition ("unbinding") of representations to audit reasoning chains.
- Efficiency: Core operations like binding and superposition can be implemented using bitwise XOR and popcount, which are high-performance on modern CPUs.
- Structural Manipulation: VSA enables symbolic tasks, such as variable binding and tree traversal, with constant-time complexity relative to representation width.
Objective
The implementation of VSA facilitates the creation of Semantic DSLs and Knowledge Graphs that operate on CPUs. By utilizing hyperdimensional vectors, semantic search and relational reasoning can be performed without continuous reliance on GPU-bound embedding models.