Metric-Affine HDC represents concepts as D-byte vectors over Zāā ā (integers 0-255). In AGISystem2, geometry = D (the number of byte channels per vector). The default is D=32, and this page illustrates that default configuration.
1. The Core Idea: Continuous Values in Compact Space
Fundamental Insight
Instead of thousands of binary bits, Metric-Affine uses just D byte channels (default D=32). Each channel is a byte (0-255), so the storage is D bytes = 8D bits. The continuous values enable fuzzy superposition where bundled concepts smoothly blend rather than voting discretely.
Definition: Metric-Affine Vector
A metric-affine vector V is a fixed-length sequence of bytes:
V = [b0, b1, ..., bD-1] where bi ∈ {0, 1, ..., 255}
Storage: D bytes (default 32 bytes = 256 bits).
2. The Shifted Baseline: Why 0.67, Not 0.5
Unlike binary HDC where random vectors share ~50% of bits, metric-affine vectors have a different baseline.
Theorem: Metric-Affine Baseline Similarity
For random vectors X, Y with each byte uniformly distributed over [0, 255]:
Expected absolute difference per byte: E[|Xi - Yi|] ≈ 85.33
Important: Because the baseline is ~0.67 instead of 0.5, all similarity thresholds must be adjusted. A "good match" in metric-affine is ~0.75+, whereas in dense-binary it would be ~0.55+.
3. Binding: BIND (XOR on Bytes)
Metric-Affine implements BIND as byte-wise XOR.
Definition: BIND (byte-wise XOR)
Given vectors A and B, their binding BIND(A, B) is computed byte-by-byte:
BIND(A, B)i = Ai XOR Bi
4. Bundling: Arithmetic Mean
Unlike binary majority vote, metric-affine uses arithmetic mean for bundling.
Intuition: The bundle is a "smooth blend" of all inputs. Unlike binary majority vote (all-or-nothing per bit), arithmetic mean creates a continuous interpolation. The result lies geometrically between all inputs in the space.
Gray Convergence Warning: Unlike binary majority vote, arithmetic mean causes values to drift toward the middle (128) over repeated bundling operations. Deep nested bundling can lose concept distinctiveness.
5. Similarity: Manhattan (L1) Distance
Definition: Normalized L1 Similarity
sim(A, B) = 1 - (Σ|Ai - Bi|) / (32 × 255)
The sum of absolute byte differences, normalized to [0, 1].
6. Strategy Comparison
Property
Dense-Binary
SPHDC
Metric-Affine
Memory
4 KB
k-sparse (varies)
D bytes (default 32)
Values
Binary {0, 1}
BigInt exponents
Bytes {0..255}
Bind
XOR bitwise
Cartesian XOR
XOR byte-wise
Bundle
Majority vote
Set union
Arithmetic mean
Similarity
Hamming
Jaccard
L1 Manhattan
Baseline
0.5
~0.01
~0.67
Holographic
Yes (classic)
Limited
Yes (fuzzy)
Best For
General HDC
Symbolic only
Fuzzy reasoning
7. Use Cases
Ideal Applications
Embedded Systems: D bytes per vector (default 32) is highly efficient
Metric-Affine Elastic (EMA) extends this strategy with chunked bundling (bounded depth) to reduce gray drift under large superpositions. Geometry (D bytes) remains a manual tuning knob in both strategies. Read the EMA theory page →
9. Mathematical Properties
Key Properties of Metric-Affine HDC
Perfect XOR cancellation: BIND(BIND(A, B), B) = A exactly
Commutativity: BIND(A, B) = BIND(B, A)
Associativity: BIND(BIND(A, B), C) = BIND(A, BIND(B, C))
Continuous Bundling: Smooth interpolation between concepts
Compact Storage: D bytes regardless of complexity (default 32)
Shifted Baseline: ~0.67 for random (requires threshold adjustment)