`arrowspace`: Capabilities, speed and accuracy
Testing all aspect of Graph Wiring on semantic data.
- How fast is `arrowspace`
- How accurate is `arrowspace`
- `arrowspace` can relevantly improve RAG systems
I am Lorenzo — AI Research Engineer — I produce novel research and code leveraging Large Language Models (GPTs and LLMs). I focus on workflows automation with AI Agents and code generation.
Also check out my research on a new generation of data engineering tools. and a list of my publications at this page.
Testing all aspect of Graph Wiring on semantic data.
Domain memory injected directly inside self-attention via a persistent Graph Laplacian (distilled knowledge graphs with arrowspace).
AI safety through topology‑aware, energy‑informed retrieval that separates stable facts from risky intuitions.
arrowspace is game-changing for data operations at scaleTest‑bed milestone for a unified vector, graph, and key‑value engine built on spectral indexing and energy‑informed search.
Deep Dive into a Rust implementation of a decoder-only transformer inspired by Karpathy's nanochat.
Rust implementation of DeepSeek-OCR compression achieves 10× token reduction, while ArrowSpace v0.18.0 introduces energy-informed retrieval that replaces cosine similarity with spectral graph properties.
Version 0.16.0 is out with quite relevant news and encouraging results for `arrowspace` to be one of the fastest approximate nearest neighbours algorithm available in the open.
Vector databases have become the backbone of modern AI workflows, particularly in RAG systems. But traditional approaches are fundamentally limited—they miss the deeper structural patterns that define how information relates within domains. Discover how ArrowSpace introduces energy-informed indexing through taumode, enabling AI systems with memory that truly understands domain contexts through spectral signatures and graph Laplacian energy.