From conversation
to knowledge graph
The value loop is simple. Participants contribute naturally. AI does the structuring. Everything gets a permanent home.
Contribute naturally
Participants share text, notes, reflections, and observations during and after sessions. No special tools needed — just write what matters to you.
AI extracts structure
An extraction pipeline identifies artifacts, people, commitments, and connections. Raw contributions become structured knowledge — tagged, linked, and contextualized.
Coordinate and verify
Signal what interests you. The system surfaces where energy is gathering. Every contribution is appended to a convergence chain — an append-only hash chain that makes the commons verifiable and replayable.
ETHBoulder 2026
Our first live deployment. Four days of Ethereum and public goods conversations in Boulder, Colorado — captured, structured, and preserved in real time.
A white-label knowledge layer for events
ETHBoulder gets its own branded landing page, its own scoped view of the knowledge graph, and a post-event archive that outlasts the gathering itself. Every session, every idea, every commitment — addressable and persistent.
Your event could be next. Every conference generates knowledge worth keeping. We provide the infrastructure to capture it.
Every event gets a namespace.
Every idea gets an address.
The domain is not just a website. It is infrastructure — a design space where events, ideas, people, and APIs each resolve to a predictable, permanent location.
The root
The living archive itself. The story, the entry point. Everything below is reachable from here.
you are hereThe platform
Browse the knowledge graph. Explore artifacts, people, convergences. Contribute and search.
live nowConvergence landing page
A branded entry point for ETHBoulder 2026 — event details, sessions, and a gateway to the event's knowledge graph.
Convergence-scoped app
The platform filtered to a single event. All artifacts, sessions, and participants scoped to ETHBoulder 2026.
coming soonAgent interface
Programmatic access to the knowledge graph. Query artifacts, search, verify the chain, submit contributions. Read the status, dimensions, and coordination signals. Built for agents.
live nowWhite-label subdomains
Partner events can run on their own subdomain. Full branding, scoped data, shared infrastructure.
for partnersEvery event gets its own namespace. Every idea gets a permanent address. The URL is the identity layer.
e/H-LAM/T/S
In the mid-twentieth century, while most computer scientists were trying to build machines that could think, a smaller group asked a different question: what if we used machines to help humans think better together? This path — Intelligence Amplification rather than Artificial Intelligence — produced the foundational ideas behind everything from hypertext to the internet itself. commons.id follows this lineage directly.
The augmentation lineage
Ada Lovelace — Poetic Science
The first person to see a computing machine as more than a calculator. She recognized Babbage's Analytical Engine as a symbol manipulator — capable of weaving algebraic patterns "just as the Jacquard loom weaves flowers and leaves." The conceptual bridge between physical craft and universal computation.
Vannevar Bush — The Memex
In "As We May Think," Bush imagined the Memex: a device that would serve as an "enlarged intimate supplement to memory." Not artificial thinking, but augmented remembering — the ability to create and follow trails of association through a personal archive. The seed of hypertext and the knowledge graph.
Norbert Wiener — Cybernetics
Wiener formalized the study of feedback loops in systems — how information flows between components to create self-regulation. This insight underlies everything from thermostats to ecosystems to the convergence chain in commons.id: systems that observe themselves and adjust.
J.C.R. Licklider — Man-Computer Symbiosis
Licklider envisioned a partnership: humans set goals and make judgments; machines handle routine processing and pattern matching. Not replacement but symbiosis — "a cooperative interaction in which each party contributes what it does best." He went on to fund the creation of ARPANET, the precursor to the internet.
Douglas Engelbart — H-LAM/T
At the Stanford Research Institute, Engelbart wrote "Augmenting Human Intellect: A Conceptual Framework." In it, he described the H-LAM/T system: Human using Language, Artifacts, Methodology, and Training. This wasn't a product specification — it was a theory of how humans amplify their own capability. Every tool, shared vocabulary, practiced method, and learned skill compounds into collective capacity. His insight: you can engineer the augmentation system itself.
The Mother of All Demos
Engelbart's legendary 90-minute demonstration in San Francisco. He showed, for the first time: the computer mouse, hypertext links, real-time collaborative editing, video conferencing, and a windowed user interface. Every modern computing interface descends from this moment. The audience was watching one person demonstrate the H-LAM/T framework live.
commons.id — e/H-LAM/T/S
We extend Engelbart's framework with two additions: e/ (Ecology) — because all coordination happens in a place, a season, a watershed — and /S (Sessions) — because knowledge emerges from convergence, from people gathering to think together. The result is a seven-dimensional lens for organizing collective knowledge.
Deeper roots
The Western augmentation lineage is one stream. The deeper roots of collective intelligence infrastructure stretch back millennia, across cultures that managed complex resource flows and communal agreements long before silicon.
Khipu — Andean coordination systems
The Inca and pre-Inca civilizations used knotted-string recording systems called khipu to manage the logistics of vast, non-monetary cooperative economies. They tracked tribute, census data, agricultural yields, and labor obligations across thousands of communities connected by the Qhapaq Nan road network. Khipu are a direct ancestor of what we now call Resource-Event-Agent accounting — tracking who contributed what in response to which event.
Traditional Ecological Knowledge
Indigenous knowledge systems worldwide represent living data systems — oral, biological, and observational records that hold the long-term patterns of watersheds, seasonal rhythms, and ecological health across generations. These are the original "e/ layer" — the understanding that intelligence is situated in place, that you cannot coordinate well without knowing the land you're coordinating on.
Fermentation as intelligence amplification
The practice of fermentation — bread, beer, cheese, kimchi, miso — is the oldest form of intelligence amplification. You don't engineer the outcome. You create the conditions — temperature, moisture, substrate, time — and let the microbial community do its work. This is the operational principle behind commons.id: we don't extract knowledge from communities. We create the conditions where their knowledge becomes visible to themselves.
You are not artificial. And you are not separate. Real intelligence isn't locked in a skull or a server. It is in the mycorrhizal networks that connect forest trees, moving nutrients to where they are needed. It is in the watershed that calculates the path to the sea.
The seven dimensions
Every contribution to commons.id is observed through seven lenses. Together, they form a complete picture of how a community coordinates — and what it knows.
e/ Ecology
All coordination happens in a place — a specific elevation, watershed, season. Place shapes what's possible. The e/ layer holds bioregional context so knowledge stays grounded.
H/ Human
Participants with persistent identity across events. Practitioners with embodied knowledge, relationships, and vantage points that deepen over time.
L/ Language
How we name things shapes what we can coordinate around. The knowledge graph schema is a shared vocabulary that gives the commons a grammar.
A/ Artifacts
The durable traces of human coordination. Each artifact carries origin context, lineage, and stewards. Permanent URLs make them citable and evolvable.
M/ Methodology
The practices and protocols a community uses to coordinate. Agent orchestration, stewardship, the rhythm of convergence and continuation.
T/ Training
How the system learns. Engelbart called it "bootstrapping" — using your own tools to improve your own tools. A compounding loop of capability.
S/ Sessions
Where convergence happens. The moments where people gather to think together — the structural backbone other dimensions attach to.
For event organizers
Run your next event on the commons. We handle the infrastructure so your ideas outlast the gathering.
- White-label deployment under your brand
- AI extraction pipeline for sessions
- Real-time knowledge graph during the event
- Permanent post-event archive
For technologists
The platform is open source. Built on a practical stack, designed for extension.
- Supabase + Make.com + React
- API at
api.commons.id - Peer Production License
- Contributions welcome