What is — captured.
Photos, sensor feeds, Gaussian splats, drone footage.
The physical world signed at the moment of capture and folded into the same content-addressed graph as everything else you produce.
Most capture pipelines treat the physical world as a problem to be uploaded. Photos sit on phones until someone gets round to it. Drone surveys land in inboxes as 200 GB zip files nobody opens twice. IoT readings stream into time-series stores with no link to the asset they describe. The data exists; the provenance does not.
wot.is signs at source. The moment a photo, scan, sensor reading, drone frame, or 3D splat is captured, it becomes a signed thought — content-addressed, identity-bound, timestamped, and because-linked to the device that made it and the operator who triggered it. By the time it reaches storage, the chain of custody is already complete.
Behind the scenes: zero-copy ingestor ravens (mmap → SIMD → BLAKE3 → durable WAL) move data from sensor to signed thought without round-tripping through middleware. The capture format is whatever the device produces; the substrate handles framing, signing, and provenance.
You see a unified asset graph: every photo from every site visit, every drone splat from every survey, every IoT reading from every line — all queryable as one signed dataset, all walkable back to the moment and operator that captured them.
If you're a site surveyor flying drones over construction works, an asset manager tracking machinery condition through embedded sensors, an insurance assessor capturing damage with photos that need to stand up in dispute, a heritage organisation 3D-scanning artefacts that can't be re-scanned, or a safety lead managing body-cam footage that needs chain-of-custody — this is built for you. Capture is signed at source; provenance is structural; the receipts come with the asset.
Drone splat survey, signed at the rotor, walkable months later.
A site surveyor flies a drone across a partially-built block. The drone is paired with a wot.is identity. As frames stream in, each is signed at the rotor — content-addressed, timestamped, GPS-stamped, because-linked to the flight plan and operator. The on-device pipeline computes a Gaussian splat reconstruction; the splat itself is signed and linked to the underlying frames it was derived from.
By the time the drone lands, the survey is already in the graph. The site dashboard updates. The structural engineer can walk the splat in their browser and click any region to see the source frames. The QS can check the splat against the BIM model and see exactly where divergence happened, with timestamps that name the operator and the frame.
Six months later a dispute lands. "What did the corner of the eastern block look like on 12 June?" The archive is queryable: the splat, the frames it derived from, the operator's signature, the drone's identity. No reconstruction. No "we'd need to find the original SD card." A walkable signed record from the moment the rotor turned.
Capture once. Sign at source. Walkable forever.