Elevating Structured Data
to Native Speed.

An open development toolkit seamlessly integrating the metadata expressiveness of o2c with the ultra-fast, lightweight binary serialization of LBIN.

Why olbin?

⚠️

The JSON Dilemma

Human-readable, but slow to parse and prone to bloated file sizes.

⚠️

The Protobuf Rigidness

Fast, but rigid schema definitions make it weak against dynamic data structures.

The Breakthrough

The Best of Both Worlds.

o2c Portability

Maintains free, structured expressiveness without compromise.

LBIN Serialization

Ultra-lightweight binary output that wastes zero network bandwidth and CPU cycles.

01

Ultra-Fast
Binary Serialization

Adopts the LBIN format. Near-zero parse time, outperforming major existing formats in serialization speed.

02
🧩

Dynamic o2c
Integration

Type-safe yet flexible packaging for structured AI and canvas data. Enables highly unconstrained schema design.

03
🛡️

Local-First,
Zero-Colonialism

Frees developers from cloud vendor lock-in. Delivers extreme performance on private hardware and local environments.

Interactive Playground

Feel the compression. Measure the nano-seconds.

input.o2c Size: 452 B
{
  "type": "entity",
  "metadata": {
    "id": "obj_001",
    "timestamp": 1715430000,
    "tags": ["dynamic", "fast"]
  },
  "payload": {
    "x": 120.5,
    "y": -45.2,
    "active": true
  }
}
output.lbin Size: 48 B
Waiting for compilation...

Quick Start

bash
# Install olbin toolchain
$ curl -fsSL https://olbin.dev/install.sh | sh
Downloading olbin-cli-darwin-arm64...
Successfully installed olbin to /usr/local/bin/olbin

# Ultra-fast compile from o2c to LBIN binary
$ olbin compile input.o2c -o output.lbin
Compiled in 0.04ms. Output size: 48 bytes.