Top Level System
A universal modular C microkernel that connects hot-loadable modules through path-based message routing. Everything is a path. Every interaction is a message.
What is TLS?
Top Level System is a new universe for building solutions and applications. The main program is named portal because it is the entry point — the door through which you access this universe. You don't interact with modules directly, you don't call languages directly, you don't touch devices directly. You go through the portal, and the portal routes you to wherever you need to be.
TLS is a core that does almost nothing by itself. It loads modules, routes messages between them via paths, and manages their lifecycle. Web servers, database connectors, serial port readers, IoT controllers, AI agents, node federation — everything is a module.
The core provides:
- Path-based routing — O(1) hash table lookup with wildcard fallback
- Universal message system — one structure for all communication
- Label-based ACL — groups on users, labels on paths, intersection = access
- Hot-loadable modules — load, unload, reload at runtime with reference counting
- Cross-platform event loop — embedded libev (epoll/kqueue/select)
- Module crash isolation — the core survives module segfaults
- Message tracing — trace_id, timestamp, hop count on every message
- Pub/Sub events — ACL-controlled event subscriptions with pattern matching
Quick Start
Build
# Detect libraries and generate config
./configure
# Build core + 50 modules
make clean && make
# Run 57 unit tests
make tests
# Install to /usr/local
make install
Create an Instance
# Create instance (auto: ports, certs, users, 50 module configs)
portal -C myapp
# Start foreground with debug
portal -n myapp -f -d
# Or via systemd
systemctl start portal-myapp
# Connect CLI
portal -n myapp -r
Use
# CLI — navigate like a filesystem
portal:/> ls
core/ auth/ users/ groups/ events/ web/ node/ iot/ ...
portal:/> get /core/status
Portal v1.0.0 — running, 50 modules, 216 paths
# HTTP — every path is a REST endpoint
curl http://host:8080/api/core/status
curl http://host:8080/api/iot/resources/devices
curl http://host:8080/api/node/resources/peers
# Remote nodes — transparent federation
portal:/> get /remote-node/core/status
portal:/> get /remote-node/iot/resources/devices
Six Interfaces
TLS exposes its path system through 6 simultaneous interfaces. All share the same paths, same ACL, same data:
| Interface | Protocol | Description |
|---|---|---|
| CLI | UNIX socket | Interactive shell with arrow keys, history, tab completion |
| HTTP | HTTP/1.1 | REST API — every path becomes an endpoint |
| HTTPS | TLS | Encrypted REST with self-signed or custom certs |
| Core TCP | Wire protocol | Binary protocol for direct integration |
| Core UDP | Wire protocol | Stateless binary protocol |
| SSH | SSH | Full CLI accessible via any SSH client |
Three Storage Backends
Every change writes to all active backends simultaneously:
| Backend | Type | Description |
|---|---|---|
file | Always active | INI-style config files in the instance directory |
sqlite | Optional | Local SQLite database with WAL mode |
psql | Optional | Remote PostgreSQL with auto-created tables |
Four Scripting Languages
Write application logic inside the TLS universe using any of these languages. All share the same path system, the same events, the same resources:
| Language | Engine | How it works |
|---|---|---|
| Lua 5.4 | Embedded interpreter | In-process, zero-copy. portal.get(), portal.call(), portal.route() |
| Python 3 | Forked subprocess | JSON pipe bridge. import portal, same API |
| C | gcc compile + dlopen | Native speed. Uses portal.h directly |
| Pascal | fpc compile + dlopen | Free Pascal. Exports app_load, app_handle |
Key Concepts
Everything Is a Path
Every resource has a universal address. A serial port, a database row, a remote IoT device, a user account, a config value — they all live in the same namespace:
/core/status # the core itself
/iot/resources/devices # IoT devices on this node
/serial/com1/read # physical RS232 port
/cache/keys # in-memory cache
/remote-dc1/db/query # database on a remote node
/warehouse/serial/com1/read # serial port on a remote machine
Everything Is a Message
One single structure carries all communication. A request to read a serial port looks identical to a request to query a remote database or toggle an IoT plug:
portal_msg_t {
id // unique message id
path // destination: "/module/resource"
method // GET, SET, CALL, EVENT, SUB, UNSUB, META
headers // key-value metadata
body // payload (any format)
ctx { // travels with every message
auth // user + labels
trace // trace_id, timestamp, hops
}
}
Modules Compose Through Messages
Modules never call each other directly. They send messages through the core. This creates emergent systems from composition:
No one orchestrated this chain. Each module did its one thing. The events connected them.