Scriptum · Orchestration

LLM workflows that actually compile.

Scriptum is a purpose-built DSL and runtime for distributed, LLM-driven workloads. Branching, parallelism, streaming, iteration, ReAct loops, and human approval are all first-class keywords — and every script is type-checked before a single tool runs.

15
Primitives
29
gRPC RPCs
100%
Type-checked
Resumable
YAML / JSON orchestrators

Stringly-typed configs. Errors surface at 3 AM in production.

Scriptum

A real language that engineers can read, LLMs can generate, and compilers can verify — before you spend a single API call.

Three things Scriptum
does differently.

Built around a purpose-built language, a compiler that catches real errors, and a runtime that treats crashes and human approval as ordinary execution states.

Compile · Live

Full type checking before execution

The compiler loads every tool manifest, validates binding uniqueness and field access against real schemas, and rejects decide branches that bind different names. If it compiles, it runs.

DSL · Live

15 composable primitives

do · each · decide · check · together · plan · run · wait · ask · emit · yield · send · pipe · agent · sections. Every one maps 1:1 to a visual block. Streams, LLM reasoning, and human approval are keywords, not libraries.

Threads · Live

Resumable. Persistent. Inspectable.

Every step persists its input and output to thread state. Crashes, timeouts, and 48-hour human reviews are normal — inspect the binding state, fix the issue, and resume from the exact step that paused.

The lifecycle

Write. Compile. Execute. Resume.

Step 01

Write

Author a .scriptum file. Top-to-bottom, indentation-structured. Every step starts with a verb. Inputs, outputs, and tool contracts declared up-front.

Step 02

Compile

`scriptum compile` lexes, parses, loads tool manifests, type-checks field access, validates decide/check branches, and emits a deterministic YAML IR.

Step 03

Execute

The same IR runs locally (native tools, filesystem state) or on Dodil Cloud (Ignite tools, S3 state). Steps stream results via `->>` and `pipe`; parallel lanes fan out through `together`.

Step 04

Resume

Threads pause on errors, timeouts, or `ask` steps — no silent crashes. Inspect binding state, fix the failing step, and `scriptum thread resume` picks up exactly where it stopped.

Reads like a recipe. Runs like a compiler.

Every sample below is real .scriptum from the dodil-scriptum examples/ directory. The last tab is what the compiler actually prints when your script has two typos.

script "Ask Basic"
version 0.1.0

input
  deploy_version : text = "1.2.3"

output
  approval : text

ask "Approve deployment"
  prompt = "Deploy {deploy_version} to production?"
  options = ["approve", "reject", "defer"]
  timeout = "24h"
  max_retries = 3
-> approval

emit "Done"
  approval = approval.value
Powered by Ignite

Call one function. Or ten thousand.

Every Scriptum step is an Ignite function call. You start with a public registry of 23 pre-built tools — PDF extraction, semantic chunking, multimodal embeddings, PII redaction, small-LLM inference — and you can publish your own Python, Rust, Go, or Deno functions alongside them. Fan out with together, each, or pipe, and Ignite's executor pool scales the underlying concurrency for you — up to 1,000 in-flight invocations per function, warm-started in ~200ms.

fan_out_embed.scriptum
parallel
do "Extract PDF text" with ingest_extract_pdf_text
  url = report_url
-> pdf

do "Chunk text" with ingest_chunk_text
  text          = pdf.text
  chunk_size    = 512
  chunk_overlap = 50
->> chunks

pipe chunk from chunks parallel 64
  do "Embed" with ingest_embed
    inputs = [{ type = "text",
                content = chunk.text }]
    task   = "index"
->> embeddings
parallel 64 tells Scriptum to hand 64 chunks to Ignite at once. Ignite's executor pool takes it from there.
ignite executor pool
~200ms warm1000 concurrent / fnKEDA auto-scale
One together / pipe parallel N expands into N simultaneous function invocations. The pool keeps them warm; the scheduler fans them across clusters.
Public registry
Free to call

23 functions. Zero setup.

Drop any name below into a do "..." with <name> and the compiler validates the schema against the real tool manifest.

Ingest · extract
ingest_extract_pdf_textingest_extract_pdf_imagesingest_extract_docx_textingest_extract_html_texttranscribe_audio_to_text
Ingest · chunk
ingest_chunk_textingest_chunk_fileingest_chunk_imageingest_chunk_audioingest_chunk_video
Ingest · index
ingest_embedingest_rerank
Ingest · transform
ingest_normalize_textingest_redact_piiingest_resize_image
Data source
data_source_readdata_source_transfer
Analysis
analysis_small_llm
Bring your own
Your code

Python · Rust · Go · Deno

Write a function in any supported runtime, publish it to Ignite, and Scriptum discovers it via gRPC the next time you compile. Same catalog, same type checking, same fan-out behaviour.

1$ ignite function create my_score --runtime pythonscaffold
2$ ignite draft save org:my_score --code handler.pyupload
3$ ignite draft publish org:my_scoreship
Now any .scriptum file in your org can say do "..." with my_score — and the compiler validates the new tool's schema immediately, right alongside the 23 public ones.
Try it

Three demos. One language.

Browse the Primitives tab to see every keyword with a live syntax snippet. Flip to Compile and toggle the typo to watch the compiler refuse to ship a broken script. Then hit Execute to watch a thread pause at a human-in-the-loop step and resume itself.

action
do
Call a tool with typed inputs. The unit of work.
do "Search the web" with web_search
  query       = topic
  max_results = 20
-> raw_results
Demo runs entirely in your browser — the keywords, examples, and compile errors are lifted from the real dodil-scriptum docs.

Questions, answered.

Write the script. Compile the future.

Join Early Access for $500 in credits, priority onboarding, and direct access to the language team. Your first .scriptum file is fifteen keywords away.

DODILFrom data to intelligence.
© 2026 Circle Technologies Pte Ltd. All rights reserved.Built for the AI era.