MOTHER, Metaprogramming, and the Meta Language
I've written about MOTHER's philosophy and its mechanics. This post is about what I realized after I had some time to step back and reflect on the system once it was in place.
MOTHER is a metaprogramming system. Literally. It synthesizes programs and executes them immediately within the same runtime.
The LLM receives a natural language objective and produces an execution graph composed of typed steps expressed as F# discriminated unions. These unions form a small domain-specific language (DSL) describing the program. The runtime executes that DSL directly, producing a deterministic program in F#. The model never participates in execution; the runtime never interprets intent. Natural language enters the system and a typed program emerges.
The realization that I had built such a system (while solving a practical production problem rather than pursuing an academic exercise) sent me down a historical thread I had not anticipated.
The Meta Language
F# descends from ML. The name does not refer to machine learning; it means Meta Language.
Robin Milner created ML at the University of Edinburgh in 1973 as the meta-language for the LCF theorem prover. The name was literal. ML existed so programs could reason about other programs.
The language design reflects that purpose. The type system, pattern matching, and algebraic data types all support formal reasoning about program structure with strong correctness guarantees. ML later produced Standard ML; Standard ML produced OCaml; OCaml eventually produced F#.
Today F# serves as the execution layer for a system where natural language generates typed programs that run deterministically. Milner designed ML so software could reason about programs. MOTHER uses ML's descendant so natural language can generate programs expressed in a typed DSL and executed by the runtime. The same conceptual thread runs through both systems, separated by fifty years.
The Deeper Connection
The idea reaches even further back. John McCarthy created LISP in 1958 for AI research because he needed a language where code could be treated as data. Programs could manipulate other programs as ordinary data structures. Lisp macros are the canonical example of metaprogramming: code that produces code, expressed in the same structures the language executes.
MOTHER operates on a similar principle. The LLM generates execution plans expressed as F# discriminated unions. The resulting graph is both data and program. It exists as an immutable structure that can be stored, traced, and audited; it is also something the runtime executes directly as a program. This follows the classic code-as-data principle, applied to a modern orchestration problem where the synthesizer happens to be a large language model.
The early vision for AI leaned heavily toward declarative reasoning. Systems were meant to express goals while the machine determined the procedure. Languages such as LISP, ML, and Prolog all grew from this idea.
Over time the ecosystem shifted toward more imperative patterns. Many modern agent frameworks follow loops where the model is called repeatedly, the result is inspected, and the next step is decided dynamically. The approach works and often produces useful systems, but it moves away from the original design ambition of declarative reasoning.
MOTHER returns to that earlier direction. The model synthesizes a program. The typed runtime executes that program.
Why This Matters to Me
The design decisions that produced MOTHER were practical ones: F# for exhaustive pattern matching, serverless infrastructure for reactive alignment, and a strict separation between planning and execution for reliability. None of these choices were motivated by language history.
Only later did the deeper alignment become visible. The F# type system, refined across decades of language design rooted in AI research, already contained the grammar needed for LLM-generated execution plans. Discriminated unions were not a workaround. They were the native idiom for describing the program structure the system required.
There is something meaningful in that convergence. MOTHER is not historically significant in the broader arc of computer science; claiming that would be absurd. Yet it is notable that the architecture arrived naturally when guided by a language lineage built for reasoning about programs.
Natural language programming has been an aspiration since the early days of computing. The distance between human intention and machine execution remained substantial for decades.
Large language models change that landscape. They can translate natural language objectives into structured programs that deterministic systems can execute.
MOTHER is a working example of this shift. A user describes an objective in ordinary language; the system synthesizes a typed execution plan and immediately executes that plan across isolated serverless reactors, producing an auditable result.
It still strikes me that practical engineering decisions can lead toward the same conceptual thread explored by Milner and McCarthy.