Beamtalk Architecture

How the compiler, tooling, and runtime fit together to deliver live programming.

For persistent workspaces (actors survive REPL disconnect, multiple sessions), see ADR 0004: Persistent Workspace Management.


Table of Contents


Overview

The Beamtalk compiler is written in Rust and runs as a daemon on the developer's machine. It compiles .bt source files to BEAM bytecode, which is then hot-loaded into a running BEAM node. The Rust compiler is not part of the runtime — it's build infrastructure.

┌─────────────────────────────────────────────────────────────┐
│                    Developer Machine                         │
│                                                              │
│  ┌──────────────┐      ┌──────────────┐      ┌────────────┐ │
│  │ Editor/REPL  │ ──── │ Rust Compiler│ ──── │ .beam files│ │
│  │ (VS Code)    │ IPC  │ (beamtalk)   │      │ (bytecode) │ │
│  └──────────────┘      └──────────────┘      └─────┬──────┘ │
│                                                     │        │
└─────────────────────────────────────────────────────┼────────┘
                                                      │ hot load
                                                      ▼
                              ┌────────────────────────────────┐
                              │       Running BEAM Node        │
                              │                                │
                              │  ┌────────┐  ┌────────┐       │
                              │  │Counter │  │ Agent  │  ...  │
                              │  └────────┘  └────────┘       │
                              └────────────────────────────────┘

Why Rust for the Compiler?

BEAM is optimized for concurrency and fault tolerance, not compiler workloads. Compilers need:

Rust delivers 10-100x better performance for these tasks than BEAM languages.

Additional Benefits

BenefitExplanation
No bootstrap problemDon't need Beamtalk to build Beamtalk
Single binary distributionNo Erlang/OTP dependency for the compiler
Cross-compilationBuild for any platform from any platform
Memory safetyCompiler bugs don't crash production systems

What About Self-Hosting?

Self-hosted compilers (compiler written in its own language) are elegant but costly:

For Beamtalk, self-hosting would delay shipping by 1+ years with no user-facing benefit. The liveness advantage shows up in running Beamtalk code, not in compiling it.


Component Responsibilities

ComponentRuns WhereWritten InPurpose
CompilerDev machine (daemon)RustParse, type-check, generate Core Erlang
LSP ServerDev machineRustIDE features (completions, errors, hover)
REPL CLIDev machineRustThin shell, sends input to BEAM node
REPL BackendBEAM nodeErlangReceives code, coordinates with compiler, evaluates
RuntimeBEAM nodeErlangSupervision, distribution, standard library
Your ActorsBEAM nodeCompiled BeamtalkYour application code

Compilation Pipeline

  .bt source
      │
      ▼
┌─────────────┐
│   LexerTokens with source spans
└─────────────┘
      │
      ▼
┌─────────────┐
│   ParserAST with error recovery
└─────────────┘
      │
      ▼
┌─────────────┐
│  AnalyzerType checking, name resolution
└─────────────┘
      │
      ▼
┌─────────────┐
│  CodegenCore Erlang output
└─────────────┘
      │
      ▼
┌─────────────┐
│   erlc     │  BEAM bytecode (.beam)
└─────────────┘
      │
      ▼
  Running BEAM node (hot load)

Incremental Compilation

The compiler daemon maintains state between compilations:

Target: <50ms for single-file change to loaded code.


Live Development Flow

1. Editing in VS Code

┌─────────────────────────────────────────────────────────────┐
│  VS Code                                                    │
│  ┌─────────────────────────────────────────────────────┐   │
│  │  counter.bt                                          │   │
│  │  ─────────────────────────────────────────────────   │   │
│  │  Actor subclass: Counter                             │   │
│  │    state: value = 0                                  │   │
│  │                                                      │   │
│  │    increment => self.value := self.value + 1   ← you edit here   │   │
│  │    getValue => ^self.value                           │   │
│  └─────────────────────────────────────────────────────┘   │
│                           │                                 │
│                           │ LSP (JSON-RPC over stdio)       │
│                           ▼                                 │
│  ┌─────────────────────────────────────────────────────┐   │
│  │  Beamtalk LSP Server (Rust)                         │   │
│  │  - Receives textDocument/didChange                   │   │
│  │  - Incremental recompile                             │   │
│  │  - Returns diagnostics                               │   │
│  │  - Provides completions, hover, go-to-def            │   │
│  └─────────────────────────────────────────────────────┘   │
└─────────────────────────────────────────────────────────────┘

2. Hot Reload on Save

  Ctrl+S in VS Code
        │
        ▼
  LSP Server compiles counter.bt
        │
        ▼
  Produces bt@counter.beam
        │
        ▼
  Sends to BEAM node via:           ┌────────────────────────┐
  - TCP connection, or              │    Running BEAM Node   │
  - Unix socket, or         ────────►                        │
  - File watch + signal             │  code:load_file(...)   │
        │                           │         │              │
        │                           │         ▼              │
        │                           │  ┌─────────────────┐   │
        │                           │  │ Counter actors  │   │
        │                           │  │ now use new     │   │
        │                           │  │ increment code  │   │
        │                           │  └─────────────────┘   │
        │                           └────────────────────────┘
        │
  Total time: <100ms

3. REPL Interaction

┌─────────────────┐         ┌─────────────────────────────┐
│  REPL CLITCPRunning BEAM Node      │
│  (Rust)         │ ◄─────► │                             │
│                 │         │  ┌─────────────────────┐    │
│  > counter := Counter spawn     │ REPL Server Process │    │
│  > counter increment      │  │ (Erlang)            │    │
│  > counter getValue await │  │                     │    │
│  => 1                     │  │ 1. Receive input    │    │
│                 │         │  │ 2. Call compiler    │    │
│                 │         │  │ 3. Load bytecode    │    │
│                 │         │  │ 4. Evaluate         │    │
│                 │         │  │ 5. Return result    │    │
│                 │         │  └─────────────────────┘    │
└─────────────────┘         └─────────────────────────────┘

The REPL compiles each expression on demand:

  1. Input: counter increment
  2. REPL server sends to compiler daemon
  3. Compiler returns bytecode for the expression
  4. REPL server loads and evaluates
  5. Result sent back to CLI for display

Compiler Daemon

The compiler runs as a long-lived daemon process for performance:

# Started automatically by VS Code extension or CLI
beamtalk daemon start

# Or run in foreground for debugging
beamtalk daemon --foreground

IPC Protocol

Communication via Unix socket (or TCP on Windows):

~/.beamtalk/daemon.sock

Protocol: JSON-RPC 2.0 (same as LSP)

// Request: Compile file
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "compile",
  "params": {
    "path": "/project/src/counter.bt"
  }
}

// Response
{
  "jsonrpc": "2.0",
  "id": 1,
  "result": {
    "beam_path": "/project/_build/counter.beam",
    "diagnostics": []
  }
}

Compiler State

The daemon maintains:

struct CompilerState {
    // Parsed files (invalidated on change)
    file_cache: HashMap<PathBuf, ParsedFile>,

    // Module dependency graph
    deps: DependencyGraph,

    // Salsa-style incremental queries
    db: CompilerDatabase,

    // Connected BEAM nodes for hot reload
    nodes: Vec<NodeConnection>,
}

BEAM Node Integration

Loader Process

A small Erlang process runs in the BEAM node to receive hot reloads:

-module(beamtalk_loader).
-behaviour(gen_server).

%% Receives compiled .beam files from compiler daemon
handle_cast({load_module, Module, Binary}, State) ->
    code:load_binary(Module, "", Binary),
    {noreply, State};

%% Receives expressions to evaluate (REPL)
handle_call({eval, Binary}, _From, State) ->
    code:load_binary('beamtalk_repl_temp', "", Binary),
    Result = 'beamtalk_repl_temp':eval(),
    {reply, Result, State}.

Connection to Compiler

Options for compiler-to-node communication:

MethodProsCons
TCP socketWorks across machinesNeed to manage port
Unix socketFast, localUnix-only
Distributed ErlangNative, supports remoteRequires cookie setup
File watchSimpleSlower, polling

Recommended: TCP socket for simplicity, with Distributed Erlang for remote nodes.


Directory Structure

my_project/
├── beamtalk.toml           # Project config
├── src/
│   ├── my_app.bt           # Main module
│   └── actors/
│       ├── counter.bt
│       └── agent.bt
├── test/
│   └── counter_test.bt
├── _build/                  # Compiler output
│   ├── dev/
│   │   ├── counter.beam
│   │   └── agent.beam
│   └── test/
└── deps/                    # Hex dependencies

Performance Targets

OperationTargetNotes
Keystroke to diagnostics<50msLSP responsiveness
Save to hot reload<100msEdit-run cycle
Cold compile (100 files)<5sInitial build
Incremental compile (1 file)<50msTypical edit
REPL expression<100msInteractive feel

Actor Runtime Model

Every Beamtalk actor is a BEAM process running a gen_server. This section describes the runtime representation.

Process State Structure

Each actor maintains state in a map:

%% Runtime state for a Counter actor
#{
  '__class__' => 'Counter',
  '__methods__' => #{
    increment => fun handle_increment/2,
    decrement => fun handle_decrement/2,
    getValue => fun handle_getValue/2,
    'incrementBy:' => fun 'handle_incrementBy:'/2
  },

  %% User-defined state fields
  value => 0
}

gen_server Callbacks

Generated actors implement gen_server:

-module(beamtalk_counter).
-behaviour(gen_server).

%% Start with initial state
init(Args) ->
    InitialState = #{
        '__class__' => 'Counter',
        '__methods__' => method_table(),
        value => proplists:get_value(initial, Args, 0)
    },
    {ok, InitialState}.

%% Async messages (cast) - returns future
handle_cast({Selector, Args, FuturePid}, State) ->
    case dispatch(Selector, Args, State) of
        {reply, Result, NewState} ->
            FuturePid ! {resolved, Result},
            {noreply, NewState};
        {noreply, NewState} ->
            {noreply, NewState}
    end.

%% Sync messages (call) - blocks caller
handle_call({Selector, Args}, _From, State) ->
    case dispatch(Selector, Args, State) of
        {reply, Result, NewState} ->
            {reply, Result, NewState}
    end.

Message Dispatch

Message sends compile to gen_server:cast (async) or gen_server:call (sync):

// Beamtalk
counter increment

// Compiles to (async, returns future)
FuturePid = spawn_future(),
gen_server:cast(CounterPid, {increment, [], FuturePid}),
FuturePid

The dispatch function looks up the method:

dispatch(Selector, Args, State) ->
    Methods = maps:get('__methods__', State),
    case maps:find(Selector, Methods) of
        {ok, Fun} ->
            Fun(Args, State);
        error ->
            %% doesNotUnderstand: handler
            handle_dnu(Selector, Args, State)
    end.

doesNotUnderstand: Metaprogramming

Unknown messages trigger doesNotUnderstand: if defined:

Actor subclass: Proxy
  state: target = nil

  doesNotUnderstand: selector args: args =>
    // Forward to target
    self.target perform: selector withArgs: args
handle_dnu(Selector, Args, State) ->
    case maps:find('doesNotUnderstand:args:', maps:get('__methods__', State)) of
        {ok, Fun} ->
            Fun([Selector, Args], State);
        error ->
            %% No handler - crash (let supervisor handle)
            error({unknown_message, Selector})
    end.

Code Generation Details

Beamtalk compiles to Core Erlang, which erlc then compiles to BEAM bytecode. This section shows concrete examples.

Simple Actor

Beamtalk source:

Actor subclass: Counter
  state: value = 0

  increment => self.value := self.value + 1
  getValue => ^self.value

Generated Core Erlang:

module 'beamtalk_counter' ['start_link'/1, 'init'/1,
                           'handle_cast'/2, 'handle_call'/3]
  attributes ['behaviour'='gen_server']

'start_link'/1 = fun (Args) ->
    call 'gen_server':'start_link'('beamtalk_counter', Args, [])

'init'/1 = fun (Args) ->
    let State = #{
        '__class__' => 'Counter',
        '__methods__' => #{
            'increment' => fun 'handle_increment'/2,
            'getValue' => fun 'handle_getValue'/2
        },
        'value' => 0
    }
    in {'ok', State}

'handle_increment'/2 = fun (Args, State) ->
    let Value = call 'maps':'get'('value', State)
    in let NewValue = call 'erlang':'+'(Value, 1)
    in let NewState = call 'maps':'put'('value', NewValue, State)
    in {'noreply', NewState}

'handle_getValue'/2 = fun (Args, State) ->
    let Value = call 'maps':'get'('value', State)
    in {'reply', Value, State}

Block Compilation

Blocks compile to Erlang funs:

Beamtalk:

doubled := #(1, 2, 3) collect: [:x | x * 2]

Core Erlang:

let Fun = fun (X) -> call 'erlang':'*'(X, 2)
in let Doubled = call 'lists':'map'(Fun, [1, 2, 3])

Keyword Message Compilation

Keyword messages flatten to function calls:

Beamtalk:

#{#x => 1} at: #x put: "hello"

Core Erlang:

%% Selector becomes 'at:put:'
call 'dispatch'('at:put:', [#x, <<"hello">>], DictPid)

Binary Operations with Math Precedence

Beamtalk:

result := 2 + 3 * 4  // => 14 (standard precedence)

Core Erlang:

%% Parser handles precedence, generates correct tree
let Temp = call 'erlang':'*'(3, 4)
in let Result = call 'erlang':'+'(2, Temp)

State Migration During Hot Reload

The "live programming" promise requires preserving actor state across code changes. This is one of the trickier parts of the system.

BEAM's Code Upgrade Mechanism

BEAM supports two versions of a module simultaneously:

When hot-loading:

  1. New code becomes "current"
  2. Old code becomes "old"
  3. Processes running old code continue until they make a fully-qualified call
  4. code_change/3 callback allows state transformation (delegated to beamtalk_hot_reload)

Generated code_change Callback

%% Called when module is hot-reloaded
%% Delegates to beamtalk_hot_reload domain service
code_change(OldVsn, State, Extra) ->
    beamtalk_hot_reload:code_change(OldVsn, State, Extra).

Current behavior: The beamtalk_hot_reload domain service preserves state unchanged. Future enhancements will support automatic field migration.

Explicit State Migration in Beamtalk

State migration during hot reload is handled by the beamtalk_hot_reload module. When the compiler/loader supplies new instance variable info via code_change/3, the module can add fields with defaults and remove obsolete fields. Without this info, state is preserved unchanged.

Automatic Field Migration

When the compiler detects state schema changes:

ChangeAutomatic Behavior
New field with defaultAdd field with default value
New field without defaultCompilation error (must specify)
Removed fieldKeep in state (warn) unless explicit removal
Type changeCompilation error (must specify migration)

Triggering Hot Reload

%% In beamtalk_loader
handle_cast({hot_reload, Module, Binary}, State) ->
    %% Load new code
    code:load_binary(Module, "", Binary),

    %% Trigger code_change in all running instances
    %% (BEAM does this automatically when process makes qualified call)

    %% Optionally: force immediate migration via sys:change_code
    [sys:change_code(Pid, Module, undefined, [])
     || Pid <- find_actors_of_class(Module)],

    {noreply, State}.

Limitations and Safety

What works:

What's risky:

Safety mechanism: Compiler warns about potentially unsafe migrations and requires explicit opt-in.


Future/Promise Implementation

Beamtalk is async-first: message sends return futures by default. This section describes the implementation.

Design Choice: Lightweight Processes

Each future is a lightweight BEAM process. Why?

AlternativeProsCons
Process per futureSimple, isolated, GC'd naturallyMemory overhead (~2KB/process)
Ref + registryLess memoryComplex tracking, no isolation
ETS-basedFast lookupManual cleanup, no mailbox

BEAM processes are cheap enough that process-per-future is the right default. Optimization can come later if needed.

Future Process Implementation

-module(beamtalk_future).

%% Spawn a new future
new() ->
    spawn(fun() -> pending([]) end).

%% Future states
pending(Waiters) ->
    receive
        {resolve, Value} ->
            %% Notify all waiters
            [Pid ! {future_resolved, self(), Value} || Pid <- Waiters],
            resolved(Value);
        {reject, Reason} ->
            [Pid ! {future_rejected, self(), Reason} || Pid <- Waiters],
            rejected(Reason);
        {await, Pid} ->
            %% Add to waiters list
            pending([Pid | Waiters]);
        {add_callback, resolved, Callback} ->
            pending([{callback, resolved, Callback} | Waiters]);
        {add_callback, rejected, Callback} ->
            pending([{callback, rejected, Callback} | Waiters])
    end.

resolved(Value) ->
    receive
        {await, Pid} ->
            Pid ! {future_resolved, self(), Value},
            resolved(Value);
        {add_callback, resolved, Callback} ->
            Callback(Value),
            resolved(Value);
        {add_callback, rejected, _} ->
            resolved(Value)  % Ignore reject callback
    end.

rejected(Reason) ->
    receive
        {await, Pid} ->
            Pid ! {future_rejected, self(), Reason},
            rejected(Reason);
        {add_callback, rejected, Callback} ->
            Callback(Reason),
            rejected(Reason);
        {add_callback, resolved, _} ->
            rejected(Reason)  % Ignore resolve callback
    end.

Async Send Compilation

Beamtalk:

result := agent analyze: data

Compiles to:

%% Create future
FuturePid = beamtalk_future:new(),

%% Send async message with future reference
gen_server:cast(AgentPid, {'analyze:', [Data], FuturePid}),

%% Bind future to variable
Result = FuturePid

await Implementation

Beamtalk:

value := result await

Compiles to:

%% Register as waiter
FuturePid ! {await, self()},

%% Block until resolved
Value = receive
    {future_resolved, FuturePid, V} -> V;
    {future_rejected, FuturePid, Reason} -> error(Reason)
after 30000 ->
    error(future_timeout)
end

Future Cleanup

Futures are garbage collected when:

  1. No references remain to the future pid
  2. The resolved/rejected value has been delivered to all waiters

BEAM's per-process GC handles this naturally — no manual cleanup needed.


Architecture: Next Steps

The following areas need detailed specification in future iterations:

Supervision Tree Generation

How declarative supervision compiles to OTP supervisor specs:

Error-Recovering Parser

Tooling-first parsing architecture:

Distribution Model

Actors across BEAM nodes:

Type System Architecture

If/when optional types are implemented:


References