Recording APNGs

Every APNG in this tutorial set was produced the same way: a tiny subject app hosted under daslang-live, then a driver script running in a second shell that toggled visual aids, called record_start, narrated each interaction, performed the action, and called record_stop. This is the meta tutorial — the driver script is the artifact, and the embedded recording demonstrates itself.

Source files:

  • examples/tutorial/recording.das — the subject: a one-slider one-button window. Tiny on purpose so every frame in the APNG corresponds to one specific driver call.

  • tests/integration/record_recording.das — the driver. Walked through below.

The subject

  1options gen2
  2
  3require imgui
  4require imgui_app
  5require glfw/glfw_boost
  6require opengl/opengl_boost
  7require live/glfw_live
  8require live/live_api
  9require live/live_commands
 10require live/live_vars
 11require live/opengl_live
 12require live_host
 13require imgui/imgui_live
 14require imgui/imgui_boost_runtime
 15require imgui/imgui_boost_v2
 16require imgui/imgui_widgets_builtin
 17require imgui/imgui_containers_builtin
 18require imgui/imgui_visual_aids
 19
 20// =============================================================================
 21// TUTORIAL: recording — the meta tutorial.
 22//
 23// Every previous tutorial's APNG was produced by the same workflow:
 24//
 25//   shell 1: daslang-live  hosts a SUBJECT app          (this .das)
 26//   shell 2: daslang       runs a DRIVER script that:
 27//              - turns on imgui_mouse_trail + imgui_cursor_sprite
 28//              - calls record_start
 29//              - posts imgui_narrate before each interaction
 30//              - moves the cursor / clicks / drags / sets values
 31//              - calls record_stop
 32//
 33// The DRIVER is the artifact that teaches recording — the subject is
 34// just whatever app you want to record. This file is intentionally tiny
 35// (one slider, one button) so the recording recipe is obvious: every
 36// frame of the APNG corresponds to one specific driver call.
 37//
 38// The driver script for this tutorial is:
 39//   tests/integration/record_recording.das
 40//
 41// Read the driver alongside the walkthrough in the RST companion — the
 42// .das + driver + APNG triple is the tutorial.
 43//
 44// STANDALONE: daslang.exe modules/dasImgui/examples/tutorial/recording.das
 45// LIVE:       daslang-live modules/dasImgui/examples/tutorial/recording.das
 46//
 47// DRIVE: see tests/integration/record_recording.das
 48// =============================================================================
 49
 50[export]
 51def init() {
 52    live_create_window("dasImgui recording tutorial", 600, 380)
 53    live_imgui_init(live_window)
 54    var io & = unsafe(GetIO())
 55    io.FontGlobalScale = 1.5
 56}
 57
 58[export]
 59def update() {
 60    if (!live_begin_frame()) return
 61    begin_frame()
 62
 63    ImGui_ImplOpenGL3_NewFrame()
 64    ImGui_ImplGlfw_NewFrame()
 65    apply_synth_io_override()
 66    NewFrame()
 67
 68    SetNextWindowPos(ImVec2(30.0f, 30.0f), ImGuiCond.FirstUseEver)
 69    SetNextWindowSize(ImVec2(520.0f, 280.0f), ImGuiCond.FirstUseEver)
 70    window(REC_WIN, (text = "subject", closable = false,
 71                     flags = ImGuiWindowFlags.None)) {
 72        text("Tiny subject — driver narrates what's happening.")
 73        slider_float(VOLUME, (text = "Volume"))
 74        if (button(SAVE_BTN, (text = "Save"))) {}
 75        text("SAVE_BTN.click_count = {SAVE_BTN.click_count}")
 76    }
 77
 78    end_of_frame()
 79    Render()
 80    var w, h : int
 81    live_get_framebuffer_size(w, h)
 82    glViewport(0, 0, w, h)
 83    glClearColor(0.10f, 0.10f, 0.12f, 1.0f)
 84    glClear(GL_COLOR_BUFFER_BIT)
 85    ImGui_ImplOpenGL3_RenderDrawData(GetDrawData())
 86
 87    live_end_frame()
 88}
 89
 90[export]
 91def shutdown() {
 92    live_imgui_shutdown()
 93    live_destroy_window()
 94}
 95
 96[export]
 97def main() {
 98    init()
 99    while (!exit_requested()) {
100        update()
101    }
102    shutdown()
103}

The subject is a normal dasImgui live-reload app — same shape as every other tutorial’s subject. The recorder doesn’t touch the subject at all; it operates entirely through the live-command HTTP surface.

The driver

recording recording
  1options gen2
  2options indenting = 4
  3options no_unused_block_arguments = false
  4options no_unused_function_arguments = false
  5
  6require imgui public
  7require imgui/imgui_playwright public
  8require daslib/json public
  9require daslib/json_boost public
 10
 11//! Driver script: record ``recording.apng`` against an
 12//! **already-running** daslang-live. This driver is also the tutorial
 13//! topic — the RST walks through every line below.
 14//!
 15//! Workflow (two shells):
 16//!
 17//!     # window 1 — host with cwd = the asset dir so the APNG lands there:
 18//!     cd modules/dasImgui/doc/source/_static/tutorials
 19//!     ../../../../../../bin/Release/daslang-live.exe \
 20//!         ../../../../examples/tutorial/recording.das
 21//!
 22//!     # window 2 — fire the driver, exits when done:
 23//!     bin/Release/daslang.exe modules/dasImgui/tests/integration/record_recording.das
 24
 25let BASE_URL = "http://127.0.0.1:9090"
 26let OUTPUT   = "recording.apng"
 27
 28def widget_bbox(var snap : JsonValue?; ident : string) : float4 {
 29    //! Pull (x_min, y_min, x_max, y_max) out of the snapshot for one
 30    //! widget. The `?[...]` chain returns null on a missing path; the
 31    //! `?? 0.0f` coalesces missing axes to zero so callers don't have
 32    //! to null-check each component.
 33    var b = snap?["globals"]?[ident]?["bbox"]
 34    return float4(
 35        b?["x"] ?? 0.0f,
 36        b?["y"] ?? 0.0f,
 37        b?["z"] ?? 0.0f,
 38        b?["w"] ?? 0.0f
 39    )
 40}
 41
 42def widget_center(var snap : JsonValue?; ident : string) : tuple<float; float> {
 43    //! Cursor-target helper — the driver moves the synth cursor to the
 44    //! center of the widget's bbox so the trail visibly travels to it.
 45    let b = widget_bbox(snap, ident)
 46    return ((b.x + b.z) * 0.5f, (b.y + b.w) * 0.5f)
 47}
 48
 49[export]
 50def main {
 51    // ---- 1. Connect to the live host ----
 52    // ImguiApp bundles the HTTP transport + base URL into a value the
 53    // playwright helpers (post_command, move_to, click_at, ...) all
 54    // take as their first arg.
 55    let app = ImguiApp(
 56        base_url = BASE_URL,
 57        feature_path = "",
 58        transport <- live_api_transport(BASE_URL)
 59    )
 60    print("[record] connecting to {BASE_URL}\n")
 61    if (!wait_until_ready(app, 5.0f)) {
 62        panic("daslang-live not responding on {BASE_URL} — is it running with recording.das?")
 63    }
 64
 65    // ---- 2. Resolve widget centers from a snapshot ----
 66    // wait_for_render polls imgui_snapshot until the named widget
 67    // appears (covers the cold-start gap where the subject is up but
 68    // first frame hasn't rendered). 10s timeout is generous.
 69    let T_VOLUME = "REC_WIN/VOLUME"
 70    let T_SAVE   = "REC_WIN/SAVE_BTN"
 71    var snap = wait_for_render(app, T_SAVE, 10.0f)
 72    if (snap == null) {
 73        panic("{T_SAVE} never rendered — wrong app running on {BASE_URL}?")
 74    }
 75    let p_volume = widget_center(snap, T_VOLUME)
 76    let p_save   = widget_center(snap, T_SAVE)
 77
 78    // Volume drag: 15% to 80% along the slider's bbox width.
 79    let vol_bb = widget_bbox(snap, T_VOLUME)
 80    let vol_y  = (vol_bb.y + vol_bb.w) * 0.5f
 81    let vol_x0 = vol_bb.x + (vol_bb.z - vol_bb.x) * 0.15f
 82    let vol_x1 = vol_bb.x + (vol_bb.z - vol_bb.x) * 0.80f
 83
 84    // ---- 3. Enable visual aids BEFORE record_start ----
 85    // mouse_trail draws the fading line behind the synth cursor;
 86    // cursor_sprite draws a visible pointer at io.MousePos so the
 87    // recording shows what's clicking what. Enable both first so the
 88    // very first frame of the APNG already has the cursor visible.
 89    post_command(app, "imgui_mouse_trail",   JV((enabled = true)))
 90    post_command(app, "imgui_cursor_sprite", JV((enabled = true)))
 91
 92    // ---- 4. record_start: opens the APNG writer ----
 93    // file lands next to daslang-live's cwd (we set that to
 94    // doc/source/_static/tutorials/ in shell 1). max_seconds caps the
 95    // recording so a runaway driver doesn't churn disk forever.
 96    post_command(app, "record_start", JV((file = OUTPUT, fps = 30, max_seconds = 30)))
 97
 98    //! Pacing rule (LOCKED): `frames` is the app's frame counter (60 fps
 99    //! under vsync), NOT the recorder's fps. For ~3s readable narrate:
100    //!     frames = 180   →  3.0s visible
101    //!     sleep  = 3500u →  narrate disappears with ~500ms gap before next
102    //! Then ~1500ms result dwell after the action.
103
104    // ---- 5. Narrate then act — repeat ----
105
106    // Stage 1: slider drag
107    post_command(app, "imgui_narrate", JV((
108        text = "imgui_narrate posts a sticky-note over a target.",
109        target = T_VOLUME,
110        frames = 200
111    )))
112    move_to(app, p_volume)
113    sleep(3500u)
114    var slider_events : array<JsonValue?>
115    slider_events |> drag_along(0, (vol_x0, vol_y), (vol_x1, vol_y), 1400)
116    post_command(app, "imgui_mouse_play", JV((events = slider_events)))
117    sleep(2500u)
118
119    // Stage 2: click the button
120    post_command(app, "imgui_narrate", JV((
121        text = "imgui_mouse_play feeds a scripted timeline of events.",
122        target = T_SAVE,
123        frames = 200
124    )))
125    move_to(app, p_save)
126    sleep(3500u)
127    var save_events : array<JsonValue?>
128    save_events |> click_at(0, p_save)
129    post_command(app, "imgui_mouse_play", JV((events = save_events)))
130    sleep(1500u)
131
132    // ---- 6. record_stop: flushes the APNG writer, prints stats ----
133    // The response carries (frames, duration_s, dropped, saved) — a quick
134    // sanity check for the driver author.
135    let stopped = post_command(app, "record_stop", null)
136    let stop_dump = stopped != null ? write_json(stopped) : "<null>"
137    print("[record] record_stop -> {stop_dump}\n")
138    print("[record] APNG saved to {OUTPUT} (next to daslang-live's CWD)\n")
139
140    // ---- 7. Disable visual aids ----
141    // The host stays alive after the driver exits; clean up so the next
142    // driver (or a developer poking at the live host manually) starts
143    // from a clean state.
144    post_command(app, "imgui_cursor_sprite", JV((enabled = false)))
145    post_command(app, "imgui_mouse_trail",   JV((enabled = false)))
146}

Anatomy of a driver

  1. Connect to the live host. ImguiApp bundles the base URL + transport into a value every playwright helper takes as its first argument. wait_until_ready polls until the HTTP server answers, so the driver can be launched concurrently with daslang-live and recover from the cold-start gap.

  2. Resolve widget centers. wait_for_render polls imgui_snapshot until the named widget shows up in the registry (covers the first-frame gap where the subject’s window exists but hasn’t rendered yet). The helper widget_bbox / widget_center extract the geometry from the snapshot JSON.

  3. Enable visual aids — BEFORE ``record_start``.

    post_command(app, "imgui_mouse_trail",   JV((enabled = true)))
    post_command(app, "imgui_cursor_sprite", JV((enabled = true)))
    

    imgui_mouse_trail draws a fading line behind the synthetic cursor. imgui_cursor_sprite draws a visible pointer at io.MousePos. Without these, the recording shows widget state changing but no cursor — confusing for viewers.

  4. ``record_start``. Opens an APNG writer at file (relative to daslang-live’s cwd, which is why every tutorial’s shell-1 command cd\s into the asset directory first). fps controls the time stamps written into the APNG frame headers — not how often frames are captured. max_seconds caps the recording length.

    The writer pulls from a PBO ring on the GL side (4 buffers by default) — glReadPixels returns immediately, the actual readback happens 3 frames later, the encoder runs on a worker thread. Frames drop only if the worker can’t keep up with the GL output rate, and even then dropping is graceful (the APNG just gets slightly choppier — never breaks).

  5. Narrate, then act — repeat. The locked pacing rule:

    frames = 180   →  3.0 s of visible narrate
    sleep  = 3500u →  narrate disappears with ~500 ms gap
    sleep  = 1500u →  result-dwell after the action
    

    frames counts the APP’s frame counter (60 fps under vsync), NOT the recorder’s fps. So frames = 180 is 3 seconds of real time, regardless of whether the recorder is at 30 fps or 60 fps.

  6. ``record_stop``. Flushes the APNG writer, joins the encoder thread, returns (saved, frames, duration_s, dropped, ok) so the driver can spot-check stats.

  7. Disable visual aids. The host stays alive after the driver exits — clean up the cursor sprite + trail so the next driver (or a developer poking at the live host manually) starts from a clean slate.

The visual-aids stack

Four [live_command] toggles in imgui_visual_aids.das:

  • imgui_mouse_trail — fading line behind io.MousePos. Args: enabled, fade (seconds), color (RGBA uint).

  • imgui_cursor_sprite — visible pointer drawn at io.MousePos. Without it the cursor only exists in the OS layer, which screen recordings don’t capture.

  • imgui_narrate — a sticky-note callout with optional connector line to a target widget. Auto-fits to avoid sibling overlap; frames controls visibility duration.

  • imgui_highlight — a colored rectangle around a widget’s bbox for N frames. Useful for “look here” without text. imgui_auto_highlight flips a global flag that fires highlight on every accepted live command — a one-shot debug aid.

Two more for keyboard work:

  • imgui_key_hud — bottom-center keycap overlay + modifier strip. Pops a keycap for every synth key event; lights modifier pills while held. Useful for input-heavy demos.

  • imgui_focus_rect — a rectangle around io.active_widget so the viewer can tell which widget keystrokes will land in.

The driver workflow in shell

The two-shell pattern every tutorial uses:

# shell 1 — host with cwd = the asset dir so the APNG lands there
cd modules/dasImgui/doc/source/_static/tutorials
../../../../../../bin/Release/daslang-live.exe \
    ../../../../examples/tutorial/recording.das

# shell 2 — fire the driver, exits when done
bin/Release/daslang.exe modules/dasImgui/tests/integration/record_recording.das

The host stays alive after the driver exits; if the recording was bad, re-run shell 2 — no need to restart daslang-live. The driver script nukes the previous file (rm -f <name>.apng) at the top, or record_start returns {"error":"already recording"} if a prior session leaked.

Stop conditions

record_stop is the clean exit. Three other ways the writer finalizes:

  • max_seconds expires — same APNG, frame count + duration match the cap.

  • stbi_apng_frame returns an error (rare; usually disk full or permission denied) — writer auto-stops, returns the partial APNG.

  • daslang-live shuts down — the [on_app_exit] hook on the recorder finalizes any in-flight ring before tearing down GL state.

In practice record_stop is the only exit you’ll see; the others are safety nets.

Replay stability

The same driver run against the same subject produces APNGs that look the same to a viewer — but they’re not byte-identical. ImGui’s frame times jitter, vsync alignment shifts, the PBO ring may stall once or twice on encoder backpressure. dropped in the response is the useful metric: anything under capture_pbo_count + a few is fine. Higher means the encoder couldn’t keep up — try lowering the subject’s render rate, simplifying the subject, or raising capture_pbo_count.

Standalone vs live

The whole recording surface — visual aids, record_start / record_stop, the playwright helpers — runs only under daslang-live. Standalone daslang.exe runs the subject, but the HTTP-bound live commands aren’t registered. The driver script itself is a normal daslang.exe script — it just talks HTTP to a host running on another process.

Driving from outside

The same JSON commands the driver issues are reachable from curl:

curl -X POST -d '{"name":"imgui_mouse_trail","args":{"enabled":true}}'      localhost:9090/command
curl -X POST -d '{"name":"imgui_cursor_sprite","args":{"enabled":true}}'    localhost:9090/command
curl -X POST -d '{"name":"record_start","args":{"file":"manual.apng","fps":30,"max_seconds":15}}' \
     localhost:9090/command

# do whatever interactions interactively in the live window ...

curl -X POST -d '{"name":"record_stop"}' localhost:9090/command

Use this for ad-hoc captures when the scripted driver would be overkill — bug repros, “show me what this looks like” pings.

The end of the curriculum

The 12 tutorials in this set walk the dasImgui surface end-to-end: basics, widgets, layout, docking, style, identity, state, containers, live reload, the JSON-driven view, the visual aids tour, and this recording recipe. Beyond the curriculum, every examples/features/ demo is a richer reference for one specific surface — those are the go-to once the tutorials are familiar.

See also

Subject source: examples/tutorial/recording.das

Driver source: tests/integration/record_recording.das

Recorder implementation: modules/dasOpenGL/opengl/opengl_live.das (record_start / record_stop plus the PBO ring).

Visual aids: modules/dasImgui/widgets/imgui_visual_aids.das.

Playwright helpers: modules/dasImgui/widgets/imgui_playwright.das.

Previous tutorial: Visual aids tour

Curriculum top: dasImgui tutorials