Driving from outside

Every boost widget that previous tutorials wrote registers a path in the telemetry tree (DRIVE_WIN/USER, DRIVE_WIN/SPEED, DRIVE_WIN/RUN_BTN …). That path is also an HTTP endpoint: the boost layer ships [live_command] handlers (imgui_snapshot, imgui_set, imgui_click, imgui_open, imgui_close, imgui_focus) that look up the target in the registry and queue the matching pending field on the widget’s state struct. Next frame, the render function consumes the pending field and ImGui sees the effect as if a real input device drove it.

This tutorial flips the point of view: instead of writing the daslang side, write the driver — a curl / Python / Bash script that issues JSON commands at a running daslang-live app. Every interaction the user could perform via mouse/keyboard has a curl equivalent, and the two surfaces use one in-memory model.

Source: examples/tutorial/driving_outside.das — a small target app exposing five widget kinds. The recording is driven entirely by JSON commands — no synthesized mouse moves into widget centers.

Walkthrough

driving_outside recording
  1options gen2
  2
  3require imgui
  4require imgui_app
  5require glfw/glfw_boost
  6require opengl/opengl_boost
  7require live/glfw_live
  8require live/live_api
  9require live/live_commands
 10require live/live_vars
 11require live/opengl_live
 12require live_host
 13require imgui/imgui_live
 14require imgui/imgui_boost_runtime
 15require imgui/imgui_boost_v2
 16require imgui/imgui_widgets_builtin
 17require imgui/imgui_containers_builtin
 18require imgui/imgui_visual_aids
 19
 20let PRESETS : array<string> <- ["Calm", "Mellow", "Active", "Frantic"]
 21var PRESET_ROW : table<int; ClickState>
 22
 23// =============================================================================
 24// TUTORIAL: driving_outside — every boost widget is also a JSON endpoint.
 25//
 26// Previous tutorials wrote the daslang side. This one is the inverse view:
 27// what the JSON command surface looks like as a programming model in its
 28// own right. Every widget the boost layer ships registers a path in the
 29// telemetry tree; that path is also addressable from outside via the
 30// [live_command] HTTP endpoints `imgui_snapshot` / `imgui_set` /
 31// `imgui_click` / `imgui_open` / `imgui_close` / `imgui_focus`.
 32//
 33// The target app is small — slider + button + input + popup + combo — so
 34// the driving recipes are easy to spot. Every interaction the user could
 35// perform via mouse/keyboard has a curl equivalent.
 36//
 37// STANDALONE: daslang.exe modules/dasImgui/examples/tutorial/driving_outside.das
 38// LIVE:       daslang-live modules/dasImgui/examples/tutorial/driving_outside.das
 39//
 40// DRIVE (curl recipes — pair these with the RST walkthrough):
 41//
 42//   # Snapshot the world (always the first read in any driver)
 43//   curl -X POST -d '{"name":"imgui_snapshot"}' localhost:9090/command
 44//
 45//   # imgui_set — drive a slider value
 46//   curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/SPEED","value":7}}' \
 47//        localhost:9090/command
 48//
 49//   # imgui_click — fire a button
 50//   curl -X POST -d '{"name":"imgui_click","args":{"target":"DRIVE_WIN/RUN_BTN"}}' \
 51//        localhost:9090/command
 52//
 53//   # imgui_set — string into a text input
 54//   curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/USER","value":"Alice"}}' \
 55//        localhost:9090/command
 56//
 57//   # imgui_set — combo by selected-index
 58//   curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/PRESET","value":2}}' \
 59//        localhost:9090/command
 60//
 61//   # imgui_open / imgui_close — popups, closable windows, tabs
 62//   curl -X POST -d '{"name":"imgui_open","args":{"target":"DRIVE_WIN/STATUS_POPUP"}}' \
 63//        localhost:9090/command
 64//   curl -X POST -d '{"name":"imgui_close","args":{"target":"DRIVE_WIN/STATUS_POPUP"}}' \
 65//        localhost:9090/command
 66// =============================================================================
 67
 68[export]
 69def init() {
 70    live_create_window("dasImgui driving_outside tutorial", 720, 540)
 71    live_imgui_init(live_window)
 72    var io & = unsafe(GetIO())
 73    io.FontGlobalScale = 1.5
 74}
 75
 76[export]
 77def update() {
 78    if (!live_begin_frame()) return
 79    begin_frame()
 80
 81    ImGui_ImplOpenGL3_NewFrame()
 82    ImGui_ImplGlfw_NewFrame()
 83    apply_synth_io_override()
 84    NewFrame()
 85
 86    SetNextWindowPos(ImVec2(30.0f, 30.0f), ImGuiCond.FirstUseEver)
 87    SetNextWindowSize(ImVec2(640.0f, 460.0f), ImGuiCond.FirstUseEver)
 88    window(DRIVE_WIN, (text = "driving_outside", closable = false,
 89                       flags = ImGuiWindowFlags.None)) {
 90
 91        // ---- A text input — driven by `imgui_set` with a string value ----
 92        input_text(USER, (text = "User"))
 93        text("USER.value = \"{USER.value}\"")
 94
 95        separator(DR_SEP_1)
 96
 97        // ---- A slider — driven by `imgui_set` with a number value ----
 98        slider_int(SPEED, (text = "Speed"))
 99        text("SPEED.value = {SPEED.value}")
100
101        separator(DR_SEP_2)
102
103        // ---- A combo — driven by `imgui_set` with the selected index ----
104        var preset_label = "(none)"
105        if (PRESET.value >= 0 && PRESET.value < length(PRESETS)) {
106            preset_label = PRESETS[PRESET.value]
107        }
108        combo_select(PRESET, (text = "Preset",
109                              preview_value = preset_label,
110                              flags = ImGuiComboFlags.None)) {
111            for (i in range(length(PRESETS))) {
112                let is_sel = (i == PRESET.value)
113                if (selectable_label(PRESET_ROW[i], PRESETS[i], is_sel)) {
114                    PRESET.value = i
115                }
116            }
117        }
118        text("PRESET = {preset_label} (idx {PRESET.value})")
119
120        separator(DR_SEP_3)
121
122        // ---- A button — fired by `imgui_click` ----
123        if (button(RUN_BTN, (text = "Run"))) {}
124        text("RUN_BTN.click_count = {RUN_BTN.click_count}")
125
126        separator(DR_SEP_4)
127
128        // ---- A popup — opened/closed via `imgui_open` / `imgui_close` ----
129        text("STATUS_POPUP — driven by imgui_open / imgui_close.")
130        popup(STATUS_POPUP, (text = "StatusPopup",
131                             flags = ImGuiWindowFlags.None)) {
132            text("Driven from outside via imgui_open.")
133            separator(DR_SEP_5)
134            text("RUN_BTN.click_count = {RUN_BTN.click_count}")
135        }
136    }
137
138    end_of_frame()
139    Render()
140    var w, h : int
141    live_get_framebuffer_size(w, h)
142    glViewport(0, 0, w, h)
143    glClearColor(0.10f, 0.10f, 0.12f, 1.0f)
144    glClear(GL_COLOR_BUFFER_BIT)
145    ImGui_ImplOpenGL3_RenderDrawData(GetDrawData())
146
147    live_end_frame()
148}
149
150[export]
151def shutdown() {
152    live_imgui_shutdown()
153    live_destroy_window()
154}
155
156[export]
157def main() {
158    init()
159    while (!exit_requested()) {
160        update()
161    }
162    shutdown()
163}

The command surface

Three dispatch levels, in order of decreasing abstraction:

  • L1 — synth IO: imgui_mouse_pos, imgui_mouse_button, imgui_mouse_play, imgui_key_press, imgui_key_type. The driver pretends to be a mouse or keyboard, feeding events into the ImGui input queue. Used by imgui_playwright for cursor-visible recordings.

  • L2 — semantic actions on registered widgets: imgui_click, imgui_focus. The framework looks up the widget by path and queues state.pending_click = true (or the equivalent). Whatever the widget’s render function would do on a real click happens next frame. No bbox lookup, no cursor motion.

  • L3 — value writes: imgui_set. Same lookup, queues state.has_pending = true + state.pending_value = .... The render function consumes the pending value and submits it to ImGui on the next frame.

Plus the read side and the container channel:

  • imgui_snapshot — full registry as JSON, the first call in any driver.

  • imgui_open / imgui_close — set state.pending_open / state.pending_close on container widgets (popups, closable windows, tabs, tree nodes).

Always L2/L3 first — they’re race-free under the framework’s per-frame consumption rule. Drop to L1 only when there’s no L2 counterpart (drag along a custom trajectory, paste a long string into a focused input, sustain a chord, …).

imgui_snapshot — read the world

The first call every driver makes:

curl -X POST -d '{"name":"imgui_snapshot"}' localhost:9090/command

Response shape:

{
  "frame": 412,
  "globals": {
    "DRIVE_WIN": {
      "kind": "window",
      "bbox": [30, 30, 670, 490],
      "hex_id": "0x2c1a8f4b",
      "payload": { "open": true, "size": [640, 460], ... }
    },
    "DRIVE_WIN/SPEED": {
      "kind": "slider_int",
      "bbox": [...],
      "hex_id": "0x...",
      "payload": { "value": 5, "bounds": [0, 10], ... }
    },
    "DRIVE_WIN/RUN_BTN": {
      "kind": "button",
      "bbox": [...],
      "payload": { "click_count": 0 }
    },
    ...
  },
  "io": {
    "mouse_pos": [320, 180],
    "active_widget": "..."
  }
}

Use it to:

  • discover what’s on screen and what kind each widget is;

  • read bbox for L1 mouse synthesis (when needed);

  • check payload for current state (test assertions);

  • read hex_id for fallback dispatch when the path isn’t stable.

imgui_set — value writes

imgui_set is the universal value-write endpoint — slider, checkbox, combo, color, text input, dock-window position. Type-dispatched on the value’s JSON shape:

# string into a text input
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/USER","value":"Alice"}}' \
     localhost:9090/command

# int into a slider
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/SPEED","value":7}}' \
     localhost:9090/command

# int into a combo (selected index)
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/PRESET","value":2}}' \
     localhost:9090/command

# array-of-floats into a color picker
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/TINT","value":[0.2,0.7,0.4]}}' \
     localhost:9090/command

Under the hood: the registered dispatcher for the widget’s state struct unpacks the JSON, type-checks it against the state’s value field, flips has_pending = true, stores pending_value. The render function picks it up next frame; ImGui submits the new value through its own UpdateValue path.

imgui_click — fire a click

imgui_click doesn’t synthesize a mouse event — it sets state.pending_click = true on the registered ClickState (or its equivalent on selectable / menu_item):

curl -X POST -d '{"name":"imgui_click","args":{"target":"DRIVE_WIN/RUN_BTN"}}' \
     localhost:9090/command

The button’s render function returns true next frame just like a real click would, the click_count increments, the daslang side sees both the inline if (button(...)) and RUN_BTN.clicked / RUN_BTN.click_count as expected. No mouse position has to land on the button.

imgui_open / imgui_close — containers

Containers expose an open-state channel through state.pending_open and state.pending_close. imgui_open flips pending_open; imgui_close flips pending_close. The next frame’s render function applies the change:

curl -X POST -d '{"name":"imgui_open","args":{"target":"DRIVE_WIN/STATUS_POPUP"}}' \
     localhost:9090/command
curl -X POST -d '{"name":"imgui_close","args":{"target":"DRIVE_WIN/STATUS_POPUP"}}' \
     localhost:9090/command

The same channel handles popups, closable windows, tabs (closable-tab visibility specifically — see tutorial_containers for the tab-item caveat), tree nodes, and collapsing headers.

The flow on a single command

Every command runs the same path:

  1. daslang-live HTTP server receives POST /command.

  2. Routes by name to the registered [live_command] handler.

  3. Handler looks up target in the registry’s path map (or hex_id reverse map).

  4. Mutates the matching state struct’s pending field; returns {"ok": true, ...} on the HTTP response.

  5. Next frame the script’s update() runs; the widget’s render function sees the pending flag, applies it, ImGui responds, the updated state is observable from the next imgui_snapshot.

So commands are one-frame-delayed by design — there’s no ambiguity about which frame’s state corresponds to a given response. For test harnesses that need to read the result, the canonical pattern is: command, then await_quiescent (waits a frame), then imgui_snapshot.

Standalone vs live

The HTTP server only exists under daslang-live. Standalone daslang.exe runs the same script but the live-command endpoints aren’t bound — drive-from-outside scenarios require the live host.

Driving from outside (recap)

A complete drive sequence for this tutorial’s app:

# Read the world
curl -X POST -d '{"name":"imgui_snapshot"}' localhost:9090/command

# Write each widget kind
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/USER","value":"Alice"}}' \
     localhost:9090/command
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/SPEED","value":7}}' \
     localhost:9090/command
curl -X POST -d '{"name":"imgui_set","args":{"target":"DRIVE_WIN/PRESET","value":2}}' \
     localhost:9090/command
curl -X POST -d '{"name":"imgui_click","args":{"target":"DRIVE_WIN/RUN_BTN"}}' \
     localhost:9090/command
curl -X POST -d '{"name":"imgui_open","args":{"target":"DRIVE_WIN/STATUS_POPUP"}}' \
     localhost:9090/command

The recording at the top of this page runs this exact sequence — no mouse moves, just JSON commands. Every effect you see in the APNG is the result of an imgui_set / imgui_click / imgui_open flowing through the state-struct pending channel.

Next steps

Now that the JSON-driven view is explicit, the visual aids tour walks through every overlay the recordings used: highlight, mouse trail, cursor sprite, narrate, key HUD, focus rect — all [live_command]-wrapped so the same curl pattern reaches them.

See also

Full source: examples/tutorial/driving_outside.das

Richer reference: examples/features/io_synth_text.dasimgui_key_type chains a coroutine that pushes one AddInputCharactersUTF8 per frame; the L1 keyboard layer in action.

Snapshot contract: imgui_boost_runtime.das’s g_serializers per-kind payload definitions.

Previous tutorial: Live reload

Boost macros — the macro layer.