AnnoClaw Workflow

Continue AnnoClaw tasks in one console.

Create workflows, check status, jump into human review, and continue through training, export, and delivery here.

Please complete the following: 1. Provide dataset URI in "Images to annotate"

OpenClaw hosted entrypoints and resources

Current site base URL
https://www.tjmakebot.com
Hosted endpoint
https://www.tjmakebot.com/api/openclaw-gateway/workflows
Compatibility templates and debug entry (expand only for engineering)
These templates remain for node-level debugging, migration, and shortest-path validation. They are no longer the default entry.
Advanced: view upstream API base
Loading...
Recommended path: OpenClaw starts the flow, TjMakeBot Workbench hosts the agent workflow, and the main-site editor handles human refinement.
Plan boundary for workflow actions
Current plan
Free Studio
Workflow execution access
Creating workflows, agent resume, training, export, and compatibility tasks require Pro Workflow or higher.
Collab binding access
Binding collab projects and direct review completion require Team Review Cloud or Enterprise.
Sign in first, then use a paid workflow plan to run hosted workflow actions.

Hosted workflow setup

The left side only configures this hosted workflow. The real running state, current stage, and next action all come from the workflow lane on the right.

If you are unsure what to do: change only 3 fields first. Project ID, dataset or image URI, and annotation labels. After creation, stop guessing and follow the workflow lane on the right.
On-screen hint
You are on step 1: Create the hosted workflow first

Save the left-side setup as one workflowId first. Without it, review, training, export, and handoff cannot target the right session.

Area to focus on: Only fill in the first left-side section right now. Training and export settings can stay at their defaults.
Step 1 / Start here

Basic setup: fill these first

Editable

For a first run, filling out only this section is enough. Leave training and export settings at their defaults.

Images to annotate

Tell the system which images to label first

Two input modes are supported here: 1) provide one dataset zip URI, or 2) submit an image URI list line by line. This automation workbench does not directly upload local browser files yet.

If your images are still only on your local computer, first move them into the main-site editor to import or organize them, then return here with cloud-accessible URIs.
Annotation label design

Then tell the system which objects to annotate

Labels are the object names you want the AI to annotate, such as person, car, or traffic-light. Use the quick-add buttons below or add your own labels manually.

Quick add common labels
These labels will be annotated
personcartraffic-light
Part 2: Training and export settings (defaults usually work)
Most users do not need to open this right away. Review the relevant group only when you reach training or export.
Step 3

Training settings

Only review this section when you are ready to start training. The defaults are good for a first demo.

Step 5

Export settings

You only need this section after training finishes. Ignore it during earlier steps.

Advanced mode: manual takeover / compatibility (most users can ignore this)
This area only contains recovery and engineering-debug actions. Buttons are strictly filtered by the current workflow allowedActions, so you only see manual steps that are actually allowed now.

Manual takeover actions available now

If an action does not appear here, the current workflow does not allow it. In normal use, stay on the main flow.

The current workflow does not need manual takeover right now. Waiting for live sync or following the current stage lane is the safer choice.

Compatibility validation and low-level task debugging

Reserved for OpenClaw engineering validation, shortest smoke tests, and node-level API debugging. Most users do not need this area.

Workflow and task status messages appear here.
Current workflow lane

Step 1: Create the hosted workflow first

Save the left-side setup as one workflowId first. Without it, review, training, export, and handoff cannot target the right session.

Not started
Automation workbench
Next action
-
Live sync
-
Workflow ID
-
Overall progress
0%
Done when
After creation, the lane switches to "Open human review" and shows the workflowId, next action, and allowed actions.
Avoid this
Please complete the following: 1. Provide dataset URI in "Images to annotate"

Workflow map: who owns each stage

This lane exists for one reason: to make the current stage, owner, and next action obvious. The main site owns the frontend pages and human handoff, while AnnoClaw owns the workflow state machine.

1
Create hosted workflow
Hosted frontend · Do this now
The left-side configuration is saved as one workflowId. Human review, training, export, and handoff all continue from that workflow.
2
Hosted human review
Human review · Waiting for workflow
The actual human review happens in the main-site editor. Open the review page, finish the review there, and return control to the agent.
3
OpenClaw agent resumes training
OpenClaw agent · Waiting for review
After review, the OpenClaw workflow advances training using allowedActions and resumeAction. Manual takeover is only needed for recovery.
4
Agent exports the model
Export stage · Waiting for training
After training completes, the OpenClaw agent triggers export and writes the artifact URI and version back to this workflow.
5
Review artifacts and handoff links
Hosted handoff · Waiting for export
When the loop completes, results stay on the hosted frontend, including the artifact URI, training handoff page, and editor deep-links.

Live snapshot

Idle
Workflow ID
-
Project ID
-
Collab project ID
-
Submitted assets
-
Training epoch
-
Dataset version ID
-
Release ID
-
Artifact URI
-
Artifact file
-
Next action: -
Agent resume action: -
Human task: No pending human task
Project: -
Collab project: -
Dataset version: -
Release ID: -
Annotation labels: -
Recommended plan: -
Download entry: -
Overall progress: 0%

Hosted pages and result handoff

All human-facing pages stay on the TjMakeBot main site, including the review page, training recovery page, and result handoff page; AnnoClaw provides the workflow state and action permissions.

Collab project linkedPending
-
This workflow is not mapped to a collab project yet, so downstream review and delivery deep-links stay unavailable.
Dataset version syncedPending
-
This is usually created after review approval as delivery continues.
Release createdPending
-
This is typically created after training/export so delivery and training lineage can be traced.
Review URL:: -
Training URL:: -

Annotation task

Shared annotation entry for the agent workflow and low-level API tests

Idle
Task ID: -
Created: -

Training task

Can be restored into the training view or polled directly

Idle
Task ID: -
Created: -

Export task

Artifact URI, format, and callback are surfaced here

Idle
Task ID: -
Created: -