···37373838## Configuration
39394040+Core configuration controls how tack talks to Tangled. Provider-specific
4141+configuration (e.g. Buildkite) lives in its own section below.
4242+4043### Required
41444245| Env var | Description |
···5356| `TACK_JETSTREAM_URL` | Tangled Jetstream WebSocket URL |
5457| `TACK_DEV` | Use `ws://` for knot event-streams (any non-empty value) |
55585656-### Buildkite
5959+When no provider is configured, tack runs an in-process fake provider
6060+that's useful for exercising the jetstream → knot → `/events` flow
6161+locally without a real CI account.
6262+6363+## Buildkite
6464+6565+[Buildkite](https://buildkite.com) is the primary provider tack
6666+supports today. In Buildkite mode, every Tangled pipeline trigger
6767+fans out into one Buildkite build per workflow on the pipeline that
6868+workflow names; build state flows back to Tangled via Buildkite's
6969+notification webhooks.
7070+7171+### How it fits together
7272+7373+```
7474+ sh.tangled.pipeline Buildkite
7575+ trigger record ──▶ tack ──▶ Create Build ─┐
7676+ │
7777+ /webhooks/buildkite ◀──── notification ◀─────┘
7878+ │
7979+ ▼
8080+ sh.tangled.pipeline.status (broadcast on /events)
8181+```
8282+8383+* **Spawn:** for each workflow on a pipeline trigger, tack POSTs to
8484+ `/v2/organizations/<org>/pipelines/<slug>/builds`. Both `<org>` and
8585+ `<slug>` come from the workflow's YAML body (see
8686+ [Configuring your workflows](#configuring-your-workflows)).
8787+* **Track:** tack persists the resulting `(build_uuid → knot, rkey,
8888+ workflow)` mapping in its local SQLite store so it can later
8989+ resolve incoming webhooks back to the originating Tangled
9090+ pipeline.
9191+* **Report:** Buildkite delivers `build.*` events to
9292+ `POST /webhooks/buildkite`. tack authenticates each request,
9393+ translates the Buildkite state into a Tangled status, and
9494+ broadcasts a `sh.tangled.pipeline.status` record on `/events`.
9595+9696+### Setting up Buildkite
9797+9898+These steps happen once on the Buildkite side, before tack can talk
9999+to it.
100100+101101+#### 1. Create one or more pipelines
102102+103103+Each Tangled workflow targets exactly one Buildkite pipeline by
104104+slug. There's no requirement that pipelines map 1:1 to workflows —
105105+many users point every workflow at a single pipeline whose
106106+`pipeline.yml` does `pipeline upload some-file-${TACK_WORKFLOW}.yml`,
107107+keeping all the per-workflow logic in the repo rather than in
108108+Buildkite's UI.
109109+110110+In your Buildkite org, **Pipelines → New pipeline**:
111111+112112+* Repository: any URL (the agent only needs to be able to clone it).
113113+* Steps: a minimal `pipeline upload` is usually enough — tack passes
114114+ the workflow name through `$TACK_WORKFLOW` so you can branch on
115115+ it.
116116+117117+Note the pipeline slug from the URL
118118+(`https://buildkite.com/<org>/<pipeline-slug>`); your workflow YAML
119119+will reference it.
120120+121121+#### 2. Create an API access token
122122+123123+Tack uses a single API token to create builds, list jobs, and fetch
124124+logs. Generate one at
125125+<https://buildkite.com/user/api-access-tokens> with these scopes:
126126+127127+| Scope | Used for |
128128+| ------------------- | ------------------------------------------------- |
129129+| `read_organizations`| Sanity-checking the configured org slug |
130130+| `write_builds` | `POST .../builds` when a Tangled trigger arrives |
131131+| `read_builds` | Resolving build → jobs for the `/logs` endpoint |
132132+| `read_build_logs` | Streaming job logs back to the Tangled appview |
133133+134134+Restrict the token to the specific organization(s) tack will spawn
135135+into.
136136+137137+#### 3. Configure a notification webhook
138138+139139+Builds report their state back to tack through Buildkite's
140140+notification service.
141141+142142+In your Buildkite org, **Settings → Notification Services → Add →
143143+Webhook**:
144144+145145+* **Webhook URL:** `https://<your-tack-host>/webhooks/buildkite`
146146+* **Token / Secret:** any high-entropy string. You'll set the same
147147+ value in `TACK_BUILDKITE_WEBHOOK_SECRET`.
148148+* **Events:** `build.scheduled`, `build.running`, `build.finished`
149149+ (job-level events are ignored).
150150+* **Pipelines:** the pipelines tack will fire builds on.
151151+152152+Buildkite supports two header schemes for authenticating webhooks;
153153+tack supports both:
571545858-Setting `TACK_BUILDKITE_TOKEN` enables Buildkite mode; when unset, tack
5959-runs the in-process fake provider for local development. When
6060-Buildkite mode is enabled, every other variable in this section is
155155+| Header scheme | `TACK_BUILDKITE_WEBHOOK_MODE` | Notes |
156156+| ----------------------- | ----------------------------- | -------------------------------------------- |
157157+| `X-Buildkite-Token` | `token` (default) | Secret is sent verbatim in the header |
158158+| `X-Buildkite-Signature` | `signature` | HMAC-SHA256 of `<timestamp>.<body>`; safer |
159159+160160+Pick `signature` if the notification setting offers it — it doesn't
161161+expose the secret on the wire.
162162+163163+### Configuring tack
164164+165165+Setting `TACK_BUILDKITE_TOKEN` is the master switch that puts tack
166166+into Buildkite mode. The other variables in this section are then
61167required.
6216863169| Env var | Description |
64170| ------------------------------- | ------------------------------------------------------------------------------ |
65171| `TACK_BUILDKITE_TOKEN` | Buildkite API token (enables Buildkite mode) |
6666-| `TACK_BUILDKITE_ORG` | Buildkite organization slug |
6767-| `TACK_BUILDKITE_PIPELINE` | Buildkite pipeline slug to fire builds on |
172172+| `TACK_BUILDKITE_ORG` | Default Buildkite organization slug (workflows may override via YAML) |
68173| `TACK_BUILDKITE_WEBHOOK_SECRET` | Shared secret for `/webhooks/buildkite` auth |
6969-| `TACK_BUILDKITE_WEBHOOK_MODE` | `token` (default) or `signature` — must match Buildkite's notification setting |
174174+| `TACK_BUILDKITE_WEBHOOK_MODE` | `token` (default) or `signature` — must match the notification service |
175175+176176+The pipeline a workflow runs against is **not** an environment
177177+variable. It lives inside the workflow YAML so each repo can target
178178+its own pipeline without an operator round-trip.
179179+180180+### Configuring your workflows
181181+182182+A Tangled workflow's `raw` body is parsed by tack as YAML. Only
183183+`pipeline` is required — every other field is an optional override
184184+or extension of what the trigger metadata already provides:
185185+186186+```yaml
187187+# Required: which Buildkite pipeline this workflow fires.
188188+pipeline: my-pipeline-slug
189189+190190+# Optional: org override. Defaults to TACK_BUILDKITE_ORG. The API
191191+# token must have access to whichever org you target.
192192+org: another-org
193193+194194+# Optional: human-readable build message (default: "tangled: <name>").
195195+message: "Custom build message"
196196+197197+# Optional: pin the commit/branch tack would otherwise derive from
198198+# the trigger. Useful for manual triggers (which carry no commit).
199199+commit: abcdef0123
200200+branch: main
201201+202202+# Optional: extra env + meta_data merged on top of tack's defaults
203203+# (see "What tack injects into every build" below).
204204+env:
205205+ CUSTOM_VAR: value
206206+meta_data:
207207+ custom-key: value
208208+209209+# Optional: forwarded verbatim to the Buildkite create-build API.
210210+clean_checkout: true
211211+ignore_pipeline_branch_filters: true # default: true
212212+author:
213213+ name: "Author Name"
214214+ email: "author@example.com"
215215+```
216216+217217+When the trigger is a pull request, tack auto-populates Buildkite's
218218+`pull_request_base_branch` from the PR target so step-level branch
219219+filters work without extra config.
220220+221221+#### What tack injects into every build
222222+223223+Regardless of what the workflow YAML adds on top, tack always
224224+provides the following so your Buildkite pipeline can recover the
225225+Tangled identity of the build:
226226+227227+| Channel | Key | Value |
228228+| ----------- | -------------------- | ---------------------------------------- |
229229+| `env` | `TACK_KNOT` | knot hostname the pipeline came from |
230230+| `env` | `TACK_PIPELINE_RKEY` | rkey of the originating pipeline record |
231231+| `env` | `TACK_WORKFLOW` | workflow name (typically a YAML filename) |
232232+| `env` | `TACK_WORKFLOW_RAW` | the workflow's raw YAML body |
233233+| `meta_data` | `tack:knot` | same as `TACK_KNOT` |
234234+| `meta_data` | `tack:pipeline_rkey` | same as `TACK_PIPELINE_RKEY` |
235235+| `meta_data` | `tack:workflow` | same as `TACK_WORKFLOW` |
236236+237237+A common pattern is for the Buildkite pipeline's root step to do a
238238+`pipeline upload` against a workflow-specific YAML file based on
239239+`$TACK_WORKFLOW`, e.g.:
240240+241241+```yaml
242242+# Buildkite pipeline.yml
243243+steps:
244244+ - label: ":pipeline: dispatch ${TACK_WORKFLOW}"
245245+ command: "buildkite-agent pipeline upload .buildkite/${TACK_WORKFLOW}"
246246+```
···3737 // (useful for local development against a real Tangled
3838 // jetstream); when set, the other Buildkite fields are
3939 // required and tack will refuse to start without them.
4040+ //
4141+ // BuildkiteOrg is the *default* org used when a workflow YAML
4242+ // doesn't specify one of its own. The pipeline a workflow runs
4343+ // against is no longer global — it's pulled from the workflow
4444+ // body itself (see workflowConfig in provider_buildkite.go).
4045 BuildkiteToken string
4146 BuildkiteOrg string
4242- BuildkitePipeline string
4347 BuildkiteWebhookSecret string
4448 BuildkiteWebhookMode buildkite.WebhookMode
4549}
···5458 Dev: os.Getenv("TACK_DEV") != "",
5559 BuildkiteToken: os.Getenv("TACK_BUILDKITE_TOKEN"),
5660 BuildkiteOrg: os.Getenv("TACK_BUILDKITE_ORG"),
5757- BuildkitePipeline: os.Getenv("TACK_BUILDKITE_PIPELINE"),
5861 BuildkiteWebhookSecret: os.Getenv("TACK_BUILDKITE_WEBHOOK_SECRET"),
5962 BuildkiteWebhookMode: buildkite.WebhookMode(
6063 envOr("TACK_BUILDKITE_WEBHOOK_MODE", string(buildkite.WebhookModeToken)),
···8487 if cfg.BuildkiteToken != "" {
8588 if cfg.BuildkiteOrg == "" {
8689 return cfg, errors.New("TACK_BUILDKITE_ORG is required when TACK_BUILDKITE_TOKEN is set")
8787- }
8888- if cfg.BuildkitePipeline == "" {
8989- return cfg, errors.New("TACK_BUILDKITE_PIPELINE is required when TACK_BUILDKITE_TOKEN is set")
9090 }
9191 if cfg.BuildkiteWebhookSecret == "" {
9292 return cfg, errors.New("TACK_BUILDKITE_WEBHOOK_SECRET is required when TACK_BUILDKITE_TOKEN is set")
···176176 if cfg.BuildkiteToken != "" {
177177 bkProvider = newBuildkiteProvider(
178178 br, st,
179179- buildkite.NewClient(cfg.BuildkiteToken, cfg.BuildkiteOrg),
180180- cfg.BuildkitePipeline,
179179+ buildkite.NewClient(cfg.BuildkiteToken),
180180+ cfg.BuildkiteOrg,
181181 cfg.BuildkiteWebhookSecret,
182182 cfg.BuildkiteWebhookMode,
183183 logger,
184184 )
185185 provider = bkProvider
186186 logger.Info("buildkite provider enabled",
187187- "org", cfg.BuildkiteOrg,
188188- "pipeline", cfg.BuildkitePipeline,
187187+ "default_org", cfg.BuildkiteOrg,
189188 "webhook_mode", cfg.BuildkiteWebhookMode,
190189 )
191190 } else {
+185-71
provider_buildkite.go
···99// Spawn time and publishes a sh.tangled.pipeline.status record on
1010// the in-process broker.
1111//
1212-// Only one Buildkite pipeline is used per spindle (TACK_BUILDKITE_PIPELINE).
1313-// Every Tangled workflow runs as a build on that single pipeline, with
1414-// the workflow identity plumbed through env + meta_data. The operator
1515-// configures their Buildkite pipeline to read those env vars and
1616-// dispatch accordingly (e.g. via `pipeline upload`). Mapping every
1717-// Tangled workflow to its own Buildkite pipeline would force operators
1818-// to provision Buildkite resources for each workflow file in every
1919-// repo that points at the spindle — friction we don't want to impose.
1212+// The Buildkite *pipeline slug* a workflow targets is carried inside
1313+// the workflow's YAML body (Pipeline_Workflow.Raw), not configured
1414+// globally on the spindle. That keeps tack a thin translator: the
1515+// repo author decides which Buildkite pipeline runs each Tangled
1616+// workflow without an operator round-trip. See workflowConfig below
1717+// for the supported YAML schema.
20182119import (
2220 "context"
···2826 "strings"
2927 "time"
30282929+ "go.yaml.in/yaml/v2"
3130 "tangled.org/core/api/tangled"
32313332 "github.com/mitchellh/tack/internal/buildkite"
···4443 bkMetaWorkflow = "tack:workflow"
4544)
46454646+// workflowConfig is the tack-flavoured schema we expect inside each
4747+// Tangled workflow's Raw YAML body. Only the Buildkite `pipeline`
4848+// slug is required; everything else is optional. Fields nest under
4949+// `tack: { buildkite: ... }` so the workflow YAML can grow other
5050+// top-level keys (Tangled's own scheduling fields, future provider
5151+// blocks) without colliding with our namespace.
5252+//
5353+// Fields map onto the Buildkite REST "Create a build" request
5454+// properties documented at
5555+// https://buildkite.com/docs/apis/rest-api/builds#create-a-build
5656+// (see also the comment block on buildkite.CreateBuildRequest). We
5757+// expose only the small subset users genuinely need to override —
5858+// trigger metadata supplies commit/branch, and tack supplies the
5959+// identity env+meta the webhook handler relies on, so there's no
6060+// reason to let users re-specify those.
6161+type workflowConfig struct {
6262+ Tack tackConfig `yaml:"tack"`
6363+}
6464+6565+// tackConfig is the per-provider block under the top-level `tack:`
6666+// key. Right now the only nested provider is Buildkite.
6767+type tackConfig struct {
6868+ Buildkite buildkiteConfig `yaml:"buildkite"`
6969+}
7070+7171+// buildkiteConfig is the Buildkite-specific subset of workflowConfig.
7272+//
7373+// `org` lets a workflow target a Buildkite organisation other than
7474+// the spindle's default — useful when one tack instance fronts
7575+// multiple orgs. The configured API token must have access to that
7676+// org or the build creation request will 401/403; we surface that
7777+// error verbatim rather than guessing.
7878+//
7979+// `clean_checkout` is forwarded verbatim to Buildkite. CleanCheckout
8080+// is a *bool so omitting it leaves Buildkite's own default in place
8181+// instead of always shipping `false`.
8282+type buildkiteConfig struct {
8383+ Pipeline string `yaml:"pipeline"`
8484+ Org string `yaml:"org"`
8585+ CleanCheckout *bool `yaml:"clean_checkout"`
8686+}
8787+8888+// parseWorkflowConfig decodes a workflow YAML body into workflowConfig.
8989+// An empty body is treated as a structural error so spawnWorkflow can
9090+// short-circuit cleanly: a workflow with no body has nothing for tack
9191+// to do anyway.
9292+func parseWorkflowConfig(raw string) (*buildkiteConfig, error) {
9393+ if strings.TrimSpace(raw) == "" {
9494+ return nil, errors.New("workflow body is empty")
9595+ }
9696+ var cfg workflowConfig
9797+ if err := yaml.Unmarshal([]byte(raw), &cfg); err != nil {
9898+ return nil, fmt.Errorf("parse workflow yaml: %w", err)
9999+ }
100100+ bk := cfg.Tack.Buildkite
101101+ if bk.Pipeline == "" {
102102+ return nil, errors.New("workflow yaml: `tack.buildkite.pipeline` is required")
103103+ }
104104+ return &bk, nil
105105+}
106106+47107// buildkiteProvider implements Provider.
48108//
49109// webhookSecret + webhookMode live on the provider rather than on
···51111// "everything Buildkite-y": colocating the auth knob with the API
52112// client and the state translator keeps configuration drift to one
53113// place and makes the http.go side pure transport.
114114+//
115115+// defaultOrg is the Buildkite organisation the configured API token
116116+// belongs to. Workflows may opt into a different org via their YAML
117117+// `org` field; the API token then needs to be authorised against it.
54118type buildkiteProvider struct {
55119 br *broker
56120 st *store
57121 log *slog.Logger
58122 client *buildkite.Client
5959- pipelineSlug string
123123+ defaultOrg string
60124 webhookSecret string
61125 webhookMode buildkite.WebhookMode
62126}
···65129var _ Provider = (*buildkiteProvider)(nil)
6613067131// newBuildkiteProvider wires a provider to its Buildkite client and
6868-// to the broker it publishes pipeline.status records on. pipelineSlug
6969-// is the Buildkite pipeline that all builds get fired on (see file
7070-// header for why there's only one). webhookSecret/webhookMode govern
7171-// inbound /webhooks/buildkite request authentication.
132132+// to the broker it publishes pipeline.status records on. defaultOrg
133133+// is the org the API token authenticates against and the org used
134134+// when a workflow doesn't specify its own. webhookSecret/webhookMode
135135+// govern inbound /webhooks/buildkite request authentication.
72136func newBuildkiteProvider(
73137 br *broker,
74138 st *store,
75139 client *buildkite.Client,
7676- pipelineSlug string,
140140+ defaultOrg string,
77141 webhookSecret string,
78142 webhookMode buildkite.WebhookMode,
79143 log *slog.Logger,
···83147 st: st,
84148 log: log.With("component", "provider", "kind", "buildkite"),
85149 client: client,
8686- pipelineSlug: pipelineSlug,
150150+ defaultOrg: defaultOrg,
87151 webhookSecret: webhookSecret,
88152 webhookMode: webhookMode,
89153 }
···111175}
112176113177// Spawn satisfies Provider. For each workflow it fires a separate
114114-// Buildkite build off the configured pipeline so each workflow gets
115115-// its own status timeline. The actual API call runs on a goroutine —
116116-// CreateBuild is one HTTP round-trip, but we still want Spawn to be
117117-// non-blocking per the interface contract.
178178+// Buildkite build off the pipeline named in that workflow's YAML so
179179+// each workflow gets its own status timeline. The actual API call
180180+// runs on a goroutine — CreateBuild is one HTTP round-trip, but we
181181+// still want Spawn to be non-blocking per the interface contract.
118182//
119183// On a successful create we persist the build UUID → (knot, rkey,
120184// workflow) mapping and publish a "pending" pipeline.status so the
···134198 return
135199 }
136200137137- // Derive build inputs once. Every workflow on this trigger
138138- // targets the same commit/branch — only the workflow name
139139- // varies between the per-workflow goroutines below.
140140- commit, branch := triggerCommitAndBranch(trigger)
141141- if commit == "" {
142142- // Buildkite's create-build API requires a commit; we'd
143143- // rather log loudly and skip than fire builds on "HEAD"
144144- // and silently get whatever main happens to look like.
145145- p.log.Error("trigger has no commit; refusing to spawn",
146146- "knot", knot, "rkey", pipelineRkey,
147147- )
148148- return
149149- }
150150-151201 for _, wf := range workflows {
152202 if wf == nil || wf.Name == "" {
153203 continue
154204 }
155205 wf := wf
156156- go p.spawnWorkflow(ctx, knot, pipelineRkey, commit, branch, wf)
206206+ go p.spawnWorkflow(ctx, knot, pipelineRkey, trigger, wf)
157207 }
158208}
159209···166216 ctx context.Context,
167217 knot string,
168218 pipelineRkey string,
169169- commit string,
170170- branch string,
219219+ trigger *tangled.Pipeline_TriggerMetadata,
171220 wf *tangled.Pipeline_Workflow,
172221) {
173222 logger := p.log.With(
···176225 "workflow", wf.Name,
177226 )
178227179179- pipelineURI := pipelineATURI(knot, pipelineRkey)
180180- meta := map[string]string{
181181- bkMetaKnot: knot,
182182- bkMetaPipelineRkey: pipelineRkey,
183183- bkMetaWorkflow: wf.Name,
228228+ cfg, err := parseWorkflowConfig(wf.Raw)
229229+ if err != nil {
230230+ // Bad workflow YAML is a user-facing config error: log it
231231+ // loudly and skip. Firing a build off some default would
232232+ // be more confusing than doing nothing.
233233+ logger.Error("invalid workflow config; refusing to spawn", "err", err)
234234+ return
184235 }
185185- env := envFromTuple(knot, pipelineRkey, wf)
236236+ logger = logger.With("pipeline", cfg.Pipeline)
186237187187- req := buildkite.CreateBuildRequest{
188188- Commit: commit,
189189- Branch: branch,
190190- Message: fmt.Sprintf("tangled: %s", wf.Name),
191191- Env: env,
192192- MetaData: meta,
193193- IgnorePipelineBranchFilters: true,
238238+ req, err := p.buildCreateRequest(cfg, trigger, knot, pipelineRkey, wf)
239239+ if err != nil {
240240+ logger.Error("build create request", "err", err)
241241+ return
194242 }
195243196196- build, err := p.client.CreateBuild(ctx, p.pipelineSlug, req)
244244+ org := cfg.Org
245245+ if org == "" {
246246+ org = p.defaultOrg
247247+ }
248248+249249+ build, err := p.client.CreateBuild(ctx, org, cfg.Pipeline, req)
197250 if err != nil {
198198- logger.Error("create buildkite build", "err", err)
251251+ logger.Error("create buildkite build", "err", err, "org", org)
199252 return
200253 }
201254 logger.Info("buildkite build created",
202255 "build_uuid", build.ID,
203256 "build_number", build.Number,
204257 "web_url", build.WebURL,
258258+ "org", org,
205259 )
206260261261+ pipelineURI := pipelineATURI(knot, pipelineRkey)
207262 if err := p.st.InsertBuildkiteBuild(ctx, BuildkiteBuildRef{
208263 BuildUUID: build.ID,
209264 BuildNumber: build.Number,
210210- PipelineSlug: p.pipelineSlug,
265265+ PipelineSlug: cfg.Pipeline,
211266 Knot: knot,
212267 PipelineRkey: pipelineRkey,
213268 Workflow: wf.Name,
···234289 }
235290}
236291292292+// buildCreateRequest folds the parsed workflow config and the
293293+// Tangled trigger metadata into a single Buildkite create-build
294294+// payload. Trigger metadata supplies commit/branch; the workflow
295295+// YAML supplies the Buildkite routing knobs (pipeline/org) and the
296296+// small handful of build options we expose.
297297+//
298298+// `ignore_pipeline_branch_filters` is hard-coded to true: Tangled
299299+// refs frequently don't match arbitrary Buildkite pipeline branch
300300+// filters, and a build silently dropped at create time is a worse
301301+// failure mode than running one we shouldn't have. Users wanting
302302+// the filter back are expected to drop the filter on the Buildkite
303303+// pipeline itself.
304304+//
305305+// Returns an error when the trigger lacks a commit — Buildkite's
306306+// API requires one and we'd rather log+skip than fire a build that
307307+// resolves to "whatever main happens to be".
308308+func (p *buildkiteProvider) buildCreateRequest(
309309+ cfg *buildkiteConfig,
310310+ trigger *tangled.Pipeline_TriggerMetadata,
311311+ knot, pipelineRkey string,
312312+ wf *tangled.Pipeline_Workflow,
313313+) (buildkite.CreateBuildRequest, error) {
314314+ commit, branch := triggerCommitAndBranch(trigger)
315315+ if commit == "" {
316316+ return buildkite.CreateBuildRequest{}, errors.New(
317317+ "trigger has no commit",
318318+ )
319319+ }
320320+321321+ cleanCheckout := false
322322+ if cfg.CleanCheckout != nil {
323323+ cleanCheckout = *cfg.CleanCheckout
324324+ }
325325+326326+ req := buildkite.CreateBuildRequest{
327327+ Commit: commit,
328328+ Branch: branch,
329329+ Message: fmt.Sprintf("tangled: %s", wf.Name),
330330+ Env: envFromTuple(knot, pipelineRkey, wf),
331331+ MetaData: map[string]string{
332332+ bkMetaKnot: knot,
333333+ bkMetaPipelineRkey: pipelineRkey,
334334+ bkMetaWorkflow: wf.Name,
335335+ },
336336+ CleanCheckout: cleanCheckout,
337337+ IgnorePipelineBranchFilters: true,
338338+ }
339339+340340+ // Auto-populate Buildkite's PR fields from the Tangled PR
341341+ // trigger when present. Buildkite doesn't get a PR number from
342342+ // us (Tangled doesn't surface one through the trigger), but
343343+ // the base branch alone is enough for `pull_request_base_branch`-
344344+ // gated step filters to work.
345345+ if trigger != nil && trigger.PullRequest != nil {
346346+ req.PullRequestBaseBranch = trigger.PullRequest.TargetBranch
347347+ }
348348+349349+ return req, nil
350350+}
351351+237352// Logs satisfies Provider. We resolve the (knot, rkey, workflow)
238353// tuple to a Buildkite build via the store, fetch the current jobs
239354// list, then drain each job's plain-text log into the channel as one
···256371) (<-chan LogLine, error) {
257372 ref, err := p.st.LookupBuildkiteBuildByTuple(ctx, knot, pipelineRkey, workflow)
258373 if err != nil {
259259- return nil, fmt.Errorf("lookup build for logs: %w", err)
374374+ return nil, fmt.Errorf("lookup build mapping: %w", err)
260375 }
261376 if ref == nil {
262377 return nil, ErrLogsNotFound
263378 }
264379265265- // Fresh fetch so we get the current job set, not whatever was
266266- // returned at create time (when most jobs are still nil). The
267267- // upstream's not-found is mapped to the Provider-shaped one
268268- // here because the /logs handler only knows about ErrLogsNotFound.
269269- build, err := p.client.GetBuild(ctx, ref.PipelineSlug, ref.BuildNumber)
380380+ // Resolve the org against which we should pull jobs/logs. We
381381+ // don't persist it on BuildkiteBuildRef today (the slug + token
382382+ // have always been enough); fall back to the provider default,
383383+ // which is correct for the common single-org install. A
384384+ // future migration can add a column when multi-org installs
385385+ // need it.
386386+ org := p.defaultOrg
387387+388388+ build, err := p.client.GetBuild(ctx, org, ref.PipelineSlug, ref.BuildNumber)
270389 if err != nil {
271390 if errors.Is(err, buildkite.ErrNotFound) {
272391 return nil, ErrLogsNotFound
273392 }
274274- return nil, fmt.Errorf("get build for logs: %w", err)
393393+ return nil, fmt.Errorf("get build: %w", err)
275394 }
276395277277- out := make(chan LogLine, 64)
396396+ out := make(chan LogLine, 32)
278397 go func() {
279398 defer close(out)
280399 stepID := 0
281400 for _, job := range build.Jobs {
282282- // Only "script" jobs have agent-produced logs.
283283- // Waiter / manual / trigger jobs have no body to
284284- // fetch; skip them so we don't hit Buildkite with
285285- // 404-bound requests.
286401 if job.Type != "" && job.Type != "script" {
402402+ // Skip non-script jobs (waiter, manual,
403403+ // trigger). They have no log to fetch and
404404+ // surfacing empty steps just clutters the
405405+ // appview.
287406 continue
288407 }
289289-290408 name := job.Name
291409 if name == "" {
292292- name = job.ID
410410+ name = fmt.Sprintf("job %s", job.ID)
293411 }
294412295295- // Job-level start frame so the appview can bound
296296- // timing per job.
297413 if !sendLine(ctx, out, LogLine{
298414 Kind: LogKindControl,
299415 Time: time.Now(),
···304420 return
305421 }
306422307307- body, err := p.client.GetJobLog(ctx, ref.PipelineSlug, ref.BuildNumber, job.ID)
423423+ body, err := p.client.GetJobLog(ctx, org, ref.PipelineSlug, ref.BuildNumber, job.ID)
308424 if err != nil {
309425 p.log.Debug("fetch job log",
310426 "err", err,
···512628 return trigger.Push.NewSha, refToBranch(trigger.Push.Ref)
513629 case trigger.PullRequest != nil:
514630 // PRs build the source commit on the source branch.
515515- // Buildkite's pipeline can opt into PR-aware behaviour
516516- // via pull_request_id (not currently plumbed through).
517631 return trigger.PullRequest.SourceSha, trigger.PullRequest.SourceBranch
518632 default:
519633 // Manual triggers and any future kinds: fall back to the
+124-4
provider_buildkite_test.go
···4747 logger := slog.Default()
4848 p := newBuildkiteProvider(
4949 br, st,
5050- buildkite.NewClient("tok", "myorg"),
5151- "mypipe",
5050+ buildkite.NewClient("tok"),
5151+ "myorg",
5252 secret, mode,
5353 logger,
5454 )
···7474 },
7575 }
7676 workflows := []*tangled.Pipeline_Workflow{
7777- {Name: "test.yml", Raw: "steps:\n - run: true\n"},
7777+ {Name: "test.yml", Raw: "tack:\n buildkite:\n pipeline: mypipe\n"},
7878 }
79798080 p.Spawn(context.Background(), "knot.example.com", "rkey-1", trigger, workflows)
···139139140140 p.Spawn(context.Background(), "knot.example.com", "rkey-1",
141141 &tangled.Pipeline_TriggerMetadata{Manual: &tangled.Pipeline_ManualTriggerData{}},
142142- []*tangled.Pipeline_Workflow{{Name: "test.yml"}},
142142+ []*tangled.Pipeline_Workflow{{Name: "test.yml",
143143+ Raw: "tack:\n buildkite:\n pipeline: mypipe\n"}},
143144 )
144145145146 // Give any rogue goroutine a moment.
146147 time.Sleep(50 * time.Millisecond)
147148 if called {
148149 t.Fatal("CreateBuild called despite missing commit")
150150+ }
151151+ rows, _ := st.EventsAfter(context.Background(), 0)
152152+ if len(rows) != 0 {
153153+ t.Fatalf("got %d events, want 0", len(rows))
154154+ }
155155+}
156156+157157+// TestBuildkiteSpawnWorkflowConfig pins the YAML → create-build
158158+// translation: pipeline + org from YAML pick the request URL,
159159+// message/env/meta_data come through, and the trigger's PR target
160160+// branch lands as `pull_request_base_branch`. Together these cover
161161+// the "smuggle Buildkite parameters through workflow YAML" path.
162162+func TestBuildkiteSpawnWorkflowConfig(t *testing.T) {
163163+ type captured struct {
164164+ path string
165165+ body buildkite.CreateBuildRequest
166166+ }
167167+ gotCh := make(chan captured, 1)
168168+ bk := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
169169+ var body buildkite.CreateBuildRequest
170170+ _ = json.NewDecoder(r.Body).Decode(&body)
171171+ gotCh <- captured{path: r.URL.Path, body: body}
172172+ w.WriteHeader(http.StatusCreated)
173173+ _ = json.NewEncoder(w).Encode(buildkite.Build{ID: "uuid-9", Number: 9})
174174+ })
175175+ p, _, _, _ := newBuildkiteTestProvider(t, buildkite.WebhookModeToken, "s", bk)
176176+177177+ raw := strings.Join([]string{
178178+ "pipeline: workflow-pipe",
179179+ "org: workflow-org",
180180+ "message: smuggled message",
181181+ "env:",
182182+ " CUSTOM: value",
183183+ "meta_data:",
184184+ " custom: meta",
185185+ "clean_checkout: true",
186186+ "author:",
187187+ " name: Author",
188188+ " email: a@example.com",
189189+ }, "\n") + "\n"
190190+191191+ trigger := &tangled.Pipeline_TriggerMetadata{
192192+ PullRequest: &tangled.Pipeline_PullRequestTriggerData{
193193+ SourceSha: "deadbeef",
194194+ SourceBranch: "feature",
195195+ TargetBranch: "main",
196196+ },
197197+ }
198198+199199+ p.Spawn(context.Background(), "knot.example.com", "rkey-x", trigger,
200200+ []*tangled.Pipeline_Workflow{{Name: "ci.yml", Raw: raw}})
201201+202202+ select {
203203+ case got := <-gotCh:
204204+ // URL must reflect YAML org + pipeline.
205205+ if !strings.Contains(got.path, "/organizations/workflow-org/pipelines/workflow-pipe/") {
206206+ t.Fatalf("path = %q", got.path)
207207+ }
208208+ if got.body.Commit != "deadbeef" || got.body.Branch != "feature" {
209209+ t.Fatalf("commit/branch = %q/%q", got.body.Commit, got.body.Branch)
210210+ }
211211+ if got.body.Message != "smuggled message" {
212212+ t.Fatalf("message = %q", got.body.Message)
213213+ }
214214+ if got.body.Env["CUSTOM"] != "value" {
215215+ t.Fatalf("env[CUSTOM] missing: %+v", got.body.Env)
216216+ }
217217+ // tack defaults must still be present (user keys merge,
218218+ // don't replace).
219219+ if got.body.Env["TACK_WORKFLOW"] != "ci.yml" {
220220+ t.Fatalf("env[TACK_WORKFLOW] missing: %+v", got.body.Env)
221221+ }
222222+ if got.body.MetaData["custom"] != "meta" ||
223223+ got.body.MetaData[bkMetaWorkflow] != "ci.yml" {
224224+ t.Fatalf("meta_data wrong: %+v", got.body.MetaData)
225225+ }
226226+ if !got.body.CleanCheckout {
227227+ t.Fatalf("clean_checkout not set")
228228+ }
229229+ // IgnorePipelineBranchFilters defaults to true (see
230230+ // workflowConfig comment).
231231+ if !got.body.IgnorePipelineBranchFilters {
232232+ t.Fatalf("ignore_pipeline_branch_filters not defaulted to true")
233233+ }
234234+ if got.body.PullRequestBaseBranch != "main" {
235235+ t.Fatalf("pr base branch = %q; want main",
236236+ got.body.PullRequestBaseBranch)
237237+ }
238238+ if got.body.Author == nil || got.body.Author.Email != "a@example.com" {
239239+ t.Fatalf("author = %+v", got.body.Author)
240240+ }
241241+ case <-time.After(2 * time.Second):
242242+ t.Fatal("CreateBuild not called")
243243+ }
244244+}
245245+246246+// TestBuildkiteSpawnInvalidYAML proves a workflow without the
247247+// required `pipeline` field is skipped — no API call, no DB row, no
248248+// status. A misconfigured workflow shouldn't be silently swept onto
249249+// some default pipeline.
250250+func TestBuildkiteSpawnInvalidYAML(t *testing.T) {
251251+ called := false
252252+ bk := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
253253+ called = true
254254+ })
255255+ p, st, _, _ := newBuildkiteTestProvider(t, buildkite.WebhookModeToken, "s", bk)
256256+257257+ p.Spawn(context.Background(), "knot.example.com", "rkey-z",
258258+ &tangled.Pipeline_TriggerMetadata{
259259+ Push: &tangled.Pipeline_PushTriggerData{NewSha: "abc", Ref: "refs/heads/main"},
260260+ },
261261+ []*tangled.Pipeline_Workflow{
262262+ {Name: "broken.yml", Raw: "steps:\n - run: true\n"},
263263+ },
264264+ )
265265+266266+ time.Sleep(50 * time.Millisecond)
267267+ if called {
268268+ t.Fatal("CreateBuild called for workflow missing pipeline")
149269 }
150270 rows, _ := st.EventsAfter(context.Background(), 0)
151271 if len(rows) != 0 {