Select the types of activity you want to include in your feed.
feat: read-through indexing job processing
- job retry with exponential backoff. - update local SQLite configuration with PRAGMA settings for better concurrency. - smoke checks/tests for API endpoints using uv
···4141Start the Ionic/Vite app:
42424343```bash
4444-pnpm dev
4545-# or: just dev
4444+pnpm dev # or: just dev
4645```
47464847That serves the client from `apps/twisted` with Vite.
49485050-To run the Go API locally, make sure `packages/api/.env` has at least:
4949+To run the Go API locally for routine experimentation, no Turso credentials are required.
5050+5151+Start the API in local file mode:
5252+5353+```bash
5454+pnpm api:run:api # or: just api-dev
5555+```
51565252-- `TURSO_DATABASE_URL`
5353-- `TURSO_AUTH_TOKEN`
5757+This serves the API and search site on `http://localhost:8080` using
5858+`packages/api/twister-dev.db`.
54595555-Then start the API:
6060+To run the API against remote Turso instead:
56615762```bash
5858-pnpm api:run:api
5959-# or: just api-dev
6363+just api-dev remote
6064```
61656262-This serves the API and search site on `http://localhost:8080`.
6666+To run the indexer in local file mode as well:
63676464-To run the indexer as well, `packages/api/.env` also needs:
6868+```bash
6969+pnpm api:run:indexer # or: just api-run-indexer
7070+```
7171+7272+To run the indexer against remote Turso, `packages/api/.env` needs:
65736674- `TAP_URL`
6775- `TAP_AUTH_PASSWORD`
6876- `INDEXED_COLLECTIONS`
69777070-Then start the indexer in a separate terminal:
7171-7278```bash
7373-pnpm api:run:indexer
7474-# or: just api-run-indexer
7979+just api-run-indexer remote
7580```
76817782Typical local setup is three terminals:
···7788Highest priority. This work blocks further investment in semantic search, hybrid ranking, and broader discovery features.
991010-- [ ] Stabilize local development and experimentation around a local `file:` database
1010+- [x] Stabilize local development and experimentation around a local `file:` database
1111- [x] Document backup, restore, and disk-growth procedures for the experimental local DB
1212- [x] Research production backend options: PostgreSQL, Turso remote/libSQL, and Turso embedded replicas
1313- [x] Write a production storage decision record with workload and operational tradeoffs, using `docs/adr/pg.md` and `docs/adr/turso.md`
1414- [x] Define the migration path from the experimental local setup to the chosen production backend
1515-- [ ] Add cURL smoke tests for `healthz`, `readyz`, `search`, `documents`, indexing, and activity in `scripts/api/`
1515+- [x] Add a durable read-through indexing job queue for records fetched through the API
1616+- [x] Add API smoke tests for `healthz`, `readyz`, `search`, `documents`, indexing, and activity in `scripts/api/`
1617 - desertthunder.dev DID: `did:plc:xg2vq45muivyy3xwatcehspu`
1718 - Twisted AT URI: `at://did:plc:xg2vq45muivyy3xwatcehspu/sh.tangled.repo/3mho6hukiei22`
1819 - Profile AT URI: `at://did:plc:xg2vq45muivyy3xwatcehspu/sh.tangled.actor.profile/self`
1920 - Follow AT URI (desertthunder.dev follows npmx): `at://did:plc:xg2vq45muivyy3xwatcehspu/sh.tangled.graph.follow/3mhofstanru22`
2021 - Star AT URI (desertthunder.dev stars microcosm-rs): `at://did:plc:lulmyldiq4sb2ikags5sfb25/sh.tangled.repo/3lvsxzinfz222`
2122- ~~Add `just` targets for smoke-test runs locally and against a remote base URL~~ directly invoking the scripts is fine.
2222-- [ ] Add a durable read-through indexing job queue for records fetched through the API
2323- [ ] Reuse the existing normalization and upsert path for on-demand indexing jobs
2424- [ ] Trigger indexing jobs from repo, issue, PR, profile, and similar fetch handlers
2525- [ ] Add dedupe, retries, and observability for indexing jobs
+6-4
justfile
···4141api-build:
4242 just --justfile packages/api/justfile build
43434444-api-dev:
4545- just --justfile packages/api/justfile run-api
4444+# Run API. Usage: just api-dev [mode], mode: local|remote (default local)
4545+api-dev mode="local":
4646+ just --justfile packages/api/justfile run-api {{mode}}
46474747-api-run-indexer:
4848- just --justfile packages/api/justfile run-indexer
4848+# Run indexer. Usage: just api-run-indexer [mode], mode: local|remote (default local)
4949+api-run-indexer mode="local":
5050+ just --justfile packages/api/justfile run-indexer {{mode}}
49515052api-test:
5153 just --justfile packages/api/justfile test
···18181919The server listens on `:8080` by default. Logs are printed as text when `--local` is set.
20202121+## API Smoke Tests
2222+2323+Smoke checks for the API surface live in a uv-managed Python project at
2424+`scripts/api/`.
2525+2626+From the repo root:
2727+2828+```sh
2929+uv run --project scripts/api twister-api-smoke
3030+```
3131+3232+Optional base URL override:
3333+3434+```sh
3535+TWISTER_API_BASE_URL=http://localhost:8080 \
3636+ uv run --project scripts/api twister-api-smoke
3737+```
3838+2139## Experimental Local DB Operations
22402341The experimental local database lives at `packages/api/twister-dev.db` when you run Twister from `packages/api` with `--local`.
+16
packages/api/internal/api/actors.go
···8484 if err != nil {
8585 return nil, fmt.Errorf("list repos for %s: %w", actor.DID, err)
8686 }
8787+ s.enqueueXRPCList(r.Context(), entries)
87888889 for _, entry := range entries {
8990 name, _ := entry.Value["name"].(string)
···208209 s.actorError(w, err)
209210 return
210211 }
212212+ s.enqueueXRPCRecord(r.Context(), rec.URI, rec.CID, rec.Value)
211213212214 var bsky *bskyProfileResponse
213215 if linked, _ := rec.Value["bluesky"].(bool); linked {
···243245 s.actorError(w, err)
244246 return
245247 }
248248+ s.enqueueXRPCList(r.Context(), entries)
246249247250 records := make([]recordEntry, len(entries))
248251 for i, e := range entries {
···279282 s.actorError(w, err)
280283 return
281284 }
285285+ s.enqueueXRPCRecord(r.Context(), rec.URI, rec.CID, rec.Value)
282286283287 writeJSON(w, http.StatusOK, map[string]any{
284288 "did": repo.DID,
···443447 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch issues"))
444448 return
445449 }
450450+ s.enqueueXRPCList(r.Context(), issues)
446451447452 var records []issueEntry
448453 for _, e := range issues {
···480485 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch pulls"))
481486 return
482487 }
488488+ s.enqueueXRPCList(r.Context(), pulls)
483489484490 var records []pullEntry
485491 for _, e := range pulls {
···518524 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch issues"))
519525 return
520526 }
527527+ s.enqueueXRPCList(r.Context(), issues)
521528522529 records := make([]issueEntry, len(issues))
523530 for i, e := range issues {
···548555 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch pulls"))
549556 return
550557 }
558558+ s.enqueueXRPCList(r.Context(), pulls)
551559552560 records := make([]pullEntry, len(pulls))
553561 for i, e := range pulls {
···578586 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch follows"))
579587 return
580588 }
589589+ s.enqueueXRPCList(r.Context(), entries)
581590582591 records := make([]recordEntry, len(entries))
583592 for i, e := range entries {
···605614 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch strings"))
606615 return
607616 }
617617+ s.enqueueXRPCList(r.Context(), entries)
608618609619 records := make([]recordEntry, len(entries))
610620 for i, e := range entries {
···635645 s.actorError(w, err)
636646 return
637647 }
648648+ s.enqueueXRPCRecord(r.Context(), rec.URI, rec.CID, rec.Value)
638649639650 _, stateMap, err := s.fetchIssuesAndStates(r, actor.PDS, actor.DID)
640651 if err != nil {
···667678 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch comments"))
668679 return
669680 }
681681+ s.enqueueXRPCList(r.Context(), entries)
670682671683 var records []recordEntry
672684 for _, e := range entries {
···704716 s.actorError(w, err)
705717 return
706718 }
719719+ s.enqueueXRPCRecord(r.Context(), rec.URI, rec.CID, rec.Value)
707720708721 _, statusMap, err := s.fetchPullsAndStatuses(r, actor.PDS, actor.DID)
709722 if err != nil {
···736749 writeJSON(w, http.StatusBadGateway, errorBody("upstream_error", "failed to fetch comments"))
737750 return
738751 }
752752+ s.enqueueXRPCList(r.Context(), entries)
739753740754 var records []recordEntry
741755 for _, e := range entries {
···792806 }
793807794808 stateMap := make(map[string]string, len(states))
809809+ s.enqueueXRPCList(r.Context(), states)
795810 for _, e := range states {
796811 issueURI, _ := e.Value["issue"].(string)
797812 state, _ := e.Value["state"].(string)
···839854 }
840855841856 statusMap := make(map[string]string, len(statuses))
857857+ s.enqueueXRPCList(r.Context(), statuses)
842858 for _, e := range statuses {
843859 pullURI, _ := e.Value["pull"].(string)
844860 status, _ := e.Value["status"].(string)
···11+CREATE TABLE IF NOT EXISTS indexing_jobs (
22+ document_id TEXT PRIMARY KEY,
33+ did TEXT NOT NULL,
44+ collection TEXT NOT NULL,
55+ rkey TEXT NOT NULL,
66+ cid TEXT NOT NULL,
77+ record_json TEXT NOT NULL,
88+ status TEXT NOT NULL,
99+ attempts INTEGER NOT NULL DEFAULT 0,
1010+ last_error TEXT,
1111+ scheduled_at TEXT NOT NULL,
1212+ updated_at TEXT NOT NULL
1313+);
1414+1515+CREATE INDEX IF NOT EXISTS idx_indexing_jobs_status_scheduled
1616+ ON indexing_jobs(status, scheduled_at, updated_at);
···55build:
66 CGO_ENABLED=0 go build -ldflags "{{ldflags}}" -o twister ./main.go
7788-run-api:
99- go run -ldflags "{{ldflags}}" ./main.go api
88+# Run the API server. Usage: just run-api [mode], mode: local|remote (default local)
99+run-api mode="local":
1010+ if [ "{{mode}}" = "local" ]; then \
1111+ go run -ldflags "{{ldflags}}" ./main.go api --local; \
1212+ elif [ "{{mode}}" = "remote" ]; then \
1313+ go run -ldflags "{{ldflags}}" ./main.go api; \
1414+ else \
1515+ echo "invalid mode '{{mode}}' (expected local or remote)" >&2; \
1616+ exit 1; \
1717+ fi
10181111-run-indexer:
1212- go run -ldflags "{{ldflags}}" ./main.go indexer
1919+# Run the indexer. Usage: just run-indexer [mode], mode: local|remote (default local)
2020+run-indexer mode="local":
2121+ if [ "{{mode}}" = "local" ]; then \
2222+ go run -ldflags "{{ldflags}}" ./main.go indexer --local; \
2323+ elif [ "{{mode}}" = "remote" ]; then \
2424+ go run -ldflags "{{ldflags}}" ./main.go indexer; \
2525+ else \
2626+ echo "invalid mode '{{mode}}' (expected local or remote)" >&2; \
2727+ exit 1; \
2828+ fi
13291430test:
1531 go test ./...
+24
scripts/api/README.md
···11+# Twister API Smoke Checks
22+33+Python smoke checks for Twister API endpoints, managed with uv.
44+55+## Usage
66+77+From the repo root:
88+99+```sh
1010+# Run all
1111+uv run --project scripts/api twister-api-smoke
1212+# Run specific checks (healthz | readyz | search | documents | indexing | activity)
1313+uv run --project scripts/api twister-api-smoke --check healthz
1414+```
1515+1616+## Options
1717+1818+- `--verbose` for detailed output of API responses (JSON)
1919+- `--base-url` (or env `TWISTER_API_BASE_URL`, default `http://localhost:8080`)
2020+- `--query` for search check (default `twisted`)
2121+- `--document-id` for documents check
2222+- `--actor-handle` for indexing check (default `desertthunder.dev`)
2323+- `--repo-at-uri` for repo fixture indexing/search checks
2424+- `--profile-at-uri` for profile fixture indexing/search checks