Monorepo for Aesthetic.Computer aesthetic.computer
4
fork

Configure Feed

Select the types of activity you want to include in your feed.

jeffrey-platter: scaffold index, archive, face-curation, neo-jeffrey gen

papers/jeffrey-platter/ — sub-platter index (sibling to whistlegraph-platter),
pointer-only README + canonical POI manifest.json (55 shoot + 38 masters + 38
candids = 131 entries, lifted from give.aesthetic.computer/index.html which
now fetches it at runtime via loadJeffreysManifest + buildImageIndex). sync.mjs
keeps system/public/give.aesthetic.computer/jeffreys-manifest.json in step.

portraits/jeffrey/ — bulk archive + face curation pipeline. Local-first, no
API dependency for the identity step:
bin/ig-import-cookies.py browser cookie → instaloader session (works around
2FA-flagged accounts where password flow strips
the auth cookie)
bin/ig-archive.fish bulk pull a profile (posts+highlights+stories,
fast-update incremental)
bin/ig-index.mjs per-account JSON summary by date
bin/fetch-corpus.mjs pulls CDN refs (shoot/masters/candids) per manifest
bin/face-match.py local insightface (buffalo_l) identity match against
reference embeddings — free, ~50ms/image, 95%+
accurate on clear faces
bin/face-describe.py GPT-4o vision tags matched images with the rev2
scene-graph schema (subject/environment/photography
/domain/tags/caption_hint), with cross-check on
face-match identity
bin/face-browser.py static HTML browser of curated/described records
with year/domain/sim filters, video preview
bin/generate-neo.py OpenAI gpt-image-1.5 / gpt-image-2 with multi-ref
identity grounding (default SHOOT_REFS, optional
--use-selfies adds 5 IG-platter selfies)

silo/instagrapi.service systemd unit staged for IG bridge migration
(not deployed; the platter doesn't need it — local
instaloader cookie path bypassed the original block)

reports/instagram-api-migration-2026-03-29.md status update — bypass landed,
bulk archive complete

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

+2820 -66
+176
papers/jeffrey-platter/README.md
··· 1 + # Jeffrey Platter 2 + 3 + An index of canonical photographic references for **Jeffrey Alan Scudder** — face, body, hands — across the repo, the assets CDN, the vault, and the social silos. A sub-platter within the [papers platter](../SCORE.md), parallel to [whistlegraph-platter](../whistlegraph-platter/). 4 + 5 + > The platter exists so that any pipeline (CV layout, image generation, press kit, lecture slide, video composite) can resolve "a photograph of jeffrey" to a concrete URL or file path with known POIs, focal points, and provenance — instead of reaching for whatever's nearby. 6 + 7 + This is **index-only**: pointers to where canonical images live. Masters live on the assets CDN and in the vault. No binary duplication into `papers/`. 8 + 9 + --- 10 + 11 + ## 1. In this repo 12 + 13 + ### Pieces (capture surfaces) 14 + 15 + These are the AC pieces that produce new photographic frames of jeffrey when he uses them: 16 + 17 + - [system/public/aesthetic.computer/disks/selfie.mjs](../../system/public/aesthetic.computer/disks/selfie.mjs) — *Selfie* — decorated photographic selfie palette (designed with @mollysoda) 18 + - [system/public/aesthetic.computer/disks/snap.mjs](../../system/public/aesthetic.computer/disks/snap.mjs) — *Snap* — still photo capture, saves to painting 19 + - [system/public/aesthetic.computer/disks/cap.mjs](../../system/public/aesthetic.computer/disks/cap.mjs) — *Cap* — video capture (caps/tapes), portrait-native 20 + - [system/public/aesthetic.computer/disks/camera.mjs](../../system/public/aesthetic.computer/disks/camera.mjs) — base camera piece 21 + - [system/public/aesthetic.computer/disks/phand.mjs](../../system/public/aesthetic.computer/disks/phand.mjs) — *Phand* — peter-hand palette (hand reference frames) 22 + 23 + ### POI manifest (single source of truth) 24 + 25 + - **[manifest.json](manifest.json)** — POI manifest for both CDN buckets (8 headshots + 38 candids). Each item carries `focal: [x%, y%]`, `pois: [{t: f|b|h, box}]`, and `aspect`. POIs originally detected via OpenCV DNN + Haar cascades. 26 + - **[sync.mjs](sync.mjs)** — copies `manifest.json` to consumer-served paths. Run after editing the manifest. Currently writes one target: `system/public/give.aesthetic.computer/jeffreys-manifest.json`. 27 + 28 + ### Consumers (where canonical jeffrey-images are already used) 29 + 30 + - [system/public/justanothersystem.org/cv.html](../../system/public/justanothersystem.org/cv.html#L485-L488) — bio portrait slideshow with Ken Burns; pulls `jeffery-av--07/06/10.jpg` from the `shoot/` bucket. OG/Twitter card image is `jeffery-av--07.jpg`. *Hardcoded URLs; doesn't read the manifest yet (TBD).* 31 + - [system/public/give.aesthetic.computer/index.html](../../system/public/give.aesthetic.computer/index.html) — full Jeffreys Ken Burns canvas slideshow. Fetches `./jeffreys-manifest.json` at runtime via `loadJeffreysManifest()` and merges into the slideshow's image index (`buildImageIndex()`). Originally inlined the same data as JS object literals; lifted to JSON 2026-04-28. 32 + - [recap/](../../recap/) — recap pipeline uses the **`jeffrey-pvc` Professional Voice Clone** via `/api/say`. Audio analogue of canonical-jeffrey: not visual but lives in the same canonical-self category. See [recap/SCORE.md](../../recap/SCORE.md). 33 + 34 + ### Adjacent textual / video material 35 + 36 + - [papers/cv/cv.tex](../cv/cv.tex), [papers/cv/cv.pdf](../cv/cv.pdf) — formal CV (text + dates) 37 + - [grants/lacma-2026/](../../grants/lacma-2026/) — LACMA pitch package; uses jeffrey-pvc narration + 6 photos from `jeffreys/jpg/` (per `SESSION-LOG.md` line 195) 38 + - TBD: lecture recordings cataloged under [papers/lectures/](../lectures/) — cross-reference any with on-camera jeffrey 39 + 40 + --- 41 + 42 + ## 2. External (assets CDN + sites) 43 + 44 + ### `assets.aesthetic.computer/jeffreys/` 45 + 46 + Hosted on Digital Ocean Spaces. Sync via `npm run assets:sync:down` / `npm run assets:sync:up` (see [CLAUDE.md](../../CLAUDE.md)). 47 + 48 + #### `jeffreys/shoot/` — Professional AV photoshoot (face-focused headshots) 49 + 50 + **55 headshots** named `jeffery-av--01.jpg` through `jeffery-av--55.jpg`, uploaded 2026-01-05. Bucket audited 2026-04-28 via `aws s3 ls`; all 55 are present and cataloged in [manifest.json](manifest.json) under `buckets.shoot.items`. 51 + 52 + **Resolution tiers** (sizes vary 10x within the sequence — uploads were sized for different consumers): 53 + 54 + | Tier | Range | Size | Use | 55 + | -------- | -------------- | ----------- | ---------------------------------------------- | 56 + | `master` | `--01..--10` | 20–26 MB | Training crops, print, archival | 57 + | `mid` | `--11..--35` | 1.7–5.3 MB | Slideshows, web reference | 58 + | `web` | `--36..--55` | 220–340 KB | Thumbnails / smallest backgrounds | 59 + 60 + Per-tier annotation lives in each item's `tier` and `size` fields. All headshots share uniform POI framing: `aspect: 0.667`, `focal: [50, 35]`, `pois: [{t:"f", box:[30,15,40,40]}]` — face-centered portrait. The framing is a placeholder; a re-run of OpenCV face detection on `master`-tier frames would yield true bounding boxes. 61 + 62 + Note the spelling: filenames use `jeffery-` (one r), not `jeffrey-`. 63 + 64 + **Currently consumed**: 65 + - give.aesthetic.computer Ken Burns slideshow — re-enabled 2026-04-28; merges all 55 into `allImagesData` (was previously commented out — only 8 cataloged then) 66 + - justanothersystem.org cv.html — hardcoded URLs for `--06/07/10` (OG card uses `--07`) 67 + 68 + #### `jeffreys/<NAME>.{heic,HEIC,jpeg,JPEG}` — iPhone master files 69 + 70 + **38 masters** sit at the top of the `jeffreys/` prefix (not in a subdirectory), 1:1 with `jpg/` candids. These are the iPhone-original HEIC/JPEG files — the JPGs in `candids/` are derivatives. Cataloged in [manifest.json](manifest.json) under `buckets.masters.items` with `candid_key` cross-references and per-file size. 71 + 72 + Use this bucket for **training crops** — masters carry roughly 2× the byte-budget of their JPG derivatives and preserve color depth that JPEG re-encoding lossily compresses. Don't surface masters in slideshows (HEIC has spotty browser support); use the `candids/` JPGs for those. 73 + 74 + Mixed extensions in the bucket: 32 are `heic`/`HEIC`, 6 are `jpeg`/`JPEG`. Manifest keys preserve the actual extension on the CDN. 75 + 76 + #### `jeffreys/jpg/` — Candids (JPG derivatives of the masters) 77 + 78 + 38 enumerated candids (`IMG_NNNN.jpg` + `FullSizeRender`) with hand-tuned focal points and POI bounding boxes. Full data lives in [manifest.json](manifest.json) under `buckets.candids.items`. Includes: 79 + 80 + `FullSizeRender`, `IMG_0260`, `IMG_0675`, `IMG_0686`, `IMG_0688`, `IMG_0798`, `IMG_1111`, `IMG_1577`, `IMG_1616`, `IMG_1737`, `IMG_1809`, `IMG_2124`, `IMG_2208`, `IMG_2280`, `IMG_2498`, `IMG_2630`, `IMG_2658`, `IMG_2668`, `IMG_2905`, `IMG_2913`, `IMG_3017`, `IMG_3234`, `IMG_4281`, `IMG_4312`, `IMG_4606`, `IMG_4894`, `IMG_4997`, `IMG_5043`, `IMG_5050`, `IMG_5272`, `IMG_5644`, `IMG_6342`, `IMG_6367`, `IMG_6435`, `IMG_8080`, `IMG_8188`, `IMG_8989`, `IMG_9795`. 81 + 82 + Bucket audited 2026-04-28 — exactly 38 entries, manifest is complete, no orphans, no uncataloged files. Each candid has a `master` field pointing at its HEIC/JPEG counterpart in `jeffreys/`; prefer the master for training, use the JPG for slideshows and embeds. 83 + 84 + POI types: `f` = face, `b` = body, `h` = hand. Aspects vary (0.562 / 0.563 / 0.667 / 0.75 / 0.8 / 1.333). `IMG_2124` and `IMG_2658` are referenced by name in [give.aesthetic.computer/index.html](../../system/public/give.aesthetic.computer/index.html) as the splash/background plates. 85 + 86 + TBD: locate the script that generated the existing focal/POI values (OpenCV DNN + Haar cascades, per the give-page comment) and check it in to `portraits/jeffrey/bin/`. The manifest carries the *output* of that pipeline; the pipeline itself isn't in the repo yet. 87 + 88 + ### Social silos (canonical public faces) 89 + 90 + - **Instagram (jeffrey solo, post-2023)** — https://www.instagram.com/whistlegraph/ 91 + - **Instagram (AC)** — https://www.instagram.com/aesthetic.computer/ — silo target account 92 + - **TikTok @whistlegraph** — https://www.tiktok.com/@whistlegraph (~2.7M peak) 93 + - **YouTube** — https://www.youtube.com/channel/UCZ_5AuCebRbm9t9_Y7SrckQ 94 + - **X / Twitter @whistlegraph** — https://x.com/whistlegraph 95 + - TBD: jas.life — closed-source / private; never surfaced through this index. Route any jas.life-sourced material through the vault, not the public CDN. 96 + 97 + ### Institutional pages carrying jeffrey portraits 98 + 99 + - **KADIST artist page** — https://kadist.org/people/jeffrey-alan-scudder/ — TBD: confirm portrait usage and licensing 100 + - **Schneider Museum of Art** — https://sma.sou.edu/whistlegraph/ — TBD: pull headshots used in event listings 101 + - **Rhizome / New Museum** *First Look* — TBD: 2022-05-14 debut, may carry press portraits 102 + - **Feral File** — https://feralfile.com/artists/Whistlegraph — TBD: artist page may carry a headshot 103 + 104 + --- 105 + 106 + ## 3. Local bulk archive (gitignored) 107 + 108 + Bulk image data lives at [`portraits/jeffrey/`](../../portraits/jeffrey/) under gitignored subdirectories — these aren't secrets (vault is for SECRET_FILES only), just bulky regenerable cache: 109 + 110 + - **[`portraits/jeffrey/ig-archive/<account>/`](../../portraits/jeffrey/ig-archive/)** — Instagram bulk dumps. As of 2026-04-29: `whistlegraph/` ≈ 4.1 GB, 6,104 jpgs spanning 2014-08-23 → 2026-04-28 (posts + highlights + active stories). Pulled via `portraits/jeffrey/bin/ig-archive.fish`. 111 + - **[`portraits/jeffrey/curated/`](../../portraits/jeffrey/curated/)** — outputs of the face-match → vision-describe pipeline. `jeffrey-match.jsonl` (insightface identity scores), `jeffrey-described.jsonl` (GPT-4o scene-graph metadata), `thumbnails/` (384px), `index.html` (static browser). 112 + - **TBD: `portraits/jeffrey/selfies/`** — selfie corpus (raw output from `selfie.mjs`/`snap.mjs`/`cap.mjs` runs); promote the keepers to `assets/jeffreys/jpg/` after curation. 113 + - **TBD: `portraits/jeffrey/press/`** — press headshots and high-res stills sent to institutions. 114 + - **TBD: `portraits/jeffrey/shoot-raw/`** — uncompressed masters from the AV photoshoot (only web-sized JPGs are on the CDN today). 115 + 116 + The only jeffrey-related thing that lives in the vault is the **instaloader session cookie** at `aesthetic-computer-vault/silo/instaloader-sessions/whistlegraph` — that one IS a credential (authenticated IG session) and stays vault-side. 117 + 118 + --- 119 + 120 + ## 4. Instagram archive ingestion (currently BLOCKED) 121 + 122 + The silo Instagram bridge is the path to bulk-importing jeffrey's solo + AC photo archives into the platter. 123 + 124 + - **Bridge code**: [silo/server.mjs](../../silo/server.mjs) on the silo VPS, port 3003 125 + - **Migration report**: [reports/instagram-api-migration-2026-03-29.md](../../reports/instagram-api-migration-2026-03-29.md) 126 + - **Status (2026-03-29)**: login rejected on `instagram-private-api`, `@i7m/instagram-cli`, and `subzeroid/instagrapi-rest`. Errors range from `IgLoginBadPasswordError` to "IP added to the blacklist". Migration to `instagrapi 2.3.0` (Python, actively maintained) sketched but not landed. 127 + - **Block resolution path**: bypassed 2026-04-29 — instaloader cookie-import (Chrome) succeeded; bulk archive of @whistlegraph (4.1 GB, 6,104 frames, 2014–2026) lives at [`portraits/jeffrey/ig-archive/whistlegraph/`](../../portraits/jeffrey/ig-archive/). Silo bridge migration is independent and still pending — only needed if the silo dashboard needs live IG profile/feed queries. 128 + 129 + When ingestion comes online, image filenames should map IG shortcodes to date-prefixed names (`YYYY-MM-DD-<shortcode>.jpg`) so timeline queries are easy. POI re-detection is a separate post-ingest pass. 130 + 131 + --- 132 + 133 + ## 5. For canonical image generation (the goal) 134 + 135 + The motivating use case for this platter is to feed image-generation pipelines a stable, well-annotated reference set of jeffrey so generated portraits stay on-model. 136 + 137 + **Pipeline location: [`portraits/jeffrey/`](../../portraits/jeffrey/)** at the monorepo root. It consumes [manifest.json](manifest.json) and the CDN buckets; it is the agreed home for training scripts, LoRA artifacts, IP-Adapter ref sets, and any sample outputs. 138 + 139 + **Currently in place** 140 + 141 + - A two-bucket archive (`shoot/` headshots + `jpg/` candids) hosted on the CDN 142 + - Per-image POI metadata (face/body/hand bounding boxes + focal point + aspect) for 46 images in [manifest.json](manifest.json). Lifted out of `give.aesthetic.computer/index.html` 2026-04-28. 143 + - Voice analogue (`jeffrey-pvc` PVC) already wired into `/api/say` — establishes the pattern of canonical-self assets being callable as a service 144 + 145 + **Gaps to close** 146 + 147 + - TBD: agree on what "canonical" means for generation: 148 + - Face LoRA / DreamBooth training corpus — needs ≥20 high-quality face crops with diverse lighting/angles. The 8 `shoot/` headshots are too uniform on their own (all `focal: [50, 35]`); needs candids mixed in. 149 + - IP-Adapter / FaceID reference set — needs a smaller curated set of 5–10 very clean references. 150 + - Identity-preserving prompt suffix — needs a stable text description of jeffrey's appearance, derived from photos. Adjacent to but distinct from voice clone metadata. 151 + - TBD: licensing & consent rails — even though jeffrey is the subject and operator, downstream pipelines that publish generated portraits should have an explicit allowlist of where they're allowed to render him. 152 + 153 + **Adjacent infrastructure to study** 154 + 155 + - [arxiv-penrose/](../arxiv-penrose/) — Penrose pipeline for AC illustrations. Not for portraits, but the "diagrams from data" pattern (declarative spec → rendered image) is a useful analogue. 156 + - [recap/bin/scout.mjs](../../recap/bin/scout.mjs) — content-query resolver. Could be extended with a `{ jeffrey: "shoot" | "candid" | "any", count: N }` query that returns N canonical jeffrey URLs for use in slide compositions. 157 + 158 + --- 159 + 160 + ## 6. Timeline (rough spine) 161 + 162 + - **2026-01-03 to 2026-01-05** — candids and HEIC masters uploaded to `jeffreys/` and `jeffreys/jpg/` (38 files each); AV shoot uploaded to `jeffreys/shoot/` (55 frames) on 2026-01-05 163 + - **2026-01–04** — POI manifest authored by hand inside `give.aesthetic.computer/index.html`; consumed by `cv.html` (justanothersystem.org) and the Ken Burns slideshow 164 + - **2026-03-29** — silo IG migration attempted, blocked at login layer ([report](../../reports/instagram-api-migration-2026-03-29.md)) 165 + - **2026-04-28** — platter index created; POI manifest lifted from `give.aesthetic.computer/index.html` to standalone [manifest.json](manifest.json); give-page now fetches the synced copy at runtime; bucket audit added `masters/` bucket and `tier`/`size`/`master` annotations; headshots merge re-enabled in give-page; `portraits/jeffrey/` scaffolded; silo `instagrapi.service` staged 166 + - **TBD: candid corpus origin** — `IMG_NNNN.jpg` filenames are iPhone-default; the masters carry EXIF dates that should be parsed and added to the manifest as `taken_at` 167 + 168 + --- 169 + 170 + ## How to extend this index 171 + 172 + 1. When new material appears (a new shoot, a new selfie session keeper, a press photo placement, an ingested IG batch) — add a line under the matching section with a link or asset path. 173 + 2. Use `TBD:` prefix for known-missing items so they're greppable. 174 + 3. Don't check binaries into `papers/`. Masters live on the CDN or in the vault; this file points at them. 175 + 4. When the POI manifest gets extracted from `give.aesthetic.computer/index.html` into a standalone JSON, link it here and update §2 to reflect the new source of truth. 176 + 5. If a sibling `manifest.json` or `pois/` directory grows, note it in §1 (consumers) so other pipelines can find it.
+164
papers/jeffrey-platter/manifest.json
··· 1 + { 2 + "$schema": "./manifest.schema.json", 3 + "version": 1, 4 + "generated": "2026-04-28", 5 + "note": "Canonical jeffrey-image POI manifest. Source of truth lives here; served copy at system/public/give.aesthetic.computer/jeffreys-manifest.json (run papers/jeffrey-platter/sync.mjs to refresh). Three buckets: shoot/ headshots, masters/ HEIC+JPEG originals, candids/ JPG derivatives. Filename spelling 'jeffery-' (one r) is intentional and matches the CDN.", 6 + "buckets": { 7 + "shoot": { 8 + "label": "Professional AV photoshoot — 55 face-focused headshots, uniform framing", 9 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/shoot/{name}", 10 + "key_includes_extension": true, 11 + "audited": "2026-04-28: aws s3 ls confirmed 55 contiguous --01..--55", 12 + "tiers": "master (>10MB, --01..--10), mid (1–10MB, --11..--35), web (<1MB, --36..--55)", 13 + "items": { 14 + "jeffery-av--01.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":24579271,"tier":"master"}, 15 + "jeffery-av--02.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23031131,"tier":"master"}, 16 + "jeffery-av--03.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23782694,"tier":"master"}, 17 + "jeffery-av--04.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":20180498,"tier":"master"}, 18 + "jeffery-av--05.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":21157608,"tier":"master"}, 19 + "jeffery-av--06.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23912884,"tier":"master"}, 20 + "jeffery-av--07.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":26175514,"tier":"master"}, 21 + "jeffery-av--08.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":25539600,"tier":"master"}, 22 + "jeffery-av--09.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":24963289,"tier":"master"}, 23 + "jeffery-av--10.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":25466226,"tier":"master"}, 24 + "jeffery-av--11.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3442326,"tier":"mid"}, 25 + "jeffery-av--12.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4923364,"tier":"mid"}, 26 + "jeffery-av--13.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4916244,"tier":"mid"}, 27 + "jeffery-av--14.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4650693,"tier":"mid"}, 28 + "jeffery-av--15.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4516921,"tier":"mid"}, 29 + "jeffery-av--16.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4489196,"tier":"mid"}, 30 + "jeffery-av--17.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3292944,"tier":"mid"}, 31 + "jeffery-av--18.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":1672328,"tier":"mid"}, 32 + "jeffery-av--19.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3705244,"tier":"mid"}, 33 + "jeffery-av--20.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4236846,"tier":"mid"}, 34 + "jeffery-av--21.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4263089,"tier":"mid"}, 35 + "jeffery-av--22.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4233171,"tier":"mid"}, 36 + "jeffery-av--23.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3663425,"tier":"mid"}, 37 + "jeffery-av--24.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4192960,"tier":"mid"}, 38 + "jeffery-av--25.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3794618,"tier":"mid"}, 39 + "jeffery-av--26.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3676864,"tier":"mid"}, 40 + "jeffery-av--27.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4195383,"tier":"mid"}, 41 + "jeffery-av--28.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4090050,"tier":"mid"}, 42 + "jeffery-av--29.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4130648,"tier":"mid"}, 43 + "jeffery-av--30.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4933430,"tier":"mid"}, 44 + "jeffery-av--31.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5068198,"tier":"mid"}, 45 + "jeffery-av--32.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5205568,"tier":"mid"}, 46 + "jeffery-av--33.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5294940,"tier":"mid"}, 47 + "jeffery-av--34.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5283671,"tier":"mid"}, 48 + "jeffery-av--35.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4789782,"tier":"mid"}, 49 + "jeffery-av--36.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":272541,"tier":"web"}, 50 + "jeffery-av--37.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":268923,"tier":"web"}, 51 + "jeffery-av--38.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":296328,"tier":"web"}, 52 + "jeffery-av--39.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":305648,"tier":"web"}, 53 + "jeffery-av--40.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":226963,"tier":"web"}, 54 + "jeffery-av--41.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":225962,"tier":"web"}, 55 + "jeffery-av--42.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":223762,"tier":"web"}, 56 + "jeffery-av--43.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":224105,"tier":"web"}, 57 + "jeffery-av--44.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":304325,"tier":"web"}, 58 + "jeffery-av--45.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":340465,"tier":"web"}, 59 + "jeffery-av--46.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":338624,"tier":"web"}, 60 + "jeffery-av--47.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":339362,"tier":"web"}, 61 + "jeffery-av--48.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":345660,"tier":"web"}, 62 + "jeffery-av--49.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":242741,"tier":"web"}, 63 + "jeffery-av--50.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":244176,"tier":"web"}, 64 + "jeffery-av--51.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":251330,"tier":"web"}, 65 + "jeffery-av--52.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":254304,"tier":"web"}, 66 + "jeffery-av--53.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":283319,"tier":"web"}, 67 + "jeffery-av--54.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":246485,"tier":"web"}, 68 + "jeffery-av--55.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":248833,"tier":"web"} 69 + } 70 + }, 71 + "masters": { 72 + "label": "iPhone master files (HEIC + JPEG) — 1:1 with candids/, higher quality. Pick masters over derivatives for training crops.", 73 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/{name}", 74 + "key_includes_extension": true, 75 + "audited": "2026-04-28: aws s3 ls assets-aesthetic-computer/jeffreys/ — 38 masters found, 1:1 with candids/", 76 + "items": { 77 + "FullSizeRender.heic": {"candid_key":"FullSizeRender","size":516985}, 78 + "IMG_0260.heic": {"candid_key":"IMG_0260","size":1978771}, 79 + "IMG_0675.JPEG": {"candid_key":"IMG_0675","size":4743995}, 80 + "IMG_0686.heic": {"candid_key":"IMG_0686","size":1063196}, 81 + "IMG_0688.heic": {"candid_key":"IMG_0688","size":954060}, 82 + "IMG_0798.jpeg": {"candid_key":"IMG_0798","size":2332501}, 83 + "IMG_1111.heic": {"candid_key":"IMG_1111","size":1818360}, 84 + "IMG_1577.heic": {"candid_key":"IMG_1577","size":3453048}, 85 + "IMG_1616.heic": {"candid_key":"IMG_1616","size":2838527}, 86 + "IMG_1737.heic": {"candid_key":"IMG_1737","size":1830876}, 87 + "IMG_1809.heic": {"candid_key":"IMG_1809","size":4224096}, 88 + "IMG_2124.jpeg": {"candid_key":"IMG_2124","size":1332909}, 89 + "IMG_2208.heic": {"candid_key":"IMG_2208","size":1569939}, 90 + "IMG_2280.heic": {"candid_key":"IMG_2280","size":876871}, 91 + "IMG_2498.heic": {"candid_key":"IMG_2498","size":2148839}, 92 + "IMG_2630.HEIC": {"candid_key":"IMG_2630","size":1490103}, 93 + "IMG_2658.HEIC": {"candid_key":"IMG_2658","size":1608793}, 94 + "IMG_2668.heic": {"candid_key":"IMG_2668","size":1195972}, 95 + "IMG_2905.heic": {"candid_key":"IMG_2905","size":2304901}, 96 + "IMG_2913.heic": {"candid_key":"IMG_2913","size":803125}, 97 + "IMG_3017.heic": {"candid_key":"IMG_3017","size":2562824}, 98 + "IMG_3234.heic": {"candid_key":"IMG_3234","size":806715}, 99 + "IMG_4281.jpeg": {"candid_key":"IMG_4281","size":2303766}, 100 + "IMG_4312.jpeg": {"candid_key":"IMG_4312","size":2657700}, 101 + "IMG_4606.heic": {"candid_key":"IMG_4606","size":970231}, 102 + "IMG_4894.heic": {"candid_key":"IMG_4894","size":1428189}, 103 + "IMG_4997.heic": {"candid_key":"IMG_4997","size":1804730}, 104 + "IMG_5043.HEIC": {"candid_key":"IMG_5043","size":1156692}, 105 + "IMG_5050.heic": {"candid_key":"IMG_5050","size":1131254}, 106 + "IMG_5272.jpeg": {"candid_key":"IMG_5272","size":1978886}, 107 + "IMG_5644.heic": {"candid_key":"IMG_5644","size":1366318}, 108 + "IMG_6342.jpeg": {"candid_key":"IMG_6342","size":1844946}, 109 + "IMG_6367.HEIC": {"candid_key":"IMG_6367","size":1528443}, 110 + "IMG_6435.HEIC": {"candid_key":"IMG_6435","size":2105174}, 111 + "IMG_8080.HEIC": {"candid_key":"IMG_8080","size":1420859}, 112 + "IMG_8188.heic": {"candid_key":"IMG_8188","size":903213}, 113 + "IMG_8989.HEIC": {"candid_key":"IMG_8989","size":2106687}, 114 + "IMG_9795.heic": {"candid_key":"IMG_9795","size":2240240} 115 + } 116 + }, 117 + "candids": { 118 + "label": "Personal candids — face/body/hand POIs", 119 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/jpg/{name}.jpg", 120 + "key_includes_extension": false, 121 + "audited": "2026-04-28: aws s3 ls confirmed all 38 entries match bucket; no orphans, no uncataloged. All have HEIC/JPEG masters at jeffreys/{name}.<ext> — see masters bucket.", 122 + "items": { 123 + "FullSizeRender": {"focal":[38.5,34.8],"pois":[{"t":"f","box":[33.8,30,9.5,9.7]},{"t":"b","box":[59.5,63.9,20.2,12.4]},{"t":"b","box":[87.6,89.3,8.4,5.2]}],"aspect":0.75,"size":829094,"master":"FullSizeRender.heic","master_size":516985}, 124 + "IMG_0260": {"focal":[41,38.1],"pois":[{"t":"f","box":[32.3,29.3,17.4,17.5]},{"t":"b","box":[15.9,50,5.9,3.6]}],"aspect":0.75,"size":1492041,"master":"IMG_0260.heic","master_size":1978771}, 125 + "IMG_0675": {"focal":[54,37.2],"pois":[{"t":"f","box":[47.8,31.8,12.3,10.9]}],"aspect":0.667,"size":4743995,"master":"IMG_0675.JPEG","master_size":4743995}, 126 + "IMG_0686": {"focal":[57.4,25.4],"pois":[{"t":"b","box":[53.6,23.1,7.5,4.6]},{"t":"b","box":[54.1,47.2,6.9,4.2]}],"aspect":0.75,"size":1145284,"master":"IMG_0686.heic","master_size":1063196}, 127 + "IMG_0688": {"focal":[72.3,63],"pois":[{"t":"b","box":[69.1,59.3,6.5,7.4]},{"t":"b","box":[39.4,46.8,4.1,4.6]}],"aspect":0.563,"size":1035484,"master":"IMG_0688.heic","master_size":954060}, 128 + "IMG_0798": {"focal":[46.6,6.4],"pois":[{"t":"b","box":[44.8,5.3,3.7,2.3]}],"aspect":0.75,"size":2332501,"master":"IMG_0798.jpeg","master_size":2332501}, 129 + "IMG_1111": {"focal":[50,50],"pois":[],"aspect":0.562,"size":1412575,"master":"IMG_1111.heic","master_size":1818360}, 130 + "IMG_1577": {"focal":[37,70.6],"pois":[{"t":"b","box":[35.5,69.7,2.8,1.7]}],"aspect":0.75,"size":3285470,"master":"IMG_1577.heic","master_size":3453048}, 131 + "IMG_1616": {"focal":[47.8,51.7],"pois":[{"t":"f","box":[42.5,45.2,10.7,13]},{"t":"b","box":[14.1,68,12.4,7.6]}],"aspect":0.75,"size":2863682,"master":"IMG_1616.heic","master_size":2838527}, 132 + "IMG_1737": {"focal":[50,50],"pois":[],"aspect":0.75,"size":1539214,"master":"IMG_1737.heic","master_size":1830876}, 133 + "IMG_1809": {"focal":[48,53.2],"pois":[{"t":"b","box":[45,51.4,6.1,3.7]}],"aspect":0.75,"size":3742829,"master":"IMG_1809.heic","master_size":4224096}, 134 + "IMG_2124": {"focal":[54.8,35.8],"pois":[{"t":"f","box":[49.7,31.8,10.1,7.9]},{"t":"b","box":[3.1,48.8,32,14.8]}],"aspect":0.563,"size":1332909,"master":"IMG_2124.jpeg","master_size":1332909}, 135 + "IMG_2208": {"focal":[40,21.8],"pois":[{"t":"f","box":[33.6,14.5,12.8,14.6]},{"t":"b","box":[23.4,5.1,10.6,6.5]}],"aspect":0.75,"size":1788410,"master":"IMG_2208.heic","master_size":1569939}, 136 + "IMG_2280": {"focal":[57.4,33.4],"pois":[{"t":"f","box":[42.5,16.4,29.8,34]}],"aspect":0.8,"size":1016167,"master":"IMG_2280.heic","master_size":876871}, 137 + "IMG_2498": {"focal":[21.8,61.4],"pois":[{"t":"b","box":[12.6,55.8,18.3,11.2]}],"aspect":0.75,"size":1710740,"master":"IMG_2498.heic","master_size":2148839}, 138 + "IMG_2630": {"focal":[20.3,59.6],"pois":[{"t":"b","box":[6.8,44.8,27,29.4]},{"t":"b","box":[30.7,44,8.6,22.9]}],"aspect":1.333,"size":1452701,"master":"IMG_2630.HEIC","master_size":1490103}, 139 + "IMG_2658": {"focal":[46.6,43.7],"pois":[{"t":"f","box":[39.2,34.5,14.8,18.4]},{"t":"b","box":[58.5,68.9,8.4,5.2]}],"aspect":0.75,"size":1992798,"master":"IMG_2658.HEIC","master_size":1608793}, 140 + "IMG_2668": {"focal":[39.4,24.5],"pois":[{"t":"b","box":[30.4,19,18,11]},{"t":"b","box":[40.7,68.8,8.6,5.3]}],"aspect":0.75,"size":1365145,"master":"IMG_2668.heic","master_size":1195972}, 141 + "IMG_2905": {"focal":[47.5,40.1],"pois":[{"t":"b","box":[28.3,28.4,38.3,23.5]}],"aspect":0.75,"size":2497398,"master":"IMG_2905.heic","master_size":2304901}, 142 + "IMG_2913": {"focal":[69.9,19.9],"pois":[{"t":"b","box":[67.7,18.5,4.5,2.7]},{"t":"b","box":[87,36.8,4.1,2.5]}],"aspect":0.75,"size":1042326,"master":"IMG_2913.heic","master_size":803125}, 143 + "IMG_3017": {"focal":[37.6,81.5],"pois":[{"t":"b","box":[31.7,77.9,11.8,7.3]}],"aspect":0.75,"size":2446374,"master":"IMG_3017.heic","master_size":2562824}, 144 + "IMG_3234": {"focal":[42.4,48.8],"pois":[{"t":"f","box":[36.9,43.2,11.1,11.3]},{"t":"b","box":[27.1,34.8,15.4,9.4]}],"aspect":0.75,"size":1015178,"master":"IMG_3234.heic","master_size":806715}, 145 + "IMG_4281": {"focal":[50.2,38.1],"pois":[{"t":"f","box":[44.5,31.3,11.4,13.6]},{"t":"b","box":[70.5,59.8,8.4,12.6]}],"aspect":0.75,"size":2303766,"master":"IMG_4281.jpeg","master_size":2303766}, 146 + "IMG_4312": {"focal":[4.1,28.1],"pois":[{"t":"f","box":[0,18.6,8.2,19]},{"t":"b","box":[32.5,63.2,3.2,2]}],"aspect":0.75,"size":2657700,"master":"IMG_4312.jpeg","master_size":2657700}, 147 + "IMG_4606": {"focal":[53.1,56.5],"pois":[{"t":"f","box":[28.6,31.6,48.9,49.8]}],"aspect":0.75,"size":1044546,"master":"IMG_4606.heic","master_size":970231}, 148 + "IMG_4894": {"focal":[48,73.6],"pois":[{"t":"b","box":[44.4,71.4,7.2,4.4]},{"t":"b","box":[71.8,49.7,4.6,6.8]}],"aspect":0.75,"size":1564246,"master":"IMG_4894.heic","master_size":1428189}, 149 + "IMG_4997": {"focal":[54,55.8],"pois":[{"t":"b","box":[52.3,53.2,3.4,5.1]}],"aspect":0.75,"size":2173587,"master":"IMG_4997.heic","master_size":1804730}, 150 + "IMG_5043": {"focal":[40.8,43.4],"pois":[{"t":"f","box":[30.8,32.3,20.1,22.1]},{"t":"b","box":[31.9,69.9,4.8,3]}],"aspect":0.75,"size":1239460,"master":"IMG_5043.HEIC","master_size":1156692}, 151 + "IMG_5050": {"focal":[65.6,53.7],"pois":[{"t":"f","box":[60.1,48.3,10.9,10.8]},{"t":"b","box":[21.9,43,16.5,10.1]}],"aspect":0.75,"size":1515666,"master":"IMG_5050.heic","master_size":1131254}, 152 + "IMG_5272": {"focal":[62.3,53.4],"pois":[{"t":"b","box":[61,51.3,2.7,4.1]}],"aspect":0.75,"size":1978886,"master":"IMG_5272.jpeg","master_size":1978886}, 153 + "IMG_5644": {"focal":[49,17.8],"pois":[{"t":"f","box":[44.1,13.9,9.8,8]},{"t":"b","box":[32.7,19.9,11,5.1]}],"aspect":0.563,"size":1380329,"master":"IMG_5644.heic","master_size":1366318}, 154 + "IMG_6342": {"focal":[62.3,28],"pois":[{"t":"f","box":[55.1,18.5,14.5,18.9]},{"t":"b","box":[10.7,69.4,23.7,14.6]}],"aspect":0.75,"size":1844946,"master":"IMG_6342.jpeg","master_size":1844946}, 155 + "IMG_6367": {"focal":[44.1,34.3],"pois":[{"t":"f","box":[37.8,28.7,12.7,11.1]},{"t":"b","box":[60.4,4.8,3.6,2.2]}],"aspect":0.75,"size":1416580,"master":"IMG_6367.HEIC","master_size":1528443}, 156 + "IMG_6435": {"focal":[48,37.8],"pois":[{"t":"f","box":[37.9,27.2,20.2,21.1]},{"t":"b","box":[35.9,86,4.8,2.9]}],"aspect":0.75,"size":1604594,"master":"IMG_6435.HEIC","master_size":2105174}, 157 + "IMG_8080": {"focal":[42.6,27.7],"pois":[{"t":"f","box":[36,20.8,13.1,13.9]},{"t":"b","box":[15.9,8.3,66.6,40.9]}],"aspect":0.75,"size":1580558,"master":"IMG_8080.HEIC","master_size":1420859}, 158 + "IMG_8188": {"focal":[48.9,33.9],"pois":[{"t":"f","box":[37.9,24,21.8,19.9]},{"t":"b","box":[35,88.1,9.8,6]}],"aspect":0.75,"size":968035,"master":"IMG_8188.heic","master_size":903213}, 159 + "IMG_8989": {"focal":[79.2,86.1],"pois":[{"t":"b","box":[76.7,84.6,5,3.1]}],"aspect":0.75,"size":1830983,"master":"IMG_8989.HEIC","master_size":2106687}, 160 + "IMG_9795": {"focal":[31.2,22],"pois":[{"t":"f","box":[23.8,15.6,14.8,12.9]},{"t":"b","box":[21.3,51.5,4.3,2]}],"aspect":0.562,"size":2033875,"master":"IMG_9795.heic","master_size":2240240} 161 + } 162 + } 163 + } 164 + }
+20
papers/jeffrey-platter/sync.mjs
··· 1 + #!/usr/bin/env node 2 + 3 + import { readFileSync, writeFileSync, mkdirSync } from "fs"; 4 + import { dirname, join } from "path"; 5 + 6 + const HERE = new URL(".", import.meta.url).pathname; 7 + const REPO_ROOT = join(HERE, "..", ".."); 8 + const SOURCE = join(HERE, "manifest.json"); 9 + const TARGETS = [ 10 + join(REPO_ROOT, "system/public/give.aesthetic.computer/jeffreys-manifest.json"), 11 + ]; 12 + 13 + const json = readFileSync(SOURCE, "utf8"); 14 + JSON.parse(json); 15 + 16 + for (const target of TARGETS) { 17 + mkdirSync(dirname(target), { recursive: true }); 18 + writeFileSync(target, json, "utf8"); 19 + console.log(`wrote ${target.replace(REPO_ROOT + "/", "")}`); 20 + }
+12
portraits/jeffrey/.gitignore
··· 1 + corpus/ 2 + crops/ 3 + out/ 4 + models/ 5 + weights/ 6 + ig-archive/ 7 + curated/ 8 + sessions/ 9 + *.safetensors 10 + *.ckpt 11 + *.pt 12 + *.bin
+86
portraits/jeffrey/README.md
··· 1 + # Portraits / Jeffrey 2 + 3 + Image-generation pipeline for canonical jeffrey portraits. Consumes the POI manifest at [`papers/jeffrey-platter/manifest.json`](../../papers/jeffrey-platter/manifest.json) and the `assets.aesthetic.computer/jeffreys/` CDN buckets cataloged there. 4 + 5 + > The platter is the index (read-only pointers). This directory is the workshop (training, generation, sample outputs). The visual analogue of [`recap/`](../../recap/) for `jeffrey-pvc` voice — same canonical-self pattern, applied to faces instead of audio. 6 + 7 + ## Status 8 + 9 + Scaffolded 2026-04-28. The corpus fetcher is wired; everything downstream of "have a local corpus" is TBD. 10 + 11 + ## Pipeline (planned) 12 + 13 + ``` 14 + papers/jeffrey-platter/manifest.json 15 + 16 + ▼ bin/fetch-corpus.mjs 17 + corpus/<bucket>/<filename> (gitignored — downloaded from CDN) 18 + corpus/index.json (manifest + fetch metadata) 19 + 20 + ▼ TBD: bin/face-crops.mjs (apply manifest POI face boxes to crop training tiles) 21 + crops/face/*.jpg (gitignored — N×N face tiles for LoRA/IP-Adapter) 22 + 23 + ▼ TBD: training (LoRA / DreamBooth / IP-Adapter ref set — pick one) 24 + weights/jeffrey-{lora|ipadapter}.safetensors 25 + 26 + ▼ TBD: bin/generate.mjs (sample portraits given a prompt) 27 + out/<run-id>/ (gitignored — generated samples) 28 + ``` 29 + 30 + ## Commands 31 + 32 + ### Pull canonical corpus (CDN → local) 33 + 34 + ```bash 35 + node bin/fetch-corpus.mjs # all three buckets (shoot + masters + candids) 36 + node bin/fetch-corpus.mjs --shoot # 55 AV-shoot headshots only 37 + node bin/fetch-corpus.mjs --masters # 38 HEIC/JPEG iPhone originals (best for training) 38 + node bin/fetch-corpus.mjs --candids # 38 JPG derivatives 39 + node bin/fetch-corpus.mjs --force # re-download even if local copies exist 40 + ``` 41 + 42 + The fetcher reads `papers/jeffrey-platter/manifest.json` directly — there is no separate config. To add a new image to the corpus, add it to the manifest and re-run. 43 + 44 + ### Bulk-archive Instagram 45 + 46 + Two paths to bootstrap an instaloader session — pick whichever your account allows: 47 + 48 + ```fish 49 + # Path A (recommended): import session from a logged-in browser 50 + # - Works around 2FA-flagged accounts where the password API path 51 + # silently strips the auth cookie (this is what bit @whistlegraph) 52 + # - One Keychain prompt on first Chrome run, then silent thereafter 53 + bin/ig-import-cookies.py chrome whistlegraph 54 + bin/ig-import-cookies.py firefox aesthetic.computer # or chrome / brave / arc / edge / safari 55 + 56 + # Path B (fallback): login with password 57 + # - Only works on accounts without 2FA / checkpoints 58 + IG_PASSWORD='...' bin/ig-login.py whistlegraph 59 + 60 + # Then archive the full timeline (de-dupes via --fast-update on re-runs) 61 + bin/ig-archive.fish whistlegraph 62 + bin/ig-archive.fish aesthetic.computer 63 + ``` 64 + 65 + Sessions persist at `aesthetic-computer-vault/silo/instaloader-sessions/<account>` (the only piece that stays in vault — it's an authenticated cookie). Archives land at `portraits/jeffrey/ig-archive/<account>/` (gitignored). Re-running `ig-archive.fish` only fetches new posts (by shortcode); safe to cron. 66 + 67 + **On account safety**: Instagram's anti-bot detection has flagged @whistlegraph before (see [reports/instagram-api-migration-2026-03-29.md](../../reports/instagram-api-migration-2026-03-29.md)). The cookie-import path is safer than password-flow because it piggybacks on a session Instagram has already accepted in your browser — there is no fresh login event for the anti-bot system to evaluate. 68 + 69 + ## Open decisions 70 + 71 + - **Training approach** — face LoRA (~20–60 face tiles, fine-tunes the model) vs IP-Adapter / FaceID (5–10 reference images, no training, identity injected at inference time) vs hybrid. Different tradeoffs on faithfulness, generation speed, base-model lock-in. Pick before scripting `bin/face-crops.mjs` since the crop format depends on it. 72 + - **Base model** — Flux.1-dev, SDXL, SD 3.5, or something else. Affects which adapter formats apply. 73 + - **Compute** — local on jas's mac (MPS), runpod/lambda (cloud GPU), or fal/replicate (managed). Manifest + corpus are portable; only `weights/` and `out/` are environment-specific. 74 + - **Identity-preserving prompt suffix** — a stable text fragment that describes jeffrey's appearance for use as a generation hint. Adjacent to but distinct from voice clone metadata. 75 + - **Consent / use rails** — even though jeffrey is the operator and subject, downstream pipelines that *publish* generated portraits should have an explicit allowlist of where they can render him. 76 + 77 + ## Why here, not under `papers/` 78 + 79 + `papers/jeffrey-platter/` is **index-only** — pointers to where canonical material lives. This directory holds the actual machinery (scripts, weights, samples) that consumes the index. Same separation as `papers/whistlegraph-platter/` (index) ↔ `system/public/aesthetic.computer/disks/whistlegraph.mjs` (the practice surface that uses it). 80 + 81 + ## See also 82 + 83 + - [papers/jeffrey-platter/README.md](../../papers/jeffrey-platter/README.md) — canonical platter index 84 + - [papers/jeffrey-platter/manifest.json](../../papers/jeffrey-platter/manifest.json) — POI manifest (source of truth) 85 + - [recap/SCORE.md](../../recap/SCORE.md) — voice analogue (`jeffrey-pvc` PVC pipeline) 86 + - [reports/instagram-api-migration-2026-03-29.md](../../reports/instagram-api-migration-2026-03-29.md) — IG ingestion gating
+410
portraits/jeffrey/bin/face-browser.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Build a static HTML browser for the curated jeffrey-described.jsonl. 4 + 5 + Generates: 6 + - curated/thumbnails/<filename>.jpg — 384px-max thumbnails (one per record) 7 + - curated/index.html — single-file browser, all data embedded 8 + 9 + Open by serving the curated/ dir over HTTP: 10 + cd portraits/jeffrey/curated 11 + python3 -m http.server 8000 12 + open http://localhost:8000/ 13 + 14 + The browser shows: filter sidebar (year, domain, sim, confirmed), masonry-ish 15 + grid of thumbnails with date/domain/sim overlays, click any card for a modal 16 + with full subject/environment/photography fields and the original-resolution 17 + image. 18 + 19 + Usage: 20 + python face-browser.py \\ 21 + --described portraits/jeffrey/curated/jeffrey-described.jsonl \\ 22 + --match portraits/jeffrey/curated/jeffrey-match.jsonl \\ 23 + --curated-dir portraits/jeffrey/curated 24 + """ 25 + 26 + from __future__ import annotations 27 + 28 + import argparse 29 + import json 30 + import sys 31 + from collections import Counter 32 + from pathlib import Path 33 + 34 + from PIL import Image 35 + 36 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 37 + DEFAULT_CURATED = REPO_ROOT / "portraits" / "jeffrey" / "curated" 38 + 39 + THUMB_MAX = 384 40 + 41 + 42 + def make_thumbnail(src: Path, dst: Path) -> bool: 43 + if dst.exists() and dst.stat().st_size > 0: 44 + return False 45 + try: 46 + with Image.open(src) as img: 47 + img.thumbnail((THUMB_MAX, THUMB_MAX), Image.Resampling.LANCZOS) 48 + if img.mode != "RGB": 49 + img = img.convert("RGB") 50 + dst.parent.mkdir(parents=True, exist_ok=True) 51 + img.save(dst, "JPEG", quality=78, optimize=True) 52 + return True 53 + except Exception as e: 54 + print(f" warn: thumbnail failed for {src.name}: {e}", file=sys.stderr) 55 + return False 56 + 57 + 58 + HTML_TEMPLATE = """<!DOCTYPE html> 59 + <html lang="en"> 60 + <head> 61 + <meta charset="UTF-8"> 62 + <title>Jeffrey Platter — Browser</title> 63 + <style> 64 + :root { 65 + --bg: #1a1a1f; 66 + --fg: #e8e8e8; 67 + --dim: #888; 68 + --card: #2a2a30; 69 + --accent: #4ecdc4; 70 + --pink: #cd5c9b; 71 + --gold: #d4a017; 72 + } 73 + * { box-sizing: border-box; } 74 + body { margin: 0; background: var(--bg); color: var(--fg); font: 14px/1.5 -apple-system, BlinkMacSystemFont, "Helvetica Neue", sans-serif; } 75 + header { padding: 14px 18px; border-bottom: 1px solid #333; display: flex; gap: 14px; flex-wrap: wrap; align-items: baseline; } 76 + h1 { margin: 0; font-size: 18px; font-weight: 600; } 77 + .stats { color: var(--dim); font-size: 12px; } 78 + .layout { display: grid; grid-template-columns: 220px 1fr; min-height: calc(100vh - 50px); } 79 + aside { padding: 14px; border-right: 1px solid #333; overflow-y: auto; max-height: calc(100vh - 50px); position: sticky; top: 0; } 80 + aside h3 { margin: 12px 0 6px; font-size: 11px; color: var(--dim); text-transform: uppercase; letter-spacing: 0.05em; } 81 + .filter-list { list-style: none; padding: 0; margin: 0; } 82 + .filter-list li { padding: 3px 0; cursor: pointer; user-select: none; } 83 + .filter-list li:hover { color: var(--accent); } 84 + .filter-list li.active { color: var(--accent); font-weight: 600; } 85 + .filter-list li .count { color: var(--dim); font-size: 11px; margin-left: 4px; } 86 + input[type=range] { width: 100%; } 87 + input[type=text] { width: 100%; padding: 6px; background: var(--card); border: 1px solid #444; color: var(--fg); border-radius: 3px; } 88 + .toggle { display: flex; align-items: center; gap: 6px; padding: 4px 0; cursor: pointer; } 89 + main { padding: 14px; } 90 + .grid { display: grid; grid-template-columns: repeat(auto-fill, minmax(220px, 1fr)); gap: 8px; } 91 + .card { background: var(--card); border-radius: 4px; overflow: hidden; cursor: pointer; position: relative; transition: transform 0.1s; } 92 + .card:hover { transform: scale(1.02); outline: 2px solid var(--accent); } 93 + .card .thumb { width: 100%; aspect-ratio: 1; object-fit: cover; display: block; background: #111; } 94 + .card .meta { padding: 6px 8px; font-size: 11px; } 95 + .card .date { color: var(--dim); } 96 + .card .domain { color: var(--accent); } 97 + .card .sim { position: absolute; top: 4px; right: 4px; background: rgba(0,0,0,0.75); padding: 2px 5px; border-radius: 3px; font-size: 10px; font-family: ui-monospace, monospace; } 98 + .card .badge-rejected { position: absolute; top: 4px; left: 4px; background: rgba(205,92,155,0.9); padding: 2px 5px; border-radius: 3px; font-size: 10px; font-weight: bold; } 99 + .card .badge-video { position: absolute; bottom: 36px; left: 4px; background: rgba(0,0,0,0.75); padding: 2px 6px; border-radius: 3px; font-size: 12px; font-weight: bold; color: var(--gold); } 100 + #modal { display: none; position: fixed; inset: 0; background: rgba(0,0,0,0.92); z-index: 100; padding: 20px; overflow-y: auto; } 101 + #modal.open { display: block; } 102 + .modal-grid { max-width: 1100px; margin: 0 auto; display: grid; grid-template-columns: 1fr 360px; gap: 20px; } 103 + .modal-img { width: 100%; max-height: 80vh; object-fit: contain; background: #111; border-radius: 4px; } 104 + .modal-info { color: var(--fg); } 105 + .modal-info h2 { margin: 0 0 6px; font-size: 14px; color: var(--accent); } 106 + .modal-info .field { margin-bottom: 10px; font-size: 12px; } 107 + .modal-info .label { color: var(--dim); font-size: 10px; text-transform: uppercase; letter-spacing: 0.05em; } 108 + .modal-info .value { color: var(--fg); } 109 + .modal-info .tag { display: inline-block; background: var(--card); padding: 2px 6px; border-radius: 2px; margin: 2px 4px 2px 0; font-size: 11px; } 110 + #close { position: fixed; top: 12px; right: 16px; cursor: pointer; font-size: 24px; color: var(--fg); z-index: 101; user-select: none; } 111 + .empty { color: var(--dim); padding: 40px; text-align: center; } 112 + </style> 113 + </head> 114 + <body> 115 + <header> 116 + <h1>Jeffrey Platter</h1> 117 + <span class="stats" id="stats">…</span> 118 + </header> 119 + <div class="layout"> 120 + <aside> 121 + <input type="text" id="search" placeholder="Search description / tags…"> 122 + 123 + <h3>Status</h3> 124 + <label class="toggle"><input type="checkbox" id="confirmedOnly" checked> Confirmed only (vision)</label> 125 + <label class="toggle"><input type="checkbox" id="rejectedOnly"> Rejected only (false positives)</label> 126 + 127 + <h3>Min similarity (face-match)</h3> 128 + <input type="range" id="simSlider" min="0.5" max="0.95" step="0.05" value="0.5"> 129 + <div style="font-family: ui-monospace, monospace; color: var(--dim); font-size: 11px;" id="simValue">0.50</div> 130 + 131 + <h3>Year</h3> 132 + <ul class="filter-list" id="yearList"></ul> 133 + 134 + <h3>Domain</h3> 135 + <ul class="filter-list" id="domainList"></ul> 136 + </aside> 137 + <main> 138 + <div class="grid" id="grid"></div> 139 + </main> 140 + </div> 141 + 142 + <div id="modal"> 143 + <span id="close">×</span> 144 + <div class="modal-grid" id="modalGrid"></div> 145 + </div> 146 + 147 + <script> 148 + const RECORDS = __RECORDS__; 149 + 150 + function bucketSim(s) { 151 + if (s >= 0.8) return "0.8+"; 152 + if (s >= 0.7) return "0.7–0.8"; 153 + if (s >= 0.6) return "0.6–0.7"; 154 + return "0.5–0.6"; 155 + } 156 + 157 + const state = { 158 + year: null, 159 + domain: null, 160 + confirmedOnly: true, 161 + rejectedOnly: false, 162 + minSim: 0.5, 163 + search: "", 164 + }; 165 + 166 + function filtered() { 167 + return RECORDS.filter(r => { 168 + if (state.confirmedOnly && !r.is_jeffrey_confirmed) return false; 169 + if (state.rejectedOnly && r.is_jeffrey_confirmed) return false; 170 + if ((r.match_similarity || 0) < state.minSim) return false; 171 + if (state.year && r.date && !r.date.startsWith(state.year)) return false; 172 + if (state.domain && r.domain !== state.domain) return false; 173 + if (state.search) { 174 + const q = state.search.toLowerCase(); 175 + const blob = JSON.stringify({ 176 + s: r.subject, e: r.environment, p: r.photography, 177 + t: r.tags, c: r.caption_hint, d: r.domain, x: r.rel_path 178 + }).toLowerCase(); 179 + if (!blob.includes(q)) return false; 180 + } 181 + return true; 182 + }); 183 + } 184 + 185 + function renderGrid() { 186 + const items = filtered(); 187 + document.getElementById("stats").textContent = 188 + `${items.length} of ${RECORDS.length} · ${RECORDS.filter(r => r.is_jeffrey_confirmed).length} confirmed total`; 189 + const grid = document.getElementById("grid"); 190 + if (items.length === 0) { 191 + grid.innerHTML = `<div class="empty">no matches</div>`; 192 + return; 193 + } 194 + grid.innerHTML = items.slice(0, 600).map((r, i) => ` 195 + <div class="card" data-idx="${RECORDS.indexOf(r)}"> 196 + <img class="thumb" loading="lazy" src="thumbnails/${r.rel_path}" alt=""> 197 + <div class="sim">${(r.match_similarity || 0).toFixed(2)}</div> 198 + ${r.has_video ? '<div class="badge-video">▶</div>' : ''} 199 + ${!r.is_jeffrey_confirmed ? '<div class="badge-rejected">REJECTED</div>' : ''} 200 + <div class="meta"> 201 + <div class="date">${r.date || ""}</div> 202 + <div class="domain">${r.domain || ""}</div> 203 + </div> 204 + </div> 205 + `).join(""); 206 + if (items.length > 600) { 207 + grid.innerHTML += `<div class="empty">…and ${items.length - 600} more (refine filters to see)</div>`; 208 + } 209 + } 210 + 211 + function renderFilters() { 212 + const items = RECORDS.filter(r => 213 + (!state.confirmedOnly || r.is_jeffrey_confirmed) && 214 + (!state.rejectedOnly || !r.is_jeffrey_confirmed) && 215 + (r.match_similarity || 0) >= state.minSim 216 + ); 217 + 218 + const years = {}; 219 + const domains = {}; 220 + items.forEach(r => { 221 + if (r.date) { const y = r.date.slice(0, 4); years[y] = (years[y] || 0) + 1; } 222 + if (r.domain) domains[r.domain] = (domains[r.domain] || 0) + 1; 223 + }); 224 + 225 + const yl = document.getElementById("yearList"); 226 + yl.innerHTML = `<li class="${state.year === null ? 'active' : ''}" data-y="">all <span class="count">${items.length}</span></li>` + 227 + Object.entries(years).sort().map(([y, n]) => 228 + `<li class="${state.year === y ? 'active' : ''}" data-y="${y}">${y} <span class="count">${n}</span></li>` 229 + ).join(""); 230 + 231 + const dl = document.getElementById("domainList"); 232 + dl.innerHTML = `<li class="${state.domain === null ? 'active' : ''}" data-d="">all <span class="count">${items.length}</span></li>` + 233 + Object.entries(domains).sort((a, b) => b[1] - a[1]).map(([d, n]) => 234 + `<li class="${state.domain === d ? 'active' : ''}" data-d="${d}">${d} <span class="count">${n}</span></li>` 235 + ).join(""); 236 + } 237 + 238 + function render() { renderFilters(); renderGrid(); } 239 + 240 + document.getElementById("yearList").addEventListener("click", e => { 241 + const li = e.target.closest("li"); 242 + if (!li) return; 243 + state.year = li.dataset.y || null; 244 + render(); 245 + }); 246 + document.getElementById("domainList").addEventListener("click", e => { 247 + const li = e.target.closest("li"); 248 + if (!li) return; 249 + state.domain = li.dataset.d || null; 250 + render(); 251 + }); 252 + document.getElementById("confirmedOnly").addEventListener("change", e => { 253 + state.confirmedOnly = e.target.checked; 254 + if (e.target.checked) { state.rejectedOnly = false; document.getElementById("rejectedOnly").checked = false; } 255 + render(); 256 + }); 257 + document.getElementById("rejectedOnly").addEventListener("change", e => { 258 + state.rejectedOnly = e.target.checked; 259 + if (e.target.checked) { state.confirmedOnly = false; document.getElementById("confirmedOnly").checked = false; } 260 + render(); 261 + }); 262 + document.getElementById("simSlider").addEventListener("input", e => { 263 + state.minSim = parseFloat(e.target.value); 264 + document.getElementById("simValue").textContent = state.minSim.toFixed(2); 265 + render(); 266 + }); 267 + document.getElementById("search").addEventListener("input", e => { 268 + state.search = e.target.value; 269 + renderGrid(); 270 + }); 271 + 272 + document.getElementById("grid").addEventListener("click", e => { 273 + const card = e.target.closest(".card"); 274 + if (!card) return; 275 + const r = RECORDS[parseInt(card.dataset.idx, 10)]; 276 + const s = r.subject || {}; 277 + const en = r.environment || {}; 278 + const p = r.photography || {}; 279 + const mediaTag = r.has_video 280 + ? `<video class="modal-img" src="originals/${r.video_path}" controls autoplay muted loop playsinline></video>` 281 + : `<img class="modal-img" src="originals/${r.rel_path}" alt="">`; 282 + const dim = (r.width && r.height) ? `${r.width}×${r.height}` : "?"; 283 + document.getElementById("modalGrid").innerHTML = ` 284 + ${mediaTag} 285 + <div class="modal-info"> 286 + <h2>${r.rel_path}</h2> 287 + <div class="field"><span class="label">date · resolution · type</span><div class="value">${r.date || "?"} · ${dim} · ${r.has_video ? "video" : "image"}</div></div> 288 + <div class="field"><span class="label">domain</span><div class="value">${r.domain || "?"}</div></div> 289 + <div class="field"><span class="label">match similarity</span><div class="value">${(r.match_similarity || 0).toFixed(3)} → ${r.match_ref || ""}</div></div> 290 + <div class="field"><span class="label">vision confirmed</span><div class="value">${r.is_jeffrey_confirmed ? "yes" : "NO"} (${(r.confidence || 0).toFixed(2)})</div></div> 291 + ${s.description ? `<div class="field"><span class="label">subject — appearance</span><div class="value">${s.description}</div></div>` : ""} 292 + ${s.expression ? `<div class="field"><span class="label">expression</span><div class="value">${s.expression}</div></div>` : ""} 293 + ${s.pose ? `<div class="field"><span class="label">pose</span><div class="value">${s.pose}</div></div>` : ""} 294 + ${en.location ? `<div class="field"><span class="label">location</span><div class="value">${en.location} · ${en.time_of_day || ""}</div></div>` : ""} 295 + ${en.lighting ? `<div class="field"><span class="label">lighting</span><div class="value">${en.lighting}</div></div>` : ""} 296 + ${en.background_details ? `<div class="field"><span class="label">background</span><div class="value">${en.background_details}</div></div>` : ""} 297 + ${p.camera_angle ? `<div class="field"><span class="label">photography</span><div class="value">${p.camera_angle} · ${p.style} · ${p.framing}</div></div>` : ""} 298 + ${r.tags && r.tags.length ? `<div class="field"><span class="label">tags</span><div class="value">${r.tags.map(t => `<span class="tag">${t}</span>`).join("")}</div></div>` : ""} 299 + ${r.caption_hint ? `<div class="field"><span class="label">caption hint</span><div class="value"><em>${r.caption_hint}</em></div></div>` : ""} 300 + </div> 301 + `; 302 + document.getElementById("modal").classList.add("open"); 303 + }); 304 + document.getElementById("close").addEventListener("click", () => { 305 + document.getElementById("modal").classList.remove("open"); 306 + }); 307 + document.getElementById("modal").addEventListener("click", e => { 308 + if (e.target.id === "modal") document.getElementById("modal").classList.remove("open"); 309 + }); 310 + 311 + render(); 312 + </script> 313 + </body> 314 + </html> 315 + """ 316 + 317 + 318 + def main() -> int: 319 + p = argparse.ArgumentParser() 320 + p.add_argument("--described", required=True) 321 + p.add_argument("--match", required=True) 322 + p.add_argument("--curated-dir", default=str(DEFAULT_CURATED)) 323 + args = p.parse_args() 324 + 325 + described_path = Path(args.described) 326 + match_path = Path(args.match) 327 + curated = Path(args.curated_dir) 328 + thumbs_dir = curated / "thumbnails" 329 + 330 + # Load described records 331 + described: list[dict] = [] 332 + for line in described_path.read_text().splitlines(): 333 + try: 334 + described.append(json.loads(line)) 335 + except json.JSONDecodeError: 336 + pass 337 + 338 + # Index match records by path so we can pick up sim scores even for rejects 339 + match_by_path: dict[str, dict] = {} 340 + for line in match_path.read_text().splitlines(): 341 + try: 342 + r = json.loads(line) 343 + match_by_path[r["path"]] = r 344 + except (json.JSONDecodeError, KeyError): 345 + pass 346 + 347 + # Build thumbnails 348 + print(f"generating thumbnails to {thumbs_dir}…", file=sys.stderr) 349 + n_new = 0 350 + for r in described: 351 + src = Path(r["path"]) 352 + if not src.exists(): 353 + continue 354 + dst = thumbs_dir / r["rel_path"] 355 + if make_thumbnail(src, dst): 356 + n_new += 1 357 + print(f"thumbnails: {n_new} new, {len(described) - n_new} existed", file=sys.stderr) 358 + 359 + # Slim records for browser (drop heavy fields). Detect sibling video files 360 + # and probe original-image dimensions so the modal can show real quality. 361 + slim = [] 362 + for r in described: 363 + m = match_by_path.get(r["path"], {}) 364 + src = Path(r["path"]) 365 + mp4 = src.with_suffix(".mp4") 366 + width = height = None 367 + if src.exists(): 368 + try: 369 + with Image.open(src) as img: 370 + width, height = img.size 371 + except Exception: 372 + pass 373 + slim.append({ 374 + "rel_path": r.get("rel_path"), 375 + "date": r.get("date"), 376 + "shortcode": r.get("shortcode"), 377 + "match_similarity": r.get("match_similarity") or m.get("best_similarity"), 378 + "match_ref": r.get("match_ref") or m.get("best_match_ref"), 379 + "is_jeffrey_confirmed": r.get("is_jeffrey_confirmed"), 380 + "confidence": r.get("confidence"), 381 + "subject": r.get("subject"), 382 + "environment": r.get("environment"), 383 + "photography": r.get("photography"), 384 + "domain": r.get("domain"), 385 + "n_other_people": r.get("n_other_people"), 386 + "tags": r.get("tags"), 387 + "caption_hint": r.get("caption_hint"), 388 + "has_video": mp4.exists(), 389 + "video_path": mp4.name if mp4.exists() else None, 390 + "width": width, 391 + "height": height, 392 + }) 393 + 394 + html = HTML_TEMPLATE.replace("__RECORDS__", json.dumps(slim, ensure_ascii=False)) 395 + out_html = curated / "index.html" 396 + out_html.write_text(html) 397 + print(f"wrote {out_html}", file=sys.stderr) 398 + print("\nopen with:", file=sys.stderr) 399 + print(f" cd {curated}", file=sys.stderr) 400 + print(f" python3 -m http.server 8000", file=sys.stderr) 401 + print(f" open http://localhost:8000/", file=sys.stderr) 402 + 403 + # Summary 404 + confirmed = [r for r in slim if r["is_jeffrey_confirmed"]] 405 + print(f"\nincluded: {len(slim)} records ({len(confirmed)} confirmed)", file=sys.stderr) 406 + return 0 407 + 408 + 409 + if __name__ == "__main__": 410 + sys.exit(main())
+390
portraits/jeffrey/bin/face-describe.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Phase 2: rich scene-graph description for face-matched jeffrey images. 4 + 5 + Reads the JSONL produced by face-match.py, filters to is_jeffrey=true (or all 6 + images with a face), and sends each through GPT-4o vision with reference 7 + photos for identity grounding. Output is structured JSON aligned with the 8 + everyday.tina.zone-inspired scene-graph schema from the rev2 feasibility doc: 9 + 10 + subject: description, expression, pose 11 + environment: location, time_of_day, lighting, background_details 12 + photography: camera_angle, style, framing 13 + + domain (categorical theme), n_other_people, is_jeffrey_confirmed, 14 + tags, caption_hint 15 + 16 + This metadata is dual-purpose: 17 + - immediate: a curated, browsable jeffrey-by-date index with descriptions 18 + - downstream: training data for jeffrey's scene-graph composer 19 + (see vault/personal/2026-other/jeffrey-image-model-feasibility.md) 20 + 21 + Usage: 22 + python face-describe.py \\ 23 + --match-jsonl $VAULT/jeffrey-platter/curated/jeffrey-match.jsonl \\ 24 + --output $VAULT/jeffrey-platter/curated/jeffrey-described.jsonl 25 + 26 + Args: 27 + --match-jsonl <path> input JSONL from face-match.py. Required. 28 + --output <path> JSONL output (resumes by skipping already-described). Required. 29 + --min-similarity <f> only describe images with at least this face-match score. 30 + Default 0.5 (matches face-match's default threshold). 31 + --include-other-faces also describe images with non-jeffrey faces (cross-check). 32 + --limit <int> process at most N records (smoke testing). Default: all. 33 + --concurrency <int> parallel API requests. Default: 4. 34 + --model <id> OpenAI model. Default: gpt-4o. 35 + """ 36 + 37 + from __future__ import annotations 38 + 39 + import argparse 40 + import asyncio 41 + import base64 42 + import json 43 + import os 44 + import sys 45 + from datetime import datetime, timezone 46 + from pathlib import Path 47 + 48 + import cv2 49 + from openai import APIError, AsyncOpenAI, RateLimitError 50 + 51 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 52 + VAULT_ENV = REPO_ROOT / "aesthetic-computer-vault" / ".devcontainer" / "envs" / "devcontainer.env" 53 + SHOOT_DIR = REPO_ROOT / "portraits" / "jeffrey" / "corpus" / "shoot" 54 + 55 + # Two refs is enough for identity grounding; more bloats every request. 56 + REFERENCE_PHOTOS = [ 57 + SHOOT_DIR / "jeffery-av--01.jpg", 58 + SHOOT_DIR / "jeffery-av--07.jpg", 59 + ] 60 + 61 + SYSTEM_PROMPT = """\ 62 + You are tagging a personal photograph of a specific person — referred to as \ 63 + "jeffrey" — for a structured scene-graph index. The two reference images at \ 64 + the top of the user message are confirmed photos of jeffrey from a recent \ 65 + photoshoot. Use them to verify identity. He may look different across years \ 66 + (2014–2026): different hair length/color, beard variation, glasses on/off, \ 67 + older or younger. Be flexible on age but strict on facial structure. 68 + 69 + Return JSON with these fields: 70 + 71 + - is_jeffrey_confirmed (boolean): is jeffrey clearly the subject (or one of \ 72 + the subjects) of this candidate image? false if you can't tell, or if the \ 73 + visible face is not jeffrey. 74 + - confidence (0-1): your confidence in is_jeffrey_confirmed. 75 + - subject (object — fill only if is_jeffrey_confirmed): 76 + - description: jeffrey's appearance in THIS photo. Hair (length, color, \ 77 + state), beard (none / stubble / short / full / specific shape), clothing \ 78 + (specific items + colors + condition), accessories (glasses, hats, jewelry). \ 79 + 1-2 sentences, dense and visual. 80 + - expression: short phrase, e.g. "soft smile", "open-mouthed laugh", \ 81 + "focused, eyes down", "neutral, slight tension at mouth". 82 + - pose: what jeffrey is physically doing — "leaning into a kitchen counter, \ 83 + holding a coffee cup", "seated cross-legged drawing on a tablet", "mid-stride \ 84 + on a sidewalk, looking sideways at the camera". 85 + - environment (object): 86 + - location: specific where possible — "small home studio", "Brooklyn \ 87 + sidewalk at night", "art gallery interior with white walls". Avoid generic \ 88 + "indoors". 89 + - time_of_day: "morning" / "afternoon" / "evening" / "night" / "unclear". 90 + - lighting: short phrase — "warm afternoon window light", "harsh fluorescent \ 91 + overhead", "cyan night-club uplighting", "soft ambient lamp". 92 + - background_details: 1-2 sentences on what's visibly behind/around jeffrey \ 93 + — furniture, objects, art on walls, signage, other people in background. 94 + - photography (object): 95 + - camera_angle: "eye-level frontal", "low-angle from waist", "high-angle \ 96 + overhead", "side profile", "over-the-shoulder", etc. 97 + - style: short phrase — "casual phone selfie", "Gen Z photo dump", "staged \ 98 + event portrait", "candid 35mm film", "fluorescent-lit interior snapshot". \ 99 + Capture the *aesthetic*. 100 + - framing: "tight close-up of face", "medium shot waist-up", "wide environmental". 101 + - domain (string): pick one of the following categories that best describes \ 102 + what jeffrey is doing or where he is. If none fits well, pick "other". 103 + Categories: art (drawing, painting, whistlegraph), coding, performance \ 104 + (stage, AV show, music), lecture-or-teaching, repair (fixing things), \ 105 + cooking-or-eating, solo-portrait, with-fia, social (with friends/group), \ 106 + travel-or-outdoor, studio-or-workshop, kidlisp, other. 107 + - n_other_people (integer): count of other clearly-visible people (not \ 108 + jeffrey, not background blur). 109 + - tags (array of strings): 3-7 freeform short tags capturing notable specifics \ 110 + — e.g. ["coffee", "kitchen", "morning", "casual"] or ["projector", "stage", \ 111 + "audience", "blue lighting"]. 112 + - caption_hint (string or null): 1 short sentence in the voice of an IG \ 113 + caption that would fit this photo. Optional — null if nothing comes to mind. 114 + 115 + Output JSON ONLY. If is_jeffrey_confirmed is false, set subject to null and \ 116 + keep environment/photography/domain/n_other_people/tags filled in for the \ 117 + image as it stands.\ 118 + """ 119 + 120 + OUTPUT_SCHEMA = { 121 + "type": "object", 122 + "additionalProperties": False, 123 + "properties": { 124 + "is_jeffrey_confirmed": {"type": "boolean"}, 125 + "confidence": {"type": "number"}, 126 + "subject": { 127 + "type": ["object", "null"], 128 + "additionalProperties": False, 129 + "properties": { 130 + "description": {"type": "string"}, 131 + "expression": {"type": "string"}, 132 + "pose": {"type": "string"}, 133 + }, 134 + "required": ["description", "expression", "pose"], 135 + }, 136 + "environment": { 137 + "type": "object", 138 + "additionalProperties": False, 139 + "properties": { 140 + "location": {"type": "string"}, 141 + "time_of_day": {"type": "string"}, 142 + "lighting": {"type": "string"}, 143 + "background_details": {"type": "string"}, 144 + }, 145 + "required": ["location", "time_of_day", "lighting", "background_details"], 146 + }, 147 + "photography": { 148 + "type": "object", 149 + "additionalProperties": False, 150 + "properties": { 151 + "camera_angle": {"type": "string"}, 152 + "style": {"type": "string"}, 153 + "framing": {"type": "string"}, 154 + }, 155 + "required": ["camera_angle", "style", "framing"], 156 + }, 157 + "domain": {"type": "string"}, 158 + "n_other_people": {"type": "integer"}, 159 + "tags": {"type": "array", "items": {"type": "string"}}, 160 + "caption_hint": {"type": ["string", "null"]}, 161 + }, 162 + "required": [ 163 + "is_jeffrey_confirmed", "confidence", "subject", "environment", 164 + "photography", "domain", "n_other_people", "tags", "caption_hint", 165 + ], 166 + } 167 + 168 + 169 + def load_openai_key() -> str: 170 + if "OPENAI_API_KEY" in os.environ: 171 + return os.environ["OPENAI_API_KEY"] 172 + if VAULT_ENV.exists(): 173 + for line in VAULT_ENV.read_text().splitlines(): 174 + if line.startswith("OPENAI_API_KEY="): 175 + return line.split("=", 1)[1].strip().strip('"').strip("'") 176 + sys.exit("OPENAI_API_KEY not set and not in vault devcontainer.env") 177 + 178 + 179 + def encode_image(path: Path, max_dim: int = 1024) -> str: 180 + img = cv2.imread(str(path)) 181 + if img is None: 182 + raise ValueError(f"could not read {path}") 183 + h, w = img.shape[:2] 184 + if max(h, w) > max_dim: 185 + scale = max_dim / max(h, w) 186 + img = cv2.resize(img, (int(w * scale), int(h * scale))) 187 + ok, buf = cv2.imencode(".jpg", img, [cv2.IMWRITE_JPEG_QUALITY, 88]) 188 + if not ok: 189 + raise ValueError(f"jpeg encode failed for {path}") 190 + return base64.standard_b64encode(buf.tobytes()).decode("ascii") 191 + 192 + 193 + def build_user_content(image_b64: str, ref_b64s: list[str]) -> list[dict]: 194 + content = [{"type": "text", "text": "REFERENCE photos of jeffrey (not the candidate):"}] 195 + for ref_b64 in ref_b64s: 196 + content.append({ 197 + "type": "image_url", 198 + "image_url": {"url": f"data:image/jpeg;base64,{ref_b64}", "detail": "low"}, 199 + }) 200 + content.append({"type": "text", "text": "CANDIDATE image — tag this:"}) 201 + content.append({ 202 + "type": "image_url", 203 + "image_url": {"url": f"data:image/jpeg;base64,{image_b64}", "detail": "high"}, 204 + }) 205 + return content 206 + 207 + 208 + async def describe_one( 209 + client: AsyncOpenAI, 210 + ref_b64s: list[str], 211 + record: dict, 212 + model: str, 213 + sem: asyncio.Semaphore, 214 + ) -> dict: 215 + out: dict = { 216 + "path": record["path"], 217 + "rel_path": record["rel_path"], 218 + "date": record.get("date"), 219 + "shortcode": record.get("shortcode"), 220 + "match_similarity": record.get("best_similarity"), 221 + "match_ref": record.get("best_match_ref"), 222 + "described_at": datetime.now(timezone.utc).isoformat(), 223 + "model": model, 224 + "error": None, 225 + } 226 + try: 227 + image_b64 = encode_image(Path(record["path"])) 228 + except Exception as e: 229 + out["error"] = f"encode: {type(e).__name__}: {e}" 230 + return out 231 + 232 + user_content = build_user_content(image_b64, ref_b64s) 233 + 234 + async with sem: 235 + for attempt in range(3): 236 + try: 237 + response = await client.chat.completions.create( 238 + model=model, 239 + messages=[ 240 + {"role": "system", "content": SYSTEM_PROMPT}, 241 + {"role": "user", "content": user_content}, 242 + ], 243 + response_format={ 244 + "type": "json_schema", 245 + "json_schema": { 246 + "name": "scene_graph_tag", 247 + "strict": True, 248 + "schema": OUTPUT_SCHEMA, 249 + }, 250 + }, 251 + max_tokens=900, 252 + ) 253 + parsed = json.loads(response.choices[0].message.content or "{}") 254 + out.update(parsed) 255 + out["_usage"] = { 256 + "prompt_tokens": response.usage.prompt_tokens, 257 + "completion_tokens": response.usage.completion_tokens, 258 + "cached_tokens": ( 259 + getattr(response.usage, "prompt_tokens_details", None) 260 + and response.usage.prompt_tokens_details.cached_tokens 261 + ) or 0, 262 + } 263 + return out 264 + except RateLimitError: 265 + await asyncio.sleep(2 ** attempt + 1) 266 + except APIError as e: 267 + out["error"] = f"api: {type(e).__name__}: {e}" 268 + if attempt < 2: 269 + await asyncio.sleep(2 ** attempt) 270 + continue 271 + return out 272 + except Exception as e: 273 + out["error"] = f"{type(e).__name__}: {e}" 274 + return out 275 + return out 276 + 277 + 278 + async def main_async(args: argparse.Namespace) -> int: 279 + api_key = load_openai_key() 280 + client = AsyncOpenAI(api_key=api_key) 281 + 282 + # Load matches 283 + match_path = Path(args.match_jsonl).expanduser().resolve() 284 + if not match_path.exists(): 285 + sys.exit(f"match jsonl not found: {match_path}") 286 + matches: list[dict] = [] 287 + for line in match_path.read_text().splitlines(): 288 + try: 289 + r = json.loads(line) 290 + except json.JSONDecodeError: 291 + continue 292 + if r.get("error"): 293 + continue 294 + sim = r.get("best_similarity", 0) 295 + if r.get("is_jeffrey") or (args.include_other_faces and r.get("n_faces", 0) > 0): 296 + if sim >= args.min_similarity: 297 + matches.append(r) 298 + matches.sort(key=lambda r: r.get("best_similarity", 0), reverse=True) 299 + print(f"matched candidates: {len(matches)}", file=sys.stderr) 300 + 301 + # Resume 302 + out_path = Path(args.output).expanduser().resolve() 303 + out_path.parent.mkdir(parents=True, exist_ok=True) 304 + already: set[str] = set() 305 + if out_path.exists(): 306 + for line in out_path.read_text().splitlines(): 307 + try: 308 + already.add(json.loads(line)["path"]) 309 + except (json.JSONDecodeError, KeyError): 310 + pass 311 + matches = [r for r in matches if r["path"] not in already] 312 + if args.limit: 313 + matches = matches[: args.limit] 314 + print(f"to describe: {len(matches)} ({len(already)} already done)", file=sys.stderr) 315 + if not matches: 316 + return 0 317 + 318 + # Load reference photos 319 + ref_b64s = [] 320 + for ref in REFERENCE_PHOTOS: 321 + if not ref.exists(): 322 + sys.exit(f"reference photo missing: {ref}") 323 + ref_b64s.append(encode_image(ref, max_dim=512)) 324 + print(f"loaded {len(ref_b64s)} reference photos", file=sys.stderr) 325 + 326 + sem = asyncio.Semaphore(args.concurrency) 327 + totals = {"prompt": 0, "completion": 0, "cached": 0, "confirmed": 0, "rejected": 0, "errors": 0} 328 + 329 + with out_path.open("a") as f: 330 + tasks = [describe_one(client, ref_b64s, r, args.model, sem) for r in matches] 331 + for i, coro in enumerate(asyncio.as_completed(tasks), 1): 332 + record = await coro 333 + usage = record.pop("_usage", None) 334 + f.write(json.dumps(record) + "\n") 335 + f.flush() 336 + if usage: 337 + totals["prompt"] += usage["prompt_tokens"] 338 + totals["completion"] += usage["completion_tokens"] 339 + totals["cached"] += usage["cached_tokens"] 340 + if record.get("error"): 341 + totals["errors"] += 1 342 + tag = "✗" 343 + elif record.get("is_jeffrey_confirmed"): 344 + totals["confirmed"] += 1 345 + tag = "✓" 346 + else: 347 + totals["rejected"] += 1 348 + tag = "·" 349 + sim = record.get("match_similarity") or 0 350 + err = f" ERR={record['error'][:60]}" if record.get("error") else "" 351 + domain = record.get("domain", "—") 352 + print( 353 + f"[{i}/{len(matches)}] {tag} sim={sim:.3f} {domain:<22} {record['rel_path'][:35]}{err}", 354 + file=sys.stderr, 355 + ) 356 + 357 + cost_in = totals["prompt"] * 2.50e-6 358 + cost_out = totals["completion"] * 10e-6 359 + cache_savings = totals["cached"] * 1.25e-6 # 50% off cached input tokens 360 + print( 361 + f"\ndone. confirmed={totals['confirmed']} rejected={totals['rejected']} errors={totals['errors']}", 362 + file=sys.stderr, 363 + ) 364 + print( 365 + f"tokens: prompt={totals['prompt']} completion={totals['completion']} cached={totals['cached']}", 366 + file=sys.stderr, 367 + ) 368 + print( 369 + f"approx cost: ${cost_in + cost_out:.2f} (${cost_in:.2f} in + ${cost_out:.2f} out, " 370 + f"~${cache_savings:.2f} saved by cache)", 371 + file=sys.stderr, 372 + ) 373 + return 0 374 + 375 + 376 + def main() -> int: 377 + p = argparse.ArgumentParser() 378 + p.add_argument("--match-jsonl", required=True) 379 + p.add_argument("--output", required=True) 380 + p.add_argument("--min-similarity", type=float, default=0.5) 381 + p.add_argument("--include-other-faces", action="store_true") 382 + p.add_argument("--limit", type=int, default=0) 383 + p.add_argument("--concurrency", type=int, default=4) 384 + p.add_argument("--model", default="gpt-4o") 385 + args = p.parse_args() 386 + return asyncio.run(main_async(args)) 387 + 388 + 389 + if __name__ == "__main__": 390 + sys.exit(main())
+238
portraits/jeffrey/bin/face-match.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Local face-match: identify which images contain jeffrey, with similarity scores. 4 + 5 + Pipeline: 6 + 1. Build a reference embedding-set from confirmed-jeffrey photos 7 + (the shoot/--01..--10 master-tier headshots by default). 8 + 2. For each input image: detect faces (RetinaFace via insightface), 9 + embed each face (ArcFace), compute cosine similarity to the closest 10 + reference, threshold to a yes/no decision. 11 + 3. Emit one JSONL record per image. 12 + 13 + This is the IDENTITY step. No API calls, no costs. ~50–100 ms per image 14 + on CPU. Run before the optional description layer (face-describe.py), 15 + which calls a vision API only on the matched subset. 16 + 17 + Usage: 18 + python face-match.py \\ 19 + --input "$VAULT/jeffrey-platter/ig-archive/whistlegraph/*.jpg" \\ 20 + --output $VAULT/jeffrey-platter/curated/jeffrey-match.jsonl 21 + 22 + Args: 23 + --input <glob> glob of input image paths. Required. 24 + --output <path> JSONL output. Resumes by skipping already-tagged paths. Required. 25 + --references <glob> glob of reference photos. Default: shoot/--01..--10 masters. 26 + --threshold <float> cosine-similarity threshold for is_jeffrey. Default: 0.5 27 + (insightface default). Lower = more recall, more false positives. 28 + --limit <int> process at most N images (smoke testing). Default: all. 29 + --min-face-size <int> reject faces smaller than this (pixels). Default: 60. 30 + 31 + Output JSONL record: 32 + { 33 + "path": "/abs/path", 34 + "rel_path": "filename.jpg", 35 + "date": "YYYY-MM-DD" or null, 36 + "shortcode": "..." or null, 37 + "n_faces": 2, 38 + "best_similarity": 0.71, 39 + "best_match_ref": "jeffery-av--04.jpg", 40 + "is_jeffrey": true, 41 + "faces": [{"bbox": [x,y,w,h], "similarity": 0.71, "matched_ref": "..."}, ...], 42 + "error": null 43 + } 44 + """ 45 + 46 + from __future__ import annotations 47 + 48 + import argparse 49 + import gc 50 + import json 51 + import re 52 + import sys 53 + from datetime import datetime, timezone 54 + from glob import glob 55 + from pathlib import Path 56 + 57 + import cv2 58 + import numpy as np 59 + from insightface.app import FaceAnalysis 60 + 61 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 62 + DEFAULT_REFS_GLOB = str(REPO_ROOT / "portraits/jeffrey/corpus/shoot/jeffery-av--0[1-9].jpg") 63 + ALSO_REF_10 = str(REPO_ROOT / "portraits/jeffrey/corpus/shoot/jeffery-av--10.jpg") 64 + 65 + DATE_RE = re.compile(r"(\d{4}-\d{2}-\d{2})_([A-Za-z0-9_-]+?)(?:_\d+)?\.") 66 + 67 + 68 + def parse_filename(name: str) -> tuple[str | None, str | None]: 69 + m = DATE_RE.search(name) 70 + if not m: 71 + return None, None 72 + return m.group(1), m.group(2) 73 + 74 + 75 + def cosine_sim(a: np.ndarray, b: np.ndarray) -> float: 76 + return float(np.dot(a, b) / (np.linalg.norm(a) * np.linalg.norm(b))) 77 + 78 + 79 + def build_references(app: FaceAnalysis, ref_paths: list[Path]) -> list[tuple[str, np.ndarray]]: 80 + """Return [(ref_name, embedding), ...] for every face found in references.""" 81 + refs: list[tuple[str, np.ndarray]] = [] 82 + for path in ref_paths: 83 + img = cv2.imread(str(path)) 84 + if img is None: 85 + print(f"warn: could not read reference {path}", file=sys.stderr) 86 + continue 87 + faces = app.get(img) 88 + if not faces: 89 + print(f"warn: no face detected in reference {path.name}", file=sys.stderr) 90 + continue 91 + # Pick the largest face (the shoot photos are face-centered, but be safe) 92 + face = max(faces, key=lambda f: (f.bbox[2] - f.bbox[0]) * (f.bbox[3] - f.bbox[1])) 93 + refs.append((path.name, face.normed_embedding)) 94 + if not refs: 95 + sys.exit("no usable reference faces — check reference paths") 96 + return refs 97 + 98 + 99 + def best_match(emb: np.ndarray, refs: list[tuple[str, np.ndarray]]) -> tuple[float, str]: 100 + sims = [(cosine_sim(emb, ref_emb), name) for name, ref_emb in refs] 101 + sims.sort(reverse=True) 102 + return sims[0] 103 + 104 + 105 + def main() -> int: 106 + p = argparse.ArgumentParser(description="Local face-match against jeffrey references.") 107 + p.add_argument("--input", required=True) 108 + p.add_argument("--output", required=True) 109 + p.add_argument("--references", default=None, help="glob; default: shoot/--01..--10") 110 + p.add_argument("--threshold", type=float, default=0.5) 111 + p.add_argument("--limit", type=int, default=0) 112 + p.add_argument("--min-face-size", type=int, default=60) 113 + args = p.parse_args() 114 + 115 + # Resolve refs 116 + if args.references: 117 + ref_paths = [Path(p) for p in sorted(glob(args.references))] 118 + else: 119 + ref_paths = [Path(p) for p in sorted(glob(DEFAULT_REFS_GLOB))] 120 + if Path(ALSO_REF_10).exists(): 121 + ref_paths.append(Path(ALSO_REF_10)) 122 + if not ref_paths: 123 + sys.exit("no reference photos found — run fetch-corpus.mjs --shoot first") 124 + print(f"references: {len(ref_paths)} ({ref_paths[0].name} … {ref_paths[-1].name})", file=sys.stderr) 125 + 126 + # Resolve inputs + resume 127 + inputs = sorted(Path(p) for p in glob(args.input)) 128 + if not inputs: 129 + sys.exit(f"no inputs matched: {args.input}") 130 + out_path = Path(args.output).expanduser().resolve() 131 + out_path.parent.mkdir(parents=True, exist_ok=True) 132 + already: set[str] = set() 133 + if out_path.exists(): 134 + for line in out_path.read_text().splitlines(): 135 + try: 136 + already.add(json.loads(line)["path"]) 137 + except (json.JSONDecodeError, KeyError): 138 + pass 139 + inputs = [p for p in inputs if str(p.resolve()) not in already] 140 + if args.limit: 141 + inputs = inputs[: args.limit] 142 + print(f"to process: {len(inputs)} images ({len(already)} already done)", file=sys.stderr) 143 + if not inputs: 144 + return 0 145 + 146 + # Init insightface — buffalo_l is the default 512-dim ArcFace 147 + print("loading insightface model (buffalo_l)…", file=sys.stderr) 148 + app = FaceAnalysis(name="buffalo_l", providers=["CPUExecutionProvider"]) 149 + app.prepare(ctx_id=0, det_size=(640, 640)) 150 + 151 + refs = build_references(app, ref_paths) 152 + print(f"built {len(refs)} reference embeddings", file=sys.stderr) 153 + 154 + n_jeffrey = 0 155 + n_no_face = 0 156 + n_other_face = 0 157 + n_err = 0 158 + 159 + with out_path.open("a") as f: 160 + for i, image_path in enumerate(inputs, 1): 161 + date, shortcode = parse_filename(image_path.name) 162 + record = { 163 + "path": str(image_path.resolve()), 164 + "rel_path": image_path.name, 165 + "date": date, 166 + "shortcode": shortcode, 167 + "tagged_at": datetime.now(timezone.utc).isoformat(), 168 + "error": None, 169 + } 170 + try: 171 + img = cv2.imread(str(image_path)) 172 + if img is None: 173 + raise ValueError("could not decode image") 174 + faces = app.get(img) 175 + # Filter tiny faces (often false positives) 176 + faces = [ 177 + fa for fa in faces 178 + if (fa.bbox[2] - fa.bbox[0]) >= args.min_face_size 179 + and (fa.bbox[3] - fa.bbox[1]) >= args.min_face_size 180 + ] 181 + record["n_faces"] = len(faces) 182 + if not faces: 183 + record["best_similarity"] = 0.0 184 + record["best_match_ref"] = None 185 + record["is_jeffrey"] = False 186 + record["faces"] = [] 187 + n_no_face += 1 188 + else: 189 + face_records = [] 190 + for face in faces: 191 + sim, ref_name = best_match(face.normed_embedding, refs) 192 + x1, y1, x2, y2 = face.bbox 193 + face_records.append({ 194 + "bbox": [int(x1), int(y1), int(x2 - x1), int(y2 - y1)], 195 + "similarity": round(sim, 4), 196 + "matched_ref": ref_name, 197 + }) 198 + face_records.sort(key=lambda r: r["similarity"], reverse=True) 199 + best = face_records[0] 200 + record["best_similarity"] = best["similarity"] 201 + record["best_match_ref"] = best["matched_ref"] 202 + record["is_jeffrey"] = best["similarity"] >= args.threshold 203 + record["faces"] = face_records 204 + if record["is_jeffrey"]: 205 + n_jeffrey += 1 206 + else: 207 + n_other_face += 1 208 + except Exception as e: 209 + record["error"] = f"{type(e).__name__}: {e}" 210 + record["is_jeffrey"] = False 211 + n_err += 1 212 + 213 + f.write(json.dumps(record) + "\n") 214 + f.flush() 215 + 216 + if i % 50 == 0: 217 + gc.collect() 218 + 219 + tag = "✓" if record.get("is_jeffrey") else (" " if record.get("n_faces") else "·") 220 + err = f" ERR={record['error']}" if record.get("error") else "" 221 + sim = record.get("best_similarity", 0.0) 222 + if i % 25 == 0 or i == len(inputs) or err: 223 + print( 224 + f"[{i}/{len(inputs)}] {tag} {record['rel_path']} " 225 + f"faces={record.get('n_faces', 0)} sim={sim:.3f}{err}", 226 + file=sys.stderr, 227 + ) 228 + 229 + print( 230 + f"\ndone. jeffrey: {n_jeffrey}, other-face: {n_other_face}, no-face: {n_no_face}, errors: {n_err}", 231 + file=sys.stderr, 232 + ) 233 + print(f"output: {out_path}", file=sys.stderr) 234 + return 0 235 + 236 + 237 + if __name__ == "__main__": 238 + sys.exit(main())
+355
portraits/jeffrey/bin/face-tag.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Walk a directory of images, identify which contain jeffrey, and tag each with 4 + expression / framing / description metadata via Claude Opus 4.7 vision. 5 + 6 + Pipeline per image: 7 + 1. opencv Haar cascade — does this image have any face? (~5ms, free) 8 + - If no face: emit a face=false record and skip the API call. 9 + 2. Claude Opus 4.7 vision — given reference photos of jeffrey (cached) + 10 + this candidate, return structured JSON: is_jeffrey, confidence, expression, 11 + framing, description, n_other_people. 12 + 3. Append the record to a JSONL output file. 13 + 14 + Reference photos live in the system prompt with cache_control, so subsequent 15 + requests within a 5-minute window pay ~0.1× for the reference tokens. With 16 + a sustained run, cache reads dominate cost. 17 + 18 + Usage: 19 + IG_ARCHIVE=$VAULT/jeffrey-platter/ig-archive/whistlegraph 20 + python face-tag.py --input "$IG_ARCHIVE/*.jpg" \\ 21 + --output ~/jeffreys-tagged.jsonl 22 + 23 + Args: 24 + --input <glob> glob of input image paths (jpg/png/webp). Required. 25 + --output <path> JSONL output file. Resumes by skipping paths already in 26 + this file. Required. 27 + --limit <int> process at most N images (smoke testing). Default: all. 28 + --model <id> Claude model. Default: claude-opus-4-7. Override with 29 + claude-sonnet-4-6 to cut cost ~5×, claude-haiku-4-5 to 30 + cut ~20× (lower fidelity). 31 + --skip-no-face if set, omit no-face records from output (default: keep). 32 + --concurrency <int> parallel API requests. Default: 5. Higher → faster but 33 + more rate-limit pressure. 34 + 35 + Output JSONL record (one per line): 36 + { 37 + "path": "/abs/path/to/image.jpg", 38 + "rel_path": "image.jpg", 39 + "has_face": true, 40 + "is_jeffrey": true, 41 + "confidence": 0.92, 42 + "expression": "soft smile", 43 + "framing": "selfie", 44 + "description": "jeffrey holding a coffee in a kitchen", 45 + "n_other_people": 0, 46 + "model": "claude-opus-4-7", 47 + "tagged_at": "2026-04-29T...", 48 + "error": null 49 + } 50 + """ 51 + 52 + from __future__ import annotations 53 + 54 + import argparse 55 + import asyncio 56 + import base64 57 + import json 58 + import os 59 + import sys 60 + from datetime import datetime, timezone 61 + from glob import glob 62 + from pathlib import Path 63 + 64 + import cv2 65 + from anthropic import APIError, AsyncAnthropic, RateLimitError 66 + 67 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 68 + SHOOT_DIR = REPO_ROOT / "portraits" / "jeffrey" / "corpus" / "shoot" 69 + VAULT_ENV = REPO_ROOT / "aesthetic-computer-vault" / "lith" / ".env" 70 + 71 + # Reference photos — chosen to span lighting/angle without being too many. 72 + # All from the AV shoot, master tier (--01..--10 are highest quality). 73 + # Keeping it tight (3 refs) for cache size. Add more if accuracy is poor. 74 + REFERENCE_PHOTOS = [ 75 + SHOOT_DIR / "jeffery-av--01.jpg", 76 + SHOOT_DIR / "jeffery-av--04.jpg", 77 + SHOOT_DIR / "jeffery-av--07.jpg", 78 + ] 79 + 80 + SYSTEM_INSTRUCTIONS = """\ 81 + You are identifying whether a specific person — referred to as "jeffrey" — \ 82 + appears in a photograph, and tagging the photo with metadata. 83 + 84 + The reference images that follow are confirmed photos of jeffrey at the time \ 85 + of the AV photoshoot. Use them to recognize his face structure, hair, beard if \ 86 + present, and overall appearance. Account for: aging (photos may be from any \ 87 + year between 2014 and 2026), changes in hair length/color, beard variation, \ 88 + glasses on/off, lighting, and angle. 89 + 90 + For each candidate image you receive, return JSON with these fields: 91 + - is_jeffrey: boolean — is jeffrey visible in this image (any portion of him, \ 92 + even partial face, even from behind if recognizable)? 93 + - confidence: number from 0 to 1 — your confidence in is_jeffrey. Use ≥0.8 \ 94 + when the face is clear and matches; 0.5-0.8 when partial or angled; <0.5 when \ 95 + uncertain. If is_jeffrey is false, this is your confidence he is NOT present. 96 + - expression: short phrase describing facial expression (e.g. "soft smile", \ 97 + "laughing", "neutral focus", "serious", "open mouth singing"), or null if no \ 98 + face is visible or jeffrey is not present. 99 + - framing: one of "selfie" (jeffrey only, arm-distance), "portrait" \ 100 + (staged/composed shot of jeffrey), "candid" (unposed jeffrey), "group" \ 101 + (jeffrey with others), "background" (jeffrey not the subject), "no_jeffrey". 102 + - description: 1-2 sentence factual description of what's in the image — \ 103 + setting, activity, what's visible. Include jeffrey's role if present. 104 + - n_other_people: integer count of other people visible (excluding jeffrey). 105 + 106 + Be honest about uncertainty. False positives (tagging non-jeffrey as jeffrey) \ 107 + are worse than false negatives. If the face is too small/blurry/angled to tell, \ 108 + set is_jeffrey=false with low confidence and a description noting the ambiguity.\ 109 + """ 110 + 111 + OUTPUT_SCHEMA = { 112 + "type": "object", 113 + "properties": { 114 + "is_jeffrey": {"type": "boolean"}, 115 + "confidence": {"type": "number"}, 116 + "expression": {"type": ["string", "null"]}, 117 + "framing": { 118 + "type": "string", 119 + "enum": ["selfie", "portrait", "candid", "group", "background", "no_jeffrey"], 120 + }, 121 + "description": {"type": "string"}, 122 + "n_other_people": {"type": "integer"}, 123 + }, 124 + "required": ["is_jeffrey", "confidence", "framing", "description", "n_other_people"], 125 + "additionalProperties": False, 126 + } 127 + 128 + 129 + def load_anthropic_key() -> str: 130 + if "ANTHROPIC_API_KEY" in os.environ: 131 + return os.environ["ANTHROPIC_API_KEY"] 132 + if VAULT_ENV.exists(): 133 + for line in VAULT_ENV.read_text().splitlines(): 134 + if line.startswith("ANTHROPIC_API_KEY="): 135 + return line.split("=", 1)[1].strip().strip('"').strip("'") 136 + sys.exit("ANTHROPIC_API_KEY not set and not found in vault/lith/.env") 137 + 138 + 139 + _face_cascade = None 140 + 141 + 142 + def has_face(image_path: Path) -> bool: 143 + """Cheap local check: does this image plausibly contain at least one face?""" 144 + global _face_cascade 145 + if _face_cascade is None: 146 + cascade_path = cv2.data.haarcascades + "haarcascade_frontalface_default.xml" 147 + _face_cascade = cv2.CascadeClassifier(cascade_path) 148 + img = cv2.imread(str(image_path)) 149 + if img is None: 150 + return False 151 + gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) 152 + faces = _face_cascade.detectMultiScale(gray, scaleFactor=1.2, minNeighbors=4, minSize=(40, 40)) 153 + return len(faces) > 0 154 + 155 + 156 + def encode_image(path: Path, max_dim: int = 1024) -> tuple[str, str]: 157 + """Return (base64_data, media_type) for an image, downsampled to max_dim.""" 158 + img = cv2.imread(str(path)) 159 + if img is None: 160 + raise ValueError(f"could not read {path}") 161 + h, w = img.shape[:2] 162 + if max(h, w) > max_dim: 163 + scale = max_dim / max(h, w) 164 + img = cv2.resize(img, (int(w * scale), int(h * scale))) 165 + ok, buf = cv2.imencode(".jpg", img, [cv2.IMWRITE_JPEG_QUALITY, 88]) 166 + if not ok: 167 + raise ValueError(f"jpeg encode failed for {path}") 168 + return base64.standard_b64encode(buf.tobytes()).decode("ascii"), "image/jpeg" 169 + 170 + 171 + def build_system_blocks(reference_paths: list[Path]) -> list[dict]: 172 + """System prompt: text instructions + reference jeffrey photos. Cached.""" 173 + blocks: list[dict] = [{"type": "text", "text": SYSTEM_INSTRUCTIONS}] 174 + for ref in reference_paths: 175 + if not ref.exists(): 176 + sys.exit(f"reference photo missing: {ref} — run fetch-corpus.mjs --shoot first") 177 + b64, mt = encode_image(ref, max_dim=768) 178 + blocks.append({ 179 + "type": "image", 180 + "source": {"type": "base64", "media_type": mt, "data": b64}, 181 + }) 182 + # cache_control on the LAST block of the stable prefix caches everything before it 183 + blocks[-1]["cache_control"] = {"type": "ephemeral"} 184 + return blocks 185 + 186 + 187 + async def tag_image( 188 + client: AsyncAnthropic, 189 + system_blocks: list[dict], 190 + image_path: Path, 191 + model: str, 192 + ) -> dict: 193 + """Send one image to Claude vision; return parsed result dict.""" 194 + b64, mt = encode_image(image_path) 195 + response = await client.messages.create( 196 + model=model, 197 + max_tokens=400, 198 + system=system_blocks, 199 + messages=[{ 200 + "role": "user", 201 + "content": [ 202 + {"type": "image", "source": {"type": "base64", "media_type": mt, "data": b64}}, 203 + {"type": "text", "text": "Tag this photo per the schema."}, 204 + ], 205 + }], 206 + output_config={"format": {"type": "json_schema", "schema": OUTPUT_SCHEMA}}, 207 + ) 208 + text = next((b.text for b in response.content if b.type == "text"), "") 209 + parsed = json.loads(text) 210 + parsed["_usage"] = { 211 + "input_tokens": response.usage.input_tokens, 212 + "output_tokens": response.usage.output_tokens, 213 + "cache_creation_input_tokens": response.usage.cache_creation_input_tokens or 0, 214 + "cache_read_input_tokens": response.usage.cache_read_input_tokens or 0, 215 + } 216 + return parsed 217 + 218 + 219 + async def process_one( 220 + client: AsyncAnthropic, 221 + system_blocks: list[dict], 222 + image_path: Path, 223 + model: str, 224 + sem: asyncio.Semaphore, 225 + ) -> dict: 226 + """Per-image pipeline: face detect → (maybe) Claude → record.""" 227 + record: dict = { 228 + "path": str(image_path.resolve()), 229 + "rel_path": image_path.name, 230 + "tagged_at": datetime.now(timezone.utc).isoformat(), 231 + "model": model, 232 + "error": None, 233 + } 234 + try: 235 + record["has_face"] = has_face(image_path) 236 + except Exception as e: 237 + record["has_face"] = False 238 + record["error"] = f"face_detect: {e}" 239 + return record 240 + 241 + if not record["has_face"]: 242 + record["is_jeffrey"] = False 243 + record["confidence"] = 0.0 244 + record["expression"] = None 245 + record["framing"] = "no_jeffrey" 246 + record["description"] = "no face detected" 247 + record["n_other_people"] = 0 248 + return record 249 + 250 + async with sem: 251 + for attempt in range(3): 252 + try: 253 + tagged = await tag_image(client, system_blocks, image_path, model) 254 + record.update(tagged) 255 + return record 256 + except RateLimitError: 257 + await asyncio.sleep(2 ** attempt + 1) 258 + except APIError as e: 259 + record["error"] = f"api: {type(e).__name__}: {e}" 260 + if attempt < 2: 261 + await asyncio.sleep(2 ** attempt) 262 + continue 263 + return record 264 + except Exception as e: 265 + record["error"] = f"{type(e).__name__}: {e}" 266 + return record 267 + return record 268 + 269 + 270 + async def main_async(args: argparse.Namespace) -> int: 271 + api_key = load_anthropic_key() 272 + client = AsyncAnthropic(api_key=api_key) 273 + 274 + paths = sorted(Path(p) for p in glob(args.input)) 275 + if not paths: 276 + sys.exit(f"no inputs matched: {args.input}") 277 + 278 + out_path = Path(args.output).expanduser().resolve() 279 + out_path.parent.mkdir(parents=True, exist_ok=True) 280 + already_tagged: set[str] = set() 281 + if out_path.exists(): 282 + for line in out_path.read_text().splitlines(): 283 + try: 284 + already_tagged.add(json.loads(line)["path"]) 285 + except (json.JSONDecodeError, KeyError): 286 + pass 287 + paths = [p for p in paths if str(p.resolve()) not in already_tagged] 288 + if args.limit: 289 + paths = paths[: args.limit] 290 + 291 + print( 292 + f"to process: {len(paths)} ({len(already_tagged)} already tagged in output)", 293 + file=sys.stderr, 294 + ) 295 + if not paths: 296 + return 0 297 + 298 + system_blocks = build_system_blocks(REFERENCE_PHOTOS) 299 + sem = asyncio.Semaphore(args.concurrency) 300 + 301 + totals = {"input": 0, "output": 0, "cache_create": 0, "cache_read": 0, "jeffrey": 0, "no_face": 0} 302 + with out_path.open("a") as f: 303 + tasks = [process_one(client, system_blocks, p, args.model, sem) for p in paths] 304 + for i, coro in enumerate(asyncio.as_completed(tasks), 1): 305 + record = await coro 306 + if args.skip_no_face and not record.get("has_face"): 307 + pass 308 + else: 309 + f.write(json.dumps(record) + "\n") 310 + f.flush() 311 + usage = record.pop("_usage", None) 312 + if usage: 313 + totals["input"] += usage["input_tokens"] 314 + totals["output"] += usage["output_tokens"] 315 + totals["cache_create"] += usage["cache_creation_input_tokens"] 316 + totals["cache_read"] += usage["cache_read_input_tokens"] 317 + if record.get("is_jeffrey"): 318 + totals["jeffrey"] += 1 319 + if not record.get("has_face"): 320 + totals["no_face"] += 1 321 + tag = "✓" if record.get("is_jeffrey") else (" " if record.get("has_face") else "·") 322 + err = f" ERR={record['error']}" if record.get("error") else "" 323 + conf = record.get("confidence", 0) 324 + print( 325 + f"[{i}/{len(paths)}] {tag} {record['rel_path']} c={conf:.2f}{err}", 326 + file=sys.stderr, 327 + ) 328 + 329 + print( 330 + f"\ndone. {totals['jeffrey']} jeffrey / {totals['no_face']} no-face / " 331 + f"{len(paths) - totals['no_face'] - totals['jeffrey']} other-faces", 332 + file=sys.stderr, 333 + ) 334 + print( 335 + f"tokens: input={totals['input']} output={totals['output']} " 336 + f"cache_create={totals['cache_create']} cache_read={totals['cache_read']}", 337 + file=sys.stderr, 338 + ) 339 + return 0 340 + 341 + 342 + def main() -> int: 343 + p = argparse.ArgumentParser(description="Face-tag images via Claude vision.") 344 + p.add_argument("--input", required=True, help="glob of input image paths") 345 + p.add_argument("--output", required=True, help="JSONL output path") 346 + p.add_argument("--limit", type=int, default=0) 347 + p.add_argument("--model", default="claude-opus-4-7") 348 + p.add_argument("--skip-no-face", action="store_true") 349 + p.add_argument("--concurrency", type=int, default=5) 350 + args = p.parse_args() 351 + return asyncio.run(main_async(args)) 352 + 353 + 354 + if __name__ == "__main__": 355 + sys.exit(main())
+79
portraits/jeffrey/bin/fetch-corpus.mjs
··· 1 + #!/usr/bin/env node 2 + 3 + import { readFileSync, mkdirSync, existsSync, statSync, writeFileSync } from "fs"; 4 + import { join } from "path"; 5 + 6 + const HERE = new URL(".", import.meta.url).pathname; 7 + const ROOT = join(HERE, ".."); 8 + const REPO_ROOT = join(ROOT, "..", ".."); 9 + const MANIFEST = join(REPO_ROOT, "papers/jeffrey-platter/manifest.json"); 10 + const CORPUS = join(ROOT, "corpus"); 11 + 12 + const args = new Set(process.argv.slice(2)); 13 + const flags = ["--shoot", "--masters", "--candids"]; 14 + const selected = flags.filter((f) => args.has(f)).map((f) => f.slice(2)); 15 + const buckets = selected.length > 0 ? selected : ["shoot", "masters", "candids"]; 16 + const force = args.has("--force"); 17 + 18 + const manifest = JSON.parse(readFileSync(MANIFEST, "utf8")); 19 + 20 + let fetched = 0, 21 + skipped = 0, 22 + failed = 0; 23 + 24 + for (const bucketName of buckets) { 25 + const bucket = manifest.buckets[bucketName]; 26 + if (!bucket) { 27 + console.error(`unknown bucket: ${bucketName}`); 28 + continue; 29 + } 30 + const dest = join(CORPUS, bucketName); 31 + mkdirSync(dest, { recursive: true }); 32 + 33 + for (const key of Object.keys(bucket.items)) { 34 + const filename = bucket.key_includes_extension ? key : `${key}.jpg`; 35 + const url = bucket.url_pattern.replace("{name}", key); 36 + const localPath = join(dest, filename); 37 + 38 + if (!force && existsSync(localPath) && statSync(localPath).size > 0) { 39 + skipped++; 40 + continue; 41 + } 42 + 43 + process.stdout.write(`fetching ${bucketName}/${filename} ... `); 44 + try { 45 + const res = await fetch(url); 46 + if (!res.ok) { 47 + console.log(`HTTP ${res.status}`); 48 + failed++; 49 + continue; 50 + } 51 + const buf = Buffer.from(await res.arrayBuffer()); 52 + writeFileSync(localPath, buf); 53 + console.log(`${(buf.length / 1024).toFixed(0)} KB`); 54 + fetched++; 55 + } catch (err) { 56 + console.log(`error: ${err.message}`); 57 + failed++; 58 + } 59 + } 60 + } 61 + 62 + const indexPath = join(CORPUS, "index.json"); 63 + const index = { 64 + generated: new Date().toISOString(), 65 + manifest_version: manifest.version, 66 + buckets: Object.fromEntries( 67 + buckets.map((b) => [ 68 + b, 69 + { 70 + count: Object.keys(manifest.buckets[b].items).length, 71 + url_pattern: manifest.buckets[b].url_pattern, 72 + }, 73 + ]), 74 + ), 75 + }; 76 + writeFileSync(indexPath, JSON.stringify(index, null, 2)); 77 + 78 + console.log(`\nfetched: ${fetched}, skipped: ${skipped}, failed: ${failed}`); 79 + console.log(`corpus: ${CORPUS.replace(REPO_ROOT + "/", "")}`);
+262
portraits/jeffrey/bin/generate-neo.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Generate stylized "neo-jeffrey" portraits using OpenAI gpt-image-1, conditioned 4 + on a reference photo from the platter for identity. Saves results to ~/Desktop. 5 + 6 + Two variants by default: 7 + 1. neo-jeffrey-pixel — chunky-pixel digital character art, AC palette 8 + 2. neo-jeffrey-paint — textured digital painting, AC palette 9 + 10 + This is the first hands-on rendering of the "scene-graph + AC brand vocabulary" 11 + direction from the rev2 feasibility doc — a cheap exercise to feel out tone 12 + before committing to gen.mjs / gen.js / Gemini. 13 + """ 14 + 15 + from __future__ import annotations 16 + 17 + import argparse 18 + import base64 19 + import sys 20 + from datetime import datetime 21 + from pathlib import Path 22 + 23 + from openai import OpenAI 24 + 25 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 26 + VAULT_ENV = REPO_ROOT / "aesthetic-computer-vault" / ".devcontainer" / "envs" / "devcontainer.env" 27 + SHOOT_DIR = REPO_ROOT / "portraits" / "jeffrey" / "corpus" / "shoot" 28 + DESKTOP = Path.home() / "Desktop" 29 + 30 + # Multiple references → stronger identity preservation. The default set is the 31 + # canonical AV-shoot hero shots; --use-selfies adds varied IG-platter selfies 32 + # from across the years (2015..2025) to teach the model his casual look across 33 + # eras. gpt-image-1.5 / 2 accept up to 16 reference images per call. 34 + SHOOT_REFS = [ 35 + SHOOT_DIR / "jeffery-av--07.jpg", 36 + SHOOT_DIR / "jeffery-av--01.jpg", 37 + SHOOT_DIR / "jeffery-av--04.jpg", 38 + ] 39 + ARCHIVE_DIR = REPO_ROOT / "portraits" / "jeffrey" / "ig-archive" / "whistlegraph" 40 + SELFIE_REFS = [ 41 + ARCHIVE_DIR / "2018-12-02_Bq4ckGFFNtW.jpg", 42 + ARCHIVE_DIR / "2020-09-02_CEpxlO2FOvD.jpg", 43 + ARCHIVE_DIR / "2021-07-10_CRI095Vl7AO_1.jpg", 44 + ARCHIVE_DIR / "2025-01-25_DFQ2lHPzN_W.jpg", 45 + ARCHIVE_DIR / "2017-04-10_BStid5yjTHq.jpg", 46 + ] 47 + DEFAULT_REFS = SHOOT_REFS 48 + 49 + VARIANTS = [ 50 + { 51 + "name": "neo-jeffrey-editorial", 52 + "prompt": """\ 53 + Photographic portrait of the man in the reference photo, shot for aesthetic.computer — a creative-coding \ 54 + platform. This is a real photograph, not an illustration. Photo-realistic. 55 + 56 + Identity: same face, same medium-length brown hair, same overall bearing as the reference. Soft warm \ 57 + smile, relaxed eyes, eye-contact with the camera. He is recognizably him. Do NOT youthify, smooth, or \ 58 + prettify the skin — keep real texture, keep his actual features. 59 + 60 + Setting and styling: clean editorial portrait, studio shot. Cream off-white seamless backdrop \ 61 + (#f5f5f5) with a single soft prop or accent in one of these colors: hot pink #ff6b9d, cyan #4ecdc4, \ 62 + gold #ffd93d. He could be wearing a simple tee or sweatshirt in one of those colors. Casual, lived-in, \ 63 + not-glossy. Slight visible film grain ok. 64 + 65 + Lighting: soft natural daylight from one side, gentle fill, warm white balance. The kind of friendly \ 66 + welcome-portrait you'd see on the home page of a beloved indie creative-coding website. 67 + 68 + Framing: medium-close shot, head and shoulders, square 1:1 aspect ratio. Eye-level, slight 3/4 angle. \ 69 + Photographic, NOT illustrated, NOT painted, NOT AI-poster-glossy.\ 70 + """, 71 + }, 72 + { 73 + "name": "neo-jeffrey-cyberpunk-swords", 74 + "prompt": """\ 75 + Photographic cyberpunk-noir portrait of the man in the reference photos. Real photograph, \ 76 + photo-realistic, cinematic — NOT illustrated, NOT painted, NOT a video-game render. 77 + 78 + Identity: same face, same medium-length brown hair, recognizable as him across the various \ 79 + references. Intense fierce battle-ready expression — mouth half-open in a guttural ready-yell, \ 80 + eyes hard and locked forward, brow lowered. Defiant, fearsome, NOT joyful, NOT smiling. 81 + 82 + Styling: dark techwear / cyberpunk fashion — black leather or matte tactical jacket with subtle \ 83 + luminescent piping along the seams, high collar, fingerless gloves, combat boots, maybe a worn \ 84 + bandana around the neck. Damp skin catching the neon light. 85 + 86 + Setting: a neon-soaked alley somewhere in 2049 — wet asphalt reflecting hot pink (#ff6b9d) and \ 87 + cyan (#4ecdc4) holographic signage in a foreign script, distant gold (#ffd93d) advertisement glow. \ 88 + Heavy volumetric atmosphere — light rain, mist, deep shadows broken by vivid practical neons. \ 89 + Blade Runner 2049 / Cyberpunk 2077 / Ghost in the Shell mood. Hard rim light from a magenta sign \ 90 + behind him. 91 + 92 + Action and props: he is wielding TWO glowing energy katanas — one in each hand, blades crossed in \ 93 + a defensive ready stance just below face level. The katanas are sci-fi weapons: matte-black hilts \ 94 + with chrome accents, blades made of shimmering plasma in saturated neon colors — the right blade \ 95 + glows hot pink (#ff6b9d), the left blade glows cyan (#4ecdc4). The plasma blades cast colored \ 96 + light onto his face from below, painting half his face pink and half cyan, illuminating his \ 97 + features dramatically. Wisps of glowing plasma trail off the blade edges. 98 + 99 + Mood: deadly cyberpunk samurai energy. Unmistakably scary. The aesthetic.computer absurdist edge \ 100 + appears in the over-the-top neon palette of the blades, not in any cuteness — this version is \ 101 + purely menacing. 102 + 103 + Framing: full-body shot, vertical 1024x1536. Slight low angle from waist height looking up — he \ 104 + fills the frame head to boots in a wide planted stance, with the alley extending behind him. \ 105 + Dramatic, heroic-villain composition. Shallow depth of field, crisp on him, blurred neon bokeh \ 106 + behind. Photographic.\ 107 + """, 108 + }, 109 + { 110 + "name": "neo-jeffrey-cyberpunk", 111 + "prompt": """\ 112 + Photographic cyberpunk-noir portrait of the man in the reference photo. Real photograph, \ 113 + photo-realistic, cinematic — NOT illustrated, NOT painted, NOT a still from a video game. 114 + 115 + Identity: same face, same medium-length brown hair, recognizable as him. Intense fierce expression — \ 116 + mouth open in a guttural, defiant yell, eyes hard and locked on the camera, brow lowered. \ 117 + Menacing energy — he looks like he means it. NOT joyful, NOT smiling. 118 + 119 + Styling: dark techwear / cyberpunk fashion — black leather or matte tactical jacket with subtle \ 120 + luminescent piping, high collar, maybe a worn bandana around the neck. Maybe wet hair. Damp skin \ 121 + catching the neon. Single small chrome accent earring or eyebrow ring. 122 + 123 + Setting: a neon-soaked alley somewhere in 2049 — wet asphalt reflecting hot pink (#ff6b9d) and cyan \ 124 + (#4ecdc4) holographic signage in a foreign script, distant gold (#ffd93d) advertisement. Heavy \ 125 + volumetric atmosphere — light rain, mist, deep shadows broken by vivid practical neons. Blade Runner \ 126 + 2049 / Cyberpunk 2077 / Akira urban-noir mood. Hard rim light from a magenta sign behind him. 127 + 128 + Action and prop: he holds up — toward the camera — a CLEARLY-TOY plastic bubble gun. The toy is \ 129 + manifestly a kids' toy: bright translucent neon-pink plastic body, oversized cartoonish trigger, \ 130 + visible see-through bubble mixture reservoir on top, rounded chunky kid-friendly proportions. The \ 131 + absurd pink-translucent plastic toy in his angry tactical cyberpunk grip is the whole joke of the \ 132 + photo. Just three or four lone soap bubbles drift lazily out of the muzzle and float into the rainy \ 133 + neon air — sparse, not a swarm. 134 + 135 + Mood: serious cyberpunk menace, deadly tactical pose, undercut by the manifestly-silly translucent \ 136 + plastic toy and the few soft bubbles. Tonal contradiction. The aesthetic.computer absurdist energy. 137 + 138 + Framing: full-body shot, vertical 1024x1536. Slight low angle from waist height looking up — he \ 139 + fills the frame head to boots, with the alley extending behind him. Wide enough to show the full \ 140 + toy in his hand, his stance (planted, aggressive), his whole outfit, and the neon environment \ 141 + above and around. Photographic, shallow depth of field, gentle motion blur on the bubbles.\ 142 + """, 143 + }, 144 + { 145 + "name": "neo-jeffrey-lifestyle", 146 + "prompt": """\ 147 + Photographic candid lifestyle portrait of the man in the reference photo, shot for aesthetic.computer. \ 148 + This is a real photograph, not an illustration. Photo-realistic. 149 + 150 + Identity: same face, same medium-length brown hair, same actual features as the reference. Soft warm \ 151 + smile, eyes alive, looking gently at or just past the camera. He is recognizably him. Keep real skin \ 152 + texture and his actual face — do NOT smooth or AI-prettify. 153 + 154 + Setting: at home in a relaxed creative workspace — a wooden desk with a laptop and a few sketchbooks \ 155 + or a sketchpad and pencils, a window throwing warm afternoon light, plants. Subtle pops of \ 156 + aesthetic.computer brand color woven into the scene: a hot-pink mug (#ff6b9d), a cyan post-it (#4ecdc4), \ 157 + or a small gold object (#ffd93d). The colors are accents, not dominant. 158 + 159 + Mood: warm, candid, off-guard, approachable. Like a friendly photo a friend would take of him while \ 160 + he's mid-thought. Slight shallow depth of field with gentle bokeh. 161 + 162 + Framing: medium shot waist-up, square 1:1, slight 3/4 angle. Photographic, NOT illustrated, NOT \ 163 + painted, NOT corporate-headshot, NOT AI-glossy.\ 164 + """, 165 + }, 166 + ] 167 + 168 + 169 + def load_openai_key() -> str: 170 + import os 171 + if "OPENAI_API_KEY" in os.environ: 172 + return os.environ["OPENAI_API_KEY"] 173 + if VAULT_ENV.exists(): 174 + for line in VAULT_ENV.read_text().splitlines(): 175 + if line.startswith("OPENAI_API_KEY="): 176 + return line.split("=", 1)[1].strip().strip('"').strip("'") 177 + sys.exit("OPENAI_API_KEY not set and not in vault devcontainer.env") 178 + 179 + 180 + def main() -> int: 181 + p = argparse.ArgumentParser() 182 + p.add_argument("--refs", nargs="+", default=None, 183 + help="reference photo(s); multiple = stronger identity. " 184 + "Default: SHOOT_REFS (3 staged headshots).") 185 + p.add_argument("--use-selfies", action="store_true", 186 + help="also include 5 IG-platter selfies as refs (2015..2025) " 187 + "for varied identity grounding across eras.") 188 + p.add_argument("--out", default=str(DESKTOP), help="output directory") 189 + p.add_argument("--size", default="1024x1536", 190 + help="1024x1024 / 1024x1536 (portrait) / 1536x1024 (landscape) / auto") 191 + p.add_argument("--quality", default="high", choices=["low", "medium", "high", "auto"]) 192 + p.add_argument("--input-fidelity", default="high", choices=["low", "high"], 193 + help="how closely to preserve identity from refs; high = stronger") 194 + p.add_argument("--variant", choices=[v["name"] for v in VARIANTS] + ["all"], default="all") 195 + p.add_argument("--n", type=int, default=1, help="variants per prompt") 196 + p.add_argument("--model", default="gpt-image-2", 197 + help="gpt-image-2 (default, latest) | gpt-image-1.5 | gpt-image-1") 198 + args = p.parse_args() 199 + 200 + if args.refs: 201 + refs = [Path(r).expanduser().resolve() for r in args.refs] 202 + else: 203 + refs = list(SHOOT_REFS) 204 + if args.use_selfies: 205 + refs.extend(SELFIE_REFS) 206 + refs = [Path(r).expanduser().resolve() for r in refs] 207 + for r in refs: 208 + if not r.exists(): 209 + sys.exit(f"reference not found: {r}") 210 + out_dir = Path(args.out).expanduser().resolve() 211 + out_dir.mkdir(parents=True, exist_ok=True) 212 + 213 + selected = VARIANTS if args.variant == "all" else [v for v in VARIANTS if v["name"] == args.variant] 214 + timestamp = datetime.now().strftime("%Y-%m-%d_%H%M%S") 215 + 216 + client = OpenAI(api_key=load_openai_key()) 217 + print(f"references ({len(refs)}): {', '.join(r.name for r in refs)}", file=sys.stderr) 218 + print(f"out: {out_dir}", file=sys.stderr) 219 + print(f"settings: size={args.size} quality={args.quality} fidelity={args.input_fidelity} n={args.n}", 220 + file=sys.stderr) 221 + 222 + for variant in selected: 223 + print(f"\ngenerating {variant['name']}…", file=sys.stderr) 224 + try: 225 + files = [open(r, "rb") for r in refs] 226 + try: 227 + response = client.images.edit( 228 + model=args.model, 229 + image=files, 230 + prompt=variant["prompt"], 231 + size=args.size, 232 + quality=args.quality, 233 + input_fidelity=args.input_fidelity, 234 + n=args.n, 235 + ) 236 + finally: 237 + for f in files: 238 + f.close() 239 + 240 + for i, item in enumerate(response.data): 241 + if not item.b64_json: 242 + print(f" ERROR: variant {i} no image returned", file=sys.stderr) 243 + continue 244 + suffix = f"_{i+1}" if args.n > 1 else "" 245 + out_name = f"{variant['name']}{suffix}_{timestamp}.png" 246 + out_path = out_dir / out_name 247 + out_path.write_bytes(base64.b64decode(item.b64_json)) 248 + print(f" → {out_path}", file=sys.stderr) 249 + 250 + usage = getattr(response, "usage", None) 251 + if usage: 252 + ip = getattr(usage, "input_tokens", 0) 253 + op = getattr(usage, "output_tokens", 0) 254 + print(f" tokens: input={ip} output={op}", file=sys.stderr) 255 + except Exception as e: 256 + print(f" ERROR: {type(e).__name__}: {e}", file=sys.stderr) 257 + 258 + return 0 259 + 260 + 261 + if __name__ == "__main__": 262 + sys.exit(main())
+54
portraits/jeffrey/bin/ig-archive.fish
··· 1 + #!/usr/bin/env fish 2 + # Bulk-archive an Instagram account into portraits/jeffrey/ig-archive/. 3 + # Requires a session file (from ig-import-cookies.py or ig-login.py) to exist. 4 + # 5 + # Usage: 6 + # ./bin/ig-archive.fish whistlegraph 7 + # ./bin/ig-archive.fish aesthetic.computer 8 + # 9 + # Sessions live at portraits/jeffrey/sessions/<account>. 10 + # Archive lands at portraits/jeffrey/ig-archive/<account>/. 11 + 12 + set username $argv[1] 13 + if test -z "$username" 14 + echo "usage: ig-archive.fish <username>" 15 + exit 64 16 + end 17 + 18 + set repo_root (realpath (status dirname)/../../..) 19 + set session_file "$repo_root/portraits/jeffrey/sessions/$username" 20 + # Instaloader's --dirname-pattern={profile} creates the per-user subdir; root 21 + # above it so we don't end up with .../whistlegraph/whistlegraph/. 22 + set out_dir "$repo_root/portraits/jeffrey/ig-archive" 23 + 24 + if not test -f "$session_file" 25 + echo "session not found: $session_file" 26 + echo "bootstrap one of:" 27 + echo " bin/ig-import-cookies.py chrome $username # safer for 2FA accounts" 28 + echo " IG_PASSWORD='...' bin/ig-login.py $username # password flow" 29 + exit 1 30 + end 31 + 32 + mkdir -p "$out_dir" 33 + cd "$out_dir" 34 + 35 + # --no-profile-pic: skip avatar (we already have headshots in shoot/) 36 + # --no-captions: skip captions txt; metadata is in the .json 37 + # --no-resume: don't resume from a partial dir; tracks state via fast-update 38 + # --fast-update: skip files we already have (de-dupe between runs) 39 + # --post-metadata-txt '': suppress per-post txt files 40 + # --geotags: include geotag JSON (tiny; useful provenance) 41 + instaloader \ 42 + --login=$username \ 43 + --sessionfile=$session_file \ 44 + --dirname-pattern={profile} \ 45 + --filename-pattern={date_utc:%Y-%m-%d}_{shortcode} \ 46 + --fast-update \ 47 + --no-captions \ 48 + --no-profile-pic \ 49 + --post-metadata-txt='' \ 50 + profile $username 51 + 52 + echo "" 53 + echo "archive: $out_dir/$username" 54 + echo "post count: "(ls $out_dir/$username/*.jpg 2>/dev/null | wc -l)
+105
portraits/jeffrey/bin/ig-import-cookies.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Import an Instagram session from a logged-in browser into an instaloader 4 + session file. Works around password-flow login bugs on accounts where 5 + 2FA / soft-locks silently strip the auth cookie. 6 + 7 + Usage: 8 + ig-import-cookies.py <browser> <username> 9 + 10 + <browser>: chrome | firefox | safari | brave | edge | arc | chromium | opera 11 + <username>: instagram username (used to name the session file) 12 + 13 + Requires browser-cookie3. On macOS, Chromium-family browsers will prompt for 14 + Keychain access on first run — click "Always Allow" so subsequent runs are 15 + silent. Firefox cookies are unencrypted so no prompt. 16 + 17 + The script: 18 + 1. Pulls instagram.com cookies from the named browser via browser-cookie3. 19 + 2. Calls instaloader.test_login() to confirm a real session is present. 20 + 3. Saves an instaloader session file at 21 + portraits/jeffrey/sessions/<username>. 22 + """ 23 + 24 + from __future__ import annotations 25 + 26 + import sys 27 + from pathlib import Path 28 + 29 + import browser_cookie3 30 + import instaloader 31 + 32 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 33 + SESSION_DIR = REPO_ROOT / "portraits" / "jeffrey" / "sessions" 34 + 35 + BROWSERS = { 36 + "chrome": browser_cookie3.chrome, 37 + "firefox": browser_cookie3.firefox, 38 + "safari": browser_cookie3.safari, 39 + "brave": browser_cookie3.brave, 40 + "edge": browser_cookie3.edge, 41 + "arc": browser_cookie3.arc, 42 + "chromium": browser_cookie3.chromium, 43 + "opera": browser_cookie3.opera, 44 + } 45 + 46 + 47 + def main() -> int: 48 + if len(sys.argv) != 3: 49 + print(__doc__.strip(), file=sys.stderr) 50 + return 64 51 + 52 + browser_name, username = sys.argv[1].lower(), sys.argv[2] 53 + if browser_name not in BROWSERS: 54 + print( 55 + f"unknown browser: {browser_name}. Supported: {', '.join(BROWSERS)}", 56 + file=sys.stderr, 57 + ) 58 + return 64 59 + 60 + print(f"reading instagram.com cookies from {browser_name}…", file=sys.stderr) 61 + try: 62 + cookies = BROWSERS[browser_name](domain_name="instagram.com") 63 + except browser_cookie3.BrowserCookieError as e: 64 + print(f"failed to read {browser_name} cookies: {e}", file=sys.stderr) 65 + return 1 66 + 67 + cookie_dict = {c.name: c.value for c in cookies} 68 + if "sessionid" not in cookie_dict or not cookie_dict["sessionid"]: 69 + print( 70 + "no sessionid cookie found — make sure you're logged in to " 71 + f"instagram.com in {browser_name}.", 72 + file=sys.stderr, 73 + ) 74 + return 2 75 + 76 + L = instaloader.Instaloader(max_connection_attempts=1) 77 + L.context._session.cookies.update(cookie_dict) 78 + 79 + detected = L.test_login() 80 + if not detected: 81 + print( 82 + "test_login() failed — cookies are present but Instagram doesn't " 83 + "recognize them. Refresh the IG tab and retry.", 84 + file=sys.stderr, 85 + ) 86 + return 3 87 + if detected.lower() != username.lower(): 88 + print( 89 + f"warning: browser is logged in as @{detected}, not @{username}. " 90 + f"Saving session for @{detected} instead.", 91 + file=sys.stderr, 92 + ) 93 + username = detected 94 + 95 + L.context.username = username 96 + SESSION_DIR.mkdir(parents=True, exist_ok=True) 97 + session_path = SESSION_DIR / username 98 + L.save_session_to_file(str(session_path)) 99 + print(f"session saved: {session_path.relative_to(REPO_ROOT)}") 100 + print(f"authenticated as @{username}") 101 + return 0 102 + 103 + 104 + if __name__ == "__main__": 105 + sys.exit(main())
+103
portraits/jeffrey/bin/ig-index.mjs
··· 1 + #!/usr/bin/env node 2 + // Scan a completed instaloader archive and emit a per-account index summarizing 3 + // what's there: post count, date range, media counts (image vs video), total size, 4 + // and a flat list of shortcodes with date + media filenames. 5 + // 6 + // Usage: 7 + // node bin/ig-index.mjs whistlegraph 8 + // 9 + // Reads: portraits/jeffrey/ig-archive/<user>/ 10 + // Writes: portraits/jeffrey/ig-archive/<user>-index.json 11 + 12 + import { readdirSync, statSync, writeFileSync } from "fs"; 13 + import { join } from "path"; 14 + 15 + const HERE = new URL(".", import.meta.url).pathname; 16 + const REPO_ROOT = join(HERE, "..", "..", ".."); 17 + 18 + const username = process.argv[2]; 19 + if (!username) { 20 + console.error("usage: ig-index.mjs <username>"); 21 + process.exit(64); 22 + } 23 + 24 + const archiveDir = join( 25 + REPO_ROOT, 26 + "portraits/jeffrey/ig-archive", 27 + username, 28 + ); 29 + const indexPath = join( 30 + REPO_ROOT, 31 + "portraits/jeffrey/ig-archive", 32 + `${username}-index.json`, 33 + ); 34 + 35 + const files = readdirSync(archiveDir); 36 + const byShortcode = new Map(); 37 + 38 + for (const file of files) { 39 + // Filenames look like: 2020-12-04_CIXgfX8FdhI.jpg 40 + // or: 2020-12-04_CIXgfX8FdhI_2.jpg (carousel) 41 + // or: 2020-12-04_CIXgfX8FdhI.json.xz (metadata) 42 + // or: 2020-12-04_CIXgfX8FdhI.mp4 (video) 43 + const m = file.match(/^(\d{4}-\d{2}-\d{2})_([A-Za-z0-9_-]+?)(?:_\d+)?\.(jpg|mp4|json\.xz)$/); 44 + if (!m) continue; 45 + const [, date, shortcode, ext] = m; 46 + if (!byShortcode.has(shortcode)) { 47 + byShortcode.set(shortcode, { 48 + date, 49 + images: [], 50 + videos: [], 51 + has_metadata: false, 52 + bytes: 0, 53 + }); 54 + } 55 + const entry = byShortcode.get(shortcode); 56 + const fullPath = join(archiveDir, file); 57 + const size = statSync(fullPath).size; 58 + entry.bytes += size; 59 + if (ext === "jpg") entry.images.push(file); 60 + else if (ext === "mp4") entry.videos.push(file); 61 + else if (ext === "json.xz") entry.has_metadata = true; 62 + } 63 + 64 + const posts = [...byShortcode.entries()] 65 + .map(([shortcode, e]) => ({ shortcode, ...e })) 66 + .sort((a, b) => a.date.localeCompare(b.date)); 67 + 68 + const totalBytes = posts.reduce((s, p) => s + p.bytes, 0); 69 + const dates = posts.map((p) => p.date); 70 + const imageCount = posts.reduce((s, p) => s + p.images.length, 0); 71 + const videoCount = posts.reduce((s, p) => s + p.videos.length, 0); 72 + const carouselCount = posts.filter((p) => p.images.length + p.videos.length > 1).length; 73 + 74 + const index = { 75 + generated: new Date().toISOString(), 76 + username, 77 + archive_dir: archiveDir.replace(REPO_ROOT + "/", ""), 78 + post_count: posts.length, 79 + date_range: { first: dates[0] || null, last: dates[dates.length - 1] || null }, 80 + media: { 81 + images: imageCount, 82 + videos: videoCount, 83 + carousels: carouselCount, 84 + }, 85 + total_bytes: totalBytes, 86 + total_size_human: humanSize(totalBytes), 87 + posts, 88 + }; 89 + 90 + writeFileSync(indexPath, JSON.stringify(index, null, 2)); 91 + 92 + console.log(`wrote ${indexPath.replace(REPO_ROOT + "/", "")}`); 93 + console.log( 94 + `${posts.length} posts (${imageCount} images, ${videoCount} videos, ${carouselCount} carousels)`, 95 + ); 96 + console.log(`${dates[0] ?? "?"} → ${dates[dates.length - 1] ?? "?"}, ${humanSize(totalBytes)}`); 97 + 98 + function humanSize(bytes) { 99 + if (bytes >= 1e9) return (bytes / 1e9).toFixed(2) + " GB"; 100 + if (bytes >= 1e6) return (bytes / 1e6).toFixed(2) + " MB"; 101 + if (bytes >= 1e3) return (bytes / 1e3).toFixed(2) + " KB"; 102 + return bytes + " B"; 103 + }
+83
portraits/jeffrey/bin/ig-login.py
··· 1 + #!/opt/homebrew/Cellar/instaloader/4.15.1_1/libexec/bin/python3 2 + """ 3 + Login to Instagram via instaloader and save a session for re-use. 4 + 5 + Usage: 6 + IG_PASSWORD='...' python3 ig-login.py whistlegraph 7 + 8 + Reads the password from $IG_PASSWORD (never from argv, so it doesn't 9 + land in process listings or shell history). Saves the session cookie to 10 + portraits/jeffrey/sessions/<username>. 11 + 12 + If 2FA is required, prints a 2FA-needed marker and exits 2 — re-run 13 + interactively (drop the IG_PASSWORD env var) so instaloader can prompt 14 + for the code. 15 + """ 16 + 17 + from __future__ import annotations 18 + 19 + import os 20 + import sys 21 + from pathlib import Path 22 + 23 + import instaloader 24 + from instaloader.exceptions import ( 25 + BadCredentialsException, 26 + ConnectionException, 27 + TwoFactorAuthRequiredException, 28 + ) 29 + 30 + REPO_ROOT = Path(__file__).resolve().parent.parent.parent.parent 31 + SESSION_DIR = REPO_ROOT / "portraits" / "jeffrey" / "sessions" 32 + 33 + 34 + def main() -> int: 35 + if len(sys.argv) != 2: 36 + print("usage: ig-login.py <username>", file=sys.stderr) 37 + return 64 38 + 39 + username = sys.argv[1] 40 + password = os.environ.get("IG_PASSWORD") 41 + if not password: 42 + print("error: IG_PASSWORD env var is required", file=sys.stderr) 43 + return 64 44 + 45 + SESSION_DIR.mkdir(parents=True, exist_ok=True) 46 + session_path = SESSION_DIR / username 47 + 48 + L = instaloader.Instaloader( 49 + save_metadata=False, 50 + download_pictures=False, 51 + download_videos=False, 52 + download_video_thumbnails=False, 53 + download_geotags=False, 54 + download_comments=False, 55 + ) 56 + 57 + try: 58 + L.login(username, password) 59 + except TwoFactorAuthRequiredException: 60 + print("2fa_required", file=sys.stderr) 61 + return 2 62 + except BadCredentialsException as e: 63 + print(f"bad_credentials: {e}", file=sys.stderr) 64 + return 3 65 + except ConnectionException as e: 66 + print(f"connection_error: {e}", file=sys.stderr) 67 + return 4 68 + 69 + L.save_session_to_file(str(session_path)) 70 + print(f"session saved: {session_path.relative_to(REPO_ROOT)}") 71 + 72 + profile = instaloader.Profile.from_username(L.context, username) 73 + print( 74 + f"logged in as @{profile.username}: " 75 + f"{profile.full_name!r}, " 76 + f"{profile.mediacount} posts, " 77 + f"{profile.followers} followers" 78 + ) 79 + return 0 80 + 81 + 82 + if __name__ == "__main__": 83 + sys.exit(main())
+62
reports/instagram-api-migration-2026-03-29.md
··· 71 71 ``` 72 72 73 73 This replaces the direct `instagram-private-api` npm dependency with a more robust, actively maintained Python backend. 74 + 75 + --- 76 + 77 + ## Update 2026-04-28 — unblock prep 78 + 79 + Re-engaging this migration now that the jeffrey-platter is consuming silo/IG output. Status of the original next-steps list: 80 + 81 + 1. **Verify password** — *not done* — password hasn't been re-tried since the 2026-03-29 attempts. **User action required**: log into instagram.com as `@whistlegraph` and confirm the password in MongoDB `secrets` (`_id: "instagram"`) actually authenticates without challenge. 82 + 2. **Try interactive login** — *not done* — `instagram-cli auth login -u` on silo over SSH is the cleanest path to bootstrap a session that instagrapi can later restore from cookies. **User action required**: SSH to silo and run it after step 1 confirms the password. 83 + 3. **instagrapi migration** — *not done* — silo/server.mjs still imports `IgApiClient` from `instagram-private-api`. The HTTP-shim refactor is gated on a successful login (don't refactor speculative code paths). 84 + 4. **Systemd service** — *staged* — see [silo/instagrapi.service](../silo/instagrapi.service). Loopback-bound (127.0.0.1:8000); deploy with: 85 + ```bash 86 + scp silo/instagrapi.service silo:/etc/systemd/system/instagrapi.service 87 + ssh silo "systemctl daemon-reload && systemctl enable --now instagrapi" 88 + systemctl status instagrapi 89 + ``` 90 + 5. **Proxy fallback** — *not done* — only relevant if step 1 fails on residential IP too. Skip until needed. 91 + 92 + ### Runbook for the user 93 + 94 + ```text 95 + 1. Confirm `@whistlegraph` password works at https://www.instagram.com/ 96 + → If yes: continue. 97 + → If no: reset password, update MongoDB `secrets._id="instagram"`. 98 + 99 + 2. Deploy instagrapi.service: 100 + scp silo/instagrapi.service silo:/etc/systemd/system/ 101 + ssh silo "systemctl daemon-reload && systemctl enable --now instagrapi" 102 + ssh silo "systemctl status instagrapi" 103 + ssh silo "curl -s http://127.0.0.1:8000/docs | head" # sanity 104 + 105 + 3. Bootstrap a session interactively (silo terminal): 106 + ssh silo 107 + cd /opt/instagrapi-rest && source venv/bin/activate 108 + python -c " 109 + from instagrapi import Client 110 + cl = Client() 111 + cl.login('whistlegraph', '<password>') 112 + cl.dump_settings('/opt/instagrapi-rest/sessions/whistlegraph.json') 113 + print(cl.user_info_by_username('whistlegraph').full_name) 114 + " 115 + → solve any 2FA / email challenge prompts inline. 116 + 117 + 4. Once a session JSON exists, the silo HTTP-shim refactor can land 118 + (replace IgApiClient calls in silo/server.mjs with fetches to 119 + http://127.0.0.1:8000/...). That's a follow-up task; not staged 120 + here to avoid speculative code that can't be tested without step 3. 121 + 122 + 5. After silo refactor lands, jeffrey-platter §4 unblocks: bulk-pull 123 + @whistlegraph and @aesthetic.computer media into 124 + aesthetic-computer-vault/jeffrey-platter/ig-archive/ for canonical 125 + image generation. 126 + ``` 127 + 128 + ### Why no silo/server.mjs PR yet 129 + 130 + The `IgApiClient` → instagrapi-rest swap touches ~200 lines (login, challenge, 131 + 2FA, profile, feed, session restore). Doing it before login works means 132 + landing untestable code that may need to change once we see what 133 + instagrapi-rest's actual error shapes look like under failure. Step 3 134 + above gives us a working baseline; the refactor can then be shaped to 135 + match it.
+22
silo/instagrapi.service
··· 1 + [Unit] 2 + Description=instagrapi-rest (Instagram Private API HTTP shim for silo) 3 + After=network.target 4 + 5 + [Service] 6 + Type=simple 7 + User=root 8 + Group=root 9 + WorkingDirectory=/opt/instagrapi-rest 10 + ## Bind to localhost only — silo/server.mjs is the sole consumer and reaches 11 + ## it over the loopback interface. Public exposure would let anyone with 12 + ## silo's IP scrape via the API. 13 + ExecStart=/opt/instagrapi-rest/venv/bin/uvicorn main:app --host 127.0.0.1 --port 8000 14 + Restart=on-failure 15 + RestartSec=5 16 + StandardOutput=journal 17 + StandardError=journal 18 + 19 + LimitNOFILE=65536 20 + 21 + [Install] 22 + WantedBy=multi-user.target
+35 -66
system/public/give.aesthetic.computer/index.html
··· 7725 7725 }); 7726 7726 7727 7727 // ====== Jeffreys Ken Burns Canvas Slideshow ====== 7728 - // POI (Points of Interest) data: faces, bodies, hands 7729 - // Detected using OpenCV DNN + Haar cascades 7730 - // POI types: 'f'=face, 'b'=body, 'h'=hand 7731 - 7732 - // Headshots - professional AV shoot photos (face-focused) 7733 - // URL pattern: https://assets.aesthetic.computer/jeffreys/shoot/{filename} 7734 - // Reduced set for less frequent appearance 7735 - const headshotsData = { 7736 - 'jeffery-av--05.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7737 - 'jeffery-av--12.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7738 - 'jeffery-av--20.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7739 - 'jeffery-av--28.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7740 - 'jeffery-av--35.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7741 - 'jeffery-av--42.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7742 - 'jeffery-av--48.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7743 - 'jeffery-av--55.jpg': { focal: [50, 35], pois: [{"t": "f", "box": [30, 15, 40, 40]}], aspect: 0.667, src: 'headshots' }, 7744 - }; 7745 - 7746 - const jeffreysData = { 7747 - 'FullSizeRender': { focal: [38.5, 34.8], pois: [{"t": "f", "box": [33.8, 30.0, 9.5, 9.7]}, {"t": "b", "box": [59.5, 63.9, 20.2, 12.4]}, {"t": "b", "box": [87.6, 89.3, 8.4, 5.2]}], aspect: 0.75 }, 7748 - 'IMG_0260': { focal: [41.0, 38.1], pois: [{"t": "f", "box": [32.3, 29.3, 17.4, 17.5]}, {"t": "b", "box": [15.9, 50.0, 5.9, 3.6]}], aspect: 0.75 }, 7749 - 'IMG_0675': { focal: [54.0, 37.2], pois: [{"t": "f", "box": [47.8, 31.8, 12.3, 10.9]}], aspect: 0.667 }, 7750 - 'IMG_0686': { focal: [57.4, 25.4], pois: [{"t": "b", "box": [53.6, 23.1, 7.5, 4.6]}, {"t": "b", "box": [54.1, 47.2, 6.9, 4.2]}], aspect: 0.75 }, 7751 - 'IMG_0688': { focal: [72.3, 63.0], pois: [{"t": "b", "box": [69.1, 59.3, 6.5, 7.4]}, {"t": "b", "box": [39.4, 46.8, 4.1, 4.6]}], aspect: 0.563 }, 7752 - 'IMG_0798': { focal: [46.6, 6.4], pois: [{"t": "b", "box": [44.8, 5.3, 3.7, 2.3]}], aspect: 0.75 }, 7753 - 'IMG_1111': { focal: [50, 50], pois: [], aspect: 0.562 }, 7754 - 'IMG_1577': { focal: [37.0, 70.6], pois: [{"t": "b", "box": [35.5, 69.7, 2.8, 1.7]}], aspect: 0.75 }, 7755 - 'IMG_1616': { focal: [47.8, 51.7], pois: [{"t": "f", "box": [42.5, 45.2, 10.7, 13.0]}, {"t": "b", "box": [14.1, 68.0, 12.4, 7.6]}], aspect: 0.75 }, 7756 - 'IMG_1737': { focal: [50, 50], pois: [], aspect: 0.75 }, 7757 - 'IMG_1809': { focal: [48.0, 53.2], pois: [{"t": "b", "box": [45.0, 51.4, 6.1, 3.7]}], aspect: 0.75 }, 7758 - 'IMG_2124': { focal: [54.8, 35.8], pois: [{"t": "f", "box": [49.7, 31.8, 10.1, 7.9]}, {"t": "b", "box": [3.1, 48.8, 32.0, 14.8]}], aspect: 0.563 }, 7759 - 'IMG_2208': { focal: [40.0, 21.8], pois: [{"t": "f", "box": [33.6, 14.5, 12.8, 14.6]}, {"t": "b", "box": [23.4, 5.1, 10.6, 6.5]}], aspect: 0.75 }, 7760 - 'IMG_2280': { focal: [57.4, 33.4], pois: [{"t": "f", "box": [42.5, 16.4, 29.8, 34.0]}], aspect: 0.8 }, 7761 - 'IMG_2498': { focal: [21.8, 61.4], pois: [{"t": "b", "box": [12.6, 55.8, 18.3, 11.2]}], aspect: 0.75 }, 7762 - 'IMG_2630': { focal: [20.3, 59.6], pois: [{"t": "b", "box": [6.8, 44.8, 27.0, 29.4]}, {"t": "b", "box": [30.7, 44.0, 8.6, 22.9]}], aspect: 1.333 }, 7763 - 'IMG_2658': { focal: [46.6, 43.7], pois: [{"t": "f", "box": [39.2, 34.5, 14.8, 18.4]}, {"t": "b", "box": [58.5, 68.9, 8.4, 5.2]}], aspect: 0.75 }, 7764 - 'IMG_2668': { focal: [39.4, 24.5], pois: [{"t": "b", "box": [30.4, 19.0, 18.0, 11.0]}, {"t": "b", "box": [40.7, 68.8, 8.6, 5.3]}], aspect: 0.75 }, 7765 - 'IMG_2905': { focal: [47.5, 40.1], pois: [{"t": "b", "box": [28.3, 28.4, 38.3, 23.5]}], aspect: 0.75 }, 7766 - 'IMG_2913': { focal: [69.9, 19.9], pois: [{"t": "b", "box": [67.7, 18.5, 4.5, 2.7]}, {"t": "b", "box": [87.0, 36.8, 4.1, 2.5]}], aspect: 0.75 }, 7767 - 'IMG_3017': { focal: [37.6, 81.5], pois: [{"t": "b", "box": [31.7, 77.9, 11.8, 7.3]}], aspect: 0.75 }, 7768 - 'IMG_3234': { focal: [42.4, 48.8], pois: [{"t": "f", "box": [36.9, 43.2, 11.1, 11.3]}, {"t": "b", "box": [27.1, 34.8, 15.4, 9.4]}], aspect: 0.75 }, 7769 - 'IMG_4281': { focal: [50.2, 38.1], pois: [{"t": "f", "box": [44.5, 31.3, 11.4, 13.6]}, {"t": "b", "box": [70.5, 59.8, 8.4, 12.6]}], aspect: 0.75 }, 7770 - 'IMG_4312': { focal: [4.1, 28.1], pois: [{"t": "f", "box": [0.0, 18.6, 8.2, 19.0]}, {"t": "b", "box": [32.5, 63.2, 3.2, 2.0]}], aspect: 0.75 }, 7771 - 'IMG_4606': { focal: [53.1, 56.5], pois: [{"t": "f", "box": [28.6, 31.6, 48.9, 49.8]}], aspect: 0.75 }, 7772 - 'IMG_4894': { focal: [48.0, 73.6], pois: [{"t": "b", "box": [44.4, 71.4, 7.2, 4.4]}, {"t": "b", "box": [71.8, 49.7, 4.6, 6.8]}], aspect: 0.75 }, 7773 - 'IMG_4997': { focal: [54.0, 55.8], pois: [{"t": "b", "box": [52.3, 53.2, 3.4, 5.1]}], aspect: 0.75 }, 7774 - 'IMG_5043': { focal: [40.8, 43.4], pois: [{"t": "f", "box": [30.8, 32.3, 20.1, 22.1]}, {"t": "b", "box": [31.9, 69.9, 4.8, 3.0]}], aspect: 0.75 }, 7775 - 'IMG_5050': { focal: [65.6, 53.7], pois: [{"t": "f", "box": [60.1, 48.3, 10.9, 10.8]}, {"t": "b", "box": [21.9, 43.0, 16.5, 10.1]}], aspect: 0.75 }, 7776 - 'IMG_5272': { focal: [62.3, 53.4], pois: [{"t": "b", "box": [61.0, 51.3, 2.7, 4.1]}], aspect: 0.75 }, 7777 - 'IMG_5644': { focal: [49.0, 17.8], pois: [{"t": "f", "box": [44.1, 13.9, 9.8, 8.0]}, {"t": "b", "box": [32.7, 19.9, 11.0, 5.1]}], aspect: 0.563 }, 7778 - 'IMG_6342': { focal: [62.3, 28.0], pois: [{"t": "f", "box": [55.1, 18.5, 14.5, 18.9]}, {"t": "b", "box": [10.7, 69.4, 23.7, 14.6]}], aspect: 0.75 }, 7779 - 'IMG_6367': { focal: [44.1, 34.3], pois: [{"t": "f", "box": [37.8, 28.7, 12.7, 11.1]}, {"t": "b", "box": [60.4, 4.8, 3.6, 2.2]}], aspect: 0.75 }, 7780 - 'IMG_6435': { focal: [48.0, 37.8], pois: [{"t": "f", "box": [37.9, 27.2, 20.2, 21.1]}, {"t": "b", "box": [35.9, 86.0, 4.8, 2.9]}], aspect: 0.75 }, 7781 - 'IMG_8080': { focal: [42.6, 27.7], pois: [{"t": "f", "box": [36.0, 20.8, 13.1, 13.9]}, {"t": "b", "box": [15.9, 8.3, 66.6, 40.9]}], aspect: 0.75 }, 7782 - 'IMG_8188': { focal: [48.9, 33.9], pois: [{"t": "f", "box": [37.9, 24.0, 21.8, 19.9]}, {"t": "b", "box": [35.0, 88.1, 9.8, 6.0]}], aspect: 0.75 }, 7783 - 'IMG_8989': { focal: [79.2, 86.1], pois: [{"t": "b", "box": [76.7, 84.6, 5.0, 3.1]}], aspect: 0.75 }, 7784 - 'IMG_9795': { focal: [31.2, 22.0], pois: [{"t": "f", "box": [23.8, 15.6, 14.8, 12.9]}, {"t": "b", "box": [21.3, 51.5, 4.3, 2.0]}], aspect: 0.562 } 7785 - }; 7728 + // POI manifest source of truth: papers/jeffrey-platter/manifest.json, 7729 + // synced to ./jeffreys-manifest.json by papers/jeffrey-platter/sync.mjs. 7730 + // Schema: { buckets: { shoot, candids } }. POI types: 'f'=face, 'b'=body, 'h'=hand. 7731 + let headshotsData = {}; 7732 + let jeffreysData = {}; 7733 + async function loadJeffreysManifest() { 7734 + const res = await fetch('./jeffreys-manifest.json', { cache: 'no-cache' }); 7735 + const manifest = await res.json(); 7736 + headshotsData = manifest.buckets.shoot.items; 7737 + jeffreysData = manifest.buckets.candids.items; 7738 + } 7786 7739 7787 7740 // AC Screenshots - photographic development moments (environmental/atmospheric) 7788 7741 // URL pattern: https://assets.aesthetic.computer/screenshots/images/{imageRef} ··· 7880 7833 'november-7-2023-at-10-49-pm.webp': { focal: [47, 52], pois: [{"t":"s","box":[34.3,44.5,24.6,15.8]}], aspect: 0.75, src: 'screenshots' } 7881 7834 }; 7882 7835 7883 - // Merge all image data (headshots temporarily disabled) 7884 - const allImagesData = { ...jeffreysData, ...screenshotsData /*, ...headshotsData */ }; 7885 - const jeffreysImages = Object.keys(jeffreysData); 7836 + // Merge all image data. Filled in by buildImageIndex() after 7837 + // loadJeffreysManifest() resolves. 7838 + let allImagesData = {}; 7839 + let jeffreysImages = []; 7886 7840 const screenshotsImages = Object.keys(screenshotsData); 7887 - // const headshotsImages = Object.keys(headshotsData); 7888 - const allImages = [...jeffreysImages, ...screenshotsImages /*, ...headshotsImages */]; 7841 + let headshotsImages = []; 7842 + let allImages = []; 7843 + function buildImageIndex() { 7844 + allImagesData = { ...jeffreysData, ...screenshotsData, ...headshotsData }; 7845 + jeffreysImages = Object.keys(jeffreysData); 7846 + headshotsImages = Object.keys(headshotsData); 7847 + allImages = [...jeffreysImages, ...screenshotsImages, ...headshotsImages]; 7848 + } 7889 7849 const DEBUG_FACES = false; // Set true for labels and coordinates 7890 7850 const SHOW_POI_BOXES = true; // Matrix-style detection boxes (aesthetic) 7891 7851 const POI_GLOW = true; // Soft radial glow on detected faces/bodies ··· 9252 9212 observer.observe(container); 9253 9213 } 9254 9214 9255 - // Initialize slideshow when DOM ready 9215 + // Initialize slideshow when DOM ready (after manifest fetch resolves) 9216 + async function startJeffreysSlideshow() { 9217 + try { 9218 + await loadJeffreysManifest(); 9219 + } catch (err) { 9220 + console.warn('Jeffreys manifest fetch failed; slideshow will run with empty data', err); 9221 + } 9222 + buildImageIndex(); 9223 + initJeffreysSlideshow(); 9224 + } 9256 9225 if (document.readyState === 'loading') { 9257 - document.addEventListener('DOMContentLoaded', initJeffreysSlideshow); 9226 + document.addEventListener('DOMContentLoaded', startJeffreysSlideshow); 9258 9227 } else { 9259 - initJeffreysSlideshow(); 9228 + startJeffreysSlideshow(); 9260 9229 } 9261 9230 9262 9231 // Language selector (dropdown)
+164
system/public/give.aesthetic.computer/jeffreys-manifest.json
··· 1 + { 2 + "$schema": "./manifest.schema.json", 3 + "version": 1, 4 + "generated": "2026-04-28", 5 + "note": "Canonical jeffrey-image POI manifest. Source of truth lives here; served copy at system/public/give.aesthetic.computer/jeffreys-manifest.json (run papers/jeffrey-platter/sync.mjs to refresh). Three buckets: shoot/ headshots, masters/ HEIC+JPEG originals, candids/ JPG derivatives. Filename spelling 'jeffery-' (one r) is intentional and matches the CDN.", 6 + "buckets": { 7 + "shoot": { 8 + "label": "Professional AV photoshoot — 55 face-focused headshots, uniform framing", 9 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/shoot/{name}", 10 + "key_includes_extension": true, 11 + "audited": "2026-04-28: aws s3 ls confirmed 55 contiguous --01..--55", 12 + "tiers": "master (>10MB, --01..--10), mid (1–10MB, --11..--35), web (<1MB, --36..--55)", 13 + "items": { 14 + "jeffery-av--01.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":24579271,"tier":"master"}, 15 + "jeffery-av--02.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23031131,"tier":"master"}, 16 + "jeffery-av--03.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23782694,"tier":"master"}, 17 + "jeffery-av--04.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":20180498,"tier":"master"}, 18 + "jeffery-av--05.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":21157608,"tier":"master"}, 19 + "jeffery-av--06.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":23912884,"tier":"master"}, 20 + "jeffery-av--07.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":26175514,"tier":"master"}, 21 + "jeffery-av--08.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":25539600,"tier":"master"}, 22 + "jeffery-av--09.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":24963289,"tier":"master"}, 23 + "jeffery-av--10.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":25466226,"tier":"master"}, 24 + "jeffery-av--11.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3442326,"tier":"mid"}, 25 + "jeffery-av--12.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4923364,"tier":"mid"}, 26 + "jeffery-av--13.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4916244,"tier":"mid"}, 27 + "jeffery-av--14.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4650693,"tier":"mid"}, 28 + "jeffery-av--15.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4516921,"tier":"mid"}, 29 + "jeffery-av--16.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4489196,"tier":"mid"}, 30 + "jeffery-av--17.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3292944,"tier":"mid"}, 31 + "jeffery-av--18.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":1672328,"tier":"mid"}, 32 + "jeffery-av--19.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3705244,"tier":"mid"}, 33 + "jeffery-av--20.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4236846,"tier":"mid"}, 34 + "jeffery-av--21.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4263089,"tier":"mid"}, 35 + "jeffery-av--22.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4233171,"tier":"mid"}, 36 + "jeffery-av--23.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3663425,"tier":"mid"}, 37 + "jeffery-av--24.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4192960,"tier":"mid"}, 38 + "jeffery-av--25.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3794618,"tier":"mid"}, 39 + "jeffery-av--26.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":3676864,"tier":"mid"}, 40 + "jeffery-av--27.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4195383,"tier":"mid"}, 41 + "jeffery-av--28.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4090050,"tier":"mid"}, 42 + "jeffery-av--29.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4130648,"tier":"mid"}, 43 + "jeffery-av--30.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4933430,"tier":"mid"}, 44 + "jeffery-av--31.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5068198,"tier":"mid"}, 45 + "jeffery-av--32.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5205568,"tier":"mid"}, 46 + "jeffery-av--33.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5294940,"tier":"mid"}, 47 + "jeffery-av--34.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":5283671,"tier":"mid"}, 48 + "jeffery-av--35.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":4789782,"tier":"mid"}, 49 + "jeffery-av--36.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":272541,"tier":"web"}, 50 + "jeffery-av--37.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":268923,"tier":"web"}, 51 + "jeffery-av--38.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":296328,"tier":"web"}, 52 + "jeffery-av--39.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":305648,"tier":"web"}, 53 + "jeffery-av--40.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":226963,"tier":"web"}, 54 + "jeffery-av--41.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":225962,"tier":"web"}, 55 + "jeffery-av--42.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":223762,"tier":"web"}, 56 + "jeffery-av--43.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":224105,"tier":"web"}, 57 + "jeffery-av--44.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":304325,"tier":"web"}, 58 + "jeffery-av--45.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":340465,"tier":"web"}, 59 + "jeffery-av--46.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":338624,"tier":"web"}, 60 + "jeffery-av--47.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":339362,"tier":"web"}, 61 + "jeffery-av--48.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":345660,"tier":"web"}, 62 + "jeffery-av--49.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":242741,"tier":"web"}, 63 + "jeffery-av--50.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":244176,"tier":"web"}, 64 + "jeffery-av--51.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":251330,"tier":"web"}, 65 + "jeffery-av--52.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":254304,"tier":"web"}, 66 + "jeffery-av--53.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":283319,"tier":"web"}, 67 + "jeffery-av--54.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":246485,"tier":"web"}, 68 + "jeffery-av--55.jpg": {"focal":[50,35],"pois":[{"t":"f","box":[30,15,40,40]}],"aspect":0.667,"src":"headshots","size":248833,"tier":"web"} 69 + } 70 + }, 71 + "masters": { 72 + "label": "iPhone master files (HEIC + JPEG) — 1:1 with candids/, higher quality. Pick masters over derivatives for training crops.", 73 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/{name}", 74 + "key_includes_extension": true, 75 + "audited": "2026-04-28: aws s3 ls assets-aesthetic-computer/jeffreys/ — 38 masters found, 1:1 with candids/", 76 + "items": { 77 + "FullSizeRender.heic": {"candid_key":"FullSizeRender","size":516985}, 78 + "IMG_0260.heic": {"candid_key":"IMG_0260","size":1978771}, 79 + "IMG_0675.JPEG": {"candid_key":"IMG_0675","size":4743995}, 80 + "IMG_0686.heic": {"candid_key":"IMG_0686","size":1063196}, 81 + "IMG_0688.heic": {"candid_key":"IMG_0688","size":954060}, 82 + "IMG_0798.jpeg": {"candid_key":"IMG_0798","size":2332501}, 83 + "IMG_1111.heic": {"candid_key":"IMG_1111","size":1818360}, 84 + "IMG_1577.heic": {"candid_key":"IMG_1577","size":3453048}, 85 + "IMG_1616.heic": {"candid_key":"IMG_1616","size":2838527}, 86 + "IMG_1737.heic": {"candid_key":"IMG_1737","size":1830876}, 87 + "IMG_1809.heic": {"candid_key":"IMG_1809","size":4224096}, 88 + "IMG_2124.jpeg": {"candid_key":"IMG_2124","size":1332909}, 89 + "IMG_2208.heic": {"candid_key":"IMG_2208","size":1569939}, 90 + "IMG_2280.heic": {"candid_key":"IMG_2280","size":876871}, 91 + "IMG_2498.heic": {"candid_key":"IMG_2498","size":2148839}, 92 + "IMG_2630.HEIC": {"candid_key":"IMG_2630","size":1490103}, 93 + "IMG_2658.HEIC": {"candid_key":"IMG_2658","size":1608793}, 94 + "IMG_2668.heic": {"candid_key":"IMG_2668","size":1195972}, 95 + "IMG_2905.heic": {"candid_key":"IMG_2905","size":2304901}, 96 + "IMG_2913.heic": {"candid_key":"IMG_2913","size":803125}, 97 + "IMG_3017.heic": {"candid_key":"IMG_3017","size":2562824}, 98 + "IMG_3234.heic": {"candid_key":"IMG_3234","size":806715}, 99 + "IMG_4281.jpeg": {"candid_key":"IMG_4281","size":2303766}, 100 + "IMG_4312.jpeg": {"candid_key":"IMG_4312","size":2657700}, 101 + "IMG_4606.heic": {"candid_key":"IMG_4606","size":970231}, 102 + "IMG_4894.heic": {"candid_key":"IMG_4894","size":1428189}, 103 + "IMG_4997.heic": {"candid_key":"IMG_4997","size":1804730}, 104 + "IMG_5043.HEIC": {"candid_key":"IMG_5043","size":1156692}, 105 + "IMG_5050.heic": {"candid_key":"IMG_5050","size":1131254}, 106 + "IMG_5272.jpeg": {"candid_key":"IMG_5272","size":1978886}, 107 + "IMG_5644.heic": {"candid_key":"IMG_5644","size":1366318}, 108 + "IMG_6342.jpeg": {"candid_key":"IMG_6342","size":1844946}, 109 + "IMG_6367.HEIC": {"candid_key":"IMG_6367","size":1528443}, 110 + "IMG_6435.HEIC": {"candid_key":"IMG_6435","size":2105174}, 111 + "IMG_8080.HEIC": {"candid_key":"IMG_8080","size":1420859}, 112 + "IMG_8188.heic": {"candid_key":"IMG_8188","size":903213}, 113 + "IMG_8989.HEIC": {"candid_key":"IMG_8989","size":2106687}, 114 + "IMG_9795.heic": {"candid_key":"IMG_9795","size":2240240} 115 + } 116 + }, 117 + "candids": { 118 + "label": "Personal candids — face/body/hand POIs", 119 + "url_pattern": "https://assets.aesthetic.computer/jeffreys/jpg/{name}.jpg", 120 + "key_includes_extension": false, 121 + "audited": "2026-04-28: aws s3 ls confirmed all 38 entries match bucket; no orphans, no uncataloged. All have HEIC/JPEG masters at jeffreys/{name}.<ext> — see masters bucket.", 122 + "items": { 123 + "FullSizeRender": {"focal":[38.5,34.8],"pois":[{"t":"f","box":[33.8,30,9.5,9.7]},{"t":"b","box":[59.5,63.9,20.2,12.4]},{"t":"b","box":[87.6,89.3,8.4,5.2]}],"aspect":0.75,"size":829094,"master":"FullSizeRender.heic","master_size":516985}, 124 + "IMG_0260": {"focal":[41,38.1],"pois":[{"t":"f","box":[32.3,29.3,17.4,17.5]},{"t":"b","box":[15.9,50,5.9,3.6]}],"aspect":0.75,"size":1492041,"master":"IMG_0260.heic","master_size":1978771}, 125 + "IMG_0675": {"focal":[54,37.2],"pois":[{"t":"f","box":[47.8,31.8,12.3,10.9]}],"aspect":0.667,"size":4743995,"master":"IMG_0675.JPEG","master_size":4743995}, 126 + "IMG_0686": {"focal":[57.4,25.4],"pois":[{"t":"b","box":[53.6,23.1,7.5,4.6]},{"t":"b","box":[54.1,47.2,6.9,4.2]}],"aspect":0.75,"size":1145284,"master":"IMG_0686.heic","master_size":1063196}, 127 + "IMG_0688": {"focal":[72.3,63],"pois":[{"t":"b","box":[69.1,59.3,6.5,7.4]},{"t":"b","box":[39.4,46.8,4.1,4.6]}],"aspect":0.563,"size":1035484,"master":"IMG_0688.heic","master_size":954060}, 128 + "IMG_0798": {"focal":[46.6,6.4],"pois":[{"t":"b","box":[44.8,5.3,3.7,2.3]}],"aspect":0.75,"size":2332501,"master":"IMG_0798.jpeg","master_size":2332501}, 129 + "IMG_1111": {"focal":[50,50],"pois":[],"aspect":0.562,"size":1412575,"master":"IMG_1111.heic","master_size":1818360}, 130 + "IMG_1577": {"focal":[37,70.6],"pois":[{"t":"b","box":[35.5,69.7,2.8,1.7]}],"aspect":0.75,"size":3285470,"master":"IMG_1577.heic","master_size":3453048}, 131 + "IMG_1616": {"focal":[47.8,51.7],"pois":[{"t":"f","box":[42.5,45.2,10.7,13]},{"t":"b","box":[14.1,68,12.4,7.6]}],"aspect":0.75,"size":2863682,"master":"IMG_1616.heic","master_size":2838527}, 132 + "IMG_1737": {"focal":[50,50],"pois":[],"aspect":0.75,"size":1539214,"master":"IMG_1737.heic","master_size":1830876}, 133 + "IMG_1809": {"focal":[48,53.2],"pois":[{"t":"b","box":[45,51.4,6.1,3.7]}],"aspect":0.75,"size":3742829,"master":"IMG_1809.heic","master_size":4224096}, 134 + "IMG_2124": {"focal":[54.8,35.8],"pois":[{"t":"f","box":[49.7,31.8,10.1,7.9]},{"t":"b","box":[3.1,48.8,32,14.8]}],"aspect":0.563,"size":1332909,"master":"IMG_2124.jpeg","master_size":1332909}, 135 + "IMG_2208": {"focal":[40,21.8],"pois":[{"t":"f","box":[33.6,14.5,12.8,14.6]},{"t":"b","box":[23.4,5.1,10.6,6.5]}],"aspect":0.75,"size":1788410,"master":"IMG_2208.heic","master_size":1569939}, 136 + "IMG_2280": {"focal":[57.4,33.4],"pois":[{"t":"f","box":[42.5,16.4,29.8,34]}],"aspect":0.8,"size":1016167,"master":"IMG_2280.heic","master_size":876871}, 137 + "IMG_2498": {"focal":[21.8,61.4],"pois":[{"t":"b","box":[12.6,55.8,18.3,11.2]}],"aspect":0.75,"size":1710740,"master":"IMG_2498.heic","master_size":2148839}, 138 + "IMG_2630": {"focal":[20.3,59.6],"pois":[{"t":"b","box":[6.8,44.8,27,29.4]},{"t":"b","box":[30.7,44,8.6,22.9]}],"aspect":1.333,"size":1452701,"master":"IMG_2630.HEIC","master_size":1490103}, 139 + "IMG_2658": {"focal":[46.6,43.7],"pois":[{"t":"f","box":[39.2,34.5,14.8,18.4]},{"t":"b","box":[58.5,68.9,8.4,5.2]}],"aspect":0.75,"size":1992798,"master":"IMG_2658.HEIC","master_size":1608793}, 140 + "IMG_2668": {"focal":[39.4,24.5],"pois":[{"t":"b","box":[30.4,19,18,11]},{"t":"b","box":[40.7,68.8,8.6,5.3]}],"aspect":0.75,"size":1365145,"master":"IMG_2668.heic","master_size":1195972}, 141 + "IMG_2905": {"focal":[47.5,40.1],"pois":[{"t":"b","box":[28.3,28.4,38.3,23.5]}],"aspect":0.75,"size":2497398,"master":"IMG_2905.heic","master_size":2304901}, 142 + "IMG_2913": {"focal":[69.9,19.9],"pois":[{"t":"b","box":[67.7,18.5,4.5,2.7]},{"t":"b","box":[87,36.8,4.1,2.5]}],"aspect":0.75,"size":1042326,"master":"IMG_2913.heic","master_size":803125}, 143 + "IMG_3017": {"focal":[37.6,81.5],"pois":[{"t":"b","box":[31.7,77.9,11.8,7.3]}],"aspect":0.75,"size":2446374,"master":"IMG_3017.heic","master_size":2562824}, 144 + "IMG_3234": {"focal":[42.4,48.8],"pois":[{"t":"f","box":[36.9,43.2,11.1,11.3]},{"t":"b","box":[27.1,34.8,15.4,9.4]}],"aspect":0.75,"size":1015178,"master":"IMG_3234.heic","master_size":806715}, 145 + "IMG_4281": {"focal":[50.2,38.1],"pois":[{"t":"f","box":[44.5,31.3,11.4,13.6]},{"t":"b","box":[70.5,59.8,8.4,12.6]}],"aspect":0.75,"size":2303766,"master":"IMG_4281.jpeg","master_size":2303766}, 146 + "IMG_4312": {"focal":[4.1,28.1],"pois":[{"t":"f","box":[0,18.6,8.2,19]},{"t":"b","box":[32.5,63.2,3.2,2]}],"aspect":0.75,"size":2657700,"master":"IMG_4312.jpeg","master_size":2657700}, 147 + "IMG_4606": {"focal":[53.1,56.5],"pois":[{"t":"f","box":[28.6,31.6,48.9,49.8]}],"aspect":0.75,"size":1044546,"master":"IMG_4606.heic","master_size":970231}, 148 + "IMG_4894": {"focal":[48,73.6],"pois":[{"t":"b","box":[44.4,71.4,7.2,4.4]},{"t":"b","box":[71.8,49.7,4.6,6.8]}],"aspect":0.75,"size":1564246,"master":"IMG_4894.heic","master_size":1428189}, 149 + "IMG_4997": {"focal":[54,55.8],"pois":[{"t":"b","box":[52.3,53.2,3.4,5.1]}],"aspect":0.75,"size":2173587,"master":"IMG_4997.heic","master_size":1804730}, 150 + "IMG_5043": {"focal":[40.8,43.4],"pois":[{"t":"f","box":[30.8,32.3,20.1,22.1]},{"t":"b","box":[31.9,69.9,4.8,3]}],"aspect":0.75,"size":1239460,"master":"IMG_5043.HEIC","master_size":1156692}, 151 + "IMG_5050": {"focal":[65.6,53.7],"pois":[{"t":"f","box":[60.1,48.3,10.9,10.8]},{"t":"b","box":[21.9,43,16.5,10.1]}],"aspect":0.75,"size":1515666,"master":"IMG_5050.heic","master_size":1131254}, 152 + "IMG_5272": {"focal":[62.3,53.4],"pois":[{"t":"b","box":[61,51.3,2.7,4.1]}],"aspect":0.75,"size":1978886,"master":"IMG_5272.jpeg","master_size":1978886}, 153 + "IMG_5644": {"focal":[49,17.8],"pois":[{"t":"f","box":[44.1,13.9,9.8,8]},{"t":"b","box":[32.7,19.9,11,5.1]}],"aspect":0.563,"size":1380329,"master":"IMG_5644.heic","master_size":1366318}, 154 + "IMG_6342": {"focal":[62.3,28],"pois":[{"t":"f","box":[55.1,18.5,14.5,18.9]},{"t":"b","box":[10.7,69.4,23.7,14.6]}],"aspect":0.75,"size":1844946,"master":"IMG_6342.jpeg","master_size":1844946}, 155 + "IMG_6367": {"focal":[44.1,34.3],"pois":[{"t":"f","box":[37.8,28.7,12.7,11.1]},{"t":"b","box":[60.4,4.8,3.6,2.2]}],"aspect":0.75,"size":1416580,"master":"IMG_6367.HEIC","master_size":1528443}, 156 + "IMG_6435": {"focal":[48,37.8],"pois":[{"t":"f","box":[37.9,27.2,20.2,21.1]},{"t":"b","box":[35.9,86,4.8,2.9]}],"aspect":0.75,"size":1604594,"master":"IMG_6435.HEIC","master_size":2105174}, 157 + "IMG_8080": {"focal":[42.6,27.7],"pois":[{"t":"f","box":[36,20.8,13.1,13.9]},{"t":"b","box":[15.9,8.3,66.6,40.9]}],"aspect":0.75,"size":1580558,"master":"IMG_8080.HEIC","master_size":1420859}, 158 + "IMG_8188": {"focal":[48.9,33.9],"pois":[{"t":"f","box":[37.9,24,21.8,19.9]},{"t":"b","box":[35,88.1,9.8,6]}],"aspect":0.75,"size":968035,"master":"IMG_8188.heic","master_size":903213}, 159 + "IMG_8989": {"focal":[79.2,86.1],"pois":[{"t":"b","box":[76.7,84.6,5,3.1]}],"aspect":0.75,"size":1830983,"master":"IMG_8989.HEIC","master_size":2106687}, 160 + "IMG_9795": {"focal":[31.2,22],"pois":[{"t":"f","box":[23.8,15.6,14.8,12.9]},{"t":"b","box":[21.3,51.5,4.3,2]}],"aspect":0.562,"size":2033875,"master":"IMG_9795.heic","master_size":2240240} 161 + } 162 + } 163 + } 164 + }