its whats on the tin; culls raw photos
0
fork

Configure Feed

Select the types of activity you want to include in your feed.

feat: init

+2422
+19
.gitignore
··· 1 + # Xcode 2 + build/ 3 + DerivedData/ 4 + *.xcworkspace 5 + !*.xcodeproj/project.pbxproj 6 + xcuserdata/ 7 + *.xcuserstate 8 + 9 + # Swift Package Manager 10 + .build/ 11 + .swiftpm/ 12 + 13 + # macOS 14 + .DS_Store 15 + *.swp 16 + *~ 17 + 18 + # App-specific caches 19 + *.thumbnails/
+185
SPEC.md
··· 1 + # Building a macOS photo culling app in Swift is highly feasible 2 + 3 + **A Narrative Select–style photo culling app can be built almost entirely with Apple's native frameworks.** The critical path — RAW preview extraction, keyboard-driven UI, blur detection, and shot grouping — maps cleanly onto ImageIO, Vision, Metal Performance Shaders, and SwiftUI APIs that ship with macOS. Only two features require meaningful third-party effort: aesthetic quality scoring (Core ML model conversion) and robust XMP sidecar writing (no Apple API covers the full spec). On Apple Silicon, the entire analysis pipeline — thumbnail extraction, face detection, blur scoring, and feature-print generation — processes **~20–55ms per photo**, meaning a 2,000-image shoot can be fully analyzed in under two minutes. 4 + 5 + --- 6 + 7 + ## RAW preview extraction is effectively a solved problem 8 + 9 + Apple's **ImageIO** framework supports every major RAW format natively: CR2, CR3, ARW, NEF, DNG, RAF, and ORF. The key insight for performance is that nearly all RAW files embed full-resolution JPEG previews (the same image shown on the camera LCD), and ImageIO can extract these without triggering a RAW demosaic. 10 + 11 + The critical API is `CGImageSourceCreateThumbnailAtIndex` with the option `kCGImageSourceCreateThumbnailFromImageIfAbsent`. When an embedded preview exists — which it does in virtually all camera RAW files — this function extracts and downscales the JPEG in **~15–50ms per file** on Apple Silicon. Compare this to full RAW decoding via `CIRAWFilter`, which takes **~3 seconds** on first invocation (Metal shader compilation) and ~50–200ms on subsequent calls. The embedded preview path is 10–100× faster. 12 + 13 + For the dual import mode specifically, the architecture is straightforward. Scan the import directory for RAW+JPEG pairs by matching basenames. Display the sidecar JPEG or extracted embedded preview in the grid. Only invoke `CIRAWFilter` when the user opens a single image for detailed inspection. The `CIRAWFilter.previewImage` property (macOS 12+) also provides the embedded preview as a `CIImage`, but `CGImageSourceCreateThumbnailAtIndex` is faster for bulk thumbnail generation because it avoids Core Image pipeline overhead. 14 + 15 + **CR3 format support** arrived in macOS Catalina (10.15), but each specific Canon camera model requires Apple to add support incrementally. There is a known issue in macOS Sequoia 15.1 where CR3 files with HDR PQ–enabled HEVC previews cause excessive CPU usage in the system's `ImageThumbnailExtension` process. DNG has basic support even for unlisted cameras, making it a reliable fallback. Apple maintains a current list of supported RAW cameras that covers iOS 18, macOS Sequoia 15, and visionOS 2. 16 + 17 + **Difficulty: Easy.** Built-in APIs handle everything. Development estimate: 1–2 weeks for the RAW+JPEG pair manager and thumbnail extraction pipeline. 18 + 19 + --- 20 + 21 + ## Keyboard-driven culling maps well onto SwiftUI's focus system 22 + 23 + SwiftUI on macOS 14 (Sonoma) introduced `.onKeyPress`, the native modifier for handling keyboard input without AppKit bridging. It supports filtering by specific keys, character sets, and key phases (down, up, repeat), and returns `.handled` or `.ignored` to control event propagation. For a culling app, the mapping is direct: 24 + 25 + ```swift 26 + .onKeyPress(keys: ["p"]) { _ in markAsPick(); return .handled } 27 + .onKeyPress(keys: ["x"]) { _ in markAsReject(); return .handled } 28 + .onKeyPress(characters: .decimalDigits) { press in 29 + if let digit = Int(press.characters), (1...5).contains(digit) { 30 + setRating(digit); return .handled 31 + } 32 + return .ignored 33 + } 34 + .onKeyPress(.rightArrow) { _ in nextPhoto(); return .handled } 35 + ``` 36 + 37 + **The critical requirement is that the view must be both `.focusable()` and `.focused()`.** This is the number-one debugging issue developers encounter — if no view has focus, `onKeyPress` never fires. Apply `.focusable()` *before* `.focused($isFocused)`, set focus on appear (sometimes requiring a short `DispatchQueue.main.asyncAfter` delay), and use `.focusEffectDisabled()` to suppress the blue focus ring that macOS draws around focused views. 38 + 39 + For apps that must support macOS versions prior to Sonoma, `NSEvent.addLocalMonitorForEvents(matching: .keyDown)` remains viable. This intercepts key events at the window level regardless of which SwiftUI view has focus, which is actually advantageous for a culling app where keyboard shortcuts should work globally. The tradeoff is that it bypasses SwiftUI's declarative event handling. 40 + 41 + Three SwiftUI-specific gotchas deserve attention. First, focus can be lost when the user clicks other UI elements like sidebars or toolbars — the viewer must reclaim focus programmatically. Second, mixing AppKit views via `NSViewRepresentable` can cause focus to get "stuck" because SwiftUI's focus system doesn't perfectly map to AppKit's first-responder chain. Third, on macOS, Tab and Shift+Tab navigate focus between focusable views only when "Use keyboard navigation" is enabled in System Preferences — keep the number of focusable views minimal to avoid confusing tab behavior. 42 + 43 + **Difficulty: Easy to moderate.** The core keyboard handling is straightforward; edge cases around focus management require testing. Development estimate: 1–2 weeks. 44 + 45 + --- 46 + 47 + ## XMP sidecars require careful schema work but no third-party libraries 48 + 49 + XMP sidecar files are XML documents following Adobe's XMP specification. The core metadata for a culling app uses the `xmp:` namespace (`http://ns.adobe.com/xap/1.0/`): 50 + 51 + - **Star ratings**: `xmp:Rating` as an integer, values **0–5** (0 = unrated, 1–5 = star ratings, **-1 = rejected** in Adobe Bridge) 52 + - **Color labels**: `xmp:Label` as text — Lightroom uses `"Red"`, `"Yellow"`, `"Green"`, `"Blue"`, `"Purple"` 53 + - **Pick/reject flags**: **Not stored in XMP at all.** Lightroom's pick/reject flags exist only in the Lightroom catalog database and are never exported to sidecar files. This is a critical discovery for cross-app compatibility — use `xmp:Rating = -1` for Bridge-compatible reject, or a specific color label like `"Red"` to indicate rejection in a Lightroom-importable way. 54 + 55 + A minimal valid XMP sidecar file is surprisingly small: 56 + 57 + ```xml 58 + <?xpacket begin="" id="W5M0MpCehiHzreSzNTczkc9d"?> 59 + <x:xmpmeta xmlns:x="adobe:ns:meta/"> 60 + <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> 61 + <rdf:Description rdf:about="" 62 + xmlns:xmp="http://ns.adobe.com/xap/1.0/" 63 + xmp:Rating="3" 64 + xmp:Label="Green" 65 + xmp:CreatorTool="MyCullingApp 1.0"> 66 + </rdf:Description> 67 + </rdf:RDF> 68 + </x:xmpmeta> 69 + <?xpacket end="w"?> 70 + ``` 71 + 72 + Writing this in Swift requires no dependencies — a string template with interpolated values works perfectly. For reading, macOS provides `XMLDocument` with XPath support, making it trivial to parse existing sidecars. Apple also provides `CGImageMetadataCreateFromXMPData` and `CGImageMetadataCreateXMPData` for converting between XMP byte streams and structured metadata objects, though these APIs are documented inconsistently and developers report that "custom tags just disappear" when round-tripping through them. 73 + 74 + **File naming matters for compatibility.** Use `<basename>.xmp` (e.g., `IMG_1234.xmp`) for Lightroom compatibility. darktable uses `<basename>.<extension>.xmp` (e.g., `IMG_1234.CR3.xmp`) and will read Lightroom-format sidecars on import but never write to them. For maximum cross-app compatibility, write `<basename>.xmp` files and let darktable create its own parallel sidecars. Always read existing sidecars before writing to avoid clobbering Camera Raw develop settings that Lightroom may have stored. 75 + 76 + **Difficulty: Moderate.** The schema is well-documented but the pick-flag gap and cross-app compatibility require careful design decisions. Development estimate: 1–2 weeks. 77 + 78 + --- 79 + 80 + ## Blur detection has a fast GPU path and a smart hybrid architecture 81 + 82 + The most effective blur detection strategy for a photo culling app combines two complementary approaches: **Metal Performance Shaders for global sharpness** and **Vision framework for face-specific quality**. 83 + 84 + **The MPS Laplacian path is the fastest option.** `MPSImageLaplacian` applies an optimized Laplacian edge-detection kernel on the GPU, and `MPSImageStatisticsMeanAndVariance` computes the variance of the result via GPU reduction, outputting a 2×1 pixel texture containing mean and variance. The entire pipeline — source texture → Laplacian → variance — executes in **~1–5ms on Apple Silicon** for typical photo resolutions. Sharp images produce high Laplacian variance; blurry images produce low variance. This classic approach (Pech-Pacheco et al., 2000) detects defocus blur excellently, handles motion blur moderately well, but struggles with intentional bokeh where sharp subjects coexist with blurred backgrounds. 85 + 86 + **Apple's `VNDetectFaceCaptureQualityRequest`** (macOS 10.15+) solves the bokeh problem for portrait photography. It returns a **0.0–1.0 quality score** per detected face, incorporating sharpness, lighting, pose, expression, and eye openness into a single trained metric. This is essentially Apple's built-in "best face" selector. It takes **~5–15ms per image** and handles the exact scenario where global Laplacian variance fails: a perfectly sharp portrait with creamy bokeh. 87 + 88 + The recommended hybrid architecture runs both in parallel: 89 + 90 + 1. Extract embedded JPEG preview, downscale to ~512px 91 + 2. Run `VNDetectFaceCaptureQualityRequest` → face quality scores (if faces exist) 92 + 3. Run `MPSImageLaplacian` → `MPSImageStatisticsMeanAndVariance` on the full image or face-cropped regions 93 + 4. Combine scores: face quality + Laplacian variance → composite quality metric 94 + 5. Optionally run a NIMA Core ML model for aesthetic quality scoring 95 + 96 + For blink detection specifically, Vision framework has no dedicated API, but `VNDetectFaceLandmarksRequest` returns 76-point face landmarks including full eye contours. Computing the **Eye Aspect Ratio** (EAR = vertical eye distance / horizontal eye distance) from these landmarks detects closed eyes reliably — open eyes have EAR ≈ 0.2–0.3, closed eyes drop below 0.2. The legacy `CIDetector` also exposes `CIFaceFeatureLeftEyeClosed` and `CIFaceFeatureRightEyeClosed` booleans, though with lower accuracy. 97 + 98 + An alternative CPU path uses Apple's Accelerate framework: `vImageConvolve_PlanarF()` applies the Laplacian kernel, and `vDSP_normalize()` computes the standard deviation. Apple provides an official sample project, "Finding the Sharpest Image in a Sequence of Captured Images," demonstrating this exact approach. It's SIMD-optimized on Apple Silicon and runs in **~2–10ms** per image. 99 + 100 + For aesthetic quality beyond blur/sharpness, the **PhotoAssessment** project on GitHub provides a pre-converted NIMA (Neural Image Assessment) Core ML model that scores images on a 1–10 quality scale. MobileNet-based NIMA inference takes **~2–5ms** on the Neural Engine. More sophisticated models like MUSIQ exist but require complex Core ML conversion due to variable input sizes and custom position encodings. 101 + 102 + **Difficulty: Easy for basic blur detection** (MPS path is ~20 lines of code), **moderate for the full hybrid pipeline**, **hard for aesthetic quality scoring** (model conversion and threshold tuning). Development estimate: 2–4 weeks for the complete quality assessment system. 103 + 104 + --- 105 + 106 + ## Shot grouping works best with temporal clustering plus Vision feature prints 107 + 108 + The most production-proven approach, validated by the ShutterSlim app (which reached #2 in the German App Store processing 35,000-photo libraries), combines EXIF timestamp clustering with Apple's `VNGenerateImageFeaturePrintRequest` for visual similarity. 109 + 110 + **Step 1: Temporal clustering.** Read `kCGImagePropertyExifDateTimeOriginal` via `CGImageSourceCopyPropertiesAtIndex` (~2–4ms per file, no pixel decoding). Sort by timestamp and cluster using a simple gap threshold. A **10-minute gap** works well for grouping shots from a photo shoot — in ShutterSlim's testing on 35,000 photos, this produced ~5,300 clusters with a median size of 2–3 photos. For burst-shot detection specifically, a 1–5 second threshold identifies rapid-fire sequences. 111 + 112 + **Step 2: Visual similarity within time clusters.** `VNGenerateImageFeaturePrintRequest` generates a dense semantic embedding per image. The critical implementation detail is that **Revision 2** (macOS 14+) produces normalized **768-dimensional** vectors with distances in the 0.0–~2.0 range, while **Revision 1** (macOS 10.15+) produces **2048-dimensional** unnormalized vectors with distances in the 0.0–~40.0 range. Production-tested threshold for Revision 2: **~0.35** for near-duplicate grouping. The `computeDistance(_:to:)` method uses Euclidean distance internally (confirmed by framework decompilation). 113 + 114 + Feature print generation takes **~15–50ms per image** on Apple Silicon — the neural network inference is the bottleneck. For 2,000 images, expect ~30–100 seconds for initial generation, but results should be cached to SQLite or Core Data keyed by photo ID and modification date. Subsequent launches only process new images. 115 + 116 + For a fast pre-filter, **dHash (difference hash)** identifies exact duplicates in under 1ms per image. The CocoaImageHashing library provides a native Swift implementation supporting dHash, pHash, and aHash with built-in data parallelism. A Hamming distance threshold of **2 bits** (128-bit hash) catches compression and resize variants with minimal false positives. This catches trivially identical images before the heavier Vision pipeline runs. 117 + 118 + **CLIP embeddings via Core ML are overkill for duplicate detection.** Apple's own MobileCLIP models are available pre-converted on HuggingFace (`apple/coreml-mobileclip`) and run at 3–10ms per image, but they add 11–173MB to app size and provide cross-modal understanding (text↔image) that a culling app doesn't need. VNFeaturePrintObservation achieves comparable visual similarity detection with zero dependencies. 119 + 120 + **Difficulty: Moderate.** The temporal clustering is trivial. Feature print generation and threshold tuning require experimentation. Caching adds implementation surface. Development estimate: 2–3 weeks. 121 + 122 + --- 123 + 124 + ## The thumbnail pipeline needs a three-tier cache and careful memory management 125 + 126 + Displaying 1,000–2,000 RAW thumbnails smoothly in a SwiftUI `LazyVGrid` is achievable but requires deliberate architecture. The recommended approach uses three tiers: 127 + 128 + **Tier 1 — In-memory cache** via `NSCache` with a 500-item count limit and 100MB total cost limit. `NSCache` is thread-safe and auto-evicts under memory pressure. This serves thumbnails for visible and recently-visible cells. 129 + 130 + **Tier 2 — Disk cache** storing generated thumbnails as JPEG files (~20–50KB each at 0.7 quality, 400px) in the app's Caches directory, keyed by a hash of the source file URL. This survives app restarts. 131 + 132 + **Tier 3 — On-demand extraction** via `CGImageSourceCreateThumbnailAtIndex` from the original RAW file. Use `TaskGroup` with 8–16 concurrent tasks for parallel generation across Apple Silicon's performance cores. 133 + 134 + Realistic benchmarks on M-series chips for 1,000 RAW files: 135 + 136 + | Operation | Per image | 1,000 images (8-way parallel) | 137 + |-----------|-----------|-------------------------------| 138 + | EXIF metadata read | 2–4ms | ~0.3–0.5s | 139 + | Embedded JPEG preview (400px) | 15–50ms | **~2–6s** | 140 + | Full RAW decode (CIRAWFilter) | 50–3,000ms | Minutes (impractical) | 141 + | QLThumbnailGenerator (cached) | <5ms | <1s | 142 + 143 + **LazyVGrid handles 1,000+ items smoothly** when cell views are simple, but it has a critical difference from UICollectionView: **it does not implement cell reuse**. Images loaded into cells that scroll off-screen remain in memory. The fix is explicit: set the image to `nil` in `.onDisappear` and reload in `.task`. Use `kCGImageSourceShouldCache: false` when creating image sources to prevent ImageIO from retaining full decoded images. The `.task` modifier on SwiftUI views automatically cancels when the view disappears, preventing wasted work for cells that scroll off-screen before loading completes. 144 + 145 + For progressive rendering, show a gray placeholder immediately, then the low-resolution embedded thumbnail (~128px, extracted in ~5ms), then the full-quality thumbnail (~512px). `QLThumbnailGenerator.generateRepresentations(for:update:)` provides this natively with three quality tiers (icon → low-quality → full), but direct `CGImageSourceCreateThumbnailAtIndex` is faster for bulk RAW processing because it avoids IPC overhead (QLThumbnailGenerator runs out-of-process). 146 + 147 + **Difficulty: Moderate.** The individual pieces are straightforward, but making the full pipeline feel instant and managing memory correctly across thousands of images requires careful engineering. Development estimate: 2–3 weeks. 148 + 149 + --- 150 + 151 + ## What's easy, what's moderate, and what's hard 152 + 153 + | Feature | Difficulty | Dependencies | Dev estimate | Key APIs | 154 + |---------|-----------|-------------|-------------|----------| 155 + | RAW preview extraction | **Easy** | None (built-in) | 1–2 weeks | `CGImageSourceCreateThumbnailAtIndex`, `CIRAWFilter.previewImage` | 156 + | Keyboard culling UI | **Easy–Moderate** | None | 1–2 weeks | `.onKeyPress` (macOS 14+), `@FocusState`, `.focusable()` | 157 + | XMP sidecar read/write | **Moderate** | None | 1–2 weeks | `XMLDocument`, string templates, `CGImageMetadataCreateFromXMPData` | 158 + | Global blur detection | **Easy** | None | 1 week | `MPSImageLaplacian`, `MPSImageStatisticsMeanAndVariance` | 159 + | Face quality scoring | **Easy** | None | 3–5 days | `VNDetectFaceCaptureQualityRequest` | 160 + | Blink detection | **Moderate** | None | 1 week | `VNDetectFaceLandmarksRequest` + EAR calculation | 161 + | Near-duplicate detection | **Moderate** | None | 2–3 weeks | `VNGenerateImageFeaturePrintRequest`, `computeDistance` | 162 + | Shot/scene grouping | **Moderate** | None | 2–3 weeks | EXIF timestamp clustering + feature print similarity | 163 + | Thumbnail pipeline | **Moderate** | None | 2–3 weeks | `CGImageSource`, `NSCache`, `LazyVGrid`, `TaskGroup` | 164 + | Aesthetic quality scoring | **Hard** | Core ML model (PhotoAssessment/NIMA) | 2–4 weeks | `coremltools` conversion, `MLModel` inference | 165 + | CLIP-based features | **Hard** | MobileCLIP model (~11–173MB) | 3–4 weeks | `apple/coreml-mobileclip` from HuggingFace | 166 + 167 + **Total estimated development time for a competent Swift developer: 12–20 weeks** for a full-featured MVP, with the core culling workflow (import, display, keyboard-flag, export XMP) achievable in 4–6 weeks. 168 + 169 + --- 170 + 171 + ## Gotchas and recommendations that will save you weeks 172 + 173 + **The pick-flag gap is the biggest workflow surprise.** Lightroom's pick/reject flags are catalog-only and never appear in XMP. Design the app to use `xmp:Rating = -1` for Adobe Bridge–compatible rejection, and document that Lightroom users should use color labels or star ratings as their pick/reject signal. This is a user-education issue, not a technical one. 174 + 175 + **VNFeaturePrint revision differences can silently break duplicate detection.** Revision 1 (macOS 10.15–13) produces 2,048-float vectors with distances ~0–40; Revision 2 (macOS 14+) produces 768-float normalized vectors with distances ~0–2. Thresholds from one revision are meaningless for the other. Pin to a specific revision with `request.revision` or detect the OS version and adjust thresholds accordingly. 176 + 177 + **CR3 files with HDR PQ cause known system issues** on macOS Sequoia 15.1, triggering excessive CPU usage in `ImageThumbnailExtension`. Test with Canon R5 Mark II files specifically. 178 + 179 + **SwiftUI's focus system is fragile on macOS.** Invest in a robust focus management layer early — create a single "keyboard target" view that always reclaims focus after any UI interaction. Consider keeping `NSEvent.addLocalMonitorForEvents` as a fallback even if targeting macOS 14+. 180 + 181 + **Cache feature prints aggressively.** Neural network inference at ~15–50ms per image is the most expensive per-image operation in the pipeline. Store feature print vectors in SQLite (768 floats × 4 bytes = ~3KB per image, or ~6MB for 2,000 images). Re-scan only processes new or modified files. 182 + 183 + **Key WWDC sessions for reference**: "Capture and Process ProRAW Images" (2021, session 10160) for RAW handling; "Demystify SwiftUI Performance" (2023) for grid optimization; "Images and Graphics Best Practices" (2018) for image downsampling; "Optimize your Core ML usage" (2022) for Vision/ML profiling. The Apple sample project "Finding the Sharpest Image in a Sequence of Captured Images" provides a complete Accelerate-based blur detection implementation. 184 + 185 + The bottom line: **every core feature of a Narrative Select competitor can be built with zero third-party dependencies** using ImageIO, Vision, MPS, Core Image, and SwiftUI. The only feature that benefits from an external model is aesthetic quality scoring (NIMA via Core ML), and even that has an open-source pre-converted model available in the PhotoAssessment GitHub project.
+11
cull/Assets.xcassets/AccentColor.colorset/Contents.json
··· 1 + { 2 + "colors" : [ 3 + { 4 + "idiom" : "universal" 5 + } 6 + ], 7 + "info" : { 8 + "author" : "xcode", 9 + "version" : 1 10 + } 11 + }
+58
cull/Assets.xcassets/AppIcon.appiconset/Contents.json
··· 1 + { 2 + "images" : [ 3 + { 4 + "idiom" : "mac", 5 + "scale" : "1x", 6 + "size" : "16x16" 7 + }, 8 + { 9 + "idiom" : "mac", 10 + "scale" : "2x", 11 + "size" : "16x16" 12 + }, 13 + { 14 + "idiom" : "mac", 15 + "scale" : "1x", 16 + "size" : "32x32" 17 + }, 18 + { 19 + "idiom" : "mac", 20 + "scale" : "2x", 21 + "size" : "32x32" 22 + }, 23 + { 24 + "idiom" : "mac", 25 + "scale" : "1x", 26 + "size" : "128x128" 27 + }, 28 + { 29 + "idiom" : "mac", 30 + "scale" : "2x", 31 + "size" : "128x128" 32 + }, 33 + { 34 + "idiom" : "mac", 35 + "scale" : "1x", 36 + "size" : "256x256" 37 + }, 38 + { 39 + "idiom" : "mac", 40 + "scale" : "2x", 41 + "size" : "256x256" 42 + }, 43 + { 44 + "idiom" : "mac", 45 + "scale" : "1x", 46 + "size" : "512x512" 47 + }, 48 + { 49 + "idiom" : "mac", 50 + "scale" : "2x", 51 + "size" : "512x512" 52 + } 53 + ], 54 + "info" : { 55 + "author" : "xcode", 56 + "version" : 1 57 + } 58 + }
+6
cull/Assets.xcassets/Contents.json
··· 1 + { 2 + "info" : { 3 + "author" : "xcode", 4 + "version" : 1 5 + } 6 + }
+16
cull/CullApp.swift
··· 1 + import SwiftUI 2 + 3 + @main 4 + struct CullApp: App { 5 + @State private var session = CullSession() 6 + @State private var thumbnailCache = ThumbnailCache() 7 + 8 + var body: some Scene { 9 + WindowGroup { 10 + ContentView() 11 + .environment(session) 12 + .environment(thumbnailCache) 13 + } 14 + .windowStyle(.automatic) 15 + } 16 + }
+132
cull/Models/CullSession.swift
··· 1 + import Foundation 2 + import SwiftUI 3 + 4 + @Observable 5 + final class CullSession { 6 + var sourceFolder: URL? 7 + var groups: [PhotoGroup] = [] 8 + var selectedGroupIndex: Int = 0 9 + var selectedPhotoIndex: Int = 0 10 + 11 + var isImporting: Bool = false 12 + var importProgress: Double = 0 13 + 14 + var selectedGroup: PhotoGroup? { 15 + guard groups.indices.contains(selectedGroupIndex) else { return nil } 16 + return groups[selectedGroupIndex] 17 + } 18 + 19 + var selectedPhoto: Photo? { 20 + guard let group = selectedGroup, 21 + group.photos.indices.contains(selectedPhotoIndex) else { return nil } 22 + return group.photos[selectedPhotoIndex] 23 + } 24 + 25 + var allPhotos: [Photo] { 26 + groups.flatMap(\.photos) 27 + } 28 + 29 + // MARK: - Navigation 30 + 31 + func moveToNextGroup() { 32 + guard !groups.isEmpty else { return } 33 + selectedGroupIndex = (selectedGroupIndex + 1) % groups.count 34 + selectedPhotoIndex = 0 35 + } 36 + 37 + func moveToPreviousGroup() { 38 + guard !groups.isEmpty else { return } 39 + selectedGroupIndex = (selectedGroupIndex - 1 + groups.count) % groups.count 40 + selectedPhotoIndex = 0 41 + } 42 + 43 + func moveToNextPhoto() { 44 + guard let group = selectedGroup else { return } 45 + if selectedPhotoIndex < group.photos.count - 1 { 46 + selectedPhotoIndex += 1 47 + } else { 48 + moveToNextGroup() 49 + } 50 + } 51 + 52 + func moveToPreviousPhoto() { 53 + if selectedPhotoIndex > 0 { 54 + selectedPhotoIndex -= 1 55 + } else { 56 + moveToPreviousGroup() 57 + selectedPhotoIndex = max(0, (selectedGroup?.photos.count ?? 1) - 1) 58 + } 59 + } 60 + 61 + func selectGroup(at index: Int) { 62 + guard groups.indices.contains(index) else { return } 63 + selectedGroupIndex = index 64 + selectedPhotoIndex = 0 65 + } 66 + 67 + func selectPhoto(at index: Int) { 68 + guard let group = selectedGroup, group.photos.indices.contains(index) else { return } 69 + selectedPhotoIndex = index 70 + } 71 + 72 + // MARK: - Lookahead 73 + 74 + /// Returns the next N photos from the current position across group boundaries 75 + func photosAhead(_ count: Int) -> [Photo] { 76 + var result: [Photo] = [] 77 + var gi = selectedGroupIndex 78 + var pi = selectedPhotoIndex + 1 79 + 80 + while result.count < count && gi < groups.count { 81 + let group = groups[gi] 82 + while pi < group.photos.count && result.count < count { 83 + result.append(group.photos[pi]) 84 + pi += 1 85 + } 86 + gi += 1 87 + pi = 0 88 + } 89 + return result 90 + } 91 + 92 + /// Returns the previous N photos from the current position across group boundaries 93 + func photosBehind(_ count: Int) -> [Photo] { 94 + var result: [Photo] = [] 95 + var gi = selectedGroupIndex 96 + var pi = selectedPhotoIndex - 1 97 + 98 + while result.count < count && gi >= 0 { 99 + let group = groups[gi] 100 + while pi >= 0 && result.count < count { 101 + result.append(group.photos[pi]) 102 + pi -= 1 103 + } 104 + gi -= 1 105 + if gi >= 0 { pi = groups[gi].photos.count - 1 } 106 + } 107 + return result 108 + } 109 + 110 + // MARK: - Culling Actions 111 + 112 + func setRating(_ rating: Int) { 113 + guard (1...5).contains(rating) else { return } 114 + selectedPhoto?.rating = rating 115 + } 116 + 117 + func togglePick() { 118 + guard let photo = selectedPhoto else { return } 119 + photo.flag = photo.flag == .pick ? .none : .pick 120 + } 121 + 122 + func toggleReject() { 123 + guard let photo = selectedPhoto else { return } 124 + photo.flag = photo.flag == .reject ? .none : .reject 125 + } 126 + 127 + func clearRatingAndFlag() { 128 + guard let photo = selectedPhoto else { return } 129 + photo.rating = 0 130 + photo.flag = .none 131 + } 132 + }
+49
cull/Models/Photo.swift
··· 1 + import Foundation 2 + import UniformTypeIdentifiers 3 + 4 + enum PhotoFlag: Equatable { 5 + case none 6 + case pick 7 + case reject 8 + } 9 + 10 + @Observable 11 + final class Photo: Identifiable { 12 + let id: UUID 13 + let url: URL 14 + let basename: String 15 + 16 + /// Paired file — e.g. if this is a RAW, pairedURL points to the JPEG (and vice versa) 17 + var pairedURL: URL? 18 + 19 + var rating: Int = 0 // 0 = unrated, 1–5 20 + var flag: PhotoFlag = .none 21 + 22 + // Populated asynchronously by QualityAnalyzer 23 + var blurScore: Double? 24 + var faceQualityScore: Double? 25 + 26 + // Populated by ShotGrouper 27 + var captureDate: Date? 28 + 29 + var isRAW: Bool { 30 + guard let utType = UTType(filenameExtension: url.pathExtension) else { return false } 31 + return utType.conforms(to: .rawImage) 32 + } 33 + 34 + var isJPEG: Bool { 35 + guard let utType = UTType(filenameExtension: url.pathExtension) else { return false } 36 + return utType.conforms(to: .jpeg) 37 + } 38 + 39 + init(url: URL) { 40 + self.id = UUID() 41 + self.url = url 42 + self.basename = url.deletingPathExtension().lastPathComponent 43 + } 44 + } 45 + 46 + extension Photo: Hashable { 47 + static func == (lhs: Photo, rhs: Photo) -> Bool { lhs.id == rhs.id } 48 + func hash(into hasher: inout Hasher) { hasher.combine(id) } 49 + }
+23
cull/Models/PhotoGroup.swift
··· 1 + import Foundation 2 + 3 + @Observable 4 + final class PhotoGroup: Identifiable { 5 + let id: UUID 6 + var photos: [Photo] 7 + 8 + var representativePhoto: Photo? { photos.first } 9 + 10 + var earliestDate: Date? { 11 + photos.compactMap(\.captureDate).min() 12 + } 13 + 14 + init(photos: [Photo]) { 15 + self.id = UUID() 16 + self.photos = photos 17 + } 18 + } 19 + 20 + extension PhotoGroup: Hashable { 21 + static func == (lhs: PhotoGroup, rhs: PhotoGroup) -> Bool { lhs.id == rhs.id } 22 + func hash(into hasher: inout Hasher) { hasher.combine(id) } 23 + }
+86
cull/Services/PhotoExporter.swift
··· 1 + import Foundation 2 + 3 + enum ExportFileType: String, CaseIterable, Identifiable { 4 + case raw = "RAW Only" 5 + case jpeg = "JPEG Only" 6 + case both = "RAW + JPEG" 7 + 8 + var id: String { rawValue } 9 + } 10 + 11 + enum ExportMode: String, CaseIterable, Identifiable { 12 + case copy = "Copy" 13 + case move = "Move" 14 + 15 + var id: String { rawValue } 16 + } 17 + 18 + struct ExportOptions { 19 + var destination: URL 20 + var fileType: ExportFileType = .both 21 + var mode: ExportMode = .copy 22 + var minimumRating: Int = 1 // export photos rated >= this 23 + var includePickedOnly: Bool = false 24 + } 25 + 26 + struct ExportResult { 27 + let exported: Int 28 + let skipped: Int 29 + let errors: [String] 30 + } 31 + 32 + struct PhotoExporter { 33 + static func export(photos: [Photo], options: ExportOptions) async throws -> ExportResult { 34 + let fm = FileManager.default 35 + try fm.createDirectory(at: options.destination, withIntermediateDirectories: true) 36 + 37 + var exported = 0 38 + var skipped = 0 39 + var errors: [String] = [] 40 + 41 + let eligible = photos.filter { photo in 42 + if photo.flag == .reject { return false } 43 + if options.includePickedOnly { return photo.flag == .pick } 44 + return photo.rating >= options.minimumRating 45 + } 46 + 47 + for photo in eligible { 48 + let urlsToExport = urlsForExport(photo: photo, fileType: options.fileType) 49 + 50 + for sourceURL in urlsToExport { 51 + let destURL = options.destination.appendingPathComponent(sourceURL.lastPathComponent) 52 + do { 53 + if fm.fileExists(atPath: destURL.path) { 54 + try fm.removeItem(at: destURL) 55 + } 56 + switch options.mode { 57 + case .copy: 58 + try fm.copyItem(at: sourceURL, to: destURL) 59 + case .move: 60 + try fm.moveItem(at: sourceURL, to: destURL) 61 + } 62 + exported += 1 63 + } catch { 64 + errors.append("\(sourceURL.lastPathComponent): \(error.localizedDescription)") 65 + } 66 + } 67 + 68 + skipped += urlsToExport.isEmpty ? 1 : 0 69 + } 70 + 71 + return ExportResult(exported: exported, skipped: skipped, errors: errors) 72 + } 73 + 74 + private static func urlsForExport(photo: Photo, fileType: ExportFileType) -> [URL] { 75 + switch fileType { 76 + case .both: 77 + var urls = [photo.url] 78 + if let paired = photo.pairedURL { urls.append(paired) } 79 + return urls 80 + case .raw: 81 + return photo.isRAW ? [photo.url] : (photo.pairedURL.map { [$0] } ?? []) 82 + case .jpeg: 83 + return photo.isJPEG ? [photo.url] : (photo.pairedURL.map { [$0] } ?? []) 84 + } 85 + } 86 + }
+106
cull/Services/PhotoImporter.swift
··· 1 + import Foundation 2 + import ImageIO 3 + import UniformTypeIdentifiers 4 + 5 + struct PhotoImporter { 6 + static let supportedExtensions: Set<String> = [ 7 + "cr2", "cr3", "arw", "nef", "dng", "raf", "orf", "rw2", 8 + "jpg", "jpeg", "heic", "heif", "tiff", "tif", "png" 9 + ] 10 + 11 + struct ImportResult { 12 + let photos: [Photo] 13 + let paired: Int // count of RAW+JPEG pairs found 14 + } 15 + 16 + static func importFolder(_ url: URL) async throws -> ImportResult { 17 + let resourceKeys: Set<URLResourceKey> = [.isRegularFileKey, .contentTypeKey] 18 + guard let enumerator = FileManager.default.enumerator( 19 + at: url, 20 + includingPropertiesForKeys: Array(resourceKeys), 21 + options: [.skipsHiddenFiles, .skipsPackageDescendants] 22 + ) else { 23 + throw ImportError.cannotReadFolder 24 + } 25 + 26 + var filesByBasename: [String: [URL]] = [:] 27 + var allURLs: [URL] = [] 28 + 29 + let urls: [URL] = enumerator.compactMap { $0 as? URL } 30 + for fileURL in urls { 31 + let ext = fileURL.pathExtension.lowercased() 32 + guard supportedExtensions.contains(ext) else { continue } 33 + allURLs.append(fileURL) 34 + let basename = fileURL.deletingPathExtension().lastPathComponent 35 + filesByBasename[basename, default: []].append(fileURL) 36 + } 37 + 38 + // Build photo objects (no I/O yet) 39 + var photos: [Photo] = [] 40 + var pairedCount = 0 41 + var processed: Set<URL> = [] 42 + 43 + for (_, urls) in filesByBasename { 44 + let rawURLs = urls.filter { isRAWExtension($0.pathExtension) } 45 + let jpegURLs = urls.filter { isJPEGExtension($0.pathExtension) } 46 + 47 + if let rawURL = rawURLs.first, let jpegURL = jpegURLs.first { 48 + let photo = Photo(url: rawURL) 49 + photo.pairedURL = jpegURL 50 + photos.append(photo) 51 + processed.insert(rawURL) 52 + processed.insert(jpegURL) 53 + pairedCount += 1 54 + } 55 + 56 + for url in urls where !processed.contains(url) { 57 + let photo = Photo(url: url) 58 + photos.append(photo) 59 + processed.insert(url) 60 + } 61 + } 62 + 63 + // Read EXIF dates sequentially (header-only reads are fast, ~1ms each) 64 + for photo in photos { 65 + let dateURL = photo.pairedURL ?? photo.url 66 + photo.captureDate = readCaptureDate(from: dateURL) 67 + } 68 + 69 + photos.sort { ($0.captureDate ?? .distantPast) < ($1.captureDate ?? .distantPast) } 70 + 71 + return ImportResult(photos: photos, paired: pairedCount) 72 + } 73 + 74 + nonisolated static func readCaptureDate(from url: URL) -> Date? { 75 + guard let source = CGImageSourceCreateWithURL(url as CFURL, nil), 76 + let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [String: Any], 77 + let exif = properties[kCGImagePropertyExifDictionary as String] as? [String: Any], 78 + let dateString = exif[kCGImagePropertyExifDateTimeOriginal as String] as? String 79 + else { return nil } 80 + 81 + let formatter = DateFormatter() 82 + formatter.dateFormat = "yyyy:MM:dd HH:mm:ss" 83 + formatter.locale = Locale(identifier: "en_US_POSIX") 84 + return formatter.date(from: dateString) 85 + } 86 + 87 + private static func isRAWExtension(_ ext: String) -> Bool { 88 + let raw: Set<String> = ["cr2", "cr3", "arw", "nef", "dng", "raf", "orf", "rw2"] 89 + return raw.contains(ext.lowercased()) 90 + } 91 + 92 + private static func isJPEGExtension(_ ext: String) -> Bool { 93 + let jpeg: Set<String> = ["jpg", "jpeg"] 94 + return jpeg.contains(ext.lowercased()) 95 + } 96 + } 97 + 98 + enum ImportError: LocalizedError { 99 + case cannotReadFolder 100 + 101 + var errorDescription: String? { 102 + switch self { 103 + case .cannotReadFolder: "Could not read the selected folder." 104 + } 105 + } 106 + }
+119
cull/Services/QualityAnalyzer.swift
··· 1 + import CoreImage 2 + import Metal 3 + import MetalPerformanceShaders 4 + import Vision 5 + 6 + struct QualityAnalyzer { 7 + static func analyzeBlur(imageURL: URL) async -> Double? { 8 + guard let device = MTLCreateSystemDefaultDevice() else { return nil } 9 + 10 + let ciImage: CIImage? 11 + if let source = CGImageSourceCreateWithURL(imageURL as CFURL, nil) { 12 + let options: [CFString: Any] = [ 13 + kCGImageSourceCreateThumbnailFromImageIfAbsent: true, 14 + kCGImageSourceThumbnailMaxPixelSize: 512, 15 + kCGImageSourceShouldCache: false, 16 + kCGImageSourceCreateThumbnailWithTransform: true 17 + ] 18 + if let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, options as CFDictionary) { 19 + ciImage = CIImage(cgImage: cgImage) 20 + } else { 21 + ciImage = nil 22 + } 23 + } else { 24 + ciImage = nil 25 + } 26 + 27 + guard let ci = ciImage, 28 + let cgImage = CIContext().createCGImage(ci, from: ci.extent) 29 + else { return nil } 30 + 31 + let width = cgImage.width 32 + let height = cgImage.height 33 + 34 + let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor( 35 + pixelFormat: .r32Float, width: width, height: height, mipmapped: false 36 + ) 37 + textureDescriptor.usage = [.shaderRead, .shaderWrite] 38 + 39 + guard let sourceTexture = device.makeTexture(descriptor: textureDescriptor), 40 + let laplacianTexture = device.makeTexture(descriptor: textureDescriptor) 41 + else { return nil } 42 + 43 + // Convert to grayscale float texture 44 + let colorSpace = CGColorSpaceCreateDeviceGray() 45 + guard let context = CGContext( 46 + data: nil, width: width, height: height, 47 + bitsPerComponent: 32, bytesPerRow: width * 4, 48 + space: colorSpace, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue | CGBitmapInfo.floatComponents.rawValue | CGBitmapInfo.byteOrder32Little.rawValue).rawValue 49 + ) else { return nil } 50 + context.draw(cgImage, in: CGRect(x: 0, y: 0, width: width, height: height)) 51 + 52 + guard let data = context.data else { return nil } 53 + sourceTexture.replace( 54 + region: MTLRegionMake2D(0, 0, width, height), 55 + mipmapLevel: 0, 56 + withBytes: data, 57 + bytesPerRow: width * 4 58 + ) 59 + 60 + // Laplacian + variance 61 + guard let commandQueue = device.makeCommandQueue(), 62 + let commandBuffer = commandQueue.makeCommandBuffer() 63 + else { return nil } 64 + 65 + let laplacian = MPSImageLaplacian(device: device) 66 + laplacian.encode(commandBuffer: commandBuffer, sourceTexture: sourceTexture, destinationTexture: laplacianTexture) 67 + 68 + let varianceDesc = MTLTextureDescriptor.texture2DDescriptor( 69 + pixelFormat: .r32Float, width: 2, height: 1, mipmapped: false 70 + ) 71 + varianceDesc.usage = [.shaderRead, .shaderWrite] 72 + guard let varianceTexture = device.makeTexture(descriptor: varianceDesc) else { return nil } 73 + 74 + let stats = MPSImageStatisticsMeanAndVariance(device: device) 75 + stats.encode(commandBuffer: commandBuffer, sourceTexture: laplacianTexture, destinationTexture: varianceTexture) 76 + 77 + commandBuffer.commit() 78 + await commandBuffer.completed() 79 + 80 + var result = [Float](repeating: 0, count: 2) 81 + varianceTexture.getBytes( 82 + &result, 83 + bytesPerRow: 8, 84 + from: MTLRegionMake2D(0, 0, 2, 1), 85 + mipmapLevel: 0 86 + ) 87 + 88 + return Double(result[1]) // variance = sharpness score 89 + } 90 + 91 + static func analyzeFaceQuality(imageURL: URL) async -> Double? { 92 + guard let source = CGImageSourceCreateWithURL(imageURL as CFURL, nil) else { return nil } 93 + let options: [CFString: Any] = [ 94 + kCGImageSourceCreateThumbnailFromImageIfAbsent: true, 95 + kCGImageSourceThumbnailMaxPixelSize: 1024, 96 + kCGImageSourceShouldCache: false, 97 + kCGImageSourceCreateThumbnailWithTransform: true 98 + ] 99 + guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, options as CFDictionary) else { return nil } 100 + 101 + let request = VNDetectFaceCaptureQualityRequest() 102 + let handler = VNImageRequestHandler(cgImage: cgImage, options: [:]) 103 + try? handler.perform([request]) 104 + 105 + guard let results = request.results, !results.isEmpty else { return nil } 106 + return results.map { Double($0.faceCaptureQuality ?? 0) }.max() 107 + } 108 + 109 + static func analyze(photo: Photo) async { 110 + async let blur = analyzeBlur(imageURL: photo.pairedURL ?? photo.url) 111 + async let face = analyzeFaceQuality(imageURL: photo.pairedURL ?? photo.url) 112 + 113 + let (blurResult, faceResult) = await (blur, face) 114 + await MainActor.run { 115 + photo.blurScore = blurResult 116 + photo.faceQualityScore = faceResult 117 + } 118 + } 119 + }
+194
cull/Services/ShotGrouper.swift
··· 1 + import Vision 2 + import ImageIO 3 + 4 + struct ShotGrouper { 5 + /// Time gap threshold for temporal clustering (seconds) 6 + static let timeGapThreshold: TimeInterval = 30 7 + 8 + /// Threshold for merging adjacent temporal clusters (seconds) 9 + /// Groups within this time window get merged if visually similar 10 + static let mergeTimeThreshold: TimeInterval = 5 11 + 12 + /// Feature print distance threshold for visual similarity (Revision 2, macOS 14+) 13 + static let similarityThreshold: Float = 0.35 14 + 15 + /// Full grouping: temporal + visual similarity + merge close shots 16 + static func group(photos: [Photo], progress: (@Sendable (Double) async -> Void)? = nil) async -> [PhotoGroup] { 17 + guard !photos.isEmpty else { return [] } 18 + 19 + // Step 1: Temporal clustering 20 + let timeClusters = clusterByTime(photos) 21 + let totalWork = Double(photos.count) 22 + var completed = 0.0 23 + 24 + // Step 2: Generate feature prints for all photos 25 + var featurePrintMap: [UUID: VNFeaturePrintObservation] = [:] 26 + await withTaskGroup(of: (UUID, VNFeaturePrintObservation?).self) { group in 27 + for photo in photos { 28 + let id = photo.id 29 + group.addTask { 30 + let fp = await generateFeaturePrint(for: photo) 31 + return (id, fp) 32 + } 33 + } 34 + for await (id, fp) in group { 35 + if let fp { featurePrintMap[id] = fp } 36 + completed += 1 37 + if let progress { 38 + await progress(completed / totalWork) 39 + } 40 + } 41 + } 42 + 43 + // Step 3: Sub-cluster by visual similarity within each time cluster 44 + var groups: [PhotoGroup] = [] 45 + for cluster in timeClusters { 46 + if cluster.count <= 1 { 47 + groups.append(PhotoGroup(photos: cluster)) 48 + continue 49 + } 50 + 51 + let fps = cluster.compactMap { photo -> (Photo, VNFeaturePrintObservation)? in 52 + guard let fp = featurePrintMap[photo.id] else { return nil } 53 + return (photo, fp) 54 + } 55 + 56 + if fps.isEmpty { 57 + groups.append(PhotoGroup(photos: cluster)) 58 + continue 59 + } 60 + 61 + let subGroups = clusterByVisualSimilarity(fps, allPhotos: cluster) 62 + groups.append(contentsOf: subGroups) 63 + } 64 + 65 + // Step 4: Merge adjacent groups that are very close in time AND visually similar 66 + groups = mergeAdjacentGroups(groups, featurePrintMap: featurePrintMap) 67 + 68 + return groups 69 + } 70 + 71 + private static func clusterByTime(_ photos: [Photo]) -> [[Photo]] { 72 + let sorted = photos.sorted { ($0.captureDate ?? .distantPast) < ($1.captureDate ?? .distantPast) } 73 + var clusters: [[Photo]] = [] 74 + var current: [Photo] = [] 75 + 76 + for photo in sorted { 77 + if let last = current.last, 78 + let lastDate = last.captureDate, 79 + let thisDate = photo.captureDate, 80 + thisDate.timeIntervalSince(lastDate) > timeGapThreshold { 81 + clusters.append(current) 82 + current = [] 83 + } 84 + current.append(photo) 85 + } 86 + if !current.isEmpty { clusters.append(current) } 87 + return clusters 88 + } 89 + 90 + private static func clusterByVisualSimilarity( 91 + _ featurePrints: [(Photo, VNFeaturePrintObservation)], 92 + allPhotos: [Photo] 93 + ) -> [PhotoGroup] { 94 + var assigned = Set<UUID>() 95 + var groups: [PhotoGroup] = [] 96 + 97 + for (i, (photo, fp)) in featurePrints.enumerated() { 98 + guard !assigned.contains(photo.id) else { continue } 99 + 100 + var cluster = [photo] 101 + assigned.insert(photo.id) 102 + 103 + for j in (i + 1)..<featurePrints.count { 104 + let (otherPhoto, otherFP) = featurePrints[j] 105 + guard !assigned.contains(otherPhoto.id) else { continue } 106 + 107 + var distance: Float = 0 108 + try? fp.computeDistance(&distance, to: otherFP) 109 + 110 + if distance < similarityThreshold { 111 + cluster.append(otherPhoto) 112 + assigned.insert(otherPhoto.id) 113 + } 114 + } 115 + 116 + groups.append(PhotoGroup(photos: cluster)) 117 + } 118 + 119 + // Add any photos that failed feature print generation 120 + let ungrouped = allPhotos.filter { !assigned.contains($0.id) } 121 + if !ungrouped.isEmpty { 122 + groups.append(PhotoGroup(photos: ungrouped)) 123 + } 124 + 125 + return groups 126 + } 127 + 128 + /// Merge adjacent groups if they're within mergeTimeThreshold and visually similar 129 + private static func mergeAdjacentGroups( 130 + _ groups: [PhotoGroup], 131 + featurePrintMap: [UUID: VNFeaturePrintObservation] 132 + ) -> [PhotoGroup] { 133 + guard groups.count > 1 else { return groups } 134 + 135 + var merged: [PhotoGroup] = [groups[0]] 136 + 137 + for i in 1..<groups.count { 138 + let current = groups[i] 139 + let previous = merged[merged.count - 1] 140 + 141 + let shouldMerge = areGroupsCloseInTime(previous, current) && 142 + areGroupsVisuallySimilar(previous, current, featurePrintMap: featurePrintMap) 143 + 144 + if shouldMerge { 145 + // Merge into previous 146 + previous.photos.append(contentsOf: current.photos) 147 + } else { 148 + merged.append(current) 149 + } 150 + } 151 + 152 + return merged 153 + } 154 + 155 + private static func areGroupsCloseInTime(_ a: PhotoGroup, _ b: PhotoGroup) -> Bool { 156 + guard let aLast = a.photos.last?.captureDate, 157 + let bFirst = b.photos.first?.captureDate else { return false } 158 + return abs(bFirst.timeIntervalSince(aLast)) <= mergeTimeThreshold 159 + } 160 + 161 + private static func areGroupsVisuallySimilar( 162 + _ a: PhotoGroup, 163 + _ b: PhotoGroup, 164 + featurePrintMap: [UUID: VNFeaturePrintObservation] 165 + ) -> Bool { 166 + // Compare representative photos (first of each group) 167 + guard let aRep = a.photos.first, let bRep = b.photos.first, 168 + let aFP = featurePrintMap[aRep.id], let bFP = featurePrintMap[bRep.id] 169 + else { return false } 170 + 171 + var distance: Float = 0 172 + try? aFP.computeDistance(&distance, to: bFP) 173 + return distance < similarityThreshold 174 + } 175 + 176 + private static func generateFeaturePrint(for photo: Photo) async -> VNFeaturePrintObservation? { 177 + let url = photo.pairedURL ?? photo.url 178 + guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil } 179 + 180 + let options: [CFString: Any] = [ 181 + kCGImageSourceCreateThumbnailFromImageIfAbsent: true, 182 + kCGImageSourceThumbnailMaxPixelSize: 512, 183 + kCGImageSourceShouldCache: false, 184 + kCGImageSourceCreateThumbnailWithTransform: true 185 + ] 186 + guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, options as CFDictionary) else { return nil } 187 + 188 + let request = VNGenerateImageFeaturePrintRequest() 189 + let handler = VNImageRequestHandler(cgImage: cgImage, options: [:]) 190 + try? handler.perform([request]) 191 + 192 + return request.results?.first 193 + } 194 + }
+306
cull/Services/ThumbnailCache.swift
··· 1 + import AppKit 2 + import CryptoKit 3 + import ImageIO 4 + 5 + @MainActor @Observable 6 + final class ThumbnailCache { 7 + private let memoryCache = NSCache<NSString, NSImage>() 8 + private let previewCache = NSCache<NSString, NSImage>() 9 + private var previewKeys = Set<String>() 10 + private let diskCacheURL: URL 11 + private let maxPixelSize: Int 12 + 13 + init(maxPixelSize: Int = 400) { 14 + self.maxPixelSize = maxPixelSize 15 + self.diskCacheURL = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask)[0] 16 + .appendingPathComponent("sh.dunkirk.Cull.thumbnails", isDirectory: true) 17 + 18 + memoryCache.countLimit = 500 19 + memoryCache.totalCostLimit = 100 * 1024 * 1024 // 100 MB 20 + 21 + previewCache.countLimit = 110 22 + previewCache.totalCostLimit = 512 * 1024 * 1024 // 512 MB 23 + 24 + try? FileManager.default.createDirectory(at: diskCacheURL, withIntermediateDirectories: true) 25 + } 26 + 27 + // MARK: - Synchronous lookups (instant, memory only) 28 + 29 + func cachedThumbnail(for photo: Photo) -> NSImage? { 30 + memoryCache.object(forKey: photo.url.absoluteString as NSString) 31 + } 32 + 33 + func cachedPreview(for photo: Photo) -> NSImage? { 34 + previewCache.object(forKey: photo.url.absoluteString as NSString) 35 + } 36 + 37 + // MARK: - Async loading 38 + 39 + func thumbnail(for photo: Photo) async -> NSImage? { 40 + let key = photo.url.absoluteString 41 + let sourceURL = photo.pairedURL ?? photo.url 42 + 43 + if let cached = memoryCache.object(forKey: key as NSString) { 44 + return cached 45 + } 46 + 47 + let diskPath = diskCacheURL.appendingPathComponent(stableDiskKey(for: photo.url)) 48 + let pixelSize = maxPixelSize 49 + 50 + let image: NSImage? = await Task.detached(priority: .userInitiated) { () -> NSImage? in 51 + if let diskImage = NSImage(contentsOf: diskPath) { 52 + return diskImage 53 + } 54 + guard let extracted = Self.extractThumbnailSync(from: sourceURL, maxPixelSize: pixelSize) else { return nil } 55 + Self.saveToDisk(extracted, at: diskPath) 56 + return extracted 57 + }.value 58 + 59 + if let image { 60 + memoryCache.setObject(image, forKey: key as NSString) 61 + } 62 + return image 63 + } 64 + 65 + func previewImage(for photo: Photo) async -> NSImage? { 66 + let key = photo.url.absoluteString 67 + 68 + if let cached = previewCache.object(forKey: key as NSString) { 69 + return cached 70 + } 71 + 72 + let url = photo.pairedURL ?? photo.url 73 + 74 + let image: NSImage? = await Task.detached(priority: .userInitiated) { () -> NSImage? in 75 + Self.loadFullPreviewSync(from: url) 76 + }.value 77 + 78 + if let image { 79 + previewCache.setObject(image, forKey: key as NSString) 80 + previewKeys.insert(key) 81 + } 82 + return image 83 + } 84 + 85 + // MARK: - Preloading 86 + 87 + /// Load all thumbnails into memory, awaiting completion. Reports progress. 88 + func preloadAllThumbnails( 89 + photos: [Photo], 90 + progress: (@Sendable (Double) async -> Void)? = nil 91 + ) async { 92 + let thumbWork: [(String, URL, URL)] = photos.map { photo in 93 + (photo.url.absoluteString, photo.pairedURL ?? photo.url, photo.url) 94 + } 95 + 96 + let totalItems = Double(thumbWork.count) 97 + var completed = 0.0 98 + let pixelSize = maxPixelSize 99 + let diskCache = diskCacheURL 100 + let mc = memoryCache 101 + let batchSize = 8 102 + 103 + for batchStart in stride(from: 0, to: thumbWork.count, by: batchSize) { 104 + let batchEnd = min(batchStart + batchSize, thumbWork.count) 105 + let batch = Array(thumbWork[batchStart..<batchEnd]) 106 + await withTaskGroup(of: (String, NSImage?).self) { group in 107 + for (key, sourceURL, photoURL) in batch { 108 + let diskPath = diskCache.appendingPathComponent(Self.stableDiskKey(for: photoURL)) 109 + group.addTask { 110 + if let diskImage = NSImage(contentsOf: diskPath) { 111 + return (key, diskImage) 112 + } 113 + guard let extracted = Self.extractThumbnailSync(from: sourceURL, maxPixelSize: pixelSize) else { 114 + return (key, nil) 115 + } 116 + Self.saveToDisk(extracted, at: diskPath) 117 + return (key, extracted) 118 + } 119 + } 120 + for await (key, image) in group { 121 + if let image { 122 + mc.setObject(image, forKey: key as NSString) 123 + } 124 + completed += 1 125 + if let progress { 126 + await progress(completed / totalItems) 127 + } 128 + } 129 + } 130 + } 131 + } 132 + 133 + func preload(photos: [Photo]) { 134 + let work: [(String, URL, URL)] = photos.compactMap { photo in 135 + let key = photo.url.absoluteString 136 + guard memoryCache.object(forKey: key as NSString) == nil else { return nil } 137 + return (key, photo.pairedURL ?? photo.url, photo.url) 138 + } 139 + guard !work.isEmpty else { return } 140 + 141 + let pixelSize = maxPixelSize 142 + let diskCache = diskCacheURL 143 + let mc = memoryCache 144 + 145 + Task.detached(priority: .utility) { 146 + for batchStart in stride(from: 0, to: work.count, by: 8) { 147 + let batch = Array(work[batchStart..<min(batchStart + 8, work.count)]) 148 + await withTaskGroup(of: (String, NSImage?).self) { group in 149 + for (key, sourceURL, photoURL) in batch { 150 + let diskPath = diskCache.appendingPathComponent(Self.stableDiskKey(for: photoURL)) 151 + group.addTask { 152 + if let diskImage = NSImage(contentsOf: diskPath) { 153 + return (key, diskImage) 154 + } 155 + guard let extracted = Self.extractThumbnailSync(from: sourceURL, maxPixelSize: pixelSize) else { 156 + return (key, nil) 157 + } 158 + Self.saveToDisk(extracted, at: diskPath) 159 + return (key, extracted) 160 + } 161 + } 162 + for await (key, image) in group { 163 + if let image { 164 + await MainActor.run { mc.setObject(image, forKey: key as NSString) } 165 + } 166 + } 167 + } 168 + } 169 + } 170 + } 171 + 172 + /// Awaitable: load previews and report progress. Used during import. 173 + func preloadAllPreviews( 174 + photos: [Photo], 175 + progress: (@Sendable (Double) async -> Void)? = nil 176 + ) async { 177 + let work: [(String, URL)] = photos.map { photo in 178 + (photo.url.absoluteString, photo.pairedURL ?? photo.url) 179 + } 180 + 181 + let totalItems = Double(work.count) 182 + var completed = 0.0 183 + let pc = previewCache 184 + let batchSize = 4 185 + 186 + for batchStart in stride(from: 0, to: work.count, by: batchSize) { 187 + let batch = Array(work[batchStart..<min(batchStart + batchSize, work.count)]) 188 + await withTaskGroup(of: (String, NSImage?).self) { group in 189 + for (key, url) in batch { 190 + group.addTask { 191 + (key, Self.loadFullPreviewSync(from: url)) 192 + } 193 + } 194 + for await (key, image) in group { 195 + if let image { 196 + pc.setObject(image, forKey: key as NSString) 197 + previewKeys.insert(key) 198 + } 199 + completed += 1 200 + if let progress { 201 + await progress(completed / totalItems) 202 + } 203 + } 204 + } 205 + } 206 + } 207 + 208 + /// Fire-and-forget: preload previews in background. Used during navigation. 209 + func preloadPreviews(photos: [Photo]) { 210 + let work: [(String, URL)] = photos.compactMap { photo in 211 + let key = photo.url.absoluteString 212 + guard previewCache.object(forKey: key as NSString) == nil else { return nil } 213 + return (key, photo.pairedURL ?? photo.url) 214 + } 215 + guard !work.isEmpty else { return } 216 + 217 + let pc = previewCache 218 + 219 + Task.detached(priority: .utility) { 220 + for batchStart in stride(from: 0, to: work.count, by: 4) { 221 + let batch = Array(work[batchStart..<min(batchStart + 4, work.count)]) 222 + await withTaskGroup(of: (String, NSImage?).self) { group in 223 + for (key, url) in batch { 224 + group.addTask { 225 + (key, Self.loadFullPreviewSync(from: url)) 226 + } 227 + } 228 + for await (key, image) in group { 229 + if let image { 230 + await MainActor.run { 231 + pc.setObject(image, forKey: key as NSString) 232 + self.previewKeys.insert(key) 233 + } 234 + } 235 + } 236 + } 237 + } 238 + } 239 + } 240 + 241 + /// Remove previews that are outside the current window 242 + func evictPreviews(keeping photos: [Photo]) { 243 + let keepKeys = Set(photos.map { $0.url.absoluteString }) 244 + for key in previewKeys where !keepKeys.contains(key) { 245 + previewCache.removeObject(forKey: key as NSString) 246 + previewKeys.remove(key) 247 + } 248 + } 249 + 250 + // MARK: - Sync image extraction 251 + 252 + nonisolated private static func extractThumbnailSync(from url: URL, maxPixelSize: Int) -> NSImage? { 253 + let options: [CFString: Any] = [ 254 + kCGImageSourceCreateThumbnailFromImageIfAbsent: true, 255 + kCGImageSourceThumbnailMaxPixelSize: maxPixelSize, 256 + kCGImageSourceShouldCache: false, 257 + kCGImageSourceCreateThumbnailWithTransform: true 258 + ] 259 + guard let source = CGImageSourceCreateWithURL(url as CFURL, nil), 260 + let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, options as CFDictionary) 261 + else { return nil } 262 + return NSImage(cgImage: cgImage, size: NSSize(width: cgImage.width, height: cgImage.height)) 263 + } 264 + 265 + nonisolated private static func loadFullPreviewSync(from url: URL) -> NSImage? { 266 + // Use the thumbnail API with kCGImageSourceCreateThumbnailFromImageAlways 267 + // to force full decode + downscale while respecting EXIF orientation 268 + let options: [CFString: Any] = [ 269 + kCGImageSourceCreateThumbnailFromImageAlways: true, 270 + kCGImageSourceThumbnailMaxPixelSize: 2560, 271 + kCGImageSourceShouldCache: false, 272 + kCGImageSourceCreateThumbnailWithTransform: true 273 + ] 274 + guard let source = CGImageSourceCreateWithURL(url as CFURL, nil), 275 + let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, options as CFDictionary) 276 + else { return nil } 277 + return NSImage(cgImage: cgImage, size: NSSize(width: cgImage.width, height: cgImage.height)) 278 + } 279 + 280 + // MARK: - Utilities 281 + 282 + private func stableDiskKey(for url: URL) -> String { 283 + Self.stableDiskKey(for: url) 284 + } 285 + 286 + nonisolated private static func stableDiskKey(for url: URL) -> String { 287 + let data = Data(url.absoluteString.utf8) 288 + let digest = SHA256.hash(data: data) 289 + return digest.map { String(format: "%02x", $0) }.joined() + ".jpg" 290 + } 291 + 292 + nonisolated private static func saveToDisk(_ image: NSImage, at url: URL) { 293 + guard let tiff = image.tiffRepresentation, 294 + let bitmap = NSBitmapImageRep(data: tiff), 295 + let jpegData = bitmap.representation(using: .jpeg, properties: [.compressionFactor: 0.7]) 296 + else { return } 297 + try? jpegData.write(to: url) 298 + } 299 + 300 + func clearCache() { 301 + memoryCache.removeAllObjects() 302 + previewCache.removeAllObjects() 303 + try? FileManager.default.removeItem(at: diskCacheURL) 304 + try? FileManager.default.createDirectory(at: diskCacheURL, withIntermediateDirectories: true) 305 + } 306 + }
+130
cull/Views/ContentView.swift
··· 1 + import SwiftUI 2 + 3 + struct ContentView: View { 4 + @Environment(CullSession.self) private var session 5 + @State private var showExportSheet = false 6 + @FocusState private var isViewerFocused: Bool 7 + 8 + var body: some View { 9 + Group { 10 + if session.sourceFolder == nil { 11 + ImportView() 12 + } else if session.isImporting { 13 + VStack(spacing: 16) { 14 + Text("Analyzing photos...") 15 + .font(.title3) 16 + .foregroundStyle(.secondary) 17 + ProgressView(value: session.importProgress) 18 + .frame(width: 300) 19 + Text("\(Int(session.importProgress * 100))%") 20 + .font(.caption) 21 + .foregroundStyle(.tertiary) 22 + } 23 + .frame(maxWidth: .infinity, maxHeight: .infinity) 24 + } else if session.groups.isEmpty { 25 + VStack(spacing: 12) { 26 + Image(systemName: "photo.badge.exclamationmark") 27 + .font(.system(size: 48)) 28 + .foregroundStyle(.secondary) 29 + Text("No supported photos found") 30 + .font(.title3) 31 + .foregroundStyle(.secondary) 32 + Button("Choose Another Folder") { session.sourceFolder = nil } 33 + } 34 + .frame(maxWidth: .infinity, maxHeight: .infinity) 35 + } else { 36 + cullingView 37 + } 38 + } 39 + .frame(minWidth: 1000, minHeight: 600) 40 + } 41 + 42 + private var cullingView: some View { 43 + HStack(spacing: 0) { 44 + // Left: Groups column 45 + GroupListView() 46 + .frame(width: 120) 47 + 48 + Divider() 49 + 50 + // Middle: Photos in selected group 51 + GroupDetailView() 52 + .frame(width: 160) 53 + 54 + Divider() 55 + 56 + // Right: Large preview 57 + PhotoViewer() 58 + } 59 + .focusable() 60 + .focused($isViewerFocused) 61 + .focusEffectDisabled() 62 + // Narrative-style: ↑/↓ = photos, ←/→ = scenes/groups 63 + .onKeyPress(.upArrow) { session.moveToPreviousPhoto(); return .handled } 64 + .onKeyPress(.downArrow) { session.moveToNextPhoto(); return .handled } 65 + .onKeyPress(.leftArrow) { session.moveToPreviousGroup(); return .handled } 66 + .onKeyPress(.rightArrow) { session.moveToNextGroup(); return .handled } 67 + .onKeyPress(keys: ["p"]) { _ in session.togglePick(); return .handled } 68 + .onKeyPress(keys: ["x"]) { _ in session.toggleReject(); return .handled } 69 + .onKeyPress(keys: ["0"]) { _ in session.clearRatingAndFlag(); return .handled } 70 + .onKeyPress(characters: .decimalDigits) { press in 71 + if let digit = Int(press.characters), (1...5).contains(digit) { 72 + session.setRating(digit) 73 + return .handled 74 + } 75 + return .ignored 76 + } 77 + .onKeyPress(keys: ["e"]) { _ in showExportSheet = true; return .handled } 78 + .onAppear { isViewerFocused = true } 79 + .onChange(of: session.selectedGroupIndex) { isViewerFocused = true } 80 + .onChange(of: session.selectedPhotoIndex) { isViewerFocused = true } 81 + .sheet(isPresented: $showExportSheet) { 82 + ExportSheet() 83 + } 84 + .toolbar { 85 + ToolbarItem(placement: .automatic) { 86 + HStack(spacing: 4) { 87 + Button { session.togglePick() } label: { 88 + Image(systemName: "checkmark.circle") 89 + } 90 + .help("Pick (P)") 91 + 92 + Button { session.toggleReject() } label: { 93 + Image(systemName: "xmark.circle") 94 + } 95 + .help("Reject (X)") 96 + } 97 + } 98 + 99 + ToolbarItem(placement: .automatic) { 100 + HStack(spacing: 2) { 101 + ForEach(1...5, id: \.self) { star in 102 + Button { session.setRating(star) } label: { 103 + Image(systemName: star <= (session.selectedPhoto?.rating ?? 0) ? "star.fill" : "star") 104 + .foregroundStyle(star <= (session.selectedPhoto?.rating ?? 0) ? .yellow : .secondary) 105 + } 106 + .help("Rate \(star)") 107 + } 108 + } 109 + } 110 + 111 + ToolbarItem(placement: .automatic) { 112 + Spacer() 113 + } 114 + 115 + ToolbarItem(placement: .automatic) { 116 + Button { showExportSheet = true } label: { 117 + Image(systemName: "square.and.arrow.up") 118 + } 119 + .help("Export (E)") 120 + } 121 + 122 + ToolbarItem(placement: .automatic) { 123 + Button { session.sourceFolder = nil } label: { 124 + Image(systemName: "folder") 125 + } 126 + .help("Open Folder") 127 + } 128 + } 129 + } 130 + }
+133
cull/Views/ExportSheet.swift
··· 1 + import SwiftUI 2 + 3 + struct ExportSheet: View { 4 + @Environment(CullSession.self) private var session 5 + @Environment(\.dismiss) private var dismiss 6 + 7 + @State private var fileType: ExportFileType = .both 8 + @State private var exportMode: ExportMode = .copy 9 + @State private var minimumRating: Int = 1 10 + @State private var pickedOnly: Bool = false 11 + @State private var destination: URL? 12 + @State private var isExporting: Bool = false 13 + @State private var result: ExportResult? 14 + 15 + private var eligibleCount: Int { 16 + session.allPhotos.filter { photo in 17 + if pickedOnly && photo.flag != .pick { return false } 18 + if photo.flag == .reject { return false } 19 + return photo.rating >= minimumRating 20 + }.count 21 + } 22 + 23 + var body: some View { 24 + VStack(spacing: 20) { 25 + Text("Export Photos") 26 + .font(.title2.bold()) 27 + 28 + Form { 29 + Picker("File Type", selection: $fileType) { 30 + ForEach(ExportFileType.allCases) { type in 31 + Text(type.rawValue).tag(type) 32 + } 33 + } 34 + 35 + Picker("Mode", selection: $exportMode) { 36 + ForEach(ExportMode.allCases) { mode in 37 + Text(mode.rawValue).tag(mode) 38 + } 39 + } 40 + 41 + Picker("Minimum Rating", selection: $minimumRating) { 42 + Text("All (unrated included)").tag(0) 43 + ForEach(1...5, id: \.self) { rating in 44 + HStack(spacing: 1) { 45 + ForEach(1...rating, id: \.self) { _ in 46 + Image(systemName: "star.fill") 47 + .font(.caption2) 48 + } 49 + } 50 + .tag(rating) 51 + } 52 + } 53 + 54 + Toggle("Picked only", isOn: $pickedOnly) 55 + 56 + HStack { 57 + if let destination { 58 + Text(destination.lastPathComponent) 59 + .lineLimit(1) 60 + .truncationMode(.middle) 61 + } else { 62 + Text("No destination selected") 63 + .foregroundStyle(.secondary) 64 + } 65 + Spacer() 66 + Button("Choose...") { chooseDestination() } 67 + } 68 + } 69 + .formStyle(.grouped) 70 + 71 + Text("\(eligibleCount) photos will be \(exportMode == .move ? "moved" : "copied")") 72 + .foregroundStyle(.secondary) 73 + 74 + if let result { 75 + VStack(spacing: 4) { 76 + Text("Exported \(result.exported) files") 77 + .foregroundStyle(.green) 78 + if !result.errors.isEmpty { 79 + Text("\(result.errors.count) errors") 80 + .foregroundStyle(.red) 81 + } 82 + } 83 + } 84 + 85 + HStack { 86 + Button("Cancel") { dismiss() } 87 + .keyboardShortcut(.cancelAction) 88 + 89 + Button("Export") { runExport() } 90 + .buttonStyle(.borderedProminent) 91 + .disabled(destination == nil || isExporting || eligibleCount == 0) 92 + .keyboardShortcut(.defaultAction) 93 + } 94 + } 95 + .padding(20) 96 + .frame(width: 400) 97 + } 98 + 99 + private func chooseDestination() { 100 + let panel = NSOpenPanel() 101 + panel.canChooseDirectories = true 102 + panel.canChooseFiles = false 103 + panel.canCreateDirectories = true 104 + panel.message = "Choose export destination" 105 + 106 + if panel.runModal() == .OK { 107 + destination = panel.url 108 + } 109 + } 110 + 111 + private func runExport() { 112 + guard let destination else { return } 113 + isExporting = true 114 + 115 + Task { 116 + let options = ExportOptions( 117 + destination: destination, 118 + fileType: fileType, 119 + mode: exportMode, 120 + minimumRating: minimumRating, 121 + includePickedOnly: pickedOnly 122 + ) 123 + let exportResult = try? await PhotoExporter.export( 124 + photos: session.allPhotos, 125 + options: options 126 + ) 127 + await MainActor.run { 128 + result = exportResult 129 + isExporting = false 130 + } 131 + } 132 + } 133 + }
+94
cull/Views/GroupDetailView.swift
··· 1 + import SwiftUI 2 + 3 + struct GroupDetailView: View { 4 + @Environment(CullSession.self) private var session 5 + @Environment(ThumbnailCache.self) private var cache 6 + 7 + var body: some View { 8 + ScrollViewReader { proxy in 9 + ScrollView { 10 + if let group = session.selectedGroup { 11 + LazyVStack(spacing: 2) { 12 + ForEach(Array(group.photos.enumerated()), id: \.element.id) { index, photo in 13 + PhotoThumbnail( 14 + photo: photo, 15 + isSelected: index == session.selectedPhotoIndex 16 + ) 17 + .id(photo.id) 18 + .onTapGesture { 19 + session.selectPhoto(at: index) 20 + } 21 + } 22 + } 23 + .padding(4) 24 + } 25 + } 26 + .onChange(of: session.selectedPhotoIndex) { _, _ in 27 + if let photo = session.selectedPhoto { 28 + withAnimation { 29 + proxy.scrollTo(photo.id, anchor: .center) 30 + } 31 + } 32 + } 33 + } 34 + } 35 + } 36 + 37 + private struct PhotoThumbnail: View { 38 + let photo: Photo 39 + let isSelected: Bool 40 + @Environment(ThumbnailCache.self) private var cache 41 + @State private var thumbnail: NSImage? 42 + 43 + var body: some View { 44 + VStack(spacing: 2) { 45 + ZStack(alignment: .topLeading) { 46 + if let thumbnail { 47 + Image(nsImage: thumbnail) 48 + .resizable() 49 + .aspectRatio(contentMode: .fill) 50 + .frame(width: 148, height: 100) 51 + .clipped() 52 + } else { 53 + Rectangle() 54 + .fill(.quaternary) 55 + .frame(width: 148, height: 100) 56 + } 57 + 58 + // Flag badge 59 + if photo.flag != .none { 60 + Image(systemName: photo.flag == .pick ? "checkmark.circle.fill" : "xmark.circle.fill") 61 + .foregroundStyle(photo.flag == .pick ? .green : .red) 62 + .font(.caption) 63 + .padding(4) 64 + } 65 + } 66 + 67 + // Rating stars 68 + if photo.rating > 0 { 69 + HStack(spacing: 1) { 70 + ForEach(1...5, id: \.self) { star in 71 + Image(systemName: star <= photo.rating ? "star.fill" : "star") 72 + .font(.system(size: 8)) 73 + .foregroundStyle(star <= photo.rating ? Color.yellow : Color.gray) 74 + } 75 + } 76 + } 77 + } 78 + .clipShape(RoundedRectangle(cornerRadius: 6)) 79 + .overlay { 80 + RoundedRectangle(cornerRadius: 6) 81 + .strokeBorder(isSelected ? Color.accentColor : .clear, lineWidth: 2) 82 + } 83 + .opacity(photo.flag == .reject ? 0.5 : 1.0) 84 + .onAppear { 85 + if let cached = cache.cachedThumbnail(for: photo) { 86 + thumbnail = cached 87 + } 88 + } 89 + .task(id: photo.id) { 90 + guard thumbnail == nil else { return } 91 + thumbnail = await cache.thumbnail(for: photo) 92 + } 93 + } 94 + }
+86
cull/Views/GroupListView.swift
··· 1 + import SwiftUI 2 + 3 + struct GroupListView: View { 4 + @Environment(CullSession.self) private var session 5 + @Environment(ThumbnailCache.self) private var cache 6 + 7 + var body: some View { 8 + ScrollViewReader { proxy in 9 + ScrollView { 10 + LazyVStack(spacing: 2) { 11 + ForEach(Array(session.groups.enumerated()), id: \.element.id) { index, group in 12 + GroupThumbnail( 13 + group: group, 14 + index: index, 15 + isSelected: index == session.selectedGroupIndex 16 + ) 17 + .id(group.id) 18 + .onTapGesture { 19 + session.selectGroup(at: index) 20 + } 21 + } 22 + } 23 + .padding(4) 24 + } 25 + .onChange(of: session.selectedGroupIndex) { _, newIndex in 26 + if let group = session.groups[safe: newIndex] { 27 + withAnimation { 28 + proxy.scrollTo(group.id, anchor: .center) 29 + } 30 + } 31 + } 32 + } 33 + } 34 + } 35 + 36 + private struct GroupThumbnail: View { 37 + let group: PhotoGroup 38 + let index: Int 39 + let isSelected: Bool 40 + @Environment(ThumbnailCache.self) private var cache 41 + @State private var thumbnail: NSImage? 42 + 43 + var body: some View { 44 + ZStack(alignment: .bottomTrailing) { 45 + if let thumbnail { 46 + Image(nsImage: thumbnail) 47 + .resizable() 48 + .aspectRatio(contentMode: .fill) 49 + .frame(width: 112, height: 80) 50 + .clipped() 51 + } else { 52 + Rectangle() 53 + .fill(.quaternary) 54 + .frame(width: 112, height: 80) 55 + } 56 + 57 + Text("\(group.photos.count)") 58 + .font(.caption2.bold()) 59 + .padding(.horizontal, 5) 60 + .padding(.vertical, 2) 61 + .background(.ultraThinMaterial, in: Capsule()) 62 + .padding(4) 63 + } 64 + .clipShape(RoundedRectangle(cornerRadius: 6)) 65 + .overlay { 66 + RoundedRectangle(cornerRadius: 6) 67 + .strokeBorder(isSelected ? Color.accentColor : .clear, lineWidth: 2) 68 + } 69 + .onAppear { 70 + guard let photo = group.representativePhoto else { return } 71 + if let cached = cache.cachedThumbnail(for: photo) { 72 + thumbnail = cached 73 + } 74 + } 75 + .task(id: group.representativePhoto?.id) { 76 + guard thumbnail == nil, let photo = group.representativePhoto else { return } 77 + thumbnail = await cache.thumbnail(for: photo) 78 + } 79 + } 80 + } 81 + 82 + extension Collection { 83 + subscript(safe index: Index) -> Element? { 84 + indices.contains(index) ? self[index] : nil 85 + } 86 + }
+153
cull/Views/ImportView.swift
··· 1 + import SwiftUI 2 + import UniformTypeIdentifiers 3 + 4 + struct ImportView: View { 5 + @Environment(CullSession.self) private var session 6 + @Environment(ThumbnailCache.self) private var cache 7 + @State private var isDragging = false 8 + 9 + var body: some View { 10 + VStack(spacing: 20) { 11 + Image(systemName: "photo.on.rectangle.angled") 12 + .font(.system(size: 64)) 13 + .foregroundStyle(.secondary) 14 + 15 + Text("Open a folder of photos to start culling") 16 + .font(.title2) 17 + .foregroundStyle(.secondary) 18 + 19 + Button("Choose Folder") { 20 + openFolder() 21 + } 22 + .buttonStyle(.borderedProminent) 23 + .controlSize(.large) 24 + 25 + Text("or drag a folder here") 26 + .font(.caption) 27 + .foregroundStyle(.tertiary) 28 + } 29 + .frame(maxWidth: .infinity, maxHeight: .infinity) 30 + .background { 31 + RoundedRectangle(cornerRadius: 12) 32 + .strokeBorder(isDragging ? Color.accentColor : Color.clear, lineWidth: 3) 33 + .padding(20) 34 + } 35 + .onDrop(of: [.fileURL], isTargeted: $isDragging) { providers in 36 + guard let provider = providers.first else { return false } 37 + _ = provider.loadObject(ofClass: URL.self) { url, _ in 38 + guard let url, url.hasDirectoryPath else { return } 39 + Task { @MainActor in 40 + startImport(url) 41 + } 42 + } 43 + return true 44 + } 45 + } 46 + 47 + private func openFolder() { 48 + let panel = NSOpenPanel() 49 + panel.canChooseDirectories = true 50 + panel.canChooseFiles = false 51 + panel.allowsMultipleSelection = false 52 + panel.message = "Select a folder containing photos" 53 + 54 + guard panel.runModal() == .OK, let url = panel.url else { return } 55 + startImport(url) 56 + } 57 + 58 + @MainActor 59 + private func startImport(_ url: URL) { 60 + session.sourceFolder = url 61 + session.isImporting = true 62 + session.importProgress = 0.02 // small initial bump so bar is visible 63 + 64 + let s = session 65 + let c = cache 66 + 67 + Task { 68 + do { 69 + let result = try await PhotoImporter.importFolder(url) 70 + 71 + // Feature print grouping — run off main actor 72 + var lastReported = 0.0 73 + let groups = await ShotGrouper.group(photos: result.photos) { p in 74 + let mapped = p * 0.95 75 + guard mapped - lastReported > 0.02 else { return } 76 + lastReported = mapped 77 + await MainActor.run { 78 + withAnimation(.linear(duration: 0.3)) { 79 + s.importProgress = mapped 80 + } 81 + } 82 + } 83 + 84 + // Phase 2: Load thumbnails into memory (95-98%) 85 + let allPhotos = groups.flatMap(\.photos) 86 + var lastCacheReported = 0.95 87 + await c.preloadAllThumbnails(photos: allPhotos) { p in 88 + let mapped = 0.95 + p * 0.03 89 + guard mapped - lastCacheReported > 0.005 else { return } 90 + lastCacheReported = mapped 91 + await MainActor.run { 92 + withAnimation(.linear(duration: 0.2)) { 93 + s.importProgress = mapped 94 + } 95 + } 96 + } 97 + 98 + // Phase 3: Preload first 50 full-res previews (98-100%) 99 + let initialPreviews = Array(allPhotos.prefix(50)) 100 + var lastPreviewReported = 0.98 101 + await c.preloadAllPreviews(photos: initialPreviews) { p in 102 + let mapped = 0.98 + p * 0.02 103 + guard mapped - lastPreviewReported > 0.005 else { return } 104 + lastPreviewReported = mapped 105 + await MainActor.run { 106 + withAnimation(.linear(duration: 0.2)) { 107 + s.importProgress = mapped 108 + } 109 + } 110 + } 111 + 112 + await MainActor.run { 113 + s.importProgress = 1.0 114 + s.groups = groups 115 + s.selectedGroupIndex = 0 116 + s.selectedPhotoIndex = 0 117 + s.isImporting = false 118 + } 119 + 120 + // Quality analysis in background — batched to avoid overwhelming GPU 121 + let analysisWork: [(UUID, URL)] = allPhotos.map { ($0.id, $0.pairedURL ?? $0.url) } 122 + let photosByID: [UUID: Photo] = Dictionary(uniqueKeysWithValues: allPhotos.map { ($0.id, $0) }) 123 + Task.detached(priority: .background) { 124 + for batchStart in stride(from: 0, to: analysisWork.count, by: 4) { 125 + let batch = Array(analysisWork[batchStart..<min(batchStart + 4, analysisWork.count)]) 126 + await withTaskGroup(of: (UUID, Double?, Double?).self) { group in 127 + for (id, url) in batch { 128 + group.addTask { 129 + let blur = await QualityAnalyzer.analyzeBlur(imageURL: url) 130 + let face = await QualityAnalyzer.analyzeFaceQuality(imageURL: url) 131 + return (id, blur, face) 132 + } 133 + } 134 + for await (id, blur, face) in group { 135 + await MainActor.run { 136 + if let photo = photosByID[id] { 137 + photo.blurScore = blur 138 + photo.faceQualityScore = face 139 + } 140 + } 141 + } 142 + } 143 + } 144 + } 145 + } catch { 146 + await MainActor.run { 147 + s.sourceFolder = nil 148 + s.isImporting = false 149 + } 150 + } 151 + } 152 + } 153 + }
+110
cull/Views/PhotoViewer.swift
··· 1 + import SwiftUI 2 + 3 + struct PhotoViewer: View { 4 + @Environment(CullSession.self) private var session 5 + @Environment(ThumbnailCache.self) private var cache 6 + @State private var displayImage: NSImage? 7 + @State private var displayedPhotoID: UUID? 8 + 9 + private let lookaheadCount = 50 10 + private let lookbehindCount = 50 11 + 12 + var body: some View { 13 + ZStack { 14 + Color.black 15 + 16 + if let displayImage { 17 + Image(nsImage: displayImage) 18 + .resizable() 19 + .aspectRatio(contentMode: .fit) 20 + .frame(maxWidth: .infinity, maxHeight: .infinity) 21 + } 22 + 23 + if let photo = session.selectedPhoto { 24 + VStack { 25 + Spacer() 26 + HStack(spacing: 12) { 27 + if photo.flag == .pick { 28 + Label("Pick", systemImage: "checkmark.circle.fill") 29 + .foregroundStyle(.green) 30 + } else if photo.flag == .reject { 31 + Label("Reject", systemImage: "xmark.circle.fill") 32 + .foregroundStyle(.red) 33 + } 34 + 35 + Spacer() 36 + 37 + HStack(spacing: 2) { 38 + ForEach(1...5, id: \.self) { star in 39 + Image(systemName: star <= photo.rating ? "star.fill" : "star") 40 + .foregroundStyle(star <= photo.rating ? .yellow : .white.opacity(0.3)) 41 + } 42 + } 43 + .font(.title3) 44 + 45 + Spacer() 46 + 47 + Text(photo.url.lastPathComponent) 48 + .foregroundStyle(.white.opacity(0.6)) 49 + .font(.caption) 50 + } 51 + .padding(.horizontal, 16) 52 + .padding(.vertical, 10) 53 + .background(.ultraThinMaterial) 54 + } 55 + } 56 + } 57 + .onChange(of: session.selectedPhoto?.id) { 58 + guard let photo = session.selectedPhoto else { 59 + displayImage = nil 60 + displayedPhotoID = nil 61 + return 62 + } 63 + displayedPhotoID = photo.id 64 + // Instant: show whatever we have cached synchronously 65 + if let cached = cache.cachedPreview(for: photo) { 66 + displayImage = cached 67 + } else if let thumb = cache.cachedThumbnail(for: photo) { 68 + displayImage = thumb 69 + } 70 + } 71 + .task(id: session.selectedPhoto?.id) { 72 + guard let photo = session.selectedPhoto else { return } 73 + let photoID = photo.id 74 + 75 + // Load current photo's full-res preview 76 + if cache.cachedPreview(for: photo) == nil { 77 + if let full = await cache.previewImage(for: photo) { 78 + guard displayedPhotoID == photoID else { return } 79 + displayImage = full 80 + } 81 + } 82 + 83 + // Debounce: wait briefly before preloading window 84 + // If user is holding arrow keys, this task gets cancelled before preload fires 85 + guard displayedPhotoID == photoID else { return } 86 + try? await Task.sleep(for: .milliseconds(150)) 87 + guard !Task.isCancelled, displayedPhotoID == photoID else { return } 88 + 89 + let ahead = session.photosAhead(lookaheadCount) 90 + let behind = session.photosBehind(lookbehindCount) 91 + let window = behind + [photo] + ahead 92 + cache.preloadPreviews(photos: window) 93 + cache.evictPreviews(keeping: window) 94 + } 95 + .onAppear { 96 + if let photo = session.selectedPhoto { 97 + displayedPhotoID = photo.id 98 + if let cached = cache.cachedPreview(for: photo) { 99 + displayImage = cached 100 + } else if let thumb = cache.cachedThumbnail(for: photo) { 101 + displayImage = thumb 102 + } 103 + // Preload initial window 104 + let ahead = session.photosAhead(lookaheadCount) 105 + let behind = session.photosBehind(lookbehindCount) 106 + cache.preloadPreviews(photos: behind + [photo] + ahead) 107 + } 108 + } 109 + } 110 + }
+406
cull/cull.xcodeproj/project.pbxproj
··· 1 + // !$*UTF8*$! 2 + { 3 + archiveVersion = 1; 4 + classes = { 5 + }; 6 + objectVersion = 77; 7 + objects = { 8 + 9 + /* Begin PBXBuildFile section */ 10 + 0B0EC2722F72210B004523FA /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 0B0EC2712F72210B004523FA /* Assets.xcassets */; }; 11 + 0B0EC28A2F722491004523FA /* ImportView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2872F722491004523FA /* ImportView.swift */; }; 12 + 0B0EC28B2F722491004523FA /* ExportSheet.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2842F722491004523FA /* ExportSheet.swift */; }; 13 + 0B0EC28C2F722491004523FA /* ShotGrouper.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2802F722491004523FA /* ShotGrouper.swift */; }; 14 + 0B0EC28D2F722491004523FA /* Photo.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC27A2F722491004523FA /* Photo.swift */; }; 15 + 0B0EC28E2F722491004523FA /* GroupListView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2862F722491004523FA /* GroupListView.swift */; }; 16 + 0B0EC28F2F722491004523FA /* GroupDetailView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2852F722491004523FA /* GroupDetailView.swift */; }; 17 + 0B0EC2902F722491004523FA /* PhotoViewer.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2882F722491004523FA /* PhotoViewer.swift */; }; 18 + 0B0EC2912F722491004523FA /* QualityAnalyzer.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC27F2F722491004523FA /* QualityAnalyzer.swift */; }; 19 + 0B0EC2922F722491004523FA /* CullApp.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2782F722491004523FA /* CullApp.swift */; }; 20 + 0B0EC2932F722491004523FA /* ContentView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2832F722491004523FA /* ContentView.swift */; }; 21 + 0B0EC2942F722491004523FA /* CullSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2792F722491004523FA /* CullSession.swift */; }; 22 + 0B0EC2952F722491004523FA /* PhotoGroup.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC27B2F722491004523FA /* PhotoGroup.swift */; }; 23 + 0B0EC2962F722491004523FA /* PhotoImporter.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC27E2F722491004523FA /* PhotoImporter.swift */; }; 24 + 0B0EC2972F722491004523FA /* PhotoExporter.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC27D2F722491004523FA /* PhotoExporter.swift */; }; 25 + 0B0EC2982F722491004523FA /* ThumbnailCache.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0B0EC2812F722491004523FA /* ThumbnailCache.swift */; }; 26 + /* End PBXBuildFile section */ 27 + 28 + /* Begin PBXFileReference section */ 29 + 0B0EC26A2F722109004523FA /* cull.app */ = {isa = PBXFileReference; explicitFileType = wrapper.application; includeInIndex = 0; name = cull.app; path = /Users/kierank/code/personal/cull/cull/build/Debug/cull.app; sourceTree = "<absolute>"; }; 30 + 0B0EC2712F72210B004523FA /* Assets.xcassets */ = {isa = PBXFileReference; lastKnownFileType = folder.assetcatalog; path = Assets.xcassets; sourceTree = "<group>"; }; 31 + 0B0EC2782F722491004523FA /* CullApp.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = CullApp.swift; sourceTree = "<group>"; }; 32 + 0B0EC2792F722491004523FA /* CullSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = CullSession.swift; sourceTree = "<group>"; }; 33 + 0B0EC27A2F722491004523FA /* Photo.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Photo.swift; sourceTree = "<group>"; }; 34 + 0B0EC27B2F722491004523FA /* PhotoGroup.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = PhotoGroup.swift; sourceTree = "<group>"; }; 35 + 0B0EC27D2F722491004523FA /* PhotoExporter.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = PhotoExporter.swift; sourceTree = "<group>"; }; 36 + 0B0EC27E2F722491004523FA /* PhotoImporter.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = PhotoImporter.swift; sourceTree = "<group>"; }; 37 + 0B0EC27F2F722491004523FA /* QualityAnalyzer.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = QualityAnalyzer.swift; sourceTree = "<group>"; }; 38 + 0B0EC2802F722491004523FA /* ShotGrouper.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ShotGrouper.swift; sourceTree = "<group>"; }; 39 + 0B0EC2812F722491004523FA /* ThumbnailCache.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ThumbnailCache.swift; sourceTree = "<group>"; }; 40 + 0B0EC2832F722491004523FA /* ContentView.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ContentView.swift; sourceTree = "<group>"; }; 41 + 0B0EC2842F722491004523FA /* ExportSheet.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ExportSheet.swift; sourceTree = "<group>"; }; 42 + 0B0EC2852F722491004523FA /* GroupDetailView.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = GroupDetailView.swift; sourceTree = "<group>"; }; 43 + 0B0EC2862F722491004523FA /* GroupListView.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = GroupListView.swift; sourceTree = "<group>"; }; 44 + 0B0EC2872F722491004523FA /* ImportView.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ImportView.swift; sourceTree = "<group>"; }; 45 + 0B0EC2882F722491004523FA /* PhotoViewer.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = PhotoViewer.swift; sourceTree = "<group>"; }; 46 + /* End PBXFileReference section */ 47 + 48 + /* Begin PBXFrameworksBuildPhase section */ 49 + 0B0EC2672F722109004523FA /* Frameworks */ = { 50 + isa = PBXFrameworksBuildPhase; 51 + buildActionMask = 2147483647; 52 + files = ( 53 + ); 54 + runOnlyForDeploymentPostprocessing = 0; 55 + }; 56 + /* End PBXFrameworksBuildPhase section */ 57 + 58 + /* Begin PBXGroup section */ 59 + 0B0EC2612F722109004523FA = { 60 + isa = PBXGroup; 61 + children = ( 62 + 0B0EC2712F72210B004523FA /* Assets.xcassets */, 63 + 0B0EC2782F722491004523FA /* CullApp.swift */, 64 + 0B0EC27C2F722491004523FA /* Models */, 65 + 0B0EC2822F722491004523FA /* Services */, 66 + 0B0EC2892F722491004523FA /* Views */, 67 + ); 68 + sourceTree = "<group>"; 69 + }; 70 + 0B0EC27C2F722491004523FA /* Models */ = { 71 + isa = PBXGroup; 72 + children = ( 73 + 0B0EC2792F722491004523FA /* CullSession.swift */, 74 + 0B0EC27A2F722491004523FA /* Photo.swift */, 75 + 0B0EC27B2F722491004523FA /* PhotoGroup.swift */, 76 + ); 77 + path = Models; 78 + sourceTree = "<group>"; 79 + }; 80 + 0B0EC2822F722491004523FA /* Services */ = { 81 + isa = PBXGroup; 82 + children = ( 83 + 0B0EC27D2F722491004523FA /* PhotoExporter.swift */, 84 + 0B0EC27E2F722491004523FA /* PhotoImporter.swift */, 85 + 0B0EC27F2F722491004523FA /* QualityAnalyzer.swift */, 86 + 0B0EC2802F722491004523FA /* ShotGrouper.swift */, 87 + 0B0EC2812F722491004523FA /* ThumbnailCache.swift */, 88 + ); 89 + path = Services; 90 + sourceTree = "<group>"; 91 + }; 92 + 0B0EC2892F722491004523FA /* Views */ = { 93 + isa = PBXGroup; 94 + children = ( 95 + 0B0EC2832F722491004523FA /* ContentView.swift */, 96 + 0B0EC2842F722491004523FA /* ExportSheet.swift */, 97 + 0B0EC2852F722491004523FA /* GroupDetailView.swift */, 98 + 0B0EC2862F722491004523FA /* GroupListView.swift */, 99 + 0B0EC2872F722491004523FA /* ImportView.swift */, 100 + 0B0EC2882F722491004523FA /* PhotoViewer.swift */, 101 + ); 102 + path = Views; 103 + sourceTree = "<group>"; 104 + }; 105 + /* End PBXGroup section */ 106 + 107 + /* Begin PBXNativeTarget section */ 108 + 0B0EC2692F722109004523FA /* cull */ = { 109 + isa = PBXNativeTarget; 110 + buildConfigurationList = 0B0EC2752F72210B004523FA /* Build configuration list for PBXNativeTarget "cull" */; 111 + buildPhases = ( 112 + 0B0EC2662F722109004523FA /* Sources */, 113 + 0B0EC2672F722109004523FA /* Frameworks */, 114 + 0B0EC2682F722109004523FA /* Resources */, 115 + ); 116 + buildRules = ( 117 + ); 118 + dependencies = ( 119 + ); 120 + name = cull; 121 + packageProductDependencies = ( 122 + ); 123 + productName = cull; 124 + productReference = 0B0EC26A2F722109004523FA /* cull.app */; 125 + productType = "com.apple.product-type.application"; 126 + }; 127 + /* End PBXNativeTarget section */ 128 + 129 + /* Begin PBXProject section */ 130 + 0B0EC2622F722109004523FA /* Project object */ = { 131 + isa = PBXProject; 132 + attributes = { 133 + BuildIndependentTargetsInParallel = 1; 134 + LastSwiftUpdateCheck = 2630; 135 + LastUpgradeCheck = 2630; 136 + TargetAttributes = { 137 + 0B0EC2692F722109004523FA = { 138 + CreatedOnToolsVersion = 26.3; 139 + }; 140 + }; 141 + }; 142 + buildConfigurationList = 0B0EC2652F722109004523FA /* Build configuration list for PBXProject "cull" */; 143 + developmentRegion = en; 144 + hasScannedForEncodings = 0; 145 + knownRegions = ( 146 + en, 147 + Base, 148 + ); 149 + mainGroup = 0B0EC2612F722109004523FA; 150 + minimizedProjectReferenceProxies = 1; 151 + preferredProjectObjectVersion = 77; 152 + productRefGroup = 0B0EC2612F722109004523FA; 153 + projectDirPath = ""; 154 + projectRoot = ""; 155 + targets = ( 156 + 0B0EC2692F722109004523FA /* cull */, 157 + ); 158 + }; 159 + /* End PBXProject section */ 160 + 161 + /* Begin PBXResourcesBuildPhase section */ 162 + 0B0EC2682F722109004523FA /* Resources */ = { 163 + isa = PBXResourcesBuildPhase; 164 + buildActionMask = 2147483647; 165 + files = ( 166 + 0B0EC2722F72210B004523FA /* Assets.xcassets in Resources */, 167 + ); 168 + runOnlyForDeploymentPostprocessing = 0; 169 + }; 170 + /* End PBXResourcesBuildPhase section */ 171 + 172 + /* Begin PBXSourcesBuildPhase section */ 173 + 0B0EC2662F722109004523FA /* Sources */ = { 174 + isa = PBXSourcesBuildPhase; 175 + buildActionMask = 2147483647; 176 + files = ( 177 + 0B0EC28A2F722491004523FA /* ImportView.swift in Sources */, 178 + 0B0EC28B2F722491004523FA /* ExportSheet.swift in Sources */, 179 + 0B0EC28C2F722491004523FA /* ShotGrouper.swift in Sources */, 180 + 0B0EC28D2F722491004523FA /* Photo.swift in Sources */, 181 + 0B0EC28E2F722491004523FA /* GroupListView.swift in Sources */, 182 + 0B0EC28F2F722491004523FA /* GroupDetailView.swift in Sources */, 183 + 0B0EC2902F722491004523FA /* PhotoViewer.swift in Sources */, 184 + 0B0EC2912F722491004523FA /* QualityAnalyzer.swift in Sources */, 185 + 0B0EC2922F722491004523FA /* CullApp.swift in Sources */, 186 + 0B0EC2932F722491004523FA /* ContentView.swift in Sources */, 187 + 0B0EC2942F722491004523FA /* CullSession.swift in Sources */, 188 + 0B0EC2952F722491004523FA /* PhotoGroup.swift in Sources */, 189 + 0B0EC2962F722491004523FA /* PhotoImporter.swift in Sources */, 190 + 0B0EC2972F722491004523FA /* PhotoExporter.swift in Sources */, 191 + 0B0EC2982F722491004523FA /* ThumbnailCache.swift in Sources */, 192 + ); 193 + runOnlyForDeploymentPostprocessing = 0; 194 + }; 195 + /* End PBXSourcesBuildPhase section */ 196 + 197 + /* Begin XCBuildConfiguration section */ 198 + 0B0EC2732F72210B004523FA /* Debug */ = { 199 + isa = XCBuildConfiguration; 200 + buildSettings = { 201 + ALWAYS_SEARCH_USER_PATHS = NO; 202 + ASSETCATALOG_COMPILER_GENERATE_SWIFT_ASSET_SYMBOL_EXTENSIONS = YES; 203 + CLANG_ANALYZER_NONNULL = YES; 204 + CLANG_ANALYZER_NUMBER_OBJECT_CONVERSION = YES_AGGRESSIVE; 205 + CLANG_CXX_LANGUAGE_STANDARD = "gnu++20"; 206 + CLANG_ENABLE_MODULES = YES; 207 + CLANG_ENABLE_OBJC_ARC = YES; 208 + CLANG_ENABLE_OBJC_WEAK = YES; 209 + CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES; 210 + CLANG_WARN_BOOL_CONVERSION = YES; 211 + CLANG_WARN_COMMA = YES; 212 + CLANG_WARN_CONSTANT_CONVERSION = YES; 213 + CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES; 214 + CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; 215 + CLANG_WARN_DOCUMENTATION_COMMENTS = YES; 216 + CLANG_WARN_EMPTY_BODY = YES; 217 + CLANG_WARN_ENUM_CONVERSION = YES; 218 + CLANG_WARN_INFINITE_RECURSION = YES; 219 + CLANG_WARN_INT_CONVERSION = YES; 220 + CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES; 221 + CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES; 222 + CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; 223 + CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; 224 + CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER = YES; 225 + CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; 226 + CLANG_WARN_STRICT_PROTOTYPES = YES; 227 + CLANG_WARN_SUSPICIOUS_MOVE = YES; 228 + CLANG_WARN_UNGUARDED_AVAILABILITY = YES_AGGRESSIVE; 229 + CLANG_WARN_UNREACHABLE_CODE = YES; 230 + CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; 231 + COPY_PHASE_STRIP = NO; 232 + DEBUG_INFORMATION_FORMAT = dwarf; 233 + DEVELOPMENT_TEAM = M67B42LX8D; 234 + ENABLE_STRICT_OBJC_MSGSEND = YES; 235 + ENABLE_TESTABILITY = YES; 236 + ENABLE_USER_SCRIPT_SANDBOXING = YES; 237 + GCC_C_LANGUAGE_STANDARD = gnu17; 238 + GCC_DYNAMIC_NO_PIC = NO; 239 + GCC_NO_COMMON_BLOCKS = YES; 240 + GCC_OPTIMIZATION_LEVEL = 0; 241 + GCC_PREPROCESSOR_DEFINITIONS = ( 242 + "DEBUG=1", 243 + "$(inherited)", 244 + ); 245 + GCC_WARN_64_TO_32_BIT_CONVERSION = YES; 246 + GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; 247 + GCC_WARN_UNDECLARED_SELECTOR = YES; 248 + GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; 249 + GCC_WARN_UNUSED_FUNCTION = YES; 250 + GCC_WARN_UNUSED_VARIABLE = YES; 251 + LOCALIZATION_PREFERS_STRING_CATALOGS = YES; 252 + MACOSX_DEPLOYMENT_TARGET = 26.2; 253 + MTL_ENABLE_DEBUG_INFO = INCLUDE_SOURCE; 254 + MTL_FAST_MATH = YES; 255 + ONLY_ACTIVE_ARCH = YES; 256 + SDKROOT = macosx; 257 + SWIFT_ACTIVE_COMPILATION_CONDITIONS = "DEBUG $(inherited)"; 258 + SWIFT_OPTIMIZATION_LEVEL = "-Onone"; 259 + }; 260 + name = Debug; 261 + }; 262 + 0B0EC2742F72210B004523FA /* Release */ = { 263 + isa = XCBuildConfiguration; 264 + buildSettings = { 265 + ALWAYS_SEARCH_USER_PATHS = NO; 266 + ASSETCATALOG_COMPILER_GENERATE_SWIFT_ASSET_SYMBOL_EXTENSIONS = YES; 267 + CLANG_ANALYZER_NONNULL = YES; 268 + CLANG_ANALYZER_NUMBER_OBJECT_CONVERSION = YES_AGGRESSIVE; 269 + CLANG_CXX_LANGUAGE_STANDARD = "gnu++20"; 270 + CLANG_ENABLE_MODULES = YES; 271 + CLANG_ENABLE_OBJC_ARC = YES; 272 + CLANG_ENABLE_OBJC_WEAK = YES; 273 + CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES; 274 + CLANG_WARN_BOOL_CONVERSION = YES; 275 + CLANG_WARN_COMMA = YES; 276 + CLANG_WARN_CONSTANT_CONVERSION = YES; 277 + CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES; 278 + CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; 279 + CLANG_WARN_DOCUMENTATION_COMMENTS = YES; 280 + CLANG_WARN_EMPTY_BODY = YES; 281 + CLANG_WARN_ENUM_CONVERSION = YES; 282 + CLANG_WARN_INFINITE_RECURSION = YES; 283 + CLANG_WARN_INT_CONVERSION = YES; 284 + CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES; 285 + CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES; 286 + CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; 287 + CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; 288 + CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER = YES; 289 + CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; 290 + CLANG_WARN_STRICT_PROTOTYPES = YES; 291 + CLANG_WARN_SUSPICIOUS_MOVE = YES; 292 + CLANG_WARN_UNGUARDED_AVAILABILITY = YES_AGGRESSIVE; 293 + CLANG_WARN_UNREACHABLE_CODE = YES; 294 + CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; 295 + COPY_PHASE_STRIP = NO; 296 + DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym"; 297 + DEVELOPMENT_TEAM = M67B42LX8D; 298 + ENABLE_NS_ASSERTIONS = NO; 299 + ENABLE_STRICT_OBJC_MSGSEND = YES; 300 + ENABLE_USER_SCRIPT_SANDBOXING = YES; 301 + GCC_C_LANGUAGE_STANDARD = gnu17; 302 + GCC_NO_COMMON_BLOCKS = YES; 303 + GCC_WARN_64_TO_32_BIT_CONVERSION = YES; 304 + GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; 305 + GCC_WARN_UNDECLARED_SELECTOR = YES; 306 + GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; 307 + GCC_WARN_UNUSED_FUNCTION = YES; 308 + GCC_WARN_UNUSED_VARIABLE = YES; 309 + LOCALIZATION_PREFERS_STRING_CATALOGS = YES; 310 + MACOSX_DEPLOYMENT_TARGET = 26.2; 311 + MTL_ENABLE_DEBUG_INFO = NO; 312 + MTL_FAST_MATH = YES; 313 + SDKROOT = macosx; 314 + SWIFT_COMPILATION_MODE = wholemodule; 315 + }; 316 + name = Release; 317 + }; 318 + 0B0EC2762F72210B004523FA /* Debug */ = { 319 + isa = XCBuildConfiguration; 320 + buildSettings = { 321 + ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon; 322 + ASSETCATALOG_COMPILER_GLOBAL_ACCENT_COLOR_NAME = AccentColor; 323 + CODE_SIGN_STYLE = Automatic; 324 + COMBINE_HIDPI_IMAGES = YES; 325 + CURRENT_PROJECT_VERSION = 1; 326 + DEVELOPMENT_TEAM = M67B42LX8D; 327 + ENABLE_APP_SANDBOX = YES; 328 + ENABLE_HARDENED_RUNTIME = YES; 329 + ENABLE_PREVIEWS = YES; 330 + ENABLE_USER_SELECTED_FILES = readonly; 331 + GENERATE_INFOPLIST_FILE = YES; 332 + INFOPLIST_KEY_NSHumanReadableCopyright = ""; 333 + LD_RUNPATH_SEARCH_PATHS = ( 334 + "$(inherited)", 335 + "@executable_path/../Frameworks", 336 + ); 337 + MARKETING_VERSION = 1.0; 338 + PRODUCT_BUNDLE_IDENTIFIER = sh.dunkirk.cull; 339 + PRODUCT_NAME = "$(TARGET_NAME)"; 340 + REGISTER_APP_GROUPS = YES; 341 + STRING_CATALOG_GENERATE_SYMBOLS = YES; 342 + SWIFT_APPROACHABLE_CONCURRENCY = YES; 343 + SWIFT_DEFAULT_ACTOR_ISOLATION = MainActor; 344 + SWIFT_EMIT_LOC_STRINGS = YES; 345 + SWIFT_UPCOMING_FEATURE_MEMBER_IMPORT_VISIBILITY = YES; 346 + SWIFT_VERSION = 5.0; 347 + }; 348 + name = Debug; 349 + }; 350 + 0B0EC2772F72210B004523FA /* Release */ = { 351 + isa = XCBuildConfiguration; 352 + buildSettings = { 353 + ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon; 354 + ASSETCATALOG_COMPILER_GLOBAL_ACCENT_COLOR_NAME = AccentColor; 355 + CODE_SIGN_STYLE = Automatic; 356 + COMBINE_HIDPI_IMAGES = YES; 357 + CURRENT_PROJECT_VERSION = 1; 358 + DEVELOPMENT_TEAM = M67B42LX8D; 359 + ENABLE_APP_SANDBOX = YES; 360 + ENABLE_HARDENED_RUNTIME = YES; 361 + ENABLE_PREVIEWS = YES; 362 + ENABLE_USER_SELECTED_FILES = readonly; 363 + GENERATE_INFOPLIST_FILE = YES; 364 + INFOPLIST_KEY_NSHumanReadableCopyright = ""; 365 + LD_RUNPATH_SEARCH_PATHS = ( 366 + "$(inherited)", 367 + "@executable_path/../Frameworks", 368 + ); 369 + MARKETING_VERSION = 1.0; 370 + PRODUCT_BUNDLE_IDENTIFIER = sh.dunkirk.cull; 371 + PRODUCT_NAME = "$(TARGET_NAME)"; 372 + REGISTER_APP_GROUPS = YES; 373 + STRING_CATALOG_GENERATE_SYMBOLS = YES; 374 + SWIFT_APPROACHABLE_CONCURRENCY = YES; 375 + SWIFT_DEFAULT_ACTOR_ISOLATION = MainActor; 376 + SWIFT_EMIT_LOC_STRINGS = YES; 377 + SWIFT_UPCOMING_FEATURE_MEMBER_IMPORT_VISIBILITY = YES; 378 + SWIFT_VERSION = 5.0; 379 + }; 380 + name = Release; 381 + }; 382 + /* End XCBuildConfiguration section */ 383 + 384 + /* Begin XCConfigurationList section */ 385 + 0B0EC2652F722109004523FA /* Build configuration list for PBXProject "cull" */ = { 386 + isa = XCConfigurationList; 387 + buildConfigurations = ( 388 + 0B0EC2732F72210B004523FA /* Debug */, 389 + 0B0EC2742F72210B004523FA /* Release */, 390 + ); 391 + defaultConfigurationIsVisible = 0; 392 + defaultConfigurationName = Release; 393 + }; 394 + 0B0EC2752F72210B004523FA /* Build configuration list for PBXNativeTarget "cull" */ = { 395 + isa = XCConfigurationList; 396 + buildConfigurations = ( 397 + 0B0EC2762F72210B004523FA /* Debug */, 398 + 0B0EC2772F72210B004523FA /* Release */, 399 + ); 400 + defaultConfigurationIsVisible = 0; 401 + defaultConfigurationName = Release; 402 + }; 403 + /* End XCConfigurationList section */ 404 + }; 405 + rootObject = 0B0EC2622F722109004523FA /* Project object */; 406 + }