Simple S3-like server for development purposes, written in Go
1# gos3dir
2
3A lightweight S3-compatible server that uses a local directory as storage.
4Perfect for local development, testing, and offline S3 workflows.
5_Do not use in production._
6
7## Installation
8
9```bash
10go install tangled.org/juanlu.space/gos3dir@latest
11```
12
13Or build from source:
14
15```bash
16git clone <repository-url>
17cd gos3dir
18go build
19```
20
21## Usage
22
23Start the server by pointing it to a directory:
24
25```bash
26gos3dir /path/to/data/dir
27```
28
29The server listens on `http://localhost:8041`.
30
31## Configure AWS CLI v2
32
33Add a profile to `~/.aws/config`:
34
35```ini
36[profile gos3dir-dev]
37endpoint_url = http://localhost:8041
38s3 =
39 addressing_style = path
40```
41
42Add dummy credentials to `~/.aws/credentials`:
43
44```ini
45[gos3dir-dev]
46aws_access_key_id = test
47aws_secret_access_key = test
48```
49
50## Examples
51
52List buckets:
53```bash
54aws s3 ls --profile gos3dir-dev
55```
56
57Create a bucket:
58```bash
59aws s3 mb s3://my-bucket --profile gos3dir-dev
60```
61
62Upload a file:
63```bash
64aws s3 cp file.txt s3://my-bucket/ --profile gos3dir-dev
65```
66
67List objects in a bucket:
68```bash
69aws s3 ls s3://my-bucket --profile gos3dir-dev
70```
71
72Delete an object:
73```bash
74aws s3 rm s3://my-bucket/file.txt --profile gos3dir-dev
75```
76
77Delete an empty bucket:
78```bash
79aws s3 rb s3://my-bucket --profile gos3dir-dev
80```
81
82## Supported Operations
83
84- List buckets (`GET /`)
85- List objects (`GET /{bucket}`)
86- Download objects (`GET /{bucket}/{key}`)
87- Create bucket (`PUT /{bucket}`)
88- Upload object (`PUT /{bucket}/{key}`)
89- Delete empty bucket (`DELETE /{bucket}`)
90- Delete object (`DELETE /{bucket}/{key}`)
91
92## Lakehouse formats
93
94This has been tested with Polars + Delta Lake:
95
96```
97In [25]: df.write_delta("s3://deltalake/df", storage_options={"AWS_ENDPOINT_URL": "http://localhost:8041", "AWS_ALLOW_HTTP": "true"})
98
99In [26]: pl.read_delta("s3://deltalake/df", storage_options={"AWS_ENDPOINT_URL": "http://localhost:8041", "AWS_ALLOW_HTTP": "true"})
100Out[26]:
101shape: (4, 2)
102┌─────┬─────┐
103│ id ┆ col │
104│ --- ┆ --- │
105│ i64 ┆ str │
106╞═════╪═════╡
107│ 0 ┆ a │
108│ 1 ┆ b │
109│ 2 ┆ c │
110│ 3 ┆ d │
111└─────┴─────┘
112```
113
114## Future work
115
116We would like to fix these at some point:
117- Proper deletion of dangling "directories"
118- Whatever is needed for open table formats (DuckLake, Apache Iceberg, Delta Lake)
119 to work almost perfectly
120- Virtual-hosted-style addressing (for now, only path-style addressing is supported)
121
122If you see more gaps, feel free to open an issue.
123But it might be deemed out of scope (see below).
124
125## Limitations
126
127These are by design and will not be fixed:
128- No authentication or authorization
129- No multipart uploads
130- No versioning
131- No `.` and `..` in full key names
132- No bit-by-bit compatibility with Amazon S3 server responses