CharlesW
7 months ago
Pocketbase is already the poor man's BaaS, and is minimalist compared to the two others mentioned.
> Data stored in human-readable CSVs
The choice to not use a database when two near-perfect tiny candidates exist, and furthermore to choose the notorious CSV format for storing data, is absolutely mystifying. One can use their Wasm builds if platform-specific binaries offend.
SOLAR_FIELDS
7 months ago
I just deployed a wasm built SQLite with FTS5 enabled and it’s insane what it is capable of. It’s basically elasticsearch entirely on the client. It’s not entirely as robust as ES but it’s like 80% of the way there, and I repeat, it runs on the client side on your phone or any other SQLite supported device
tommica
7 months ago
How large of a bundle is it? And are we talking about wikipedia stuffed into sqlite, or only a few hundred pages of internal docs?
SOLAR_FIELDS
7 months ago
I'm using wa-sqlite, and the standalone wasm package is 714kb. The use case is a few hundred pages of internal docs.
bsaul
7 months ago
how large is the wasm package for an empty sqlite, together with the client library to access it ?
SOLAR_FIELDS
7 months ago
the standalone wasm package is 714kb
loeber
7 months ago
In 2025, pretending that a CSV can be a reasonable alternative to a database because it is "smaller" is just wild. Totally unconscionable.
r0fl
7 months ago
I use CSV files to run multiple sites with 40,000+ pages each. Close to 1mil pages total
Super fast
Can’t hack me because those CSV files are stored elsewhere and only pulled on build
Free, ultra fast, no latency. Every alternative I’ve tried is slower and eventually costs money.
CSV files stored on GitHub/vercel/netlify/cloudflare pages can scale to millions of rows for free if divided properly
vineyardmike
7 months ago
Can't argue with what works, but...
All these benefits also apply to SQLite, but SQLite is also typed, indexed, and works with tons of tools and libraries.
It can even be stored as a static file on various serving options mentioned above. Even better, it can be served on a per-page basis, so you can download just the index to the client, who can query for specific chunks of the database, further reducing the bandwidth required to serve.
deepsun
7 months ago
Just to be pedantic, SQLite is not really typed. I'd call them type-hints, like in Python. Their (bad IMHO) arguments for it: https://www.sqlite.org/flextypegood.html
newlisp
7 months ago
fer
7 months ago
> Just to be pedantic, SQLite is not really typed. I'd call them type-hints, like in Python
Someone already chimed in for SQLite, so worth mentioning that Python is hard typed, just dynamic. Everyone has seen TypeError; you'll get that even without hints. It becomes particularly obvious when using Cython, the dynamic part is gone and you have to type your stuff manually. Type hints are indeed hints, but for your IDE, or mypy, or you (for clarity).
It's a bit like saying C++ isn't typed because you can use "auto".
edmundsauto
7 months ago
Don’t you think it’s better in this dimension than CSV though? It seems to me like it’s a strictly better improvement than the other option discussed.
dragonwriter
7 months ago
A sibling comment posted a blind link whose contents address this, but (for the benefit of people who aren't likely to follow such links), recent versions of SQLite support STRICT tables which are rigidly typed, if you have a meed tor that instead of the default loose type affinity system.
ezekiel68
7 months ago
TBH this is why I've never messed with SQLite.
If I want to bother with a SQL database, I at least want the benefit of the physical layer compressing data to the declared types and PostgreSQL scales down surprisingly well to lower-resource (by 2025 standards) environments.
MobiusHorizons
7 months ago
How exactly do you anticipate using Postgres on client? Or are you ignoring the problem statement and saying it’s better to run a backend?
throwaway032023
7 months ago
felipeccastro
7 months ago
Not sure why this was downvoted, but I’d be very interested in learning how well does pglite compares to SQLite (pros and cons of each, maturity, etc)
MobiusHorizons
7 months ago
Interesting. TIL
user
7 months ago
Drew_
7 months ago
It sounds like you use CSVs to build static websites, not store or update any dynamic data. That's not even remotely comparable.
tansan
7 months ago
The way you write this makes it sound like your websites are pulling from the CSV per request. However, you're building static websites and uploading it to a CDN. I don't think SQL is needed here and CSV makes life way easier, but you can swap your CSV with any other storage device in this strategy and it would work the same.
AbraKdabra
7 months ago
So... SQLite with less features basically.
Spivak
7 months ago
Every file format is SQLite with fewer features.
deepsun
7 months ago
Unless it's Apache Arrow or Parquet.
Moto7451
7 months ago
For both fun and profit I’ve used the Parquet extension for SQLite to have the “Yes” answer to the question of “SQLite or Parquet?”
AbraKdabra
7 months ago
jfc is there anything Apache doesn't have a software for?
akudha
7 months ago
Is this a static website? If yes, what do you use to build?
ncruces
7 months ago
In 2020 Tailscale used a JSON file.
CharlesW
7 months ago
If you continue reading, you'll see that they were forced to ditch JSON for a proper key-value database.
ncruces
7 months ago
I know. Now see how far JSON got them.
So why wouldn't you just use a text format to persist a personal website a handful of people might use?
I created one of the SQLite drivers, but why would you bring in a dependency that might not be available in a decade unless you really need it? (SQLite will be there in 2035, but maybe not the current Go drivers)
deepsun
7 months ago
It's self-restriction, like driving a car not using the rear view mirror. Or using "while" loops always instead of "for" loops.
It's great for an extra challenge. Or for writing good literature.
moritzwarhier
7 months ago
You didn't really answer the dependency argument though.
Until the data for a static website becomes large enough to make JSON parsing a bottleneck, where is the problem?
I know, it's not generally suitable to store data for quick access of arbitrary pieces without parsing the whole file.
But if you use it at build time anyway (that's how I read the argument), it's pretty likely that you never will reach this bottleneck that makes you require any DBMS. Your site is static, you don't need to serve any database requests.
There is also huge overhead in powering static websites by a full-blown DBMS, in the worst case serving predictable requests without caching.
So many websites are powered by MySQL while essentially being static... and there are often unnecessarily complicated layers of caching to allow that.
But I'm not arguing against these layers per se (the end result is the same), it's just that, if your ecosystem is already built on JSON as data storage, it might be completely unneeded to pull in another dependency.
Not the same as restricting syntax within one programming language.
jpc0
7 months ago
> SQLite will be there in 2035, but maybe not the current Go drivers
Go binaries are statically linked, unless you expect the elf/pe format to not exist in 2035 your binary will still run just the same.
And if not well there will be an SQLite driver in 2035 and other than 5 lines of init code I don’t interact with the SQLite drover but rather the SQL abstraction in golang.
And if it’s such an issue then directly target the sqlite C api which will also still be there in 2035.
nullishdomain
7 months ago
Not so sure about this. At scale, sure, but how many apps are out there that perform basic CRUD for a few thousand records max and don't need the various benefits and guarantees a DB provides?
makeitdouble
7 months ago
I assume parent's dispair is about CSV's amount of traps and parsing quirks.
I'd also be hard pressed to find any real reason to chose CSV over JSONL for instance. Parsing is fast and utterly standard, it's predictible and if your data is really simple JSONL files will be super simple.
At it's simplest, the difference between a CSV line and a JSON array is 4 characters.
nickjj
7 months ago
If you ignore size as a benefit, CSV files still have a lot of value:
- It's plain text
- It's super easy to diff
- It's a natural fit for saving it in a git repo
- It's searchable using standard tools (grep, etc.)
- It's easy to backup and restore
- You don't need to worry about it getting corrupt
- There are many tools designed to read it to produce X types of outputs
A few months ago I wrote my own CLI driven CSV based income and expense tracker at
https://github.com/nickjj/plutus. It helps me do quartly taxes in a few minutes and I can get an indepth look at my finances on demand in 1 command.My computer built in 2014 can parse 100,000 CSV rows in 560ms which is already 10x more items than I really have. I also spent close to zero effort trying to optimize the script for speed. It's a zero dependency single file Python script using "human idiomatic" code.
Overall I'm very pleased in the decision to use a single CSV file instead of a database.
sieabahlpark
7 months ago
[dead]
skeeter2020
7 months ago
I agree on both your main points. It's not like PB has a bunch of cruft and fat to trim. The BD of the project is very aggressive in constraining scope, which is one of the reasons it's so good. The CSV-thing feels like an academic exercise. The fact I can't open an SQLite database in my text editor is a little thin, considering many tools are lighter weight than text editors, and "reading" a database (any format) is seldom the goal. You probably want to query it so the first thing you need to do here is import the CSV into DuckDB and write a bunch of queries with "WHERE active=1"
waldrews
7 months ago
The append-only, text csv format you can concatenate to from a script, edit or query in a spreadsheet, and that's still fast because of the in-memory pointer cache, seems like a big win (assuming you're in the target scaling category).
bravesoul2
7 months ago
For local use cases this could be useful. Run locally. Do your thing. Edit with Excel or tool of choice.
Also one less dependency.
zffr
7 months ago
What’s the other candidate besides pocketbase?