>You import heavy libraries just to do fuzzy string matching.
So instead of a heavy library for string matching you use a heavy library for a whole SQL engine which includes fuzzy string matching?
Instead of having the AI write JavaScript code that does fuzzy string matching operations, they have it write SQL.
This is an emerging pattern that’s surprisingly powerful: thick clients that embed wasm query engines (pglite, duckdb) and do powerful analytics (with or without AI agents writing the queries).
Below are two examples using duckdb under the hood for similar purposes. Like the author, excited for this type of architecture making semi-advanced analytics more attainable if you’re not a data engineer.
- https://pandas-ai.com/
- https://marimo.io/
We actually got around this by just using Google Sheets as a memory layer, this solves the underlying problem and has other benefits:
- its free
- its remote, so no local resource use and persists
- user can check results after as well as share
Quick demo: https://www.youtube.com/watch?v=Sas8iWMCBA4
Yeah, this is the natural next step. We're currently combining LLMs and compute -- mostly in the form of giving agents tools, then terminal access and now most recently sandboxes. The most logical next step is to give them specialized compute engines and frameworks for their tasks.
I've been building SQL agents recently, and nothing is better than just giving it access to Trino.
There's no explanation of what 3MB in the title means.