File over App: A Philosophy for Digital Longevity

57 pointsposted 5 days ago
by surprisetalk

23 Comments

uludag

16 hours ago

I completely agree with this take. There's an immense feeling of calm when your digital life revolves around files you completely own. I've managed my entire productivity around a single org-mode file for the bast five years and it perfectly adapts to my needs. Add on top of that syncthing and I have a robust system I can use on mobile that works perfectly for me.

I feel I could leave and come back in 20 years and still use my org-mode productivity file, Emms[1] playlist, verb[2] HTTP request book, etc. exactly the same as I left off. What would my data in Notion, Postman, etc. all look like in 20 years if left unattended?

[1] https://www.gnu.org/software/emms/ [2] https://github.com/federicotdn/verb

brendyn

16 hours ago

I heard syncthing has recently given up on developing the android app. Unfortunately the filesystem in android has become more and more restrictive so using an app to manage files globally that reside in each individual apps container is becoming more difficult. i onced synced 70k small files to android and it took 2 months to complete due to the virtual filesystem.

reddalo

19 hours ago

This is exactly why I like to document my APIs using Bruno [1] instead of Postman or Insomnia.

Bruno creates plain-text files that I can easily read with any text editor; and as an additional bonus, I can version my files using Git.

[1] https://www.usebruno.com/

rishikeshs

19 hours ago

This is very good. I shall update it on the list!

dzsekijo

16 hours ago

In my previous workplace I used to use Workflowy to keep my to-do list in. I had a script that generated OPML from data pulled from Bugzilla / GH Issues / Jira, which then I could paste into Workflowy and from that on, update it interactively in their web UI. The Workflowy data was daily synced to Dropbox (as OPML) which in turn was synced to my local machine, so I had all the data at hand in usable format even offline.

(OK, OPML is not utterly human friendly as such; actually when I needed to access it locally, I parsed it and used it in an interactive programming environment.)

This was a nice hybrid workflow.

magic_hamster

20 hours ago

Text files are amazing for longevity - however, they are obviously limited in the type of data they can contain (text).

Personally I create a lot of visual data such as images, drawings and video, as well as audio. I try to have everything accessible in the most rudimentary lossless format, but in this domain there are so many tradeoffs. It would be interesting to read (or perhaps write) a similar post for this sort of data.

rishikeshs

19 hours ago

I had the same thought will writing this. I do not have an answer other than plain text.

But in terms of preservation and archiving, I'm thinking of storing the files in binary as a backup.

kelvinjps10

16 hours ago

Just link to the files ? In markdown you can do just link to the files

XorNot

19 hours ago

I have been swinging around to the opinion that SQLite + files is perhaps the universal data format after all. Files give you efficient blob storage, and the SQLite database can in fact encode most types of constraints and structure.

miki123211

17 hours ago

The problem with files is that they usually don't provide what users need or expect in 2024.

Having multiple users editing one file at the same time is hard, especially if they're non-technical and they don't understand git diffs. To make that work, you need CRDTs (or operational transforms), and those can't really be represented nicely in plain text formats.

Even something like a music library, where you have n devices authorized to make changes, each device keeps an offline copy, and all changes get "synced up" when devices get online, is just far, far easier to implement with a servr guarding over a database than with a raw file on some cloud drive.

munhitsu

11 hours ago

CRDT at the very low level is an append only operations log with snapshots. This can be easily stored in a file. The trick is to solve merging with another version you just got from a friend. But then operations are idempotent and appending two operation logs with idempotent operations is simple.

Yes, ideally OS would provide a container that it natively merges, but meanwhile nothing prevents apps from storing their data in say sqlite based append only logs and when needing to solve conflict/import/merge just append new operations.

miningape

16 hours ago

I do agree for general users, but the post seems targeted at a more technical audience. I don't expect HR or finance to care about file types and storage beyond being something they know how to work with. I do expect developers to care though, and developers also (generally) prefer "self-managed" solutions like git over CRDT for editing files.

I really disagree with this idea in software design that even when you are making a tool for technical people it must be implemented in a way that non-technical people expect. Like how postman is trying to be a google docs for http requests. Programming languages are our favourite technical tools and we don't expect features that are nice for non-techies, we expect features that confuse non-techies but make our lives easier!

miningape

17 hours ago

This is why I'm building my own postman replacement which runs files stored locally (it looks something like the ES query console). It will give me a lot of flexibility to change and share my requests and configurations, as well as actually letting me store my files locally.

(I'm also building an assertion feature to allow me to assert the shape/data on the request/response, which will come in very useful when testing api changes)

Here's what a request to the pokeapi looks like with a query parameter to limit the results:

```

@baseUrl = "https://pokeapi.com/api/v2"

GET /pokemon

- query:

    limit: 20
```

luis_cho

16 hours ago

Take a look at .http files. They are a file based alternative to postman. I don’t think it has the assertion feature.

miningape

16 hours ago

This is incredibly similar (we even landed on the same @base syntax)! Thanks for sharing, I think I'll have to look up some parsers to see if there are any tricks I've missed.

I'm also not sure if http files provide a way to share configurations across a set of files / inherit properties? - Similar to how in postman you can share a set of properties within a folder

rhl314

18 hours ago

Text files are amazing, and for when you need to structure data you can use sqlite.

I am using it for https://loadjitsu.io/ Still looking for a good solution to seamlessly sync local sqlite to cloud for backup when the user wants

benfortuna

20 hours ago

Isn't this "philosphy" just a rehash of Open File Formats:

https://en.wikipedia.org/wiki/List_of_open_file_formats

Albeit, limited to just plain text formats.

rishikeshs

19 hours ago

Hi Author here!

I think when I scribbled this thought, the idea was more on formats that MIGHT endure the test of time. Yes, copyright free matters a lot. But I'm very skeptical of say formats like WebP. Will it last for decades, I don't know!

TacticalCoder

18 hours ago

File over online app. I'm all for it.

But I'm not concerned about the ability to read this or that obscure file format in three decades: just look at the retro community accessing files in old formats for the C64 or whatever old machine.

In a way we know have "app as file": be it a container build file or a complete VM, we can emulate pretty much anything and everything as long as it doesn't depend on something online.

Any app, on any OS, as long as it doesn't require a proprietary online server, can be emulated or virtualized.

I can run the old DOS programs I wrote back in 1990 or so, decoding weird picture file formats. I've got an emulator running on a Pi hooked to an adapter in my vintage arcade cab emulating thousands of arcade games.

If anything it is easier to access all those old apps and file formats today then back in the days, because you can manipulate them from a much more powerful system.

Rant done, off to Proxmox to create a container installing QEMU to emulate a Raspberry Pi 2 (it's more convenient to test in an emulator and then deploy later on on the real thing).

user

16 hours ago

[deleted]