He's also referred to Linux as "the GNU Emacs of all terminal emulators".
> It's now the GNU Emacs of all terminal emulators.
> (Linus Torvalds, regarding the fact that Linux started off as a terminal
emulator.)
http://neil.franklin.ch/Jokes_and_Fun/Linux_Quotes.html
That's the best reference I can find, but even if it's totally legit it doesn't make any sense to me.
Perhaps it's a reference to how Emacs does "everything" and some has also joked that Emacs is an OS with a text editor
> I will never not find this kind of project incredibly impressive
I wouldn’t call it incredibly impressive. The path on how to write a minimal multi-tasking kernel has been beaten decades ago.
Writing a kernel that can boot and do a few things is ‘just’ a matter of being somewhat smart and have some perseverance. Doing it for RISC-V complicates things a bit compared to x86, but there, too, the information about initialising the hardware often is easily obtained (for example: https://wiki.osdev.org/RISC-V_Meaty_Skeleton_with_QEMU_virt_... I wouldn’t know whether this project used that)
I think the author agrees, given (FTA) that they wrote “This is a redo of an exercise I did for my undergraduate course in operating systems”
It’s work, may be nice work, but I think everybody with a degree in software engineering could do this. Yes, the OS likely will have bugs, certainly will have rough edges, but getting something that can multi-process, with processes shielded from each other by a MMU isn’t hard anymore.
To you maybe. The subset of population that is even interested, smart, persevering to do this is extremely tiny.
I am 99.99% sure that less than 20% of Australian graduates could do this, and honestly I wouldn't be surprised to hear that the actual answer is <1%.
I was studying at Monash, which is considered a solid university here, and holy moly are the standards low. I had classmates in the second year of my machine learning postgrad asking me things like "What is machine learning?", and they all graduated and found jobs anyway.
I'd agree with this. I did a double degree in Comp Sci/Comp Sys Eng at RMIT (1998-2002) and even from that era I would say that's largely true. Out of the people who did my course (and those I knew from other degrees like Comp Sys Eng/Business) very few are still doing deep technical programming for a career and/or hobby programming on the side on deep technical non-web things. The rest are mostly working for places like consulting companies, banks, big data, Telstra, etc in management roles like project manager, scrum master, solutions architect, change management. A lot of folks I think were just not that interested in stuff like writing an OS, how does virtual memory work, how does the hardware work, etc so they gravitated out of those software development roles into management roles. Nothing wrong with that, but I just think not everyone is interested in or capable of writing an OS!
What did they find jobs in? Australia has like one tech company.
Which is odd since their universities have built two of the most interesting CS projects I can think of (Mercury and L4). And WWWJDIC I suppose.
Folks end up at all sorts of places. Like I mentioned above, the banks hoover up a lot of graduates. There are a lot of smaller local companies doing web stuff. The consulting companies all have a presence here (KPMG, Accenture, Fujitsu Consulting, etc).
Is there a book one can read to learn how to create one?
That is an excellent recommendation.
For Operating Systems anything Andy Tanenbaum did is world class.
This made me look up what he has been up to, there is a 2023 edition of "Modern Operating Systems" which covers cloud virtualization and Android along with everything that came before, hm, tempting.
We really need a Tech Writer Hall of Fame. W. Richard Stevens, Andrew Tanenbaum, P.J. Plauger. Others?
> I think everybody with a degree in software engineering could do this
Ideally this would be true, but it hasn't been my experience at all. At least with American graduates, I can't speak to other countries.
My CS undergrad in the UK had us write an ARM kernel with scheduling and IPC, though didn’t require we use the MMU.
It is equally valid to say that Stallman's starting to write a C compiler and Unix utilities (in 1984 whereas the Linux project started in late 1991) paved the way to getting an open source version of Unix installed on billions of machines.
I agree - there's a number of kernels that were "open source" and released at a similar time enough time to linux (e.g. 386BSD in '92) that I could see any of those winning the "community battle" and taking that space instead, but no real credible "development toolchain" equivalent until decades later.
Though I'm unsure how differing licenses might have affected this - I suspect that really early in it's development the "copyleft" nature of the GPL Linux didn't make as much of a difference, as from what I remember most commercial uses of Linux didn't come until it had already gained significant momentum.
The copyleft nature was essential to good driver support. It set it up such that for corporations making drivers the easiest path was to get the driver upstreamed. There was a bunch of hoops they could have gone through to avoid that (as many did, like Nvidia) but that became a sorta-default.
Copyleft encourages a collaborative relationship between entities because it makes trying to play it close to the chest with IP involve more legal effort (if it's possible at all).
Yes, I can see that stalling development as (at best) it turns into a pile of private forks rather than a cohesive project, but from what I remember that was already after Linux had "won" the "Open Source Kernel" race.
Commercial support for Linux was... Sparse... before the early 2000s.
> [Stallman/GNU] getting an open source version of Unix installed on billions of machines.
Agreed, funnily enough GNU tools/compilers also ended up getting installed on a lot of proprietary UNIXes because proprietary UNIX was mostly shit (in user space!). At least most of the ones I had the misfortune to have to work on.
I first came across GNU tools on NeXTSTEP, which wasn't too bad.
If Stallman had started with a kernel, there would be very few people who had the legal right to run any utilities or apps on the new kernel whereas GNU's utilities and apps (e.g., Emacs) were immediately useful (i.e., without breaking any copyright law or violating any software license) to a large population, namely, anyone with an account on a proprietary Unix system, which explains why Stallman chose to start with the userland.