Rochus
2 months ago
> about rust. Is it better than, for example, oberon, or it's descendents?
It's a completely different language with different focus and features. "better" depends on the requirements you want to solve with the language. Rust can do memory management at compile time whereas Oberon and its descendants (such as my Oberon+ version) use a garbage collector at runtime. Rust is much bigger and more complex than Oberon, but it also has features which make it more suitable for systems programming. Simpler is not automatically better. If a language is too "simple" (i.e. minimalist), the complexity often shifts from the language specification to the application code, requiring more boilerplate or "clever" tricks to solve complex problems.
And it was not only simplicity (as Vidar assumed), but Oberon indeed had a different approach to OO in that there were no bound procedures, but the idea was (as you can observe e.g. in the Oberon systems and in publication by Wirth an his PhD students) that "type extension" (i.e. inheritance) is used to create extensible message hierarchies, which then are polymorphically handled by procedures accepting a VAR parameter of the most general message type. In this handler, the messages are dispatched by the IS operator or the WITH or CASE statement. This has similar effects as e.g achievable by sum types, and interestingly is closer to e.g. Alan Kay's view of OO than the implementation found in Smalltalk-80 (notably, Wirth arrived at this message-centric model independently of Kay’s work in the Smalltalk lineage).
mikethe
2 months ago
OO is a big issue here. Or rather, in general. I have read "The Art of Unix Programming", where a very big fuss is made over object-orientation. It's like they want to take a really big gun that means big business, and shoot it. Ever since reading that, I yelp whenever I see it mentioned, and squint in suspicion at the offender. So when a titan like Wirth endorses it.. ..there is obviously something of great importance that needs to be clearly understood, and that is -not- obvious.
The same thing when I see mentions of C++. Except that I haven't yet found endorsements of it from anybody of Wirth's stature. Sites like suckless really do not help.
Related: https://plan9.io/wiki/plan9/lfaq/index.html#HARDWARE_AND_SOF...
" Is it object-oriented?
No, not in the conventional sense. It is written in a strict dialect of ISO/ANSI C. In a wider sense, its general design of making all its ``objects'' look like files to which one talks in a well-defined protocol shows a related approach. "
Rochus
a month ago
Raymond argues in the referenced book that while OOP has value in specific domains, its benefits have been oversold and its typical usage often conflicts with the core Unix philosophy of simplicity and transparency. I think that's a valid position, and hyping and overselling is a continuous fad in the programming industry, which is famously susceptible to what Fred Brooks called the "Silver Bullet" syndrome: the desperate search for a single magical technology that will solve the inherent difficulties of software engineering. The "simplicity and transparency" Raymond champions are indeed virtues, but they are also slogans used to rally a specific community. Just as "Memory Safety" is the rallying cry for Rust today, "Small is Beautiful" was the rallying cry for Unix in the 1970s.
The industry seems to advance not by finding a static "truth," but by lurching between opposing philosophies, each powered by its own wave of activism. I recommend to ignore the "moral" arguments, but to focus on the engineering trade-offs. Most "revolutions" are just rediscoveries of old ideas. Functional Programming is largely Lisp (1958) re-branded. The "Anti-OO" movement is partly a return to procedural modularity (Pascal/C). Our job is not to be "pure"; your job is to build working, maintainable software using the best tools we can find for the specific problem at hand.
mikethe
a month ago
Greetings again..
I have now read more of the book on oberon, and confess that I lost traction. It seems very clear that he is using objects, and my bias against them is very strong. I have heard it said that both oberon and plan9 are beautiful, but completely different.
I'd be interested in your take on the two of them, when, if, each of them should be used, and why. I am currently attempting to install plan9/9front/9legacy et. al. I understand that oberon, being much smaller, could be very useful for learning purposes, but if you don't actually approve of OO, then.. is it really the kind of thing you want to be teaching people. Obviously, you want people to see a bit of everything, just so they are aware of alternate ways of doing things. At the same time, most people will assume that the first thing you show them is the way you actually want them to do it, and will tread the alternatives as what you present them as, curiosities, potentially useful. So, if the objective is plan9, then what would actually be needed would be a .. well.. a tutorial for plan9. A tutorial that gradually built the operating system, explaining the logic step-by-step. There -is- actually at least one tutorial that does such a thing, that I am aware of, but what it takes as it's inspiration, I haven't checked out yet. This is the one: https://os.phil-opp.com/
Rochus
a month ago
> my bias against them is very strong
Why? OO is just a tool like any other (e.g. functional etc. programming). From an engineering perspective (where engineering is essentially the art of solving problems using technical means), making a fundamental a priori decision for or against a particular tool makes little sense.
Oberon and Plan9 are indeed completely different. Oberon is in addition both a programming language (with OO features) and an Operating system (using those features). I wouldn't "use" either. Both are very valuable for studying and experimenting. Oberon excels in "minimalism" ("write a decently complete operating system with very little effort and means"), while Plan9 is the "spiritual successor" of Unix which refined existing concepts and added innovations (e.g. for multithreading and inter-thread communication). The answer to your questions depends on your specific situation. If you're a student then I would recommend to optimize your time for what you would like to do professionally later. Universities tend to be "ivory towers", feeding students a lot of stuff for historic or "academic" (i.e. random preference and interest of the professors) reasons with little respect for "efficiency" (i.e. usefulness per hours spent). Basics are very important, but the "basics" which you learn at universties in "computer science" are rather overrated and not very useful for most people, in my humble opinion. I would recommmend to read books and experiment with different programming paradignms and languages, if you want to go for a programming career How operating systems work is mostly important if you want to build them yourself professionally or do low-level programming like OS drivers or embedded systems. My answer is therefore somewhat more general than you might have hoped for, but it depends greatly on where you are now and where you want to go.
If you want a 'step-by-step tutorial' like the phil-opp blog, Oberon is actually closer to that experience. The 'Project Oberon' book literally builds the compiler and OS from scratch. Plan 9 is different; it's a production research system, so you learn it by reading the source and man pages, not by building it.
mikethe
a month ago
"Why?"
Because of what Raymond said. He said two things that I remember fairly clearly:
1) If you have to resort to OO to solve your problems, you probably have another unresolved problem at a higher level. This is from near the beginning of the book.
2) OO succeeds mainly by increasing the global complexity of the problem.
All I'm doing is answering your question. It is true I wasn't totally clear about this. It is very clear that you understand his detailed logic better than I do. Reading the rest of your post now.
Rochus
a month ago
Ok, I see. Raymond wrote this in 2003, at the peak height of "Java OO madness" where deep inheritance and over-engineered, totally inefficient systems were common. In The Art of Unix Programming, Raymond clarifies that he isn't against "objects" as a conceptual tool but against the "gospel of OO" that dominated the 1990s. His main argument is not that OO is inherently "evil", but that OO designs often become "spaghetti-like tangles" of inheritance hierarchies that hide logic behind "thick layers of glue". He represents the "Unix Philosophy", which is a specific culture, and he was writing to defend this culture against the rising tide of Java and C++, which he saw as a threat to the simplicity and "text-based" interoperability of Unix. Interestingly, Wirth’s Oberon and OO approach actually aligns well with much of Raymond's philosophy. Raymond would certainly agree that OO is just one of many tools in the toolbox of an engineer, and our job is it to chose the right tools for the job.
mikethe
a month ago
Well, we basically agree on how I should go about it, which is good :-)
Jehanne is now on my radar..
mikethe
a month ago
"Should be cast in bronze" comes to mind.
mikethe
a month ago
To clarify, you obviously understand OO better than most, certainly much much much better than I do. I happen to have Wirth's book on oberon here, have started reading it. So possibly, eventually, I will reach your level. :-)
cxr
a month ago
> "type extension" (i.e. inheritance) is used to create extensible message hierarchies, which then are polymorphically handled by procedures accepting a VAR parameter of the most general message type. In this handler, the messages are dispatched by the IS operator or the WITH or CASE statement. This has similar effects as e.g achievable by sum types, and interestingly is closer to e.g. Alan Kay's view of OO
That seems off. Kay's comments have always struck me as in line with Brad Cox's views. Cox's book uses this example a lot as a poor/insufficient substitute for dynamic dispatch.
Rochus
a month ago
> Kay's comments have always struck me as in line with Brad Cox's views
It is unlikely that the two would have agreed on this. Kay's view is actually based on messages, as implemented in Erlang, for example, and to some extent in Smalltalk-72. Cox, on the other hand, has implemented the object and dispatch model of Smalltalk 80, that Ingalls invented and published in 1978, almost exactly, even with the same method lookup caching.
> Cox's book uses this example a lot as a poor/insufficient substitute for dynamic dispatch.
The Wirth and Smalltalk approach are both fundamentally "late-bound search" mechanisms, differing mainly in whether the search state is optimizable by a central engine (VM) or fixed in the user's explicit control flow (Handler). This is a classic example of dualism in computer science, specifically the Expression Problem (or the Data/Operation duality). You are simply traversing a 2D matrix of (Types × Operations), just choosing a different axis as primary. It's mathematically isomorphic, both are performing a directed graph traversal to find the code that matches (CurrentType, CurrentMessage). Only the ergonomics and possibility for caching differ. Smalltalk hides the dispatch loop in the VM, which can do caching and the dispatch effort goes from O(N) to O(1) in time. Wirth exposes the dispatch loop in the WITH statements with a dispatch effort continuously O(N).
cxr
a month ago
Perhaps I'm misreading a different (opposite/orthogonal) intent from what you meant when you wrote the quoted passage in your initial comment. Some form of dynamism is required, else it fails the "extreme late binding" criterion that Kay insists is fundamental to his view of OO.
I'm not familiar enough with Smalltalk-72 or what Ingalls did that makes it so different from the Smalltalk-80 that Cox read about in Byte.
> differing mainly in whether the search state is optimizable by a central engine (VM) or fixed in the user's explicit control flow (Handler). This is a classic example of dualism in computer science, specifically the Expression Problem (or the Data/Operation duality). You are simply traversing a 2D matrix of (Types × Operations), just choosing a different axis as primary.
If you are doing whole-system development and have control over the entire thing (a "closed world" system), then it is that simple. But whether it's an open world or a closed world changes things.
Cox is fond of a simple example that he repeats in his book (fairly early on—it's on something like page 9) to demonstrate that dynamic dispatch is fundamentally necessary because it means you don't have to have panoptic control/involvement over the objects in a system (with all types known at compile time). If you're programming every operation with switch statements that select code paths based on objects' type tags, not only do you have to go visit all N routines where each of those N operations are implemented to update them when introducing a single type, but it also requires a priori knowledge of all types to be baked into the system that you release at the time of release, whereas OO on a live system means that you can introduce new types even after the initial system has shipped to the user.
Rochus
a month ago
Sure. Late binding in Smalltalk-80 means that a bytecode method is selected via a hash table per class and the address of the internalized selector string (atom) as a key. In Oberon, procedures are natively compiled, but each module is dynamically loaded; a module can implement a handler for a message and be separately compiled and loaded by name, so again late binding.
If you're interested in the difference between Smalltalk versions, I recommend Ingall's most recent paper: https://dl.acm.org/doi/10.1145/3386335. In contrast to Smalltalk >= 76, ST-72 had no bytecode, but sent tokens (synchronously) to object instances for parsing (which was called "message passing" by its authors).
> If you are doing whole-system development and have control over the entire thing, then it is that simple
The dispatch mechanism and the described duality is the same, whether whole-system or not.
> Kay's view of OO is a matter of "open world" versus "closed world"
Smalltalk was always a "closed world" (you are always in the same image, but code can be compiled on the fly at runtime), and all calls were synchronous. Since the Oberon compiler and system treats each module as a dynamic loadable entity and supports loading by name, it actually supports the "open world" approach. Interestingly, Kay's view is likely best represented in Erlang, where there are true messages sent asynchronously.
> If you're programming every operation with switch statements that select code paths based on objects' type tags
As mentioned, Oberon traverses the "2D matrix" from the other side. Each module may or may not handle a message in a WITH (i.e. switch by type) statement, but modules per se ar dynamic. So the "a priori knowledge" only applies to the message type.
mikethe
2 months ago
"Simpler is not automatically better. If a language is too "simple" (i.e. minimalist), the complexity often shifts from the language specification to the application code, requiring more boilerplate or "clever" tricks to solve complex problems."
..fascinating..
..it always comes down, again and again, to what is the problem you need to solve, and from there, what is the best solution. And at that point, once you have identified the problem.. in a way it's a toss-up whether you implement your own solution, or just happen to have heard of a solution to a similar or equivalent (or the same) problem, developed by somebody else. ..or how willing you are to dig in search of said hypothetical already-existing solution. And said hypotheticals must then themselves be evaluated, which also takes time.