HeyLaughingBoy
3 days ago
Reading this thread leaves me with the impression that most posters advocating learning assembly language have never had to use it in a production environment. It sucks!
For the overwhelming majority of programmers, assembly offers absolutely no benefit. I learned (MC6809) assembly after learning BASIC. I went on to become an embedded systems programmer in an era where compilers were still pretty expensive, and I worked for a cheapskate. I wrote an untold amount of assembly for various microcontrollers over the first 10 years of my career. I honestly can't say I got any more benefit out of than programming in C; it just made everything take so much longer.
I once, for a side gig, had to write a 16-bit long-division routine on a processor with only one 8-bit accumulator. That was the point at which I declared that I'd never write another assembly program. Luckily, by then gcc supported some smaller processors so I could switch to using Atmel AVR series.
wat10000
3 days ago
I've written assembly in a production environment. I love it and wish I could do more.
But the context where I'm doing it is very different from the context where you had to write a division routine from scratch! We never use assembly where a higher-level language would be good enough. It's only used for things that can't be written in C at all, either because it needs an instruction that the C compiler won't emit, or it involves some special calling convention that C can't express.
However, I read assembly in production all the time. Literally every day on the job. It's absolutely essential for crashes that won't reproduce locally, issues caused by compiler bugs, or extremely sensitive performance work. Now, lots of programmers very rarely have to deal with those sorts of things, but when it comes up, they'll be asking for help from the ones who know this stuff.
xxpor
3 days ago
+1 on reading.
I hardly consider myself an expert ARM ASM programmer (or even an amateur...), but a baseline level of how to read it even if you have to look up a bunch of instructions every time can be super useful for performance work, especially if you have the abstract computer engineering know how to back it up.
For example, it turns out that gcc 7.3 (for arm64) doesn't optimize
foo() ? bar() : baz();
the same as if (foo()) {
bar();
} else {
baz();
}
!The former was compiled into a branchless set of instructions, while the latter had a branch!
ferguess_k
3 days ago
Just curious, what kind of work do you do? Sounds like SRE in FAANG that deals with production systems. I work as a DE and the lowest level thing I have to read is JVM dump from spark. Man I envy you.
wat10000
3 days ago
Close, OS development at a FAANG.
ferguess_k
20 hours ago
Wow this is a dream job.
a_cardboard_box
3 days ago
> I once, for a side gig, had to write a 16-bit long-division routine on a processor with only one 8-bit accumulator. That was the point at which I declared that I'd never write another assembly program.
This is exactly the kind of job I'd enjoy! A perfectly doable technical challenge with clear requirements. Some people like solving Sudoku puzzles, I like solving programming puzzles.
I guess I'm just not "the overwhelming majority of programmers".
HeyLaughingBoy
3 days ago
> doable technical challenge with clear requirements
That's a Project Management issue, not an implementation concern.
In my case, there was no requirement that said "use 16-bit long division." However, we had committed to a particular processor family (MC68HC05), and the calculation precision required 16-bit math. IIRC, there was a compiler available, but it cost more than the rest of the project and the code it produced wouldn't have fit into the variant of the processor that I was using anyway.
The actual requirement would have looked more like "detect a 0.1% decrease in signal that persists for 10 seconds, then do X."
Animats
3 days ago
Oh, yes, that era. I had to program a MC68HC11 in Forth because the C compiler was so expensive.
markus_zhang
3 days ago
Man what a blast!
commandlinefan
3 days ago
> the kind of job I'd enjoy
I feel the same way, but I also can't help but imagine the boss jumping up and down and throwing chairs and screaming "how can you not be done yet? You're a programmer and this is a program and it's been three _hours_ already".
markus_zhang
3 days ago
I totally agree. I read and commented the source code of Woz's SWEET16 and it was a blast to fully understand it.
But of course, might not be that rosy if under great time constraints.
favorited
3 days ago
I think the majority of programmers would enjoy it, but most would first need to pick an ISA (something older is probably going to be more approachable for beginners), learn enough about it to understand basic arithmetic instructions, learn enough about the dev tools to be able to assemble, link, and execute their code, etc.
For most folks, that's going to be a couple days of prep work before they can get to the fun part of solving the puzzle.
1vuio0pswjnm7
3 days ago
"I guess I'm just not the "overwhelming majority of programmers"."
The "overwhelming" majority of programmers may be underwhelming
Some readers may be unimpressed by programmers who complain about and criticise assembly language, e.g., claiming it offers "no benefit" to others, especially when no one is forcing these programmers to use it
WalterBright
3 days ago
I did a lot of assembler programming before discovering C. I learned C in maybe an hour because of that.
Not knowing assembler means programmers have a bit of a blind spot towards what are expensive things to do in C vs what generates the best code.
For example, debugging a program sometimes requires looking at the generated assembler. Recently I was wondering why my deliberate null pointer dereference wasn't generating an exception. Looking at the assembler, there were no instructions generated for it. It turns out that since a null pointer dereference was undefined behavior, the compiler didn't need to generate any instructions for it.
ozim
3 days ago
I build web applications that run on top of databases, web servers and frameworks.
I do need to understand how indexes in db engine work, I need to understand there might be socket exhaustion in the system, I do need to understand how my framework allocates data on heap vs stack.
Having to drop to instructions that is for web servers, db, frameworks developers not for me to do. I do have a clue how low level works but there is no need for me.
That is part where parent poster is correct there are better ways for developers to spend time on - trust your database, web servers and framework and learn how those work in depth you can skip asembler, because all of those will take a lot of time anyway and most likely those are the ones you should/can tweak to fix performance not assembler.
marssaxman
3 days ago
> since a null pointer dereference was undefined behavior, the compiler didn't need to generate any instructions for it.
I deeply hate this attitude in modern compiler design.
WalterBright
3 days ago
Me too. My compilers don't do that.
marssaxman
3 days ago
I'm glad to hear it. Thank you for caring.
cogman10
3 days ago
The issue is it's a moving target. What was expensive yesterday could be fast today based on compiler optimizations (and potentially vice versa).
Further, changes in the ISA can open up gains in performance that weren't available in yesteryear. An example of this would be SIMD instruction usage.
It's not a bad idea to know enough assembly language to understand why code is slow. However, the general default should be to avoid writing it. Your time would be better spent getting a newer version of your compiler and potentially enabling things like PGO.
zahlman
3 days ago
> most posters advocating learning assembly language have never had to use it in a production environment... For the overwhelming majority of programmers, assembly offers absolutely no benefit.
I don't follow. Why should assembly have to be useful or pleasant in a production environment, for learning it to be useful?
I was taught a couple different flavours of assembly in university, and I found it quite useful for appreciating what the machine actually does. Certainly more so than C. Abstractions do ultimately have to be rooted in something.
atoav
3 days ago
You and the post you commented on display both a valid point. If we're talking about using assembly as a broad general purpose programming environment that would be a mess (which is precisely why it has no broad adoption). When we talk about assembly as a niché special purpose solution we would come to a different conclusion, coincidentally this is where assembly is still used today: environments where we need highly optimized code.
Your point about education is orthogonal to the point made. I agree with you that learning assembly can be a good way to teach people how computers work on a low level, but that has nothing to do with whether it is useful as a skill to learn.
As someone teaching similar things at the university level to a non-tech audience I have to always carefully wheigh how much "practically useless" lessons a typical art student can stomach. And which kind of lesson will just deter them, potentially forever.
zahlman
3 days ago
> I agree with you that learning assembly can be a good way to teach people how computers work on a low level, but that has nothing to do with whether it is useful as a skill to learn.
I don't understand the distinction you're trying to make. The post I was replying to specifically discussed "learning assembly language". My entire point is that "learning assembly language" has purposes other than creating practical real-world programs in assembly.
bongodongobob
3 days ago
Is it useful to learn bagpipes? I guess learning for its own sake is good, but if you want to join a band, guitar or keyboards are going to be a better bet and learning bagpipes first isn't going to do much for you.
barrkel
3 days ago
Do bagpipes explain the mystery of sand performing calculations and taking actions? Do they give you an intuition for connecting how CPUs and memory accesses and cache hierarchies work with high level code, in such a way that you can start to understand why one version of code might be faster or slower than another?
If you can't see through field accesses and function calls to memory indirections, anything you might read about how TLBs and caches and branch prediction work doesn't connect to much.
bongodongobob
3 days ago
Guess what, almost no one knows how to program in assembly and yet everything is working out pretty good.
barrkel
2 days ago
I can say the performance of Windows Explorer lately, compared to how it was in Windows NT, does not impress me.
strken
3 days ago
If a guitar was an abstraction layer that was implemented by low-level bagpipes then a) that would be awesome and b) guitar players would find their guitar playing to benefit from bagpipe lessons. At the very least they'd be able to understand and maintain their guitar better.
bongodongobob
3 days ago
The connection is that they both play notes. You can play the same songs on both but no one wants to hear bagpipes.
SAI_Peregrinus
3 days ago
You can't play most of the same songs on both. Bagpipes (well, most forms of bagpipe, there are dozens, but unqualified people tend to mean the Scottish "Great Highland Bagpipe") are a diatonic instrument playing a just-intonation scale tuned to not cause discordant notes with their own drones, while guitars are a chromatic instrument fretted to play an equal-tempered intonation. The GHB plays in something rather close to the modern Mixolidian A mode with an augmented 4th, not any of the major or minor keys of modern Western music. The GHB and the guitar are entirely incompatible instruments, unless you're talking about a classical guitar with tied-on gut frets that could be replaced to allow playing the GHB scale.
SAI_Peregrinus
a day ago
To clarify in case of insult: by "but unqualified" I mean "but, without qualification as to which type of bagpipe". I had no intention to insinuate that "unqualified people" are the only ones who talk about the Scottish Great Highland Bagpipe! I myself will say that I "play the bagpipes" or "am a bagpiper" when referring to the GHB, even though I also play Ceilidh pipes sometimes (a different, smaller sort of Scottish bagpipe with a different drone tuning). I don't play the Irish Uillean Pipes, Galician Gaita, Northumbrian Smallpipes, any of the German, French, Italian, Greek, or other sort of bagpipes. Unqualified, bagpipe usually means Great Highland Bagpipe.
HeyLaughingBoy
2 days ago
I have no idea what any of that means, but I love the deep knowledge that it expresses :-)
zahlman
2 days ago
I'll try to simplify: you can't readily adjust the tuning of a bagpipe; it's tuned in a way that would make it sound horribly dissonant against other instruments (for important music-theory reasons); and it doesn't even play all the notes, so you can't play in all the major and minor keys of Western (== European from c. 1580 onward + American) music tradition - you're stuck with scales that only make sense for the genre of music that's specifically written for the instrument.
strken
2 days ago
What you're trying to say is that assembly is like the bagpipes and impractical. What I'm trying to say is that's a terrible metaphor because the main reason to learn assembly is understanding what your non-assembly code is actually executed as.
spc476
3 days ago
Learning the accordion didn't hurt Weird Al's career, nor did using the flute hurt Ian Anderson (lead vocalist and flutist of Jethro Tull).
bongodongobob
3 days ago
These are edge cases. Way to miss the point.
mabster
2 days ago
I started my career in assembly and it's reduced over time. Towards the end of the gamedev work I was still reading a lot of assembly but no longer writing it (using intrinsics instead). It was definitely a lot slower to write.
But there are a number of things we did that are not available or difficult in C:
- Guaranteed tail calls
- Returning multiple values without touching memory
- using the stack pointer as a general purpose pointer for writing to memory
- Changing the stack pointer to support co-routimes
- Using our own register / calling convention (e.g. setting aside a register to be available to all routines
- Unpicking the stack to reduce register setup for commonly used routines or fast longjmps
- VM "jump tables" without requiring an indirection to know where to jump to
ajross
3 days ago
> Reading this thread leaves me with the impression that most posters advocating learning assembly language have never had to use it in a production environment. It sucks!
It absolutely sucks. But it's not scary. In my world (Zephyr RTOS kernel) I routinely see people go through all kinds of contortions to build abstractions around what are actually pretty straightforward hardware interfaces. Like, here's an interrupt controller with four registers. And here's a C API in the HAL with 28 functions and a bunch of function pointer typedefs and handler mechanics you're supposed to use to talk to it. Stuff like that.
It's really common to see giant abstractions built around things that could literally be two-instruction inline assembly blocks. And the reason is that everyone is scared of "__asm__".
K0balt
3 days ago
This exactly. If you are doing embedded programming, direct register access and manipulation is often a far, far superior option, and you don’t have to be some kind of “assembly sensei” to do it if you just have a very basic idea of how things work. It doesn’t mean you write programs in assembly… it means when you are needing to do something that the hardware is going to do for you anyway, you know how to ask for it directly without having to load a 2Kloc library. This is especially true when using python or JS bytecode on the MCU. Actually, using python with assembly is really the best of both worlds in many cases.
HeyLaughingBoy
3 days ago
Yeah, but doesn't the Zephyr device tree abstraction actually expect you to do that? I mean, I appreciate the elegance and the desire for portability, but all I could think of as I read those docs was "here's a couple months of work for something that should take ten minutes."
ajross
3 days ago
DTS[1] is there to parametrize things (like MMIO addresses and IRQ assignments) that need to be parametrized. The discussion here is about needless abstraction at the level of C code.
In, say, the interrupt controller case: there's a lot of value in having a generic way for boards to declare what the IRQ for a device actually is. But the driver should be stuffing that into the hardware or masking interrupt priorities or whatever using appropriately constructed assembly where needed, and not a needless layer of C code that just wraps the asm blocks anyway.
[1] And to be clear I'm not that much of a devicetree booster and have lots of complaints within the space, both about the technology itself and the framework with which it's applied. But none that would impact the point here.
ferguess_k
3 days ago
Most of us do not have the chance to use it in production. I think that's where the fancy came from.
We are also getting burned out by the modern Agile web/data/whatever development scene and would like to drill really deep into one specific area without stakeholders breathing down our necks every few hours, which assembly programming conveniently provides.
I also consider the grit (forced or voluntary) to be a bath of fire which significantly improved two important things - the programmer's understanding of the system, and the programmer's capability to run low level code in his brain. Is it suffering? Definitely, but this is a suffering that bring technical prowess.
Most of us do not have the privilege to suffer properly. Do you prefer to suffer from incomplete documentation, very low level code and banging your head on a wall for tough technical problems, or do you prefer to suffer from incomplete documentation, layers and layers of abstraction, stakeholders changing requirements every day and actually know very little about technical stuffs? I think it is an easy choice, at least for me. If there is an assembly language / C job that is willing to take me in, I'll do it in half of the salary I'm earning.
flohofwoe
3 days ago
On 8-bit home computer CPUs like the 6502 or Z80, high level programming languages like C simply were not an option, you left too much performance on the table (not to mention BASIC which was easily 100x slower than handwritten assembly).
Forth was quite acceptable performance wise, but that's barely above a good macro assembler.
And after the 8-bitters, assembly coding on the Amiga was pure pleasure - also for large programs, since apart from the great 68k ISA the entire Amiga hardware and operating system was written to make assembly coding convenient (and even though C was much better on the 68k, most serious programs used a mix of C and assembly).
(also, while writing assembly code today isn't all that important, reading assembly code definitely is when looking at compiler output and trying to figure out why and how the compiler butchered my high level code).
jamesfinlayson
3 days ago
> (also, while writing assembly code today isn't all that important, reading assembly code definitely is when looking at compiler output and trying to figure out why and how the compiler butchered my high level code).
Agreed - I wouldn't be able to write any x86 assembly without a bit of help, but having done some game reverse engineering I've learned enough to make sense of compiler generated code.
pjmlp
3 days ago
To add to that, there is a reason why even all modern JITs also have ways to look into generated code.
Anyone curious how their JVM, CLR, V8, ART, Julia,.... gets massaged into machine code only needs to learn about the related tools on the ecosystem.
Some of them are available on online playgrounds like Compiler Explorer, Sharpio,....
acegopher
3 days ago
> the entire Amiga hardware and operating system was written to make assembly coding convenient
I am curious what specific examples do you have of the HW and OS being made/written to make ASM convenient?
flohofwoe
3 days ago
The hardware could be controlled via memory mapped 16-bit registers, e.g. checking whether the left mouse button is down is a single instruction:
btst #6, $bfe001
The OS used a simple assembly-friendly calling convention, parameters were passed in registers instead of the stack (and the API documentation mentioned which parameters are expected in which registers), and the reference manuals usually had both C and assembly examples, etc... basically lots of little things to make the lives of assembly coders easier.This YouTube playlist gives a nice overview of assembly coding on the Amiga (mostly via direct hardware access though): https://www.youtube.com/playlist?list=PLc3ltHgmiidpK-s0eP5hT...
kragen
2 days ago
Also, using a 68000 instead of a shitty Intel processor was a huge boon to assembly programming. Ultimately Intel won in the market and eventually even shipped processors that weren't profoundly unpleasant at the assembly level, but the 68000 is still a much more pleasant architecture for the assembly programmer. ARM is nicer still, but this was before ARM's existence as a separate company from Acorn.
mystified5016
3 days ago
If you have a less-popular CPU, compilers today can be utter trash. GCC doesn't understand the TinyAVR core and emits insane assembly. Like iterating an array, instead of putting the array pointer in the Z register and using the atomic load-and-increment instructions, it will add to the pointer, read, subtract from the pointer, loop. It also uses the slower load instruction. Overall, looping over an array in C is 4 times slower than assembly, and consumes three times as much program space. Try examining the assembly from your next program, you'll probably be quite surprised at how awful it is.
I had to implement Morton ordering on this platform. The canonical C for this blows up to over 300 instructions per iteration. I unrolled the loop, used special CPU hardware and got the entire thing in under 100 instructions.
Compilers, even modern ones, are not magic and only understand CPUs popular enough to receive specific attention from compiler devs. If your CPU is unpopular, you're doing optimizations yourself.
Assembly doesn't matter to arduino script kiddies, but it's still quite important if you care at all about execution speed, binary size, resource usage.
Const-me
3 days ago
I think writing assembly indeed offers no benefit for most developers. However, being able to read and understand assembly is generally useful.
Enables debugging binaries and crash dumps without complete source codes, like DLLs shipped with Windows or third-party DLLs. Allows to understand what compilers (both traditional and JIT) did to your source codes, this is useful when doing performance optimizations.
anta40
2 days ago
I write mobile apps for living (mostly Java/Kotlin, a little bit of Flutter/RN) so yeah agree assembly is practically useless for professional work.
But for tinkering (e.g writing GBA/NES games), hell why not? It's fun.
craftkiller
3 days ago
I never used it in production and yet learning it absolutely provided me with benefits. I didn't understand pointers until I spent a weekend learning assembly.
pjmlp
3 days ago
As someone that has spent some time in the 8 and 16 bit demoscene, I used my share of Z80, 80x86 and 68000.
It is all a matter of having high quality macro assemblers, and people that actually care to write structured documented code in Assembly.
When they don't care, usually not even writing in C and C++ will save the kind of code they write.
grishka
3 days ago
I've also heard the opinion that modern compilers are better at generating optimized code than someone writing assembly by hand. Not sure how true it is, but considering the unfathomable complexity of modern CPUs, it does feel believable.
mabster
2 days ago
As a low level performance guy I trust the compiler nowadays, especially with deep instruction pipelines. The compiler is beatable - a lot of the decisions are heuristic - but it takes a lot of work to beat it.
GianFabien
2 days ago
Last time I looked intel CPUs had like 1700 instructions. Every generation comes with an even more expanded ISA. I doubt that compilers use even a fraction of the ISA. Especially considering that binaries are often expected to run on a wide range of older CPUs. I know that there are intrinsic functions which provide access to some of the powerful, yet special purpose instructions. It is unrealistic to expect the compiler to make effective use of all the fancy instructions you paid for with your latest hardware upgrade.
genewitch
2 days ago
> It is unrealistic to expect the compiler to make effective use of all the fancy instructions you paid for with your latest hardware upgrade.
I'd add "yet" - we runinafed that the reason new machines with similar shapes (quad core to quad core of a newer generation) doesn't immediately seem like a large a jump as it ought, in performance, is because it takes time for people other than intel to update their compilers to effectively make use of the new instructions. icc is obviously going to more quickly (in the sense of how long after the CPU is released, not `time`) generate faster executing code on new Intel hardware. But gcc will take longer to catch up.
There's a sweet spot from about 1-4 years after initial release where hardware speeds up, but toward the end of that run programs bloat and wipe all the benefits of the new instructions; leading to needing a new CPU, that isn't that much faster than the one you replaced.
Yet.
Which reminds me I need to benchmark a Linux kernel compile to see if my above supposition is correct, I have the timings from when I first bought it, as compared to a 10 year old HP 40 core machine (ryzen 5950 is 5% faster but used 1/4th the wall power.)
genewitch
2 days ago
> runinafed
-> ruminated
grishka
2 days ago
These kinds of SIMD instructions are usually used by things like media codecs and DSPs. They would include several versions of the performance-critical number-crunching code and would pick the best one at runtime depending on which SIMD instructions your CPU supports.
mystified5016
2 days ago
If and only if someone has taken time to write specific optimizations for your specific CPU.
In embedded land, if your microcontroller is unpopular, you don't get much in the way of optimization. The assembly GCC generates is frankly hot steaming trash and an intern with an hour of assembly experience can do better. This is not in any way an exaggeration.
I've run into several situations where hand-optimized assembly is tens of times faster than optimized C mangled by GCC.
I do not trust compilers anymore unless it's specifically for x86_64, and only for CPUs made this decade
noitpmeder
10 hours ago
I'm curious! Can you provide an example of something gcc does poorly that you think such an intern could actually improve upon?
m463
3 days ago
I've used "machine code" plenty of times in production as inline assembly encoded in other languages.
I vaguely recall ada inline assembly looking like function calls, with arguments that sometimes referenced high-level-language variables)
unrelated to that, I distinguish between machine code which is binary/hex, and assembly as symbolic assembler or macro assembler, which can actually have high level macros and other niceties.
And one thing I can say for sure. I took assembly language as my second computer course, and it definitely added a lifelong context as to how machines worked, what everything translated to and whether it was fast or efficient.
HeyLaughingBoy
3 days ago
LOL. As a poor college student, I couldn't even afford an assembler. Programming my CoCo (Radio Shack Color Computer) had to be done either with the built-in BASIC, or by POKEing in machine codes for programs that I hand assembled. One of the nice things about Motorola (CoCo was based on MC6809E) assembly languages is that the processors were very regular and it was easy to remember the opcodes and operation structures.
A friend of mine who also had a CoCo wrote an assembler as a term project.
WalterBright
3 days ago
Sometimes you don't really want to write in assembler. Like loading a constant into a register for the AArch64. The instructions to do it are pretty wacky, and it's hard to see if you wrote the correct combination to load the value. Best to let the compiler do it for you (or use godbolt.org! to get the right mix). The same for floating point constants.
Once I got the code sequences for this right on my AArch64 code generator, I don't have to ever figure it out again!
fuhsnn
3 days ago
This post was very helpful when I tried to figure it out: https://dougallj.wordpress.com/2021/10/30/bit-twiddling-opti...
WalterBright
3 days ago
I could have used that during the hours I spent figuring it out!
vardump
3 days ago
I expect a sufficiently good macro assembler should be able to do it as well.
WalterBright
3 days ago
Then you might as well use a HLL.
kragen
3 days ago
Programming in assembly is slow. It takes a long time to make things that way; as Julia Ecklar sings, it's kind of like construction work with a toothpick for a tool. (https://www.youtube.com/watch?v=WZCs4Eyalxc) But that's also true of knitting (https://journal.stuffwithstuff.com/2025/05/30/consider-knitt...), crochet, plasterwork, childrearing, calligraphy, gardening, carving marble, hand-soldering electronics, watching sunrises, and solving crossword puzzles.
If you have a six-day deadline, probably it would be better to use a high-level language instead.
But, when you have time for them, all of these things are intrinsically rewarding. Not all the time! And not for everyone! But for some of us, some of the time, they can all be very enjoyable. And sometimes that slow effort can achieve a result that you can't get any other way.
I haven't written that much assembly, myself. Much less than you have. If I had to write everything in assembly for years, maybe I wouldn't enjoy it anymore. I've written a web server, a Tetris game, some bytecode interpreters, a threading library, a 64-byte VGA graphics demo, a sort of skeletal music synthesizer, and an interpreter for an object-oriented language with pattern-matching and multiple dispatch, as well as a couple of compilers in high-level languages targeting assembly or machine code. All of these were either 8086, ARM, RISC-V, i386, or amd64; I never had to suffer through 6809 or various microcontrollers.
Maybe most important, I've never written assembly code that someone else depended on working. Those programs I've mostly written in Python, which I regret now. It's much faster that way. However, I've found it useful in practice for debugging C and C++ programs.
I think that a farmer who says, "For the vast majority of consumers, gardening offers absolutely no benefit," is missing the point. It's not about easier access to parsley and chives. Similarly for an author who says, "For the vast majority of readers, solving crossword puzzles offers absolutely no benefit."
So I don't think assembly sucks.
alcover
3 days ago
> as Julia Ecklar sings, it's kind of like construction work with a toothpick for a tool.
I was taught assembler
in my second year of school.
It's kinda like construction work —
with a toothpick for a tool.
So when I made my senior year,
I threw my code away,
And learned the way to program
that I still prefer today.
Now, some folks on the Internet
put their faith in C++.
They swear that it's so powerful,
it's what God used for us.
And maybe it lets mortals dredge
their objects from the C.
But I think that explains
why only God can make a tree.
For God wrote in Lisp code
When he filled the leaves with green.
The fractal flowers and recursive roots:
The most lovely hack I've seen.
And when I ponder snowflakes,
never finding two the same,
I know God likes a language
with its own four-letter name.
Now, I've used a SUN under Unix,
so I've seen what C can hold.
I've surfed for Perls, found what Fortran's for,
Got that Java stuff down cold.
Though the chance that I'd write COBOL code
is a SNOBOL's chance in Hell.
And I basically hate hieroglyphs,
so I won't use APL.
Now, God must know all these languages,
and a few I haven't named.
But the Lord made sure, when each sparrow falls,
that its flesh will be reclaimed.
And the Lord could not count grains of sand
with a 32-bit word.
Who knows where we would go to
if Lisp weren't what he preferred?
And God wrote in Lisp code
Every creature great and small.
Don't search the disk drive for man.c,
When the listing's on the wall.
And when I watch the lightning burn
Unbelievers to a crisp,
I know God had six days to work,
So he wrote it all in Lisp.
Yes, God had a deadline.
So he wrote it all in Lisp.
drob518
3 days ago
True. I love assembly language, but I write my code in Clojure.
chubot
3 days ago
That's kind of how I feel about C. C is fun, because you get to see "everything"
But C is slow to create -- it is like using a toothpick
Writing from scratch is slow, and using C libraries also sucks. Certainly libc sucks, e.g. returning pointers to static buffers, global vars for Unicode, etc.
So yeah I have never written Assembly that anybody needs to work, but I think of it as "next level slow"
---
Probably the main case where C is nice is where you are working for a company that has developed high quality infrastructure over decades. And I doubt there is any such company in existence for Assembly
commandlinefan
3 days ago
yeah, it's not _scary_. It's just tedious.