I learned z80 assembly back when the cutting edge of technology was a ZX Spectrum, and 68k assembly when I upgraded to an Amiga. That knowledge served me quite well for my early career in industrial automation - it was hard real-time coding on eZ80’s and 65c02 processors, but the knowledge transfers.
Back in the day, when input got mapped straight into a memory location and the display output was another memory location, then assembly seems like magic. Read the byte they corresponds to the right-hand middle row of the keyboard, check if a certain bit is set in that byte, therefore a key is held down. Call your subroutine that copies a sequence of bytes into a known location. Boom, pressing a key updates the screen. Awesome.
Modern assembly (x64 and the like) has masses of rules about pointer alignment for the stacks, which you do so often you might as well write a macro for it. Since the OS doesn’t let you write system memory any more (a good thing) then you need to make system calls and call library functions to do the same thing. You do that so often that you might as well write a macro for that as well. Boom, now your assembly looks almost exactly like C. Might as well learn that instead.
In fact, that’s almost the purpose of C - a more readable, somewhat portable assembly language. Experienced C developers will know which sequence of opcodes they’d expect from any language construction. It’s quite a simple mapping in that regard.
It’s handy to know a little assembly occasionally, but unless you’re writing eg. crypto implementations, which must take the exact same time and power to execute regardless of the input, then it’s impractical for almost any purpose nowadays.
Really? It was required when I was in college. We did MIPS, x86, and PIC.
I like it because there’s no mysterious things happening to your bits. Every line is an instruction executed. You control the machine. It’s power. It gives you power over the machines.
I went to college for Microbiology and became a programmer on my own after, so nope, never written a single line in assembly and never thought of checking it out either. Just never really crossed my mind. I might start messing with it soon.
Go for it, if it’s to satisfy your own curiosity, but there’s virtually no practical use for it these days. I had a personal interest in it at uni, and a project involving coding in assembly for an imaginary processor was a small part of one optional CS course. Over the years I’ve dabbled with asm for 32-bit Intel PCs and various retro consoles; at the moment I’m writing something for the Atari 2600.
In the past, assembly was useful for squeezing performance out of low-powered and embedded systems, but now that “embedded” includes SoCs with clock speeds in the hundreds of MHz and several megabytes of RAM, and optimizing compilers have improved greatly, the tiny potential performance gain (and you have to be very good at it before you’ll be able to match or do better than most optimizing compilers) is almost always outweighed by the overhead of hand-writing and maintaining assembly language.
If you’re curious, I recommend this channel. It often delves deep into the code to explain stuff, as well as how the hardware works. Really fascinating!
That wasn’t required in my CS program, though instead we had to design our own instruction set and assembler. Obviously it was an approximation, though.
I’ve never written a single line of code in assembly, and I’m now curious
A little late to this comment but there are some assembly videogames out! They are puzzles and gives you the gist of how assembly works.
I really enjoyed TIS-100. I just never got around to beating it.
Oh, that’s pretty cool. Thank you
I learned z80 assembly back when the cutting edge of technology was a ZX Spectrum, and 68k assembly when I upgraded to an Amiga. That knowledge served me quite well for my early career in industrial automation - it was hard real-time coding on eZ80’s and 65c02 processors, but the knowledge transfers.
Back in the day, when input got mapped straight into a memory location and the display output was another memory location, then assembly seems like magic. Read the byte they corresponds to the right-hand middle row of the keyboard, check if a certain bit is set in that byte, therefore a key is held down. Call your subroutine that copies a sequence of bytes into a known location. Boom, pressing a key updates the screen. Awesome.
Modern assembly (x64 and the like) has masses of rules about pointer alignment for the stacks, which you do so often you might as well write a macro for it. Since the OS doesn’t let you write system memory any more (a good thing) then you need to make system calls and call library functions to do the same thing. You do that so often that you might as well write a macro for that as well. Boom, now your assembly looks almost exactly like C. Might as well learn that instead.
In fact, that’s almost the purpose of C - a more readable, somewhat portable assembly language. Experienced C developers will know which sequence of opcodes they’d expect from any language construction. It’s quite a simple mapping in that regard.
It’s handy to know a little assembly occasionally, but unless you’re writing eg. crypto implementations, which must take the exact same time and power to execute regardless of the input, then it’s impractical for almost any purpose nowadays.
Very interesting. Thank you. I may start looking into C instead. I’ll still watch a couple of videos on assembly, just for the hell of it.
Really? It was required when I was in college. We did MIPS, x86, and PIC.
I like it because there’s no mysterious things happening to your bits. Every line is an instruction executed. You control the machine. It’s power. It gives you power over the machines.
I went to college for Microbiology and became a programmer on my own after, so nope, never written a single line in assembly and never thought of checking it out either. Just never really crossed my mind. I might start messing with it soon.
I… Don’t recommend it. Rust if anything.
It’s a neat party trick? Helps you understand how a processor works? But for anything modern, it’s way more work than it’s worth.
Go for it, if it’s to satisfy your own curiosity, but there’s virtually no practical use for it these days. I had a personal interest in it at uni, and a project involving coding in assembly for an imaginary processor was a small part of one optional CS course. Over the years I’ve dabbled with asm for 32-bit Intel PCs and various retro consoles; at the moment I’m writing something for the Atari 2600.
In the past, assembly was useful for squeezing performance out of low-powered and embedded systems, but now that “embedded” includes SoCs with clock speeds in the hundreds of MHz and several megabytes of RAM, and optimizing compilers have improved greatly, the tiny potential performance gain (and you have to be very good at it before you’ll be able to match or do better than most optimizing compilers) is almost always outweighed by the overhead of hand-writing and maintaining assembly language.
If you’re curious, I recommend this channel. It often delves deep into the code to explain stuff, as well as how the hardware works. Really fascinating!
This is a very interesting channel. Thank you
That wasn’t required in my CS program, though instead we had to design our own instruction set and assembler. Obviously it was an approximation, though.
Get an 8-bit computer emulator, and learn 6502 or Z-80 assembly.
Usborne machine-code-for-beginners or any book by Rodnay Zaks.
It gets deeper from there, and modern CPUs are kind of awful to hand-hack assembly on, but you’ll at least learn how the computer really works!