Maven (famous)@lemmy.zip to Programmer Humor@programming.dev · 1 day agoSimple Optimization Tricklemmy.zipimagemessage-square71fedilinkarrow-up1876arrow-down15
arrow-up1871arrow-down1imageSimple Optimization Tricklemmy.zipMaven (famous)@lemmy.zip to Programmer Humor@programming.dev · 1 day agomessage-square71fedilink
minus-squareHugeNerd@lemmy.calinkfedilinkarrow-up15·13 hours agoHave you seen the insane complexity of modern CPUs? Ain’t no one hand coding that like a 6502 in 1985.
minus-squaremusubibreakfast@lemmy.worldlinkfedilinkarrow-up6·13 hours agoI wonder if there’s anyone alive right now who would be capable of such a task.
minus-squareBlackmist@feddit.uklinkfedilinkEnglisharrow-up5·12 hours agoIf the hardware was fixed, I don’t see why not. Might not be as fast as the optimisations compilers do these days though. If you have to support thousands of types of GPU and CPU and everything else, then fuck no.
minus-squareskuzz@discuss.tchncs.delinkfedilinkarrow-up2·12 hours agoEven if one did, say using x86, it would still just be interpreted by the CPU into the CPU’s native opcodes, as the legacy instruction sets are interpreted/translated.
minus-squareBigDanishGuy@sh.itjust.workslinkfedilinkarrow-up2·11 hours ago as the legacy instruction sets are interpreted/translated. Wth? That’s it, I’m sticking to the AVR then
Have you seen the insane complexity of modern CPUs? Ain’t no one hand coding that like a 6502 in 1985.
I wonder if there’s anyone alive right now who would be capable of such a task.
If the hardware was fixed, I don’t see why not.
Might not be as fast as the optimisations compilers do these days though.
If you have to support thousands of types of GPU and CPU and everything else, then fuck no.
Even if one did, say using x86, it would still just be interpreted by the CPU into the CPU’s native opcodes, as the legacy instruction sets are interpreted/translated.
Wth? That’s it, I’m sticking to the AVR then