A Polish programmer running on fumes recently accomplished what may soon become impossible: beating an advanced AI model from OpenAI
No 🤣
So long as the protocol is predictive or algorithm based, it won’t ever be better at the job.
Oml. As someone that can’t wait for AI to advance well enough so I don’t have to fucking code anymore, I’m so sick of this bullshit narrative.
The reality headline is “Barely functional expert outperforms extremely fast junior hire still in probation period.”
We already have AI that can beat this guy, but it takes so much work that it’s still years and years and years from being able to cover all the things we consider everyday tasks. Let alone the complexities of coding, which require the fundamentals of “intelligence”, independent planning and decision making.
It’s like that other recent news when media was flabbergasted that basic coding in a 1970s Atari chess game bested an LLM. “Yeah, no shit. It’s an LLM. Do you even know AI is an acronym, or is it just synonymous with magic at this point?”
Common sense is as valuable as a computer science degree on this level.
Edit: OpenAI’s route to doing what they say they can do is using what they can currently do to assist the work into actually getting there… BECAUSE ITS ALWAYS BEEN A BIG FUCKING JOB. We don’t have any massive breakthroughs, at best we’ve gotten shortcuts, but until we can physically overcome processing limitations, figure out quantum processing or similar, NO. Just, no. Ffs it takes climate destroying levels of computing power to just be at this level of absolute shit we have rn.
I am SO sick of this general public narrative because they’ve just been given food, that’s always been there, on a plate and have nfi about how the food got onto the plate. If they did, they’d realise there’s no magic. It’s the same it’s always been but someone decided to improve it a little for the consumer.
I expect by then, there’s more to it than scraping off StackOverflow comments grounded in a “this is the best source, therefore the output is the best” fallacy.
The knowledge and logic components of intelligence would be nice-to-haves in the artificial version too.
No 🤣
So long as the protocol is predictive or algorithm based, it won’t ever be better at the job.
Oml. As someone that can’t wait for AI to advance well enough so I don’t have to fucking code anymore, I’m so sick of this bullshit narrative.
The reality headline is “Barely functional expert outperforms extremely fast junior hire still in probation period.”
We already have AI that can beat this guy, but it takes so much work that it’s still years and years and years from being able to cover all the things we consider everyday tasks. Let alone the complexities of coding, which require the fundamentals of “intelligence”, independent planning and decision making.
It’s like that other recent news when media was flabbergasted that basic coding in a 1970s Atari chess game bested an LLM. “Yeah, no shit. It’s an LLM. Do you even know AI is an acronym, or is it just synonymous with magic at this point?”
Common sense is as valuable as a computer science degree on this level.
Edit: OpenAI’s route to doing what they say they can do is using what they can currently do to assist the work into actually getting there… BECAUSE ITS ALWAYS BEEN A BIG FUCKING JOB. We don’t have any massive breakthroughs, at best we’ve gotten shortcuts, but until we can physically overcome processing limitations, figure out quantum processing or similar, NO. Just, no. Ffs it takes climate destroying levels of computing power to just be at this level of absolute shit we have rn.
I am SO sick of this general public narrative because they’ve just been given food, that’s always been there, on a plate and have nfi about how the food got onto the plate. If they did, they’d realise there’s no magic. It’s the same it’s always been but someone decided to improve it a little for the consumer.
/rant
I too, am excited for the mines
I expect by then, there’s more to it than scraping off StackOverflow comments grounded in a “this is the best source, therefore the output is the best” fallacy.
The knowledge and logic components of intelligence would be nice-to-haves in the artificial version too.