Didn’t read the article but headlines like this are extremely naive. If a person takes seriously the existential risk factor of AGI/ASI then that’s by definition the most important thing one can pay attention to. That’s like asking a deeply religious person to stop worrying about hell.
But to build on your analogy: we don’t make regulations based on a religious doctrine anymore in most countries. If your religion says no one is allowed to wear mixed fabrics or eat pork, that’s fine if you’re not doing that, but we’re not banning those things for all of society.
There’s about as much proof for an existential AGI threat as there is for a deity, so let’s not make policies based on either, and focus instead on real potential and already proven harms of AI
Didn’t read the article but headlines like this are extremely naive. If a person takes seriously the existential risk factor of AGI/ASI then that’s by definition the most important thing one can pay attention to. That’s like asking a deeply religious person to stop worrying about hell.
But to build on your analogy: we don’t make regulations based on a religious doctrine anymore in most countries. If your religion says no one is allowed to wear mixed fabrics or eat pork, that’s fine if you’re not doing that, but we’re not banning those things for all of society.
There’s about as much proof for an existential AGI threat as there is for a deity, so let’s not make policies based on either, and focus instead on real potential and already proven harms of AI