Yes, but your response is misleading it insinuates that we have ways to keep those people off of the Fediverse and we don’t. It’s not possible, all of the kinds of people you’re saying he wants on the Fediverse are already on the Fediverse. Also, small doesn’t equal safer, the Fediverse has been small until last year and for years minorities have dealt with racism and harassment. What he said is ultimately right, we need to make the tools. With better tools people can largely have their preferred experiences, it doesn’t have to be an either or situation.
I agree that small doesn’t equal safer, in other articles I’ve quoted Mekka as saying that for many Black Twitter users there’s more racism and Nazis on the fediverse than Twitter. And I agre that better tools will be good. The question is whether, with current tools, growth with the principles of Big Fedi leads to more or less safety. Evan assumes that safety can be maintained: “There may be some bad people too, but we’ll manage them.” Given that the tools aren’t sufficient to manage the bad people today, that seems like an unrealistic assumption to me.
And yes, there are ways to keep these people off the fediverse (although they’re not perfect). Gab isn’t on the fediverse today because everybody defederated it. OANN isn’t on the fediverse today because everybody threatened to defederate the instance that (briefly) hosted them, and as a result the instance decided to enforce their terms of service. There’s a difference between Evan’s position that he wants them to have accounts on the fediverse, and the alternate view that we don’t want them to have accounts on the fediverse (although may not always be able to prevent it).
OANN and Gab are one example of a back down. What about the child porn instances? They are still on the Fediverse, they’re just blocked by lots of instances. Using Gab provides a false sense of safety to people.
Or, using Gab provides a sense of what’s possible.
And child porn is a great example – and CSAM more generally. Today’s fediverse would have less CSAM if the CSAM instances weren’t on it. Why hasn’t that happened? The reason that many instances give for not block the instances that are well-known sources of CSAM is that CSAM isn’t the only thing on that instance. And it’s true: these instances have lots of people talking about all kinds of things, and only a relatively-small number of people spreading CSAM. So not blocking them is completely in aligment with the Big Fedi views Evan articulates: everybody (even CSAM-spreaders) should have an account, and it’s more important to have the good (non-CSAM) people on the fediverse than to keep the bad (CSAM-spreading) people off.
A different view is that whoa, even a relatively-small number of people spreading CSAM is way too many, and today’s fediverse would be better if they weren’t on it, and if the instances that allow CSAM are providing a haven for them then those instances shouldn’t be on the fediverse. It seems to me that view would result in less CSAM on the fediverse, which I see as a good thing.
Yes, but your response is misleading it insinuates that we have ways to keep those people off of the Fediverse and we don’t. It’s not possible, all of the kinds of people you’re saying he wants on the Fediverse are already on the Fediverse. Also, small doesn’t equal safer, the Fediverse has been small until last year and for years minorities have dealt with racism and harassment. What he said is ultimately right, we need to make the tools. With better tools people can largely have their preferred experiences, it doesn’t have to be an either or situation.
I agree that small doesn’t equal safer, in other articles I’ve quoted Mekka as saying that for many Black Twitter users there’s more racism and Nazis on the fediverse than Twitter. And I agre that better tools will be good. The question is whether, with current tools, growth with the principles of Big Fedi leads to more or less safety. Evan assumes that safety can be maintained: “There may be some bad people too, but we’ll manage them.” Given that the tools aren’t sufficient to manage the bad people today, that seems like an unrealistic assumption to me.
And yes, there are ways to keep these people off the fediverse (although they’re not perfect). Gab isn’t on the fediverse today because everybody defederated it. OANN isn’t on the fediverse today because everybody threatened to defederate the instance that (briefly) hosted them, and as a result the instance decided to enforce their terms of service. There’s a difference between Evan’s position that he wants them to have accounts on the fediverse, and the alternate view that we don’t want them to have accounts on the fediverse (although may not always be able to prevent it).
OANN and Gab are one example of a back down. What about the child porn instances? They are still on the Fediverse, they’re just blocked by lots of instances. Using Gab provides a false sense of safety to people.
Or, using Gab provides a sense of what’s possible.
And child porn is a great example – and CSAM more generally. Today’s fediverse would have less CSAM if the CSAM instances weren’t on it. Why hasn’t that happened? The reason that many instances give for not block the instances that are well-known sources of CSAM is that CSAM isn’t the only thing on that instance. And it’s true: these instances have lots of people talking about all kinds of things, and only a relatively-small number of people spreading CSAM. So not blocking them is completely in aligment with the Big Fedi views Evan articulates: everybody (even CSAM-spreaders) should have an account, and it’s more important to have the good (non-CSAM) people on the fediverse than to keep the bad (CSAM-spreading) people off.
A different view is that whoa, even a relatively-small number of people spreading CSAM is way too many, and today’s fediverse would be better if they weren’t on it, and if the instances that allow CSAM are providing a haven for them then those instances shouldn’t be on the fediverse. It seems to me that view would result in less CSAM on the fediverse, which I see as a good thing.