

Chatgpt is just enforcing 4th law of robotics.
Chatgpt is just enforcing 4th law of robotics.
I have no mouth and I must scream.
It was originally published as part of a book compilation, and it was kind of hard to find a file with just that story and without typos all over the place.
4741
One small old book that used to be hard to find in good quality. I’ve it seeding for years, and being so small and in risk of being lost I’ve never taken it down.
They can and they will just lobby commission or EU Parliament if needed.
Do you have a proper robots.txt file?
Do they do weird things like invalid url, invalid post tries? Weird user agents?
Millions of times by the same ip sound much more like vulnerability proving than crawler.
If that’s the case fail to ban or crowdsec. Should be easy to set up a rule to ban an inhumane number of hits per second on certain resources.
How do you know it’s “AI” scrappers?
I’ve have my server up before AI was a thing.
It’s totally normal to get thousands of bot hits and to get scraped.
I use crowdsec to mitigate it. But you will always get bot hits.
You can also use a shorter version .clone();
I don’t know who are these people. And they have achieved in record time that I never want to really heard them anymore.
Nah. That analogy does not work.
Piracy situation is more like you have made a cool statue and you charge people money for looking at your statue. Then someone comes, looks at your statue, and goes away without paying.
There’s no thief, nothing was stolen at any point. The one how came looking without paying was probably never going to pay for an entrance, and the statue can me still be looked by anyone. Nothing is loss in the process, no harm is done. Some guy just looked at a statue without paying for it.
Piglet is still there in the morning though.
I have it on docker with two volumes, ./config and ./cache
I back up those before each update.
A bad Jellyfin update should not mess with your media folder in anyway. Though you should have backups of those aswell as a rule of thumb.
A good human translator is always the best solution.
But if the choice would be between crappy google translate or a LLM I would take the LLM translation.
There’s no excuse for a big studio, they should hire translators. But for indie creators without a budget it can be the best way to get their creation to more people.
I’ve been using jellyfin for years.
My best recommendation is DELAY UPDATES and back up before you update.
I have a history of updates breaking everything so you should be careful about them.
All software recommends backing up before an update, but for jellyfin the shit is real, you really want to back up.
I’m glad. But don’t get your hopes up because of this. Commission could (and probably will) just say “we have considered it and we are going to do nothing”.
I signed.
But I’m 100% sure that the decision will be “no”.
But anyway it’s good to make more people aware of the issue.
In my experience LLM get vastly better results that traditional translation software.
Also google translate is not traditionally suited for long coherent text. One particular issue is tone, proper translation takes into account not only the worlds but the tone and the subject is being treated. Google translate cannot take that into account, with an llm the user can tweak the tone to match the tone of the original text with better accuracy.
And, anyway google translate if it’s not already using llm for translation will soon. Results are just better. It’s one of the tasks that language models are actually good for.
Anyhow, what’s the issue if an automatic translation is done using one software or other? Just use whatever gives best results and it’s more convenient for the developer.
How does it differentiate an “AI crawler”, from any other crawler? Search engine crawler? Someone monitoring data to offer statistics? Archiving?
This is not good. They are most likely doing the crawling themselves and them selling the data to the best bidder. That bidder could obviously be openAI for all we know.
They just know that introducing the sentence “this is anti AI” a lot of people is not going to question anything.
Last year. It still had a lot of lag moving around in my machine.
Taking planes, another big CO2 contributor. The sky is full of planes burning fuel.