The browser addon is not anymore, and they were kind of hostile responding to questions regarding that.
The server software’s open source version also only supports the free tier’s features. There was a fork that was supposed to add the paid features, but it’s not maintained.
Yes, it’s still open source: https://github.com/languagetool-org/languagetool
And yes, it’s worth it. However I think that LLaMA based ML models supplementing LanguageTool will be the future.
Very cool.
Worth what? It’s free! And yes, it’s open source. It can also be self-hosted if you’re paranoid.
I self-host it in a docker container. You will have to download abour 4 gigabytes of “n-gram” data. And there are no AI features in the self-hosted version.
do you have a guide?
Here’s my podman-compose.yml file (you can use this as you docker-compose file as well):
version: "3" services: languagetool: image: erikvl87/languagetool:latest container_name: languagetool ports: - 8010:8010 # Using default port from the image environment: - langtool_languageModel=/ngrams # OPTIONAL: Using ngrams data - Java_Xms=512m # OPTIONAL: Setting a minimal Java heap size of 512 mib - Java_Xmx=1g # OPTIONAL: Setting a maximum Java heap size of 1 Gib volumes: - path/to/ngrams:/ngrams restart: always
Just download the ngrams from this link and change the path in the file. You can use this with a Firefox or Thunderbird addon. Go to the advanced settings in the addon preferences and add http(s)://YOUR_IP_OR_DOMAIN/v2 in the other server option.
References: