Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)

    • dustyData@lemmy.world
      link
      fedilink
      arrow-up
      28
      arrow-down
      1
      ·
      6 months ago

      Self-hosted and locally run models also goes a long way. 90% of LLMs applications don’t require users to surrender their devices, data, privacy and security to big corporations. But that is exactly how the space is being run right now.

      • LWD@lemm.ee
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        6 months ago

        And yet, Mozilla went for the 10% that do violate your privacy and gives your data to the biggest corporations: Google, Microsoft, OpenAI.

        What happened to the Mozilla Manifesto?

        • xor@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          The alternative is only supporting self hosted LLMs, though, right?

          Imagine the scenario: you’re a visually impaired, non-technical user. You want to use the alt-text generation. You’re not going to go and host your own LLM, you’re just going to give up and leave it.

          In the same way, Firefox supports search engines that sell your data, because a normal, non-technical user just wants to Google stuff, not read a series of blog posts about why they should actually be using something else.

          • LWD@lemm.ee
            link
            fedilink
            arrow-up
            4
            ·
            6 months ago

            The alt text generation is done locally. That was the big justification Mozilla used when they announced the feature.

            I’m talking about the non-local ChatGPT stuff.

            • xor@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              Ah, I missed that alt text specifically is local, but the point stands, in that allowing (opt-in) access to a 3rd party service is reasonable, even if that service doesn’t have the same privacy standards as Mozilla itself

              To pretty much every non-technical user, an AI sidebar that won’t work with ChatGPT (Google search’s equivalent from my example previously) may as well not be there at all

              They don’t want to self host an LLM, they want the box where chat gpt goes

              • LWD@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                6 months ago

                But the alt text generation already leverages a self-hosted LLM. So either Mozilla is going to cook in hundreds of extra megabytes of data for their installs, or people with accessibility issues are going to have to download something extra anyway. (IIRC it’s the latter).

                We could talk all day about things that Mozilla could add out of the box that would make the user experience better. How about an ad blocker? They can be like Opera, Brave, Vivaldi, even the most ambitious Firefox fork LibreWolf.

                But for some reason they went with injecting something into Firefox that nobody was asking for, and I don’t think it aligns at all with the average Firefox users needs or wants. Normies don’t use Firefox. They use a browser that doesn’t raise “switch to Chrome or Edge” messages. And if there was some subset of Firefox users who were begging Mozilla for AI, I never saw them. Where were they?

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      24
      arrow-down
      13
      ·
      6 months ago

      If it was truly opt-in, it could be an extension. They should not be bundling this with the browser, bloating it more in the process.

      AI already has ethical issues, and environmental issues, and privacy issues, and centralization issues. You technically can run your own local AI, but they hook up to the big data-hungry ones out of the box.

      Look at the Firefox subreddit. One month ago, people were criticizing the thought of adding AI to Firefox. Two months ago, same thing. Look at the Firefox community. See how many times people requested AI.

      • barryamelton@lemmy.ml
        link
        fedilink
        arrow-up
        12
        ·
        6 months ago

        If it was truly opt-in, it could be an extension. They should not be bundling this with the browser, bloating it more in the process.

        The extension API doesn’t have enough access for this.

        You technically can run your own local AI, but they hook up to the big data-hungry ones out of the box.

        While it is opt-in and disabled by default, this is the real problem.

        • LWD@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          What are they missing? So far, all they’ve added is a sidebar and a couple extra right-click menu additions. Both of these are available for all extensions.

      • slazer2au@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        6 months ago

        Look at the Firefox subreddit. One month ago, people were criticizing the thought of adding AI to Firefox. Two months ago, same thing. Look at the Firefox community. See how many times people requested AI.

        I believe what most people are concerned about, including myself, was the AI features being enabled automatically and then having to disable it like every other application would do to inflate metrics.

        Because this is opt in like it says in the blog I am ok with it there and disabled.

  • ScreaminOctopus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    Will you need your own account for the proprietary ones? Mozilla paying for these feels like it couldn’t be sustainable long term, which is worrying.

  • Xuderis@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    6 months ago

    But what does it DO? How is it actually useful? An accessibility PDF reader is nice, but AI can do more than that

    Our initial offering will include ChatGPT, Google Gemini, HuggingChat, and Le Chat Mistral

    This is great, but again, what for?

    • Blisterexe@lemmy.zipOP
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      A lot of people use llms a lot, ao its useful for them, but its also nice for summarizing long articles you dont have the time to read, not as good as reading it, but better than skimming jt

      • Rogério Bordini@ursal.zone
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        @Blisterexe @Xuderis It’s true, as a researcher, these models have helped me a lot to speed up the process of reading and identifying specific information in scientific articles. As long as it is privacy respecting, I see this implementation with good eyes.

        • Blisterexe@lemmy.zipOP
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          It lets you use any model, so while it lets you use chatgpt, it also lets you use a self-hosted model if you edit about:config

  • pitbuster@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    18
    ·
    6 months ago

    Theyre adding an opt-in alt text generation for blind people

    No, that’s not useful at all, but Mozilla refused to listen to the blind community.

      • pitbuster@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Because good alt text needs to be highly context dependant, so you can’t automate it. The better alternatives we have right now are crowd-sourced alt text sites, where volunteers may generate descriptions.

        • xor@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          There’s plenty of situations where even a contextless generated alt-text is a huge improvement on no alt-text at all

          • pitbuster@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            You should better read what the blind community thinks about it instead of making blanket assumptions.

        • ahal@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 months ago

          I think you have a very optimistic view on how far crowd sourcing this is going to take us.

          BTW, you think web developers aren’t already using editors that use AI to generate alt text automatically? AI alt text is going to be everywhere regardless.

          Also I’m not saying that’s a good thing. It’s just an inevitable thing.

          • pitbuster@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            Also I’m not saying that’s a good thing. It’s just an inevitable thing.

            Then why respond when I was mentioning its usefulness and that the blind community was not heard by the tech bros.

        • pitbuster@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          Because good alt text needs a lot of context, so it must be done by humans for humans at our current state of tech,

          • johnyma22@lemmy.ml
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            6 months ago

            The old ‘let perfection be the enemy of good’ argument…

            Surely this is a step in the right direction?

            • pitbuster@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              6 months ago

              It’s not, listen to the blind community instead of making assumptions (I mentioned that in my first comment).