• fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Advantages of running things locally:

    • Saving on electricity, bandwidth, and processing
    • Able to customize for individuals or families
    • Enhanced privacy
    • Option for future federated/mesh applications
    • Keeps running when network/cloud goes down (Hello, AWS!)
  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    3 days ago

    This isn’t speculative, it’s real and running, and it doesn’t pose a lot of the ethical dilemmas other AI applications face. Here’s why I think this matters: The consumer doesn’t have to do anything beyond pressing a button to use it.

    1. Whose data is it trained on? Seems like an ethical dilemma to me.

    2. Even worse than a web-based LLM, people are going to be even more unlikely to fact-check the often-incorrect information it’s going to feed you.

    3. Using it will not be the complicated part. Setting it up will be.

    • manualoverride@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      2 days ago
      1. Whose data is it trained on? Seems like an ethical dilemma to me.

      Using a standalone LLM for personal use doesn’t seem like an ethical dilemma to me, it’s already been trained on the data and if the data was accessible on the web or via a library then I don’t see the harm.

      Getting small amounts of medium-trust information on a subject, is a good way to get someone interested enough to read a book, watcha a YouTube video or find a website for more information and validate the AI response.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Using a standalone LLM for personal use doesn’t seem like an ethical dilemma to me

        What is the ethical dilemma, exactly, and why/how is this different?

        Getting small amounts of medium-trust information on a subject, is a good way to get someone interested enough to read a book, watcha a YouTube video or find a website for more information and validate the AI response.

        Again, how is this different? At least the web-based ones actually link to where the info came from…

        • manualoverride@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 days ago

          We’re talking about home use AI searches… you said it was unethical so maybe you should define exactly why you think this?

          Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer, I would not have bought a book or gone anywhere past the first page of google for that information.

          The possible ethical dilemma is depriving someone of compensation because I used their research and deprived them of potential revenue, in reality I would never have bought a book on tyre pressures or car maintenance, and it’s unlikely I would ever have visited a site where adverts would have paid the contributors.

          Another dilemma is of power consumption, the model is already made then it’s already used the power, and my tiny LLM query is going to use far less power locally than a web based search.

          As a company who might make money, or achieve cost savings from using AI trained on data some only intended for use by a human, I can see how this is not always ethical.

          • artyom@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            2 days ago

            maybe you should define exactly why you think this?

            It’s very simple, copyright. You’re benefitting from someone else’s work without providing them with any compensation for said work. That doesn’t suddenly change because the compute happens on your personal computer.

            Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer

            If you had actually looked it up, you might have actually gotten the correct answer, as well as learned that it’s printed on the driver’s door jamb of every car.

            my tiny LLM query is going to use far less power locally than a web based search

            Why would you think your local LLM would be any more efficient than a web-based one?

            • manualoverride@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              2 days ago

              This was exactly my point, when it’s for home use the chance of my depriving anyone of revenue is negligible.

              If I’m running a home assistant anyway not having that assistant constantly connected to the web relaying my audio, processing and sending it back will use less power.

              Finally thanks to the solar panels on my roof I can guarantee my searches are powered on 100% sunshine.