• Stovetop@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    4
    ·
    17 days ago

    Finally Apple is ready to use all that training data they say they don’t collect.

        • AliasAKA@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          17 days ago

          Depends. If they get access to the code OpenAI is using, they could absolutely try to leapfrog them. They could also just be looking at ways to get near ChatGPT4 performance locally, on an iPhone. They’d need a lot of tricks, but succeeding there would be a pretty big win for Apple.

          • technocrit@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            16 days ago

            People are really racing to destroy the planet so their phone can make a crappy summary of what’s on wikipedia.

            • AliasAKA@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              16 days ago

              Not even a summary of what’s on Wikipedia, usually a summary of the top 5 SEO crap webpages for any given query.

          • abhibeckert@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            16 days ago

            near ChatGPT4 performance locally, on an iPhone

            Last I checked, iPhones don’t have terabytes of RAM. Nothing that runs on a small battery powered device is ever going to be in the ballpark of ChatGPT. At least not in the foreseeable future.

            • AliasAKA@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              16 days ago

              They don’t, but with quantization and distillation, as well as fancy use of fast ssd storage (they published a paper on this exact topic last year), you can get a really decent model to work on device. People are already doing this with things like OpenHermes and Mistral (given, 7B models, but I could easily see Apple doubling ram and optimizing models with the research paper I mentioned above, and getting 40B models running entirely locally). If the start of the network is good, a 40B model could take care of a vast majority of user Siri queries without ever reaching out to the server.

              For what it’s worth, according to their wwdc note, they’re basically trying to do this.

    • BigFig@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      10
      ·
      17 days ago

      But every apple user has assured me that iPhones are so much more secure and that apple isn’t like mean ol Google and toooootally doesn’t collect all the same data from you.

      • Alphane Moon@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        17 days ago

        They will also assure you that Apple totally doesn’t not collaborate with the CCP and allows them full access to all Chinese users data.

        Apple users like to assure people of many things. :)

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 days ago

      It’s just ChatGPT. Nothing is new here. Apple hasn’t innovated anything truly new in over a decade. The closest you can get is their processors, and none of those are anything like the innovations they brought us 15-20 years ago.

  • cybersandwich@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    5
    ·
    17 days ago

    Buncha wet blankets on Lemmy. JFC.

    I know there is a ton of hype around AI, but at least there is actually something there (unlike crypto).

    This is the most exciting thing to happen with computing in a while and if you read Lemmy you’d think everything is bleak and hopeless.

    There is so much opportunity to change the way we interact with computers and innovate.

    • Clbull@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      16 days ago

      I’m actually worried about the next few years.

      All it takes is for a deep learning algorithm to learn and perform menial tasks better than humans and that is it. Suddenly whole industries of workers could be made redundant, which would spike unemployment rates. We are not ready for that.

      And before you say Universal Basic Income will save us, UBI is little more than a leftist pipe dream that would bankrupt any nation that tries to pursue it as an actual policy.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      16 days ago

      AI peeps so insecure that they gotta attack crypto (vaguely). Despite its many problems. crypto is a viable alternative to the legacy system of stat currency that’s literally creating war and destroying the planet. It’s the only truly international currency. AI solves no such problem nearly as important.