• Rose56
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    6 hours ago

    Do they care? No! Will they push more AI? Yes! Will they listen to the consumers? I don’t think so.
    Same thing happens with lot of products over the years. Companies push new stuff that we don’t want, and a year later becomes a regular thing! They push AI day by day, from websites AI chat help to in app AI assistant. Do consumers like it? No, but still you gonna find it everywhere! and now they push it in computers and looks what it happens! No sales!

    Call me crazy, but at some point, they need to look at their data or their consumers and do the right thing.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 hours ago

      It’s maddening that they did actually take away the headphone jack from all modern phones and there’s nothing we can do about it even though it objectively sucks

        • CarrotsHaveEars@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          47 minutes ago

          “The perfect size of the screen is ((3.5 + (year - 2010) * 0.5)) inches.”

          STFU. Make phones small like iPhone 4 again.

      • Korhaka@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        there’s nothing we can do about it

        Outright rejection of their shit, I won’t buy new smartphones from them. Currently using a dumbphone although the case is breaking and they don’t make this one any more. Nokia could work but costs quite a bit tbh. Getting rid of the phone entirely is tempting.

        If I ever buy a smartphone again it will be the cheapest second hand thing I can find. Maybe don’t even take it out the house, it can stay at home like a landline and will be restricted to at most LAN connections only.

    • Donkter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      Some more news had the best take on this imo. It’s clear: these companies spent the last five+ years pouring billions into AI. And right now, AI isn’t up to snuff, it can’t do everything that was promised, so now these companies are seeking a return on their investment.

      Unfortunately for us, these companies are brands like google, apple, and Microsoft which means that their solution to AI not being a sellable product on its own is that they can shove it into everything they produce without asking the consumer. They now have to figure out how to make money off of it, which is difficult because AI doesn’t do anything supremely useful right now. It’s writing is passable but amateurish, it’s law work has too many flaws for it to replace anyone, it’s art skills are 100% clockable and nobody likes them, and it’s ability to serve better ads is unproven so far. So we get shit like AI powered laptops that don’t sell.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      6 hours ago

      Microsoft pushing a feature that most users will never use or care about? Never!

      Laughs in Window 8 optimized for touchscreens

    • DeathsEmbrace@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      6 hours ago

      It’s because they’re looking at data and a lot of you forget that. They don’t care if realistically everyone hates it if the data says everyone would use and benefit from it. Why is this so much more important? If you looked at the marketing behind AI they faked this entire industry by showing companies the “right” data to have them back them up but it’s just manipulation from the industry to make something profitable like NFTs.

  • Rin@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    ·
    11 hours ago

    AI on phones peaked with MS Contana on W10 mobile circa 2014. “Remind me to jack off when I’m home”. And it fucking did what i wanted. I didn’t even have to say words, i could type it into a text box… it also worked offline.

    • filcuk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 hours ago

      Bad news for people who use google: they’ve removed the same feature, so their assistant is more useless than Cortana a decade ago (only a mild exaggeration)

    • morbidcactus
      link
      fedilink
      English
      arrow-up
      8
      ·
      9 hours ago

      Seriously missed an opportunity to being that back as their agent.

      Legitimately though, Cortana was pretty great. There was a feature to help plan commutes (before I went more or less full remote), all it really did was watch traffic and adjust a suggest time to depart but it was pretty nice.

      Say it every time someone mentions WP7/8/10, those lumia devices were fantastic and I totally miss mine, the 1020 had a fantastic camera on it, especially for the early 2010s

      • lohky@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 hours ago

        I loved my Lumia. I have the windows phone launcher on my phone currently haha

  • RaptorBenn@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    13 hours ago

    Imagine that, a new fledgingly technology hamfistedly inserted into every part of the user experience, while offering meager functionality in exchange for the most aggressive data privacy invasion ever attempted on this scale, and no one likes it.

  • normalexit@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    11 hours ago

    If I want at AI I have a multitude of options. It’s baked into my editors and easily available on the web. I just paste some crap into a text box and we’re off to the races.

    I don’t want it in my OS. I don’t want it embedded in my phone. I’ll keep disabling it as long as that is an option.

    • drislands@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 hours ago

      A friend of mine is a streamer. On his discord, the topic of the Switch 2 came up, and one of his fans stated their desire for it to support 3D TV. Rather than saying my gut reaction – “are you crazy?” – I simply asked why. I consider it a great moment of personal self control.

      • TastyWheat@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 hours ago

        I mean the thought of big screen 3ds emulation would be pretty fun, but yeah that technology died a decade ago. Thats like asking why the Switch 2 doesn’t have a slot for SNES carts!

      • ByteJunk@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        15 hours ago

        3D TVs I can see happening, if there’s some breakthrough that fixes the current tech shortcomings .

        But NFTs, and blockchain in general? Hahahahhah.

        • Bytemeister@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 hours ago

          To many separate components need to improve.

          1. Hardware. Right now, 3D TV’s require special glasses, or they only support a single viewer in a very narrow viewing range.

          2. Content. Movies made for 3 with depth effects are better than old shows remastered to have “pop out” effects. I saw Pacific Rim in Imax 3D and it was amazing. I also saw Nightmare Before Christmas remastered for 3 and it was fucking terrible.

          3. Infrastructure. Cable/Service providers need to provide services capable of streaming 3D movies consistently with solid performance.

          4. User acceptance. To me, the market is still prioritizing picture, sound, and frame rate over 3D effects. People just don’t care for it right now.

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      14 hours ago

      now that you mention it, kinda surprised I haven’t ever seen a spate of custom 3D-printed turbo buttons from overclocker circles

  • RvTV95XBeo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    51
    ·
    18 hours ago

    Maybe I’m just getting old, but I honestly can’t think of any practical use case for AI in my day-to-day routine.

    ML algorithms are just fancy statistics machines, and to that end, I can see plenty of research and industry applications where large datasets need to be assessed (weather, medicine, …) with human oversight.

    But for me in my day to day?

    I don’t need a statistics bot making decisions for me at work, because if it was that easy I wouldn’t be getting paid to do it.

    I don’t need a giant calculator telling me when to eat or sleep or what game to play.

    I don’t need a Roomba with a graphics card automatically replying to my text messages.

    Handing over my entire life’s data just so a ML algorithm might be able to tell me what that one website I visited 3 years ago that sold kangaroo testicles was isn’t a filing system. There’s nothing I care about losing enough to go the effort of setting up copilot, but not enough to just, you know, bookmark it, or save it with a clear enough file name.

    Long rant, but really, what does copilot actually do for me?

    • Don_alForno@feddit.org
      link
      fedilink
      English
      arrow-up
      14
      ·
      16 hours ago

      Our boss all but ordered us to have IT set this shit up on our PCs. So far I’ve been stalling, but I don’t know how long I can keep doing it.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      17 hours ago

      same here, i mostly dont even use it on the phone. my bro is into it thought, thinking ai generate dpicture is good.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        ·
        17 hours ago

        It’s a fun party trick for like a second, but at no point today did I need a picture of a goat in a sweater smoking three cigarettes while playing tic-tac-toe with a llama dressed as the Dalai Lama.

        • bampop@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          14 hours ago

          It’s great if you want to do a kids party invitation or something like that

          • meowMix2525@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            8 hours ago

            That wasn’t that hard to do in the first place, and certainly isn’t worth the drinking water to cool whatever computer made that calculation for you.

      • Dragonstaff@leminal.space
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 hours ago

        We’ve had speech to text since the 90s. Current iterations have improved, like most technology has improved since the 90s. But, no, I wouldn’t buy a new computer with glaring privacy concerns for real time subtitles in movies.

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 hours ago

        I tried feeding Japanese audio to an LLM to generate English subs and it started translating silence and music as requests to donate to anime fansubbers.

        No, really. Fansubbed anime would put their donation message over the intro music or when there wasn’t any speech to sub and the LLM learned that.

      • Bytemeister@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        You’re thinking too small. AI could automatically dub the entire movie while mimicking the actors voice while simultaneously moving their lips and mouth to form the words correctly.

        It would just take your daily home power usage to do a single 2hr movie.

    • AbsentBird@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      18 hours ago

      The only feature that actually seems useful for on-device AI is voice to text that doesn’t need an Internet connection.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        17 hours ago

        As someone who hates orally dictating my thoughts, that’s a no from me dawg, but I can kinda understand the appeal (though I’ll note offline TTS has been around for like a decade pre-AI)

    • ByteJunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      7
      ·
      15 hours ago

      I use it to speed up my work.

      For example, I can give it a database schema and ask it for what I need to achieve and most of the time it will throw out a pretty good approximation or even get it right on the first go, depending on complexity and how well I phrase the request. I could write these myself, of course, but not in 2 seconds.

      Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

      Then there’s just convenience things. At what date and time will something end if it starts in two weeks and takes 400h to do? There’s tools for that, or I could figure it out myself, but I mean the AI is just there and does it in a sec…

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        changing upper/lower capitalization

        That’s literally a built-in VSCode command my dude, it does it in milliseconds and doesn’t require switching a window or even a conscious thought from you

      • morbidcactus
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 hours ago

        Gotta be real, LLMs for queries makes me uneasy. We’re already in a place where data modeling isn’t as common and people don’t put indexes or relationships between tables (and some tools didn’t really support those either), they might be alright at describing tables (Databricks has it baked in for better or worse for example, it’s usually pretty good at a quick summary of what a table is for), throwing an LLM on that doesn’t really inspire confidence.

        If your data model is highly normalised, with fks everywhere, good naming and well documented, yeah totally I could see that helping, but if that’s the case you already have good governance practices (which all ML tools benefit from AFAIK). Without that, I’m totally dreading the queries, people already are totally capable of generating stuff that gives DBAs a headache, simple cases yeah maybe, but complex queries idk I’m not sold.

        Data understanding is part of the job anyhow, that’s largely conceptual which maybe LLMs could work as an extension for, but I really wouldn’t trust it to generate full on queries in most of the environments I’ve seen, data is overwhelmingly super messy and orgs don’t love putting effort towards governance.

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        26
        ·
        14 hours ago

        it’s really embarrassing when the promptfans come here to brag about how they’re using the technology that’s burning the earth and it’s just basic editor shit they never learned. and then you watch these fuckers “work” and it’s miserably slow cause they’re prompting the piece of shit model in English, waiting for the cloud service to burn enough methane to generate a response, correcting the output and re-prompting, all to do the same task that’s just a fucking key combo.

        Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

        how in fuck do you work with strings and have this shit not be muscle memory or an editor macro? oh yeah, by giving the fuck up.

        • CarrotsHaveEars@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          7 hours ago

          (100% natural rant)

          I can change a whole fucking sentence to FUCKING UPPERCASE by just pressing vf.gU in fucking vim with a fraction of the amount of the energy that’s enough to run a fucking marathon, which in turn, only need to consume a fraction of the energy the fucking AI cloud cluster uses to spit out the same shit. The comparison is like a ping pong ball to the Earth, then to the fucking sun!

          Alright, bros, listen up. All these great tasks you claim AI does it faster and better, I can write up a script or something to do it even faster and better. Fucking A! This surge of high when you use AI comes from you not knowing how to do it or if even it’s possible. You!

          You prompt bros are blasting shit tons of energy just to achieve the same quality of work, if not worse, in a much fucking longer time.

          And somehow these executives claim AI improves fucking productivity‽

          • self@awful.systems
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 hours ago

            exactly. in Doom Emacs (and an appropriately configured vim), you can surround the word under the cursor with brackets with ysiw] where the last character is the bracket you want. it’s incredibly fast (especially combined with motion commands, you can do these faster than you can think) and very easy to learn, if you know vim.

            and I think that last bit is where the educational branch of our industry massively fucked up. a good editor that works exactly how you like (and I like the vim command language for realtime control and lisp for configuration) is like an electrician’s screwdriver or another semi-specialized tool. there’s a million things you can do with it, but we don’t teach any of them to programmers. there’s no vim or emacs class, and I’ve seen the quality of your average bootcamp’s vscode material. your average programmer bounces between fad editors depending on what’s being marketed at the time, and right now LLMs are it. learning to use your tools is considered a snobby elitist thing, but it really shouldn’t be — I’d gladly trade all of my freshman CS classes for a couple semesters learning how to make vim and emacs sing and dance.

            and now we’re trapped in this industry where our professionals never learned to use a screwdriver properly, so instead they bring their nephew to test for live voltage by licking the wires. and when you tell them to stop electrocuting their nephew and get the fuck out of your house, they get this faraway look in their eyes and start mumbling about how you’re just jealous that their nephew is going to become god first, because of course it’s also a weirdo cult underneath it all, that’s what happens when you vilify the concept of knowing fuck all about anything.

          • Hexarei@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            6 hours ago

            The only things I’ve seen it do better than I could manage with a script or in Vim are things that require natural language comprehension. Like, “here’s an email forwarded to an app, find anything that sounds like a deadline” or “given this job description, come up with a reasonable title summary for the page it shows up on”… But even then those are small things that could be entirely omitted from the functionality of an app without any trouble on the user. And there’s also the hallucinations and being super wrong sometimes.

            The whole thing is a mess

      • Samskara@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        12 hours ago

        adding brackets and changing upper/lower capitalization

        I have used a system wide service in macOS for that for decades by now.

    • Flipper@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      16 hours ago

      Apparently it’s useful for extraction of information out of a text to a format you specify. A Friend is using it to extract transactions out of 500 year old texts. However to get rid of hallucinations the temperature reds to be 0. So the only way is to self host.

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        9
        ·
        15 hours ago

        Setting the temperature to 0 doesn’t get rid of hallucinations.

        It might slightly increase accuracy, but it’s still going to go wrong.

      • daellat@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        16 hours ago

        Well, LLMs are capable (but hallucinant) and cost an absolute fuckton of energy. There have been purpose trained efficient ML models that we’ve used for years. Document Understanding and Computer Vision are great, just don’t use a LLM for them.

  • Detun3d@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    8
    ·
    14 hours ago

    Gen AI should be private, secure, local and easier to train by it’s users to fit their own needs. Closest thing to this at the moment seems to be Kobold.

    • VagueAnodyneComments@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      Nah we’re up to running Qwen3 and Deepseek r1 locally with accessible hardware at this point so we have access to what you describe. Ollama is the app.

      The problem continues to be that LLMs are not suitable for many applications, and where they are useful, they are sloppy and inconsistent.

      My laptop is one of the ones they are talking about in the article. It has an AMD NPU, it’s a 780M APU that also runs games about as well as an older budget graphics card. It handles running local models really well for its size and power draw. Running local models is still lame as hell, not how I end up utilizing the hardware. 😑

  • Kewlio251@midwest.social
    link
    fedilink
    English
    arrow-up
    35
    ·
    21 hours ago

    My problem is that it’s not that fucking useful. I got the Pixel 9 specifically because of its advertised AI chip for the assistant and I swear it’s just gotten worse since the Pixel 7. I used to be able to ask Google anything through the assistant, and now 90% of my questions are answered with “can’t find the information.”

    They also advertised (or at least heavily alluded to) the use of the AI chip when you are in low network areas but it works just as good outside of 4g+ coverage as it ever did without the stupid chip.

    Whats the point of adding AI branded nonsense if there’s no practical use for it. And that doesn’t even start to cover the issues with AI’s reliability as a source of information. Garbage in = garbage out.

    • JayGray91@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 hours ago

      When Gemini can find the information, they added flowery “social” bullshit before, in the middle and after the information I asked for wasting my time

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      17 hours ago

      i dint get a pixel for that reason after PIxel 5a died, the exonys chip is significantly weaker than other flagship phones, and they sacrificed thier battery power/efficiency capacity since 5A(which was a very defective phone) just to prop up AI.

      We know google was saving money on not using QUALCOMM/snapdragon chips, which most others are using. AI is just thier excuse so they can put less effort into making quality product.

  • smiletolerantly@awful.systems
    link
    fedilink
    English
    arrow-up
    51
    ·
    edit-2
    23 hours ago

    That’s not fair! I care! A lot!

    Just had to buy a new laptop for new place of employment. It took real time, effort, and care, but I’ve finally found a recent laptop matching my hardware requirements and sense of aesthetics at a reasonable price, without that hideous copilot button :)

      • smiletolerantly@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 hours ago

        Decided on this:

        Still had some issues under Linux / NixOS a couple of weeks ago (hardware-wise everything worked; but specific programs, esp. Librewolf, will randomly start eating CPU and battery out of nowhere, with what looks like noops. Haven’t investigated further, yet.

        • JayGray91@piefed.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 hours ago

          sweet, glad to know it generally works with linux. this is available in my part of the world. been shopping around for a personal for-work laptop since my company is stingy. And I plan to move on anyways.

          • smiletolerantly@awful.systems
            link
            fedilink
            English
            arrow-up
            1
            ·
            21 minutes ago

            It generally works, yes, but I’d hold off for another month or two in the hopes of the issues being resolved in the kernel

        • psivchaz@reddthat.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          I really wanted to like that laptop but the screen is so incredibly glossy that unless you’re in a totally dark room it becomes a mirror.

          • smiletolerantly@awful.systems
            link
            fedilink
            English
            arrow-up
            1
            ·
            19 minutes ago

            I think it’s a matter of preference. Haven’t noticed the screen being a mirror yet, but then again I feel like any even mildly matte screen looks like it’s being viewed through a veil…

            I am a bit worried/curious about how the oled will deal with my very static waybars though, lol

      • thermal_shock@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        22 hours ago

        How are they bootlocked? Just need the right iso. I have done it, because I didn’t know they came with Linux for this particular client and they put windows on it, had to get a specific iso to reinstall when they borked it.