• cygnus
    link
    fedilink
    arrow-up
    46
    arrow-down
    4
    ·
    3 months ago

    Who can blame them when the industry itself misrepresents its product, starting with the name.

    • Windex007@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      3 months ago

      It’s SELF reported “knowing little to nothing”.

      As a computer scientist, I can confidently say that most people who confidently say they are well versed in the subject, actually still know little to nothing.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      17
      arrow-down
      4
      ·
      3 months ago

      You’re providing a prime example of misunderstanding. The term AI has been in professional use for a wide variety of algorithms, including machine learning and neural nets like LLMs, since the Dartmouth conference in 1956. It’s the people who only know what AI is like from Star Trek and other such sources that are misinformed.

      • xmunk@sh.itjust.works
        link
        fedilink
        arrow-up
        16
        arrow-down
        1
        ·
        3 months ago

        As a programmer intimately familiar with LLMs and training evaluation… would you mind if I rephrased your comment to “How dare you use the common meaning of our obscure industry jargon that’s mostly just marketing bullshit anyways!”

        The ship for “What does AI mean?” has fucking sailed. AI is an awful term that, in my experience, is vanishingly rarely used by developers outside of “Robots that will kill us” and “Marketing bullshit” th3 term needs to die - it implies something much closer to “AGI in a mechasuit with miniguns” rather than “My python code can recognize fuzzy numbers!”

        • howrar
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          Do we have a better word for what has historically been known as AI? I see lots of complaints about X not being AI, but no proposal for what to call them.

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            3
            ·
            3 months ago

            I don’t know what you mean by “historical”, because the stuff we’ve got now is what is historically known as AI.

            If you mean the Star Trek stuff, though, then the specific terms for those are AGI (Artificial General Intelligence, an AI that’s capable of doing basically everything a human can) and ASI (Artificial Super Intelligence, an AI that’s capable of doing more than what a human can).

            We don’t have AGI yet, but there’s no reason to assume we can’t eventually figure it out. Brains are made of matter, so by fiddling with bits of matter we should eventually be able to make it do whatever a brain can. We have an example showing what’s possible, we just need to figure out how to make one of our own.

            ASI is a little more speculative since we don’t have any known examples of naturally-occurring superintelligence. But I also think it’s a bit unlikely that humans just happen to be the smartest things that can exist.

            • Nik282000
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              If you mean the Star Trek stuff, though, then the specific terms for those are AGI

              Even in Star Trek only Data, Lore (and Peanut-hamper) were intelligent, all the computers ran on what is being called ‘AI’ now. Massive DBs and search algorithms.

              • FaceDeer@fedia.io
                link
                fedilink
                arrow-up
                2
                ·
                3 months ago

                The ship’s computer could whip up an AGI (Moriarty) in response to a simple command. The Federation later systematized this in the form of emergency holographic officers.

              • xmunk@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                3 months ago

                Search algorithms are, depending on the specifics, potentially “ai” now. If we’re tokenizing out vectors and running a straight match query (i.e. postgres full text search) that’s not AI - that’s just string matching. Some of the offerings get into NN guided or LLM powered… these tend to suck though because they’re unpredictable and inconsistent. That may just be the novelty of the technology though, we’ve had decades to work on small word exclusion and language specific dictionary mapping - it’s possible the consistency will get up and, at least when it comes to searching, everything really good already uses weird heuristics so it’s not like we can reason on why specific results are preferred, we just know they’re consistent.

            • howrar
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              the stuff we’ve got now is what is historically known as AI.

              Yeah, and people are complaining that we shouldn’t call it AI anymore because the colloquial usage of the word has changed, so I want to know what alternatives exist.

                • howrar
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 months ago

                  Yes, you’ve provided the terms that I’m familiar with. That’s not what I’m asking for though. I’m asking for alternatives from people who don’t agree with this terminology.

          • Nik282000
            link
            fedilink
            arrow-up
            2
            ·
            3 months ago

            Neural networks, deep learning, Generative pre-trained transformers…

            • howrar
              link
              fedilink
              arrow-up
              3
              ·
              3 months ago

              Those are all very narrow subtopics within AI. A replacement term for “AI” would have to be more general and include the things you’ve listed.

              • Nik282000
                link
                fedilink
                arrow-up
                3
                ·
                3 months ago

                Nondeterministic Computing. There is no intelligence in what is now called ‘AI’.

                • FaceDeer@fedia.io
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  3 months ago

                  That’s even more “wrong,” though. Plenty of AI is deterministic, and plenty of nondeterministic computing isn’t AI.

                • howrar
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 months ago

                  Counterexample: There exists an optimal deterministic policy for any MDP.

          • corsicanguppy
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Prep-cook. Magician’s Apprentice. Understudy. Artificial intern.

            I use these tools to barf code examples. It’s like asking the prep-cook to get the stock going so you can do other things.

      • cygnus
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        3 months ago

        You’re technically correct, but these products are marketed as though they were like Star Trek’s computer. Do you think it’s a coincidence that Google Assistant’s codename was “Project Majel”?

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          3 months ago

          I don’t see what this has to do with the meaning of the term AI.

          If a marketing department somewhere starts trying to push the slogan “ice cream for your car!” As a way to sell gasoline, would it make sense to start complaining that the stuff Ben & Jerry’s is selling isn’t actually gasoline?

          • cygnus
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            This is more like all petro-companies and gas stations all coordinated to call gasoline “ice cream” and media pick up the term as well, so that everybody suddenly starts calling gasoline ice cream. Some of us are on the sidelines reminding people that “ice cream” has a distinct and different meaning.

  • Nik282000
    link
    fedilink
    arrow-up
    29
    ·
    3 months ago

    Almost half of Canadians know little about Windows, MacOS, Android, iOS, Linux, tcp/ip networking, file formats, disk partitioning, encryption, SSL certificates…

    There are a thousand things that Canadians SHOULD know about before they even think about ‘AI’.

    • yeehaw
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      3 months ago

      Orrrr how electricity works, how to do an oil change on a car, etc. Nobody can know everything so I think the article title is kinda stupid.

    • girlfreddy
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      3 months ago

      Problem is people treat their computers and programs exactly like they treat their cars … as long as it starts and goes they don’t care. But when there’s a problem all hell breaks loose.

      Companies do the same thing, putting less importance on IT depts than stock buy backs.

  • small_crow
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    3 months ago

    And the rest overestimate how much they know.

    • RickyWars1
      link
      fedilink
      arrow-up
      7
      ·
      3 months ago

      Yup came to comment this. The number is certainly more than half. Second line:

      More than two in five (43%) of Canadians say they know very little or nothing about the topic, reports TECHNATION.

      This is just self-reported.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    To any Canadians who don’t know - it’s marketing bullshit. Taco Bell talking about the “crunch factor” of their latest creation is more meaningful than some company saying “We do it with AI now.”

  • /home/pineapplelover@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    Idk if I understand AI. By AI does it mean it can think and not a glorified if-else statement? Is AI different from machine learning? Idk man.

    • Mkengine@feddit.de
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      When people talk about AI, they’re generally referring to systems or machines that can perform tasks which typically require human intelligence. These tasks might include things like recognizing speech, translating languages, or making decisions. AI isn’t about simulating human consciousness or emotions but about replicating the ability to perform specific intelligent tasks.

      AI systems can range from simple, rule-based algorithms (which might seem like glorified if-else statements) to complex, learning systems. This is where machine learning comes in. Machine learning is actually a subset of AI. It’s a way of achieving AI where the system learns from data. Instead of being explicitly programmed to perform a task, the system is given huge amounts of data and learns patterns or rules from it. Over time, it can make predictions or decisions based on what it has learned.

      So, not all AI is machine learning, but all machine learning is AI. Hope this clears things up a bit!

        • Mkengine@feddit.de
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          2 months ago

          I would say in most videogames you can play against the computer (Age of Empires, Call of Duty, etc.), which use human-set rules without machine learning. Computer enemies show the same behavior, regardless of your specific knowledge of the game. I think nowadays there may already be some games where the computer learns from your behavior by machine learning, but this is not the norm.

          There were also chatbots before ChatGPT existed, which in their most basic form give human programmed answers to specific questions.

  • Sprawlie@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    Almost half of Canadians know little to nothing

    probably the most accurate thing I have read all day

  • Swordgeek
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 months ago

    The headline missed a few critical words.

    Almost half of Canadians realize they know little to nothing about AI. The rest are deluded.

    I could be charitable and say that 0.01% of Canadians know something about AI, but I’m probably overestimating by leaps and bounds.

    • corsicanguppy
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      I’ve been playing with these ‘AI’ tools, and I’ve gone from ‘hey cool’ to ‘useless git’ in about 3 months.

      It’s a neat trick, but it’s not intelligence; and too many people - especially suits - don’t seem to get that. It’s gonna be fun watching them Broadcom themselves into a smoking ruin.

  • OttoVonNoob
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    3 months ago

    I’ve been making a game as a hobby. The more I learn to program the more I see AI is great at simple tasks like spellchecker. But needs fuel(other people’s work) to do anything. On top of that if you ask it 2+2 it’s going to run every addition statement in existence to determine the best route to 2+2.

  • Lexam
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Oh well look at that. I thought it was just a new way to spell “eh” ai.