College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    16
    ·
    1 year ago

    Is AI going to go away?

    In the real world, will those students be working from a textbook, or from a browser with some form of AI accessible in a few years?

    What exactly is being measured and evaluated? Or has the world changed, and existing infrastructure is struggling to cling to the status quo?

    Were those years of students being forced to learn cursive in the age of the computer a useful application of their time? Or math classes where a calculator wasn’t allowed?

    I can hardly think just how useful a programming class where you need to write it on a blank page of paper with a pen and no linters might be, then.

    Maybe the focus on where and how knowledge is applied needs to be revisited in light of a changing landscape.

    For example, how much more practically useful might test questions be that provide a hallucinated wrong answer from ChatGPT and then task the students to identify what was wrong? Or provide them a cross discipline question that expects ChatGPT usage yet would remain challenging because of the scope or nuance?

    I get that it’s difficult to adjust to something that’s changed everything in the field within months.

    But it’s quite likely a fair bit of how education has been done for the past 20 years in the digital age (itself a gradual transition to the Internet existing) needs major reworking to adapt to changes rather than simply oppose them, putting academia in a bubble further and further detached from real world feasibility.

    • SkiDude@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      1 year ago

      If you’re going to take a class to learn how to do X, but never actually learn how to do X because you’re letting a machine do all the work, why even take the class?

      In the real world, even if you’re using all the newest, cutting edge stuff, you still need to understand the concepts behind what you’re doing. You still have to know what to put into the tool and that what you get out is something that works.

      If the tool, AI, whatever, is smart enough to accomplish the task without you actually knowing anything, what the hell are you useful for?

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        1 year ago

        why even take the class?

        To have a piece of paper to get a job.

        For CS this is nothing new we have been dealing with graduates who can’t program, and self-taught geniuses, since before the AI boom so paper credentials just aren’t as important.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          You don’t need a piece of paper to get a decent job. People go to college to get into fields they personally care about. If all you care about is money, go work sales.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Companies love paper certificates because that means they can outsource judging applicant qualifications. How bad that is differs from company to company and field to field.

            I mean if you’re a small non-IT company, say a machine shop, and you’re looking for a devops and you have no IT department and noone understands anything about what that person is going to do but “is going to run our website and sales platform”… what else are you going to rely on but paper certificates? Hire a consultancy?

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Most companies don’t actually require degrees to get decent paying work. Even s lot of IT work is hired more by documented experience than having a degree. Having a degree alone simply doesn’t cut it in that field; you have to actually prove you can do actual things which a degree can’t really do anymore. Degrees are for academic fields.

              Source: Went to college, then switched to sales which required no outside education, learned and earned a lot more.

      • prosp3kt@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        7
        ·
        1 year ago

        But that’s actually most of the works we have nowadays. IA is replacing repetitive works such as magazine writers or script writers

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              And junior programmers

              …no. Juniors are hard enough to mentor to write sensible code in the first place adding AI to that is only making things worse.

              The long-term impacts on AI past what’s already happening (and having an actual positive impact on the products and companies, that is, discount that Hollywood scriptwriting stuff) will be in industrial automation and logistics/transportation. Production lines that can QC on their own as well as a whole army of truck and taxi drivers. AI systems will augment fields such as medicine, but not replace actual doctors. Think providing alternative diagnosis possibilities and recommending suitable tests to be sure kind of stuff, combatting routine tunnel vision by, precisely, being less adaptive than human doctors.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Any platform that does that is going to collapse. Not enough people will watch AI generated garbage for it to be viable, and those that don’t will simply split off the internet and entertainment, shrinking and splitting the economy.

            • ZeroHora@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I understand that they’ll be replaced, or at least the producers want thant, but I don’t think that’s because of repetitive work, more like they need a lot of them.

    • orangeboats@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      1 year ago

      As an anecdotal though, I once saw someone simply forwarding (ie. copy and pasting) their exam questions to ChatGPT. His answers are just ChatGPT responses, but paraphrased to make it look less GPT-ish. I am not even sure whether he understood the question itself.

      In this case, the only skill that is tested… is English paraphrasing.

    • HexesofVexes@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      I’ll field this because it does raise some good points:

      It all boils down to how much you trust what is essentially matrix multiplication, trained on the internet, with some very arbitrarily chosen initial conditions. Early on when AI started cropping up in the news, I tested the validity of answers given:

      1. For topics aimed at 10–18 year olds, it does pretty well. It’s answers are generic, and it makes mistakes every now and then.

      2. For 1st–3rd year degree, it really starts to make dangerous errors, but it’s a good tool to summarise materials from textbooks.

      3. Masters+, it spews (very convincing) bollocks most of the time.

      Recognising the mistakes in (1) requires checking it against the course notes, something most students manage. Recognising the mistakes in (2) is often something a stronger student can manage, but not a weaker one. As for (3), you are going to need to be an expert to recognise the mistakes (it literally misinterpreted my own work at me at one point).

      The irony is, education in its current format is already working with AI, it’s teaching people how to correct the errors given. Theming assessment around an AI is a great idea, until you have to create one (the very fact it is moving fast means that everything you teach about it ends up out of date by the time a student needs it for work).

      However, I do agree that education as a whole needs overhauling. How to do this: maybe fund it a bit better so we’re able to hire folks to help develop better courses - at the moment every “great course” you’ve ever taken was paid for in blood (i.e. 50 hour weeks teaching/marking/prepping/meeting arbitrary research requirement).

      • Armok: God of Blood@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        (1) seems to be a legitimate problem. (2) is just filtering the stronger students from the weaker ones with extra steps. (3) isn’t an issue unless a professor teaching graduate classes can’t tell BS from truth in their own field. If that’s the case, I’d call the professor’s lack of knowledge a larger issue than the student’s.

        • jarfil@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          You may not know this, but “Masters” is about uncovering knowledge nobody had before, not even the professor. That’s where peer reviews and shit like LK-99 happen.

          • Womble@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            It really isn’t. You don’t start doing properly original research until a year or two into a PhD. At best a masters project is going to be doing something like taking an existing model and applying it to an adjacent topic to the one it was designed for.

      • zephyreks
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        On the other hand, what if the problem is simply one that’s no longer important for most people? Isn’t technological advancement supposed to introduce abstraction that people can develop on?

        • average650@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          The point is the students can’t get to the higher level concepts if they’re just regurgitating from what chatgpt says.

        • MBM@lemmings.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          If you never learn how to do the basics without ChatGPT, it’s a big jump to figure out the advanced topics where ChatGPT no longer helps you

    • pinkdrunkenelephants@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Textbooks, like on physical paper, are never going to just go away. They offer way too many advantages over even reading digital books.