• Brujones@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    It’s all fun and games until you graduate, and your employers realize you don’t know shit and can’t get up to speed.

    I’m an engineer and it’s easy to pick out the ones who haven’t put in the work. They simply don’t last. Sure, some can skate by like that and are smart enough to learn on the fly. I’m cool with that.

    • owenfromcanada
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      I’ve been on both sides of the interview table, and yeah it’s not hard to figure out who lives up to their CV. But unfortunately, that giant pool of useless grads clogs up the already horrible HR pipeline. A lot of good people are turned away before an engineer gets to talk to them.

      • GrumpyDuckling@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I work in manufacturing and we have multiple engineers on staff. I’m not really sure why because we should have competent maintenance people doing those jobs instead. They have zero trouble shooting skill. The first thing they do is open their laptops. They don’t physically look at anything to try to figure out issues. They spend hours trying to figure out basic problems, it’s ridiculous. At the last place I worked operators would be fixing most of these things.

    • resipsaloquitur@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      It’s a cliche, but “you’re only cheating yourself” is true. You’re paying to learn.

      I’ve applied for many jobs, and few even ask your GPA and you could lie about your degree and most wouldn’t know, even fewer care which school you went to.

      What you learned is 90% of the value.

    • kkj@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I’m a software engineer without a degree, so I might have to consider finishing it with some questionably-ethical assistance. I can already do the job, but I don’t have an expensive piece of paper certifying that I can do the job.

    • limer@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      The people who are worth something are already interested in the subject and have taught themselves more than is in the curriculum. AI will not change that.

  • owenfromcanada
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    2 days ago

    “The students kind of recognize that the system is broken and that there’s not really a point in doing this. Maybe the original meaning of these assignments has been lost or is not being communicated to them well.”

    The ideal of college as a place of intellectual growth, where students engage with deep, profound ideas, was gone long before ChatGPT. The combination of high costs and a winner-takes-all economy had already made it feel transactional, a means to an end. (In a recent survey, Deloitte found that just over half of college graduates believe their education was worth the tens of thousands of dollars it costs a year, compared with 76 percent of trade-school graduates.) In a way, the speed and ease with which AI proved itself able to do college-level work simply exposed the rot at the core. “How can we expect them to grasp what education means when we, as educators, haven’t begun to undo the years of cognitive and spiritual damage inflicted by a society that treats schooling as a means to a high-paying job, maybe some social status, but nothing more?” Jollimore wrote in a recent essay. “Or, worse, to see it as bearing no value at all, as if it were a kind of confidence trick, an elaborate sham?”

    This is the root of the issue, and why it won’t get better until academia is turned upside down. Those handful of professors who still have a soul might value critical thinking skills, but the academic industry as a whole is perfectly fine with all of this. The cash flows, the customers enroll and graduate with their product, the machine works as designed.

    But this is the part that keeps me up at night:

    The problem may be much larger than generative AI. The so-called Flynn effect refers to the consistent rise in IQ scores from generation to generation going back to at least the 1930s. That rise started to slow, and in some cases reverse, around 2006. “The greatest worry in these times of generative AI is not that it may compromise human creativity or intelligence,” Robert Sternberg, a psychology professor at Cornell University, told The Guardian, “but that it already has.”

    If we’ve peaked intellectually as a society, we’re completely and thoroughly fucked.