• owenfromcanada
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    2 days ago

    “The students kind of recognize that the system is broken and that there’s not really a point in doing this. Maybe the original meaning of these assignments has been lost or is not being communicated to them well.”

    The ideal of college as a place of intellectual growth, where students engage with deep, profound ideas, was gone long before ChatGPT. The combination of high costs and a winner-takes-all economy had already made it feel transactional, a means to an end. (In a recent survey, Deloitte found that just over half of college graduates believe their education was worth the tens of thousands of dollars it costs a year, compared with 76 percent of trade-school graduates.) In a way, the speed and ease with which AI proved itself able to do college-level work simply exposed the rot at the core. “How can we expect them to grasp what education means when we, as educators, haven’t begun to undo the years of cognitive and spiritual damage inflicted by a society that treats schooling as a means to a high-paying job, maybe some social status, but nothing more?” Jollimore wrote in a recent essay. “Or, worse, to see it as bearing no value at all, as if it were a kind of confidence trick, an elaborate sham?”

    This is the root of the issue, and why it won’t get better until academia is turned upside down. Those handful of professors who still have a soul might value critical thinking skills, but the academic industry as a whole is perfectly fine with all of this. The cash flows, the customers enroll and graduate with their product, the machine works as designed.

    But this is the part that keeps me up at night:

    The problem may be much larger than generative AI. The so-called Flynn effect refers to the consistent rise in IQ scores from generation to generation going back to at least the 1930s. That rise started to slow, and in some cases reverse, around 2006. “The greatest worry in these times of generative AI is not that it may compromise human creativity or intelligence,” Robert Sternberg, a psychology professor at Cornell University, told The Guardian, “but that it already has.”

    If we’ve peaked intellectually as a society, we’re completely and thoroughly fucked.