Like countless faculty this post-ChatGPT academic year, I had the eerie sensation of occasionally reading papers that sounded off. To be sure, they weren’t off in the ways a professor’s grading brain is trained to scan for: concepts misconstrued, non sequitur paragraphs, typos carelessly lurking. Quite the opposite. These were competent, articulate, polished, and yet not- quite -human papers. Generative AI style, if it can be said there is such a thing, feels Wikipedian: Early on, it was called “ beige ”; a “ generic cake mix .” Like the similarly crowdsourced Wikipedia, it sounds like everyone and no one at the same time. And, says Naomi Baron, a linguist who studies computer-mediated communication, just as when Wikipedia first came along and “People said, ‘Wait a second, do you […]
Original web page at www.bostonglobe.com