Report From a High School Classroom
More compelling arguments for getting left behind by AI
I am married to an English teacher in a public high school. He teaches sophomores and seniors, honors classes and dual-credit college classes. This is his 6th year at his school, and taught college for 15 years before that. He’s now taught reading and writing for 20 years.
This year, he claims, is his hardest year of teaching yet. (He taught through Covid, both online college, and then high school as the kids returned to in-person.)
The reasons: AI specifically, generally compounded by the ubiquitousness of screens and the attention economy.
Now, as an old punk, I am especially aware of the perennial panic of “the kids these days!” I am not one to usually raise an alarm bell about such things, or fail to recognize normal generational shifts, even when they seem weird, frustrating, or alien to the older generations.
But, OMG, we have let the tech bros destroy a generation of undeveloped brains.
I see the effects in myself, a middle-aged person who grew up in the analog era, how I now have less patience to read something long and hard. How I feel naked and vulnerable if I leave my house without my phone. How I feel compelled to pick it up even when I don’t need to. How when I do pick it up to do something specific—check an email or look something up—I often do something else entirely and put it down before I remember what I was trying to do and pick it up yet again.
We adopted this technology thinking we could handle it, as grown humans with fully-formed brains. But we can’t handle it. And then we gave it to kids, who don’t have fully-formed brains. We gave them a robot that could read and write before we taught them to read and write.
Now, they don’t know how to read and write.
There is a whole other side story to this about just how fucked education funding is, but suffice to say public schools are under A LOT of pressure to graduate kids.
What this adds up to is my partner (and countless other teachers) being put under pressure to pass and graduate seniors who literally do not know what a paragraph is. They do not know basic vocabulary and grammar or how to form a sentence. They have used Chat GPT to do their assignments since middle school. They never learned how to write, how to read, how to think critically or for themselves. It’s been creeping, and a long time coming, but the tipping point this year is undeniable.
(Remember these are kids taking dual-credit college classes.)
Sure, we didn’t just wholesale let this happen. But schools failed to pay for plagiarism detection software, the AI detection doesn’t always work great when they do have it, it isn’t always easy to tell, the kids have their phones in the classroom, they have extra browser tabs open on their laptops, the teachers themselves can’t always stay ahead of the curve enough to know the novel, new ways kids find to avoid work. They can’t always tell which paper used AI and even if they can, the principal might not back them up and parents will argue (and they do).
Most schools have failed to come up with meaningful AI policies, or any at all. It’s especially hard when districts have tech companies convincing them to have their teachers adopt AI themselves, or at least “teach responsible use of AI.” After all, we don’t want our students left behind and not prepared for the inevitability of widespread AI use (excuse me while I barf in my mouth about that inevitability—divine right of kings, anyone?).
The corporatization of education rears its ugly head in the pressure to adopt technologies that are beyond not helpful, to actually hindering education.
A couple years ago, he started polling his students on their screen time, just for his own data and curiosity about what kind of correlations he could or couldn’t find. He has students who spend as many as 8 hours a day on their phones. Interestingly, in his admittedly small data set, screen time didn’t necessarily correlate with grades.
But he did notice that, once he started physically taking phones away (his administration does not enforce our state-wide cell phone ban and I got sick of hearing him complain so I bought him a lockbox for his classroom), his students were far more engaged and in better spirits (and talking to each other more). He says their writing is getting better as he forces them to write on paper, in front of him. But he has a lot of making up to do with kids who are about to go to college—more than he can reasonably do in one school year or semester.
Getting back to 8 hours a day—8 hours a day. Most or a huge chunk of their waking hours are spent on things meant to divert and track their attention: TikTok videos and YouTube and social media, constantly trying to feed them passive engagement, not allowing them a moment to process what is being thrown at them or grapple with something deep or difficult. 8 hours of what it knows they want to see, nothing to challenge what they already think. 8 hours that asks nothing of them, except to keep watching.
Patience and focus are skills that need to be cultivated. Learning happens when we sit with something difficult and keep at it.
If we never learn how to patiently practice a skill, we will of course take a shortcut that is handed to us. This is not the fault of the kids. So many things are awesome about this generation. They are funny and sharp and thoughtful about so much. They see through the bullshit and know that many of them will never make a decent living, own a house, get out from under debt, or do better than their parents and grandparents. They know they’re inheriting a burning planet. I don’t blame them for being a little nihilistic in their willingness to offload some of the things they need to think about. My generation was nihilistic over much less.
If you ask them, so many of them know that AI wastes water, that Black and Brown communities and poor, rural communities unfairly bear the environmental consequences of data centers. They know that it’s better for them to learn how to think critically. They want to be well-read and understand hard things.
But then they have a bunch of homework, and a job, or a sports practice, and the unending drama of navigating social life as a teen, and that essay due for English class could get done instantly with an app, already on their phone, paid for by their parent who works at the university and talks about how we’ll be left behind if we don’t adopt this.
And teenagers are always looking for an easy way out. I remember. Do you? We were all at our most likely to question those with responsibility while deftly avoiding responsibility ourselves.
Again, it’s not (entirely) their fault. We are supposed to be the grown-ups in the room, and we have convinced them that it is ok to offload their ability to think and communicate (which are not only our birthrights as sentient beings, but also our hopes for democracy and a decent future) to our tech bro overlords, those guys who don’t care about us and only want to pillage us and our environment for our labor, our attention, and our resources.
Consider this your bonus argument against the inevitability of AI. The arguments about AI (and even just screens more broadly) typically assume an adult brain, and often consider the future in terms of environmental damage. But the future damage is now. It’s visible if we pay attention to kids.


As a staff person at a (supposedly) elite, Ivy League university, I can say that plenty of students are getting into VERY EXPENSIVE colleges with no reading comprehension skills, no ability to follow instructions, and seemingly no understanding of when they can and cannot get away with that shit. It's really sobering. From my office, we're supposed to be sending students abroad for weeks or months at a time and they can't even fill out a visa application when we give them step-by-step instructions. How are they going to survive overseas?
I have had students write to our office email in such a way that I can tell they are approaching that communication as if there's an AI on the other end that will just spoon-feed them the information they seek. They haven't done any of their own initial research. They haven't even gone to our website. It's a little bizarre. And I agree, it's largely not their fault. But it's also super frustrating to work with them.
A close friend who teaches 7th grade science has been forced to adopt an AI classroom platform. She said not only is LLM use preventing her students from critical thinking skills, but the mandatory AI interfacing now prevents her as an educator from giving accurate individual feedback. Seems like the loop is already complete, and it's not going anywhere because there's an AI development czar on the BoE making 200k/yr that the entire board defers to. It's extra amusing/horrifying because I've been working AI training gigs since my industry imploded, mostly on safety-adjacent projects (dangerous sycophancy, identifying out-of-distribution harm vectors, etc) and the kinds of behaviors my projects are trying to eradicate are the exact ones that are promoted as positives by their AI platform. If I still had the conspiracy theory mindset of my youth I would think the problem here was the actual goal, making everyone dumber.