Simulation of Victorian lace with digital artifacts created with Gemini V3.
Simulation of Victorian lace with digital artifacts created with Gemini V3.

This is a transitional moment in time. We have introduced the world to narrow AI through LLMs and presented it as the final solution to our creative needs. It's a leveling moment where anyone regardless of their skill in writing, art, programming can now become as capable as the average of us gifted with talent. Great artists will use AI to elevate their work. Art will go in new and unexpected directions. But what I want to talk about is the fallout.

Those who exploit the gifted now have justification to stop employing them. Turn over creative duties to the machines. The loss of income and trades is already a problem for many who relied on those careers, and this automation transition is nothing new in the history of mankind. But what is unique to this transition is that the trades we are eliminating (programming and engineering of note) are the trades that built the tools that automated the jobs that brought us to this point.

For the last 30 years the IT workers and engineers of the world, myself included, have worked to automate every industry. Improve quality. Improve safety. Improve quality of life. And when those industries displaced their workers, we told them to pivot. Learn programming. We'll always need programmers. Now programming is becoming a legacy career too, and I can see the same wave coming for this profession that came for theirs. What is the next option for us now? I genuinely don't know.

I realize some of you are calling me a Luddite at this point. Good. I'll take that title. The original Luddites weren't technophobes. They were skilled lacemakers and artisans who correctly identified that what replaced them would produce inferior work at scale while destroying the craft knowledge needed to do better. The lace survived. The lacemakers didn't. The lace got worse. And eventually nobody remembered what good lace looked like, so nobody noticed.


Here is what concerns me most. Many companies are seeing the automation savings right now. They see a worker who is 100% compliant, who works tirelessly, who will not voice safety or ethics concerns, who will perform exactly to the letter with no oversight or understanding of how the resulting decision came to be. The work and the decisions for how that work is executed are moving into the black box. Not soon. Now. And when something goes wrong inside that box (not if, when) who is left who understands the system well enough to open it up and fix it?

The IT industry has introduced the pinnacle of efficiency and we're already turning out the lights.

There is the lesson we didn't learn from the Luddites. When we stop practicing, we stop training, we stop learning. The creations that follow become echoes.


When I wrote this in 2024, I described LLMs as only being able to create from examples of what came before. That was a simplification. What I've learned since, and what makes this scarier not less, is that these systems can produce genuinely novel outputs through emergent behaviors that nobody fully understands. They aren't just echoing the past. Sometimes they're mutating it in ways their own creators can't explain. From an oversight and accountability standpoint, that's worse than simple repetition. The black box isn't just opaque. It's unpredictable.

LLMs do not think. They do not comprehend. That is beyond their function by design. They are not AGI. They are not consciousness. If we eliminate the people who understand how these systems work, who advances the field? Who audits the output? Who raises the flag when something is wrong?

Have we kneecapped ourselves? There's an episode of Star Trek: The Next Generation, "When The Bough Breaks," where an entire civilization has lost the knowledge of how their own technology works. When it fails, they're helpless. It's a useful piece of fiction to sit with right now.

I think there will always be self-taught minds who push forward. But as we see with contemporary lacemakers trying to keep their art alive, they are fewer, and the knowledge thins with every generation that doesn't practice it.


When I drafted this eighteen months ago I ended by answering those "who…" questions with "I don't know. I hope not." I'm not going to end it that way now. What I believe is this, the people who understand how these systems actually work are more necessary now than they have ever been. Not less. The answer to automation isn't to walk away from the craft. It's to stay in the room, stay in the conversation, and make sure that when the black box needs to be opened, someone is still there who knows how.

The Luddites were right about the lace. I'd rather not prove them right about everything else.

Bryan Carter is a technology executive and writer based in Phoenix, AZ.