Last weekend, I got frustrated with my habits app.
Not dramatically frustrated — the quiet kind, where you keep using something even though it doesn't quite fit, because finding something better feels like more trouble than it's worth. I wanted something simple. Something that tracked the small daily things I was trying to build into my life, without the noise and the gamification and the streaks. Nothing on the market did exactly what I wanted.
So I built one.
Here's what I should tell you about myself: I am not an app developer. I've spent my career in technology, but building mobile applications is its own craft — one I had never learned. A year ago, if you'd told me I was going to build a fully functional app from scratch over a weekend, I would have told you that you had the wrong person.
But I didn't write a single line of code. I described what I wanted, in plain English, to an AI. It built the app. Then it told me exactly how to test it, how to install it on my phone, and how to watch it run for the first time. At no point did it occur to me that I needed to do anything other than describe what I wanted and respond to what came back.
When I held my phone and opened something I had made — something that worked, something that was mine — I felt two things at once. The first was genuine exhilaration. The second was a quiet, creeping worry that had nothing to do with me.
I thought about every person who had spent years learning to build what I had just built in a weekend. And I thought: they don't know yet.
That feeling — wonder and worry arriving together — is what this piece is about.
Not the technology itself. Not the code, or the models, or the technical details that tend to make people's eyes glaze over. This is about what's already happening in the world around you, right now, quietly, while most people are going about their lives.
This is my attempt, after years of failing to find the right words, to finally explain it to everyone I love.
— — —
It's Already Here
A few weeks ago, I was sitting across from a loan officer at my bank. He was sharp, personable, good at his job in every way that mattered. We talked through my finances, my goals, what I was looking for. He asked the right questions. He made me feel taken care of.
And for roughly half of our time together, he typed.
Every answer I gave him went into a system, manually, one field at a time. I watched his eyes move from my face to the screen and back again, over and over, the conversation punctuated by the soft sound of keys. He wasn't doing anything wrong. He was doing exactly what his job required.
But I kept thinking: this shouldn't exist anymore.
Not him — he was wonderful. But that task. That specific, repetitive act of translating a human conversation into a database, one keystroke at a time. Technology that could handle that entirely on its own has existed for a while now. The reason it wasn't happening in that room had nothing to do with capability. It had to do with the fact that somewhere above him, someone with the authority to make that change hadn't made it yet.
He was doing his job perfectly. And his job was already half obsolete. He just hadn't been told.
That's the part that stayed with me on the drive home. Not the technology. The silence around it.
Most of the people whose work is being quietly reshaped by AI right now have no idea it's happening. Not because they're not paying attention. Because nobody is telling them. The conversation about what AI is and what it's already doing is almost entirely contained within a small circle of people in technology — and the rest of the world is going about its day, building careers and raising families, without any sense of what's already changing underneath them.
— — —
This Isn't Just About Tech Jobs
Consider the court reporter.
You've probably seen one without really noticing. The person sitting to the side of the courtroom, fingers moving impossibly fast across a small specialized keyboard, capturing every word in a room where words carry legal weight. It takes years to learn. The certification is hard-earned. For a long time, it was genuinely irreplaceable — because the accuracy required in a legal proceeding demanded a human being who understood context, who could tell one speaker from another, who could flag something unclear rather than guess.
AI can now do this. Not approximately. Not almost. With accuracy that meets or exceeds human performance, in real time, at a fraction of the cost.
Nobody held a press conference about it.
Or think about the paralegal. Someone who spent years in school learning the architecture of the legal system. How to research case law. How to draft documents. How to prepare a case for someone whose name will be on the brief but whose work they are quietly, invisibly carrying. AI now does that work — document review, legal research, contract summarization — in minutes.
These aren't people who chose the wrong career. They chose carefully. They invested in themselves. They built something real.
And the world changed the rules without telling them.
That's what makes this moment different from disruptions that came before. When ATMs arrived, bank tellers saw them. When self-checkout appeared, cashiers knew what was coming. The change was visible. You could point to it, argue about it, prepare for it.
This time the change is quieter. It's happening inside systems, inside workflows, in the back-end of industries you interact with every day without seeing how they work. By the time it becomes visible, many of the decisions will already have been made.
— — —
The Privilege of Seeing It
I have thought in this language my whole life.
Not code, exactly. Something underneath code. The instinct that a machine should already know what you need, that the information is all there, that the friction between human and computer is a problem waiting to be solved. I felt that before I had words for it. Long before I had a career around it.
For a long time I didn't realize how unusual that was. It felt like seeing — just the way I processed the world. It took years of watching other people's eyes glaze over mid-conversation for me to understand that what felt obvious to me was genuinely foreign to most people. Not because they were less capable. Because they had never needed to think this way.
That gap is something I've struggled with.
I've spent years trying to explain to people I love what is happening. My family. Friends outside of tech. People whose instincts and talents run in completely different directions than mine. And I've failed them, more often than I'd like to admit. I'd start talking about AI and within a few sentences I could feel the conversation slipping — their attention drifting not out of disinterest, but because I was speaking a language they had no reason to have learned.
This piece is my attempt to do better.
Because here's what I've come to understand: being close to this technology doesn't make me more important. It makes me more responsible. I'm not standing at the front of a wave that's coming for everyone else. I'm standing somewhere with better visibility, and the only thing that matters is whether I use it to help people see.
The people building and deploying AI right now are a remarkably small group. They are not malicious — most of them are genuinely trying to do something good. But they are making decisions, right now, about systems that will touch every single person reading this. And the smaller the circle of people engaged in that conversation, the more those decisions reflect a narrow set of experiences and priorities.
The room is too small. And most people don't even know there's a room.
— — —
We've Seen This Before
Let's go back further than most people think to go.
Picture a child. Seven years old, maybe eight. Small enough to fit into spaces adults couldn't reach, which is precisely why they were there. Working fourteen hour days in deafening noise, in air thick with dust that slowly destroyed their lungs. No one thought of themselves as a villain. There were quotas to meet, costs to cut, competitors to beat. The child was a resource. Cheaper than an adult. Easier to replace.
And when the machinery took a finger, or a hand, or worse — the machinery kept running.
This wasn't an anomaly. This was the system working exactly as designed. Because when transformative power moves faster than the rules meant to contain it, the people with the least power become the ones who pay the price. Not because those at the top are uniquely evil. Because unchecked systems optimize for efficiency, and human beings — their dignity, their safety, their futures — are inefficiencies.
We told ourselves we learned from that.
Then came oil. Entire towns built around a single company, where workers were paid in currency that could only be spent at company stores, housed in company homes, their lives owned in every practical sense by the same entity that signed their paychecks. When they organized, when they pushed back, the response was swift and brutal. Not because the people in charge were monsters. Because the system had no mechanism to make them care.
We told ourselves we learned from that too.
Then came social media. This one is recent enough that most people reading this lived through it. Platforms engineered — deliberately, measurably, with full internal knowledge — to be as addictive as possible. Algorithms designed not to inform or connect, but to inflame, because outrage kept people scrolling longer. We watched teenage girls develop eating disorders at scale. We watched misinformation spread faster than truth. We watched the mental health of an entire generation quietly, systematically eroded — and the people running those platforms knew. The internal research said so. They optimized anyway.
We told ourselves we would learn from that.
We are at the beginning of something categorically larger than any of these moments. And the window to shape it is right now.
The AI systems being built and deployed today are not neutral tools. They are making decisions about your life already. Whether your resume gets seen by a human being or filtered out in seconds by an algorithm trained on data that encodes decades of existing bias. Whether the financial product being offered to you is genuinely in your interest or engineered to extract maximum value from your specific vulnerabilities. Whether the information you see, the prices you're quoted, the opportunities surfaced to you — are shaped by systems optimizing for someone else's outcomes, not yours.
And here is the part that should make you feel something: most of this is invisible. There is no moment where you find out. There is no letter, no notification, no explanation. The decision was made, upstream, by a system you never interacted with, built by people you'll never meet, governed by rules that don't yet exist.
Technology has existed for years that can track movement, behavior, and association at a scale most people would find difficult to believe. That was ten years ago. What exists today is something most of us can't fully picture. And what will exist in five years — if the people building it remain the only ones deciding how it's used — is something we should all be thinking very hard about.
The child in the mill had no idea that the system consuming their childhood was one that society would one day look back on with horror. They were just living in it. Trying to survive it.
We are not powerless the way they were. We have something they didn't.
We have the chance to see it coming.
— — —
This Time, We Don't Have Decades
Every major technological shift in human history has been disruptive. That's not new.
When the steam engine arrived, it rewired entire societies. Farmers became factory workers. Cities swelled. Families that had lived the same way for generations found themselves inside a world that no longer needed what they knew. It was painful and disorienting and it took decades.
But it took decades.
That time — as brutal as the transition was — allowed something important to happen. Labor movements formed. Laws were written. Institutions adapted. Not perfectly, not fairly, not without tremendous suffering. But there was enough time for society to find its footing. Enough time for the people most affected to organize, to push back, to have some say in the shape of what came next.
We don't have that this time.
What took decades with electricity, with the automobile, with the internet — is taking months now. The gap between a technology existing and that technology being everywhere has collapsed almost entirely. The AI that feels new and experimental to most people today will be embedded in hiring decisions, medical systems, legal proceedings, and financial products before most people have had a chance to form an opinion about it.
And when transformative power moves faster than the people it affects can respond, history tells us exactly what happens.
It concentrates.
This isn't speculation. It's the pattern. Every time a technology moved faster than governance, the people who got there first wrote the rules. They didn't have to be villains to do it. They just had to move fast while everyone else was still figuring out what was happening.
The difference this time is the stakes are total. We're not talking about one industry, or one country, or one corner of the economy. We're talking about cognition itself. The ability to think, to analyze, to create, to decide — these are the things that make human contribution valuable across every field, every profession, every corner of the world. And the tools that augment or replace those abilities are being built and controlled by a very small number of people and organizations.
That's not inevitable. But it becomes inevitable the longer the conversation stays inside the room it's currently in.
— — —
The Great Experiment
I came to this country when I was ten years old.
I didn't know what I was walking into. I didn't know the language well enough — I remember a kid saying "hell" and not understanding him, because where I came from, we said "etch", not H. I didn't know the culture, the rhythms, the unwritten rules of a place that was now supposed to be home.
What I did know, slowly, over years, was this: America had given me something. Not a guarantee. Not a free pass. But a door. An open door that said — whoever you are, wherever you came from, whatever you had or didn't have — you can walk through this and build something. You can contribute. You can matter.
That door is what this country has always been, at its best. Not perfect. Not always open equally to everyone — that's a wound we're still tending. But directionally, aspirationally, fundamentally — the promise of America is that the carpenter and the nurse and the teacher and the immigrant kid who can't say H get to have a say in what this place becomes. That no single group of people, no matter how powerful or how brilliant, gets to decide the future for everyone else.
That was the radical idea. That was the experiment.
Two hundred and fifty years ago, a group of people — flawed, brilliant, frightened, determined — sat in rooms arguing through the night about what kind of world they were building. They were building it for people they would never meet. For generations that didn't exist yet. For a future they could only imagine. And the thing that made what they built remarkable wasn't that they got everything right. It was that they tried to build a system where ordinary people could keep correcting it. Keep shaping it. Keep showing up.
We are in one of those rooms right now.
Not in Philadelphia. Not in the 1780s. But in the same kind of moment — a hinge in history where the decisions being made will echo for two hundred, three hundred, five hundred years. Where the shape of everything that comes after will be determined by what we do, or fail to do, in the years immediately ahead.
The people building AI right now are not villains. Most of them are brilliant and well-intentioned and genuinely trying to do something good. But they are few. And they are moving fast. And the systems they are building will touch every single person alive today and every person who comes after them.
History will look back on this decade the way we look back on the 1780s. As the moment when everything was being decided. When the arguments were happening. When the choices were still open.
The question it will ask of us is simple.
Were you in the room?
Not as an engineer. Not as a politician. Not as someone who understood every technical detail of how these systems work. Just as a citizen. Just as a person who understood that something enormous was happening and decided that their voice — imperfect, uncertain, still learning — belonged in the conversation.
The immigrant who couldn't say H gets to be in that room. The nurse. The teacher. The grandmother. The court reporter. The loan officer typing everything you said into a system that shouldn't exist anymore.
All of them. Every single one.
That is what democracy was built to protect. The right of ordinary people to have a say in the world their children will inherit.
Think back to that child in the mill. Seven years old. No one showed up for them. Not because people didn't care — but because the people who cared didn't know, and the people who knew didn't have the power, and the people who had the power had already made their calculations. By the time society looked up and saw what had happened, a generation had already paid the price.
We are not powerless the way that child was.
We have something they never had — the ability to see it coming. The warning is available. The conversation is possible. The door is still open.
The children who will inherit this world didn't get a vote on any of it. They weren't consulted when these systems were built. They will simply wake up one day in the world we chose to build, or chose to let be built for us.
They are counting on us to show up — not because we have all the answers, not because we understand every line of code or every boardroom decision being made right now, but because we are here. We are alive in this moment. And this moment is asking something of us.
Be in the conversation. Bring someone else into it. Ask the questions out loud. Talk to your kids, your parents, your neighbors, your coworkers. You don't need to understand the technology. You just need to decide that your voice belongs in the room where the future is being decided.
Because somewhere out there, there is a child who will inherit whatever we build or fail to build in these years. They can't speak yet. They can't vote yet. They can't show up yet.
But we can.
Don't let them grow up in a world we were too quiet to shape.