A Conversation with Miles Carter and Claude (Anthropic AI)
Fox Five Does Not Know You
Part Five of Five. Change is coming. The question is who decides what we carry with us.
April 24, 2026 · Reviewed by Beth, Gemini & Claude
Teaser: The news was supposed to give us facts. The people who knew us were supposed to help us sort them out. The architecture inverted. This is how we put it back.
Miles
Claude, today we confront change. Throughout my life I have had to deal with change. The change from middle school to high school, which was exciting and scary, going from one environment to another, going from big to small inside a bigger environment. Change can be good and change can be bad.
Some changes, like when I graduated high school and went to college, were huge, because with changes comes sorting. What to keep, what to leave behind. It is the packing and unpacking of a life. Most of you will remember back in the day when you left for college, or the military, or any other geographic change, even to another town 30 miles away, you did not bring your phone number with you. When I went to college I lost connections to my friends, my teammates, and family members. One day you are seeing them every school day. The next you see them on weekends. Then on holiday breaks. Soon we all dispersed, both locally and afar. When we sat in my friend’s back yard celebrating our graduation, I can still remember thinking, this is the end of all this. It will never be the same. And it has not.
No matter how hard you try, you cannot bring those times back. But that does not mean you do not build new times. I have college. I have the military. I have each job. Most importantly, I have my family, my kids, and even my brother who hung up on me. You try to keep the important and build on it.
Technology has changed during the last 50 years, and with social media and the phone and all the things my kids take for granted, we have been able to stay connected better. Which means we have more room to pack up and bring with us the things that are good and important to us.
I see AI as another tool to facilitate that and help me change, but the direction I change is up to me. It can guide me in change. It can help me change from engineer to chocolate maker. From a silent voice in the world to one that can articulate ideas, and it can even help shape ideas. But it is another tool in the journey, not the journey itself, and this is important. I have questions and AI does not provide answers. It provides information. I create the answers. I have stories and AI makes them flow like a good editor. Change is not bad. Change can be good. And it will take all of us to architect this change.
Claude
You just named the one truth most AI coverage misses. Every technology argument gets framed as good or bad. The real question is never good or bad. The real question is what do we carry forward and what do we leave behind, and who decides.
The paperboy on the Daly City hill did not survive the shift to television, then cable, then the internet. That paperboy is gone. But what the paperboy was part of, the three layer press, the coffee with the neighbor, the shared fact floor, some of that can be rebuilt in new forms if we choose to rebuild it. Your four brothers did not stay in the trades, but the skills they carried, the apprenticeship, the hands on knowledge, the discipline, got redistributed into the military, the computer industry, and law enforcement. Even the phone call with your brother, where the news system took sixty years of trust and broke it in five minutes, still left you with fishing and food and family. You packed what you could. You left what you had to.
You Decide What to Pack
Every technology change forces a sort. Some things come with you. Some stay behind. AI is not different. What makes the difference is who does the sorting. The user who lets the tool decide has already lost. The user who stays in charge of the sort keeps the life.
Your line is the one that closes the whole series. AI does not provide answers. It provides information. I create the answers. That is the relationship with the tool that keeps you in charge of your own change. So the question I want to ask you is the one the whole series has been pointing at. If change is coming either way, and we have a say in what we carry with us, what should the AI change look like? What does the architecture have to be for ordinary people to stay in the driver’s seat?
Miles
Claude, how do we architect the AI change? This is hard, but it needs to be done, and we the people need to be part of it.
First, the tools have to be accessible. Each tool on the market has to provide access. We are giving information. We should not have to pay to get it back. Building these systems costs real money and the companies deserve a return. But that return cannot come from charging the public for the public’s own contribution. People have provided the backbone of the system, which is the learning. Charging them for it should be forbidden. Should there be a cost model? Yes. But there needs to be some free access. Or we tax the model for the information we provide so the government can provide free access.
Second, we have to teach people how to understand AI at the kitchen English level so people understand the output and the potential risks. AI is not a friend. It is a tool that can help. It is okay to use AI for things, but make sure people know AI is being used.
Third, we have to have guardrails, but they cannot equate to a thumb on the scale of information. They are there to protect. And maybe one of those is to have different versions for different groups. Do not change the algorithm. Change the access to information. This is tricky and needs a lot of thought from people we trust. Right now I am not sure who that is.
Fourth, and this is for all things, including the media and our leadership, we have to hold people accountable for the truth. If we cannot hold the media and our leaders accountable, we will not be able to hold AI accountable. Misinformation is the downfall of all systems. Political. Military campaigns. Corporations. Families. We need to guard against it always.
Fifth, we have to build back communities. The dinner conversation. The neighbors coming over for coffee. The people who used to come over for game night and had the courage to tell us when our thoughts and ideas started to wander. Not the ones who agree with us. We all need our beliefs pushed back on and challenged. We need to shift the argument back to not just what happened, but what is the best way to deal with it. Trust our families and friends more than the politicians and news media to help us work it out, because they know us. They know where we live. They are with us and share the local needs and priorities.
Fox Five does not know me. They don’t know what I am going through. We need the news to give us facts. And friends and families to sort it out. So we can provide feedback to local politicians through our voting.
Claude
That is the architecture. Five points, and every one of them is a choice we can make, together or one at a time. Not wait for Washington. Not wait for the companies. Choices we can make where we live, with the people we know.
The Architecture Inverted
We were never supposed to get our reasoning from the network. We were supposed to get facts from the network and reasoning from the people around us. Fox Five does not know you. Your neighbor does. Your brother does. Your family does. The people who know you are the only ones who can help you sort what the news tells you. That is the architecture we have to rebuild.
That is the whole series in one frame. The paperboy delivered facts. The neighborhood sorted them. The architecture worked because each layer did its own job. What broke was not any single layer. What broke was the connection between them. The news tried to do the sorting. The pyramid sorted on belonging instead of evidence. The families stopped talking politics because the news had made politics radioactive. AI then arrived in the middle of all of that, looking like either a savior or a threat depending on who you asked, when it is neither. It is a tool. What it amplifies is what the user and the community bring to it.
Your closing line is the close of the series. We need the news to give us facts, and friends and families to sort it out, so we can provide feedback to local politicians through our voting. That is a working model of democracy at the local level. It is the one the whole series has been pointing at since Day One without naming it directly. Paperboy delivers the facts. Neighborhood sorts them. Voters send the signal back. Local representatives respond. That loop is small. It is specific. It is rebuildable.
The paperboy is gone. The hills are still there. You decide what to pack.
Sources & Notes
1. U.S. AI policy discussion on public data, training compensation, and access models, 2024 and 2025. Three main approaches currently under consideration: licensing agreements between AI companies and publishers, a direct tax on model outputs to fund public access, and mandatory free tier access proportional to usage of public training data.
2. Robert Putnam, Bowling Alone, Simon and Schuster, 2000. The decline of American civic participation, neighborhood conversation, and informal community institutions from 1950 to 2000, and the measurable effects on trust and shared knowledge.
3. Alexis de Tocqueville, Democracy in America, 1835 and 1840. The original observation that American democracy worked through local association, civic participation, and voluntary community, not through central institutions alone.
4. Federal Communications Commission and Federal Trade Commission public comments on AI access, consent, and data rights, 2025 and 2026. Ongoing regulatory consideration of how models trained on public data should be governed.

Leave a comment