For the last several years, I’ve been a strong advocate for using AI, especially for small business owners.
I’ve written about it. Taught it. Encouraged entrepreneurs to experiment with it rather than fear it. I still believe AI is an extraordinary tool. It reduces workload, improves decision-making, and helps small firms compete with larger ones. Used deliberately, it can be incredibly powerful.
So my more recent cautionary tone isn’t a reversal. Businesses have to use AI just to remain competitive. But the broader AI narrative needs recalibration.
Somewhere along the way, I realized the public conversation drifted toward extremes. On one side, apocalyptic predictions. On the other side, utopian promises. Machines replace everyone. Or machines free everyone.
Both narratives felt incomplete. What I became more interested in wasn’t what might happen in some distant future. It was what was already starting to happen quietly, right now.
And the more I paid attention, the more I saw patterns forming beneath the surface.
The Shift from Possible to Probable
Early discussions about AI revolved around what was possible. Could it pass a test? Could it write code? Could it diagnose a disease? Could it outperform humans? That’s interesting, but possibility isn’t what shapes society. Probability does.
I began to ponder:
- What becomes likely once incentives take over?
- What happens when systems scale?
- What happens when cost compression collides with human institutions?
Researching the answers to these types of questions began to shift my focus. Not toward dystopia. Not toward hype. But toward structure.
Four Structural Shifts
As I stepped back, four themes kept surfacing.
- First, the quiet disappearance of entry-level cognitive work. AI doesn’t just eliminate jobs. It eliminates on-ramps. Apprenticeship layers compress. Firms become more efficient in the short term but risk hollowing out their own future expertise.
- Second, the erosion of shared reality. When convincing, synthetic audio and videos can be generated cheaply and at scale, trust becomes fragile. Verification becomes costlier. Reputation becomes more exposed.
- Third, the shift from productivity to platform power. AI increases output, but when delivered through concentrated infrastructure, leverage migrates upstream. Platforms capture more of the compounding value than the practitioners who use it.
- Fourth, the physical layer beneath it all. AI isn’t abstract. It runs on energy. Data centers, power grids, water systems, and geopolitical considerations all come into play as the demand for AI accelerates.
None of these shifts is dramatic in isolation, but over time they’re cumulative, and they compound slowly. That’s what makes them easy to miss.
This Isn’t Anti-AI
Let me be clear, I am not arguing against AI adoption. Far from it. I use it daily. I teach others how to use it responsibly. I believe small firms that ignore it put themselves at risk. But adoption without reflection is different from thoughtful integration.
Every major technological leap in history has produced second-order effects that were not obvious at the outset. Railroads reshaped labor. Social media reshaped attention and discourse. Automation reshaped manufacturing communities.
AI will do the same. The question isn’t whether we move forward. We will. The question is whether we understand the structural shifts forming underneath the surface while we still have room to adjust.
Why This Matters for Small Business Owners
If you run a business, advise clients, allocate capital, or help shape policy, you’re not just choosing whether to adopt a tool. You’re participating in a system that is redefining work, trust, power, and infrastructure simultaneously. That deserves more than enthusiasm. It deserves examination.
The goal isn’t to slow progress for its own sake. It’s to avoid optimizing ourselves into fragility. Because once structural shifts harden, they become much harder to reverse.
These four vectors form the backbone of my book, The Quiet Disruption, where I explore how AI is reshaping work, power, and trust before most institutions are ready.
If you’d like a concise overview of the full framework, I recorded a series of short summary videos and an audio deep that walk through all four structural shifts and how they connect.
You can explore each vector individually, or start with the overview video and audio deep dive to see the broader picture. Either way, my goal is simple: to encourage better questions before momentum answers them for us.
| Video Title | Description |
| The Quiet Disruption – Brief Overview | This is a brief visual overview of the ideas behind the book, The Quiet Disruption. Most conversations about AI focus on tools, speed, and productivity. This video looks at something quieter and more consequential, the structural changes already underway that are easy to miss in day-to-day work. This short whiteboard-style walkthrough touches on how AI is reshaping learning, trust, power, and incentives, and why these changes belong in the same conversation rather than being treated as separate issues. This is not a tutorial or a prediction about the future. It’s an attempt to connect the dots and explain why many of the risks and opportunities around AI don’t announce themselves loudly. If you’re a business owner, advisor, or professional who wants to think clearly about a new world dominated by AI without hype or panic, this video is a good place to start. |
| The Quite Disruption Audio First Deep Dive | This is an audio-first deep dive into the thinking behind the book, The Quiet Disruption. This audio conversation explores the deeper, often-overlooked consequences of AI adoption, especially those that unfold slowly and quietly. Topics touched on include the collapse of traditional learning and apprenticeship paths, the erosion of trust through deepfakes, the consolidation of power among platforms, shifting incentives for small businesses and professionals, and the hidden energy and infrastructure costs behind modern AI systems. This audio is intended for thoughtful listeners who want to slow down, reflect, and think more clearly about how AI is reshaping work, judgment, and decision-making. |
| The Learning Ladder Collapse | This video explores one of the quieter but more consequential effects of AI adoption, the breakdown of the apprenticeship and learning ladder. For decades, most professions relied on a progression from novice to expert, where people learned by doing real work under supervision. Increasingly, AI is taking on many of the tasks that once served as learning steps, boosting efficiency in the short term while quietly dismantling the apprenticeship ladder. This video examines how removing early-career work can create talent gaps that don’t appear until years later, when experience can’t be easily replaced. If you’re a business owner, leader, or advisor thinking about workforce development, resilience, and long-term capability, this conversation matters. |
| The Erosion of Trust | This video looks at how deepfakes and synthetic media quietly undermine one of the most important infrastructures in business and society, trust. The risk isn’t just that false content exists. It’s that the presence of AI-generated media changes how evidence is treated, shifting us from assuming information is real to requiring constant verification. That shift introduces friction, cost, and skepticism long before any single failure makes headlines. Rather than focusing on viral examples or technical details, this video examines how trust degrades structurally, and why the consequences compound over time. |
| The AI Power Shift | This video focuses on how AI accelerates the consolidation of power and leverage among platforms that control models, data, and infrastructure. While AI tools appear widely accessible, dependence on centralized platforms quietly shifts control away from individuals and small organizations. Convenience increases, but autonomy, bargaining power, and resilience can decrease. This discussion isn’t about specific companies or market predictions. It’s about understanding how incentives, scale, and dependency reshape who holds leverage over time, often without anyone explicitly choosing that outcome. |
| AI’s Invisible Energy Bill | This video examines an often-overlooked reality behind AI systems, their energy and infrastructure demands. AI feels intangible and infinitely scalable, but intelligence at scale has physical costs. Compute, power, cooling, and location constraints quietly shape what is possible, who can participate, and who gains advantage long before those limits become visible to end users. Rather than debating environmental policy or technical architecture, this video focuses on why these hidden constraints matter for business strategy, access, and long-term planning. |
How will you respond before these structural shifts harden into realities you can no longer influence?









