A side effect of AI I didn't see coming
The story of an uncomfortable realization, and three mental models to stay sharp.
I sat down to write this week and felt something I haven't felt in years: paralysis.
The screen was blank, and so was my mind. For a moment, I was confused.
For more than a decade, writing was like breathing to me. In my journalist days, I built a career on my ability to find the right words, structure an argument, and let the ideas flow onto the page. It was a muscle, exercised daily.
But recently, I had a truly uncomfortable realization.
I sat down to write, deliberately closing my LLM stack to start from a pure blank page. And the process felt… foreign. Clumsy. The familiar rhythm, and feeling was gone. The muscle I had relied on for so long felt weak, as if it hadn't been properly used in months.
It wasn't a sudden failure. It was the result of a slow, unconscious but steady erosion.
A side effect of artificial intelligence I call the "competence drain."
You might have felt it too, even if you haven't put a name to it.
It can feel like a superpower at first, delegating tasks to AI. You can do more, faster.
But think about it for a second. When you delegate writing a tricky email, analyzing data, or even outlining a business strategy entirely to AI, what are you NOT doing? You're not wrestling with the problem, not flexing your analytical muscles, not practicing the nuances of communication.
You get the output, sure. But the process - the part where your learning and skill maintenance happens - gets constantly skipped. And unfortunately - as any bad habit - it compounds over time.
I’ve experienced this firsthand. Over the last 6 months, I was more and more tempted to outsource parts of my writing to LLMs. I recently realized that my own ability to structure arguments and articulate ideas from scratch had noticeably weakened. The very skill that built my career. The one that I was proud of (and praised for) for years.
I was becoming reliant, not augmented. It felt like my brain was getting lazy, opting for the pre-chewed solution instead of doing the work itself.
I started asking myself some questions - how do we fight this? How do we use these powerful tools without letting our own skills atrophy? I don't have a perfect solution, but along the way I've started developing a mental framework to guide my own choices. Maybe it will help you too.
It’s about consciously deciding which "mode" to operate in for any given task:
The Architect Mode
This is for when I have a clear idea but need to delegate the groundwork. I use the AI to draft outlines, structure thoughts, or summarize research. The AI builds the scaffolding, but I am the one who ensures the foundation is solid and does the creative work of finishing the building. This frees up my cognitive energy for the details that matter most.
The Sparring Partner Mode
Here, I don’t want answers; I want better questions. I use the AI to challenge my ideas, play devil's advocate, and help me see my arguments from different angles. This isn't delegation; it's active collaboration that sharpens my thinking rather than replacing it. It often unveils things that I miss or even more interesting angles to choose.
Think of the AI as a world-class personal trainer. A good trainer doesn't lift the weights for you. They push you to do one more rep, correct your form, and suggest new exercises to challenge you. Using AI this way strengthens your own intellectual muscles, making your final arguments more robust and well-rounded.
The No-Fly Zone
This is the most important part. I’ve started to consciously define which skills are off-limits for AI. For me, it's the final nuance of writing in my own voice, the strategic decision-making based on intuition, and the empathetic read of a complex situation. These are the skills I must protect and exercise myself, every time.
I see this as the airline captain making the final call during an emergency. The aircraft's advanced autopilot can navigate the route, analyze weather data, and manage the engines. But when severe turbulence hits or the engine shuts down mid flight, and the situation is critical, it is the captain's experience, judgment, and intuition that are needed to land the plane safely. That final, gut-level command is the one thing you can't delegate. It’s the source of your unique value.
This isn't about rejecting the incredible leverage AI gives you. It’s about being deliberate.
It's about asking yourself before you hit "generate":
Is this task strengthening my capability, or is it creating a dependency that will make me weaker in the long run?
Your future competence might depend on the answer.
But now, I'm curious. Have you ever felt this "competence drain" in your own work?
What's one skill you would intentionally place in your "no-fly zone"?
I have experienced a similar effect related to my use of GPS. My sense of location and direction declined over a period of time where my GPS increased. My ability to navigate has returned after I reduced my use of GPS to an absolute minimum.
I think that the brain I like muscle as you described it. Its ability carry out certain tasks will decline when it’s
not used.
You’re absolutely right that there is cognitive decline in becoming reliant on AI.
As you mentioned, mapping out the areas you want to improve, maintain, and cultivate is a great start but also requiring creativity in undertaking the nuances and human experience that isn’t always a given with AI.
I myself have tried to double down on my critical thinking, learning abilities, and social skills to build competency that AI will never do better, but more importantly are my no fly zone.