The Invisible Cost: From Creator to Consumer
A reflection on "Cognitive Leakage" and the Law of Conservation of Cost in the age of AI.
Post Body
Welcome to my Substack.
This week marks my 10th anniversary as a Technical Consultant. Over the past decade—from struggling with my first lines of C++ to architecting complex legacy migrations—I have observed a recurring and worrying pattern in our industry.
We are relentlessly pursuing “simplification” via high-level abstractions like Low-Code platforms and AI coding assistants. While these tools offer immediate speed, I believe they come with a hidden price tag—a crisis I call “Cognitive Leakage” (While the term exists in psychology, in the context of software evolution, I define it as the atrophy of mental models).
In this article, I explore:
The Law of Conservation of Cost: Why the effort you save today by taking a shortcut will be paid back with compound interest during future refactoring.
The Creator-Consumer Singularity: How over-reliance on black-box tools degrades engineers from “Creators” into passive “Consumers” of their own systems.
The Neuroscientific Evidence: How recent research (Kosmyna et al., 2025; Oakley et al., 2025) validates that outsourcing cognition leads to the atrophy of our mental models.
This is not a rejection of AI, but a manifesto for maintaining “Cognitive Sovereignty” in an automated world.
1. A Decade’s Cycle
It has been exactly ten years since I started my internship at Thoughtworks during my junior year of university. To mark this milestone, I prepared this article.
The inspiration for this piece came quite serendipitously. A few days ago, our project team held a Weekly Meeting where everyone was asked to introduce themselves. Two questions struck me to the core: “When did you write your first line of code?” and “How did you get to where you are today?”
These questions plunged me into deep contemplation.
I thought back to my university days and the embarrassment I felt during my first programming course, “C Programming Language.” Back then, I didn’t even understand what `printf` was or why a statement had to end with a semicolon. It wasn’t until later, when I discovered WinForms, that I first felt the joy of programming—the sheer convenience of dragging and dropping controls to generate a window was mesmerizing.
However, as I joined Thoughtworks and grew from an intern into a Senior Technical Consultant, I noticed an interesting “regression“ in my preferences: I began to gradually reject the visual interfaces that once felt so “convenient,” turning instead to an extreme reliance on command lines and raw code. Even in my choice of programming languages, I shifted from preferring “automatic” high-level languages to favoring C++ — a “manual transmission” language that requires manual memory management.
Looking back on this decade, I suddenly realized a recurring pattern: an extreme dependency on “visual/black-box” tools when starting from zero, followed by an extreme regression to “low-level/white-box” tools upon becoming an expert.
Over the last seven years, I have traveled to various client sites as a Technical Consultant. I have seen countless enterprises attempting to introduce highly encapsulated abstraction layers, trying to bring modern software development back to that “state of convenience” I experienced with WinForms in college. Watching their excitement over this “simplicity,” I became increasingly wary.
I vaguely sensed that behind this “convenience,” there lay a hidden cost—one that we had not yet paid, but would eventually have to repay. And this has been my greatest discovery of the decade.
2. From “Leaky Abstractions” to “Cognitive Leakage”
I have worked as a technical coach and a QA consultant, and I have led large-scale system refactoring. Yet, later in my career, I set myself a goal that seemed somewhat “retro”—to become a C++ Technical Consultant. Many people asked me why. In the past, I could only vaguely answer that I found it “interesting.” It wasn’t until recently, when reflecting on this article’s title, that I realized my obsession with C++ stems from the same source as my preference for the command line: a sense of control.
When writing C++, we are forced to confront underlying details like memory management and pointer references. It is a process of pain and pleasure—take the notorious “memory leak,” for example. This got me thinking: since memory can leak, do similar “leaks” exist in other areas of software engineering?
Not only do they exist, but they are also omnipresent.
Take automated testing as an example. We have a concept called “Testing Gaps.” When we cover all functions with unit tests, is the system safe? No, because the connections between functions haven’t been tested—this is the leakage of unit testing. To plug this leak, we introduce integration tests; integration tests have their own gaps, so we introduce end-to-end (E2E) tests; E2E tests still have blind spots, so we introduce manual testing. And what manual testing misses eventually becomes a bug in production.
This “funnel” effect reveals a universal law: No means of encapsulating underlying complexity can achieve perfect shielding.
When I excitedly researched this, I discovered that Joel Spolsky had proposed the famous “Law of Leaky Abstractions“back in 2002: “All non-trivial abstractions, to some degree, are leaky.” TCP tries to encapsulate the network, but leaks when congestion occurs; SQL tries to encapsulate queries, but leaks when full table scans happen.
If I were my ten-years-younger self, I might have stopped there, marveling that “the master was right.” But after seven years of consulting, watching countless enterprises sink into the quagmire despite introducing highly encapsulated tools, I vaguely felt that the theory of “Leaky Abstractions” was still missing a piece of the puzzle.
What was missing? The “Human” element.
“Leaky Abstractions” describes the physical properties of the tool (object): because the tool is imperfect, details inevitably leak out. But we ignored the state of the engineer (subject): when those details leak out, does the person sitting in front of the screen still have the ability to handle them?
This is the core concept I want to propose — “Cognitive Leakage”.
We often assume that high-level abstractions (such as AI coding, Magic Frameworks, automated test generation, and Low-Code) lower the barrier to entry because they mask technical details. But at the same time, they also mask “Process Knowledge“.
For experts, high-level abstractions are efficient. Their brains have already internalized the translation path from “code” to “structure.” When they see a graphical component, they can mentally map out the underlying logic flow.
For beginners or developers who are overly reliant on tools, the situation is entirely different. When the tool performs all the “dismantling, abstraction, and analysis” for you, you effectively lose the opportunity to build your “Mental Meta-Model” for problem-solving.
This is why I believe Low-Code or AI-generated code is perfect for “coding for kids” or building simple “glue tools” (toys)—because these scenarios are simple enough that they don’t require deep cognitive dismantling. But in an enterprise’s core business domain, in complex systems with non-linear logic, once a “Leaky Abstraction” occurs (such as a bizarre performance jitter), the developer who lacks a bottom-layer cognitive model instantly degenerates from a “Creator“ into a “Helpless User.“
If “Leaky Abstractions” means tools cannot perfectly encapsulate underlying complexity, then “Cognitive Leakage” is the disuse atrophy of human problem-solving abilities caused by an over-reliance on tools.
You think you are saving time by not learning the underlying details, but in reality, you are merely overdrafting your future cognitive ability. This is the beginning of “Cognitive Leakage.“
3. The Law of Conservation of Cost
Over the past seven years, I have handled numerous legacy system refactorings for clients and have been deeply involved in various delivery models. Through these experiences of “archaeology” and “firefighting,” I found that so-called “technical debt” is often not due to poor technology itself, but rather the system’s inherent complexity.
We often speak of “Tesler’s Law” (The Law of Conservation of Complexity): in the spatial dimension, a system’s inherent complexity cannot be reduced; you can only shift it from the application layer to the platform layer, or from code to configuration.
But beyond this, I have observed another, perhaps more ruthless law, which I call the “Law of Conservation of Cost.”This is the projection of complexity onto the time dimension.
When we examine a project or system at any given point in time, the total amount of cognition required is conserved. To build a core business system, you must pay the corresponding “Cognitive Cost“ — the mental labor of understanding the business, dismantling logic, designing architecture, and handling edge cases.
Any “shortcut” technology you adopt (Low-Code, GenAI) cannot eliminate this cost; it can only change the time of payment:
Cognitive Shift Left: During the development phase, we choose to face complexity head-on (writing code, writing tests, DDD modeling). We pay a high price in time and brainpower upfront to build a complete mental meta-model. This may look slow, but we amortize the “cost of understanding” over every single day.
Cognitive Shift Right: We choose to use highly encapsulated tools, skipping tedious details in pursuit of “rapid delivery from zero.” We save costs upfront and get a sense of instant gratification. However, the complexity that should have been digested by the developer during the design phase does not disappear.
This introduces a hidden variable to the “Iron Triangle” of delivery. When time and resources are fixed, and we force an increase in delivery speed by introducing “black-box tools,” what is saved is not the volume of code, but the “Cognitive Volume.”
This undigested complexity accumulates like ghosts in the dark corners of the system. This process of accumulation and forgetting is exactly what we define as “Cognitive Leakage.”
At first, it manifests as bugs during testing; then, as bizarre glitches in the production system; finally, it evolves into a “black-box legacy system” that no one dares to touch and no one understands — known in the architecture world as the notorious “Big Ball of Mud.”
This is Conservation of Cost:
Total Cost of Software Lifecycle = Current Implementation Cost + Future Cognitive Repurchase Cost
Today’s highly abstracted tools, such as Low-Code and AI programming, drastically compress the “Implementation Cost” on the left side of the equation. However, this inevitably causes the “Cognitive Repurchase Cost” on the right side to rise exponentially.
When the day comes that we are forced to maintain this behemoth, we will discover that we must repay all the cognition we “saved” through shortcuts — with compound interest.
4. The Tipping Point: From Creator to Consumer
As the abstraction levels of tools, platforms, and frameworks continue to rise, the mindset of the high-level Developer (Creator) facing a system failure is undergoing a subtle qualitative change. The feeling is less like a confident engineer strolling leisurely through a failed automated test, and more like a helpless User (Consumer) watching an App crash on their phone.
I am not questioning the quality of developers, nor am I blaming anyone for not liking to learn. On the contrary, I believe this is a “physiological and subconscious inevitability.” The Cognitive Leakage I describe often happens unconsciously—it is driven by memory decay over time, the “content with superficial understanding” attitude induced by convenient tools, and the “cognitive silos” formed between people as systems expand.
Take myself as an example. When writing an IME app, once I discover that AI can generate code that compiles and looks decent, faced with a massive amount of code, I instinctively tend to paste it directly rather than checking it line-by-line. As time passes and that code encounters issues, because I only provided the abstract logic and ignored the implementation details, I am more inclined to paste the error back to the AI to solve. Once the AI fails to solve it, I am forced to spend multiple times the effort to comb through the details and analyze the cause from scratch—this is the “Forced Repurchase” following Cognitive Leakage.
Why am I so certain about this? Because the “relationship between a User and an App” is exactly the ultimate form of this extreme abstraction.
Imagine a developer facing a massive “Big Ball of Mud.” Paralyzed by fear and daring not to modify the original logic, they can only add new features by “Copying & Pasting” new paths. At this point, this developer’s effective cognition of the system approaches zero.
In this situation, between “re-learning” and “copy-pasting,” almost everyone chooses the latter. Because as the codebase grows from one line to a million, the cost of learning and memorizing details is not a simple linear growth, but an exponential explosion. Facing this massive “Cognitive Wall,” not everyone has the ability or courage to climb over it.
So I ask: what is the difference between this developer who can only “click and copy” and the Consumer sitting at home tapping on an App?
The cost of simplicity is often future complexity; the cost of speed is often future sluggishness.
In the field of software development, we have finally discovered these two parallel iron laws:
Conservation of Complexity leads to Leaky Abstractions;
Conservation of Cost corresponds to Cognitive Leakage.
Imagine a pilot who has forgotten how to fly, sitting in a cockpit where all the gauges and controls have been replaced by a single “AUTO” button. As long as the weather is clear, he feels like a Captain. But when the storm hits, he realizes in horror that he is merely a passenger in the captain’s seat.
This is the ultimate form of extreme abstraction: the degeneration from Creator to Consumer. And Cognitive Leakage is simply the result of that button working for too long, causing us to completely forget the process of flying.
5. The Essence of Refactoring: Cognitive Repurchase
Let us consider a classic engineering puzzle: Why do teams facing a “Big Ball of Mud” architecture often vacillate between “Rewrite” and “Refactor,” yet frequently fail at both? Either the refactoring plan fizzles out, or the rewritten system quickly degenerates into yet another Big Ball of Mud.
We usually attribute this to system entropy or accumulated business complexity. But in my view, it is because we have incurred a massive, invisible debt through our past development—driven by the pursuit of short-term speed and the use of high-abstraction tools.
This debt is the “Cost of Cognitive Repurchase.”
5.1 The Trap of Simplicity: The Exponential Wall of Difficulty
Looking at the evolution of programming languages, every layer of abstraction increases simplicity.
In system-level development (like C++), developers are forced to confront low-level details, compelling them to build robust Mental Meta-Models.
In highly abstract development (like Low-Code or AI-assisted programming), tools promise “zero-basis” convenience.
This convenience brings a fatal non-linear risk: It seems that 99% of problems can be easily solved by “dragging and dropping” or “prompting” (because encapsulation works well). However, once you encounter the 1% of problems that touch the underlying logic—the “leaks”—the difficulty of solving them skyrockets exponentially.
Because developers have lived in “Easy Mode” for so long, they have lost the ability to analyze underlying problems. This “Cognitive Leakage” caused by “excessive abstraction” makes solving that remaining 1% an insurmountable challenge.
5.2 Organizational Amnesia: The Placebo of Adding People and Pressure
When an organization adopts extreme abstraction (such as blindly trusting AI coding or zero-basis Low-Code) to implement core systems, “Cognitive Leakage” is no longer just an individual hazard; it becomes an organizational cancer.
As the system scales and abstraction leaks become frequent, the first reaction of many managers is to: add people, add time, or add KPIs. However, this is not only ineffective but harmful.
Adding People: Bringing in more newcomers who haven’t established underlying cognitive meta-models only dilutes the team’s average cognitive density.
Adding KPIs: Forces developers to cover up problems rather than piercing the abstraction to solve them.
Even if the organization suppresses the issues with short-term “patches,” the system will eventually lose its ability to evolve. This is because the entire organization has suffered “Collective Amnesia” — the loss of requirements. No one knows why this line of code was written, nor does anyone understand the business meaning behind that configuration.
5.3 The Truth of Refactoring: Repayment with Compound Interest
At this stage, the organization is often forced to choose between “Rewrite” or “Refactor.” It is only then that we painfully realize: rewriting isn’t just about writing code; it’s about recovering the lost requirements, logic, and context of the past.
This is the ultimate manifestation of the “Law of Conservation of Cost”: The high cost of refactoring is not essentially for typing code, but for paying the “Cognitive Repurchase Fee.”
We are paying money to buy back the cognition that wasn’t built because we took shortcuts;
We are spending time repairing logic chains that broke due to “contentment with superficial understanding”;
We are rebuilding test safety nets that atrophied due to “over-encapsulation.”
This painful process is the act of repairing organizational and individual Cognitive Leakage. Every penny of cognitive cost we thought we “saved” back then is now marked with high interest, waiting for us to repay it — with compound interest.
6. The Dialectics of Abstraction: Principles and Boundaries
After expounding on the risks of “Cognitive Leakage,” I want to clarify one thing: My proposal of this theory is by no means intended to categorically deny the value of abstraction tools.
Whether frameworks, libraries, Low-Code, visualization tools, or even AI-assisted programming, they possess indisputable value in specific scenarios. For Prototyping, small-scale systems, non-core business domains, or edge scenarios with low requirements for performance, security, and stability, abstraction and convenience are powerful enablers of efficiency. In these areas, using a “Toy Mindset” to rapidly build glue code makes perfect economic sense.
However, Abstraction & Convenience ≠ Universal & Omnipotent.
We must be wary of misapplying “tactical convenience” to the “strategic core.” For systems involving the Core Domain, high business complexity, long lifecycles, or multi-team collaboration, abstract tools must be used with extreme prudence.
This does not mean completely banning tools in such systems, but rather adding “Guard-rails” and “Governance” mechanisms to them.
When using AI, we must enforce mandatory Code Reviews to ensure cognitive synchronization.
When using Low-Code, we must ensure the generated logic remains within the range of our cognitive control.
My stance is not a dogmatic rejection, but a trade-off based on “Context”:
When the system is simple and the lifecycle is short, we can lean towards abstraction and convenience, because the cost of “Cognitive Leakage” is negligible—it is an acceptable “bad debt.”
But when the system is complex and the lifecycle is long, we must return to an emphasis on underlying structure, mental models, and absolute control.
Efficiency and long-term health are not contradictory, but they do not coexist automatically. Only through conscious governance and the deliberate defense of Cognitive Sovereignty can we enjoy the convenience of tools while avoiding becoming their slaves.
7. Reclaiming Cognitive Sovereignty in an Era of Rising Abstraction
In today’s rapidly accelerating technological landscape, everyone talks about Tech@Core. Yet, when faced with seductive buzzwords like “low barrier,” “fast pace,” and “zero cost,” it is hard not to be swayed. This is the human instinct to seek the path of least resistance; there is no need to deny it.
However, before every time you press “Copy & Paste,” and before every decision to adopt a “black-box tool,” it is both worthwhile and necessary to ask yourself one question: “What is the cost of doing this?”
Because in the world of software engineering: A highly abstract low barrier often implies a distortion of details; extremely simple operations often imply a black-box encapsulation of logic; and the so-called “zero cost” is nothing but an outright lie.
All systems with long lifecycles bear inevitable costs during their development and evolution. This is the certainty of information entropy, and the iron law of the Conservation of Cost.
Humanity has thrived not by fearing difficulties, but by facing challenges head-on. In this age of advanced tools, laziness can be a momentary respite, but it must never become a permanent creed.
Maintaining your “Cognitive Sovereignty” is not only the whetstone for an excellent engineer but also the only preservative to fight system decay and sustain product vitality.
As engineers, especially in this era, we should deliberately practice, deconstruct, and understand.
Please be the Creator of the system, not the Consumer.
8. Postscript & Preview
As an engineer who has ploughed the fields of code for a decade, I have always sought a more precise way to describe the nature of this “chaos.”
In my next article, I will detail the “Instantaneous Code Entropy Model (
)” that I proposed six years ago. It is a semi-quantitative System Dynamics model designed to reveal the non-linear cost accumulation in software evolution.
I will attempt to plug the concept of “Conservation of Cognitive Cost” discussed today into that differential equation, and combine it with Little’s Law to derive a mathematical conclusion spanning the entire software engineering lifecycle:
How “Cognitive Leakage” leads step-by-step to “Entropy Increase” at the physical layer, and ultimately to the “Collapse of Delivery Rate” at the management layer.
Stay tuned.
References & Further Reading
This reflection builds upon foundational software engineering principles and is supported by emerging research in neuroscience and human-computer interaction. Below are the key works that provide empirical evidence for the concepts of “Cognitive Leakage” and “Conservation of Cost.”
Kosmyna, N., Hauptmann, E., Yuan, Y. T., et al. (2025). Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. MIT Media Lab.
Provides physiological evidence (EEG data) supporting the “Creator-Consumer Singularity,” showing that AI usage weakens neural connectivity and reduces the sense of ownership over the output.
Oakley, B., Johnston, M., et al. (2025). The Memory Paradox: Why Our Brains Need Knowledge in an Age of AI. In The Artificial Intelligence Revolution: Challenges and Opportunities.
Validates the mechanism of “Cognitive Leakage” by explaining how “metacognitive laziness” and offloading prevent the formation of mental schemata required for deep problem-solving.
Lee, H., Drosos, I., Sarkar, A., et al. (2025). The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects From a Survey of Knowledge Workers. CHI ‘25.
Supports the “Law of Conservation of Cost,” illustrating how higher confidence in AI tools correlates with reduced critical thinking effort, effectively shifting the cost from creation to stewardship.
Spolsky, J. (2002). The Law of Leaky Abstractions. Joel on Software.
The foundational concept upon which “Cognitive Leakage” expands—shifting the focus from the imperfections of tools to the degradation of the user’s capability.
Beck, K. (2025). Why Does Development Slow? Tidy First? (Substack).
Explores the concept of software design as “buying optionality,” aligning with the view that refactoring is an upfront cognitive investment to prevent future system decay.
Sarkar, A. (2024). How to stop AI from killing your critical thinking. TEDx Talk.
Introduces the distinction between “visiting” an idea vs. “inhabiting” it, which parallels the distinction between minimizing Implementation Cost and building Orderliness.


