The fugu guide to jobs in a world of AI
Stop measuring AI by the tasks it fails at. Start noticing the systems it breaks.
In Japan, a licensed fugu chef occupies a unique position in the food economy.
Preparing this pufferfish requires years of training and certification. And that’s because eating the dish comes with the risk of fatal poisoning. You would expect that a meal that had the possibility to kill you wouldn’t have nay takers.
This is the fugu paradox. People pay a premium not despite the risk, but precisely because of it.
The chef’s role is less about culinary creativity and more about the ability to execute without error. Only chefs who’ve undergone years of training and earned a special license are allowed to serve it. Eating fugu signals bravery, perhaps, but, more importantly, it signals access to a rare culinary experience, only served by the most highly trained of chefs.
It’s tempting to treat all this as one of Japan’s many oddities ranging from train pushers (Staff who literally push passengers into packed rush-hour trains) to square watermelons (grown in boxes, for easy stacking and packing).
But it’s really a story of how roles change when the associated scarcities change.
Traditionally, seafood value came from physical scarcity. The harder it was to source a fish, the more expensive it was. Logistics and ecology were constraints that made certain forms of seafood luxury goods.
This changed with the arrival of cold chain logistics. By making refrigeration and global transport more reliable, it normalized access to rare fish, and collapsed that scarcity. Eating seafood no longer signalled luxury, simply because seafood wasn’t scarce anymore.
As seafood itself lost its status as a luxury good, something interesting happened.
Fugu - carefully-prepared seafood, which would otherwise be fatal - rose in value as a luxury good.
Most luxury foods are defined by scarcity of supply, and fugu’s scarcity lies in the scarce access to highly trained chefs. The fish itself is abundant, but what’s rare is the skill required to prepare it without killing the diner.
Most people don’t actually expect to die from fugu. In Japan, the preparation is so tightly regulated that deaths are super-rare. But the possibility, even if remote, elevates the experience. And the fact that you’re being served by a highly trained chef signals luxury.
The constraint had shifted to preparation skill. You don’t eat fugu because the fish itself is rare. The ability to reliably prepare the dangerous pufferfish is really the new constraint that gives fugu its value.
When constraints in a system change, and what was previously scarce becomes abundant, value shifts within the system. Some forms of labor lose value as the underlying constraint disappears. Others gain value precisely because they resolve what the new system cannot make abundantly accessible.
The fugu chef, today, holds a premium position in Japanese culinary arts. In a globalized seafood market, where most roles were being commoditized, the fugu chef’s role paradoxically became more valuable.
Much like cold chain logistics removed seafood scarcity, AI changes the scarcity associated with certain types of knowledge work. But in doing so, it will expose new bottlenecks and constraints. These are points in the system where trust, context, and interpretation become the new scarcities. Some jobs will vanish or lose value. But a few, like the fugu chef, will paradoxically gain value.
Understanding how roles change in response to shifting constraints is critical to navigating this transition and identifying new opportunities.
Fugu and the future of work
Every role in an organization exists to resolve a constraint.
This is easy to miss because we associate roles with fixed titles of engineer, doctor, or teacher. But every such role exists to move work along and what’s really preventing work from moving along is some form of constraint. Remove or shift the constraint, and the logic for that role starts to break down.
The role unbundles when tasks that needed to be performed together no longer need to coexist in the same role. Sometimes this is driven by technology. When automated checkout systems eliminated the need for cashiers to handle both payment and bagging, the cashier wasn’t displaced but their role was unbundled, and they had to migrate to other tasks like troubleshooting customer checkouts or upselling items. Other times it’s structural. When Covid-driven remote work decoupled collaboration from location, managerial roles were unbundled and rebundled around collaboration tools like Miro, Slack, and Notion. In both cases, the role as a bundle of co-dependent attributes falls apart.
But as old constraints are removed, value now shifts to new constraints.
In the case of the fugu chef, the constraint shifted from procurement of fresh pufferfish to managing risk while preparing the fish.
The new roles that emerge rebundle around the new constraint, often combining a different mix of capabilities.
The newly scarce jobs
The mistake people often make in responding to AI is assuming they are competing with the machine.
Instead, they should be asking:
Where is the machine creating a new constraint that only a human can resolve?
In other words:
Don’t just look at what AI can’t do.
That will pit you in a race against an ever-improving machine.
Look, instead, at what it breaks in the system.
If an AI can generate thousands of marketing variants per hour, the constraint shifts to human discernment, who decides what’s likely to land emotionally? If an AI assistant can draft legal memos, the constraint moves to the lawyer’s judgment in spotting where the model has overgeneralized or hallucinated.
With improvements in AI, the role of the radiologist has changed, who no longer gains value by identifying abnormalities in scans alone. AI can do that with remarkable accuracy. Instead, her value lies in interpreting corner cases, communicating risk to patients, and working in case groups with other specialists to resolve complex cases.
Such role shifts happen whenever the system changes. The arrival of GPS-aided navigation transformed the role of the driver from someone who plans a route to someone who looks out for exceptions where the machine gets it wrong and interprets when and how to act on the machine’s guidance.
To remain valuable, individuals and organizations must learn to spot where the constraint has moved, and redesign roles around it.
You might be fascinated with everything AI can increasingly do. But the roles of the future are found in the coordination gaps and new constraints created, not despite, but precisely because of, AI’s relentless execution.
Now that we understand how value migrates when constraints shift, we need to understand what kinds of constraints systems actually face. Because if we want to predict how roles will transform, we first need to map the nature of the constraints themselves. That’s where we turn next.
Rethinking value
In an earlier post, we explored the gap between intrinsic value and economic value, the idea that some forms of work may be meaningful, even vital to human dignity, yet remain economically undervalued.
From Humans as luxury goods in the age of AI:
First, economic value requires scarcity of supply.
Air is vital to life. Its intrinsic value is infinite.
But because it’s abundant, it has no economic value in most cases.
Of course, if you’re going Scuba diving, it’s no longer abundant and now commands economic value as compressed air.
Second, economic value requires relevance to demand.
A soldier at war carries a locket with a photo of his family.
To him, that locket is priceless. It reminds him why he’s fighting. It has infinite intrinsic value to him.
But on the open market, the locket might be worth almost nothing, just a piece of metal.
If he lost it, no amount of money might substitute its value to him, but to others, its economic value is low.
That’s the difference between intrinsic value and economic value.
Teaching, caregiving, and community organizing often fall into this category. They create intrinsic value by fostering connection and growth, but lack economic value because markets fail to price what they can’t easily exchange.
Intrinsic value is about meaning; economic value is about exchange potential.
But there’s another distinction worth making. And this is particularly relevant in an age where AI is restructuring how work gets done.
It’s the distinction between economic value and contextual value.
Contextual value is a measure of how crucial a task is to the performance or stability of a larger system. It’s not about whether a task is hard to do or emotionally meaningful; it’s about whether it is structurally indispensable to a system operating under certain constraints.
When constraints change, the contextual value of tasks in the system change.
For instance, if AI changes how workflows are executed, certain tasks within those workflows lose contextual value and others gain it.
Data labelling in machine learning is a great example. Prior to the early 2010s, tagging images or annotating text held virtually no value. It was viewed as a low-skill, manual chore. Arguably, it had low intrinsic value. But when supervised learning models became the dominant approach in AI, the constraint in the system changed.
The bottleneck was no longer in computing power or model architecture, but in access to labeled data. Annotators now determined model accuracy. Their work carried high contextual value, and determined the performance of the larger system. Even if the intrinsic value had not changed, the contextual value was now much higher.
And yet, in most cases, these annotators remained poorly compensated. Their economic value didn’t rise in proportion to their contextual value, for one simple reason: while the system depended on them, access to this skill was not scarce.
When a system evolves, new roles emerge around new constraints, and those roles may increase in contextual value. In some cases, this leads directly to high economic value. But in other cases, as with data annotation, contextual value rises without corresponding economic benefits.
Pre-order my upcoming book at 70% off
This post is based on ideas from my upcoming book Reshuffle.
Reshuffle is now available for pre-orders. All pre-orders leading up to the launch date are at 70% off. (Launching June 2025)
Pre-orders are Kindle only.
Hardcover, paperback, and audiobook versions will be available at launch.
The fugu equation
Understanding this distinction matters because it shapes how we think about fairness and leverage.
This is why the Fugu chef is such an important illustration. The fugu chef has always had high contextual value. The pufferfish is easily sourced but unless prepared with high skill, that one meal could be your last one.
When the constraint in high-end dining moved away from access to rare ingredients, fugu introduced a new constraint. The fish itself is not rare. The danger lies in its preparation. One slip of the knife, and the meal becomes fatal. The contextual value of the fugu chef is therefore very high.
And so is the economic value.
First, when most other dining experiences became commonplace with easy access to rare fish, access to a Fugu chef’s culinary skills gained very high signalling power. That increased the demand for such experiences.
At the same time, the supply of fugu chefs is tightly controlled. The chef must undergo multi-year training, pass rigorous government licensing, and operate in an environment that publicly signals their expertise. The restaurant displays their license. The diner knows they’re trusting that certification with their life.
Fugu chefs in the age of AI
As we reimagine our work alongside AI, we need to look for roles that command both high contextual value and high economic value.
Economic value ensures you get paid well. And contextual value ensures you are critical to the new system.
People make the mistake of looking for tasks machines can’t do to make a case for work that will require humans. This is an outcome of the automation fallacy - let’s move to that which cannot be automated.
Instead, to be valuable, you need to position yourself in a way that the new system can’t move forward without you. And ensure that the position is not only contextually vital but also economically visible. Recognizing the gap between these two forms of value, contextual and economic, is the first step in designing better systems, and in reimagining your role.
When broad humans beat narrow AI
The most valuable roles are those that command economic value while ensuring high contextual value. These roles are rare and not easily displaced. They are, accordingly, paid very well.
In the early days of stock trading, fortunes were made by those who could out-shout or out-signal the competition on the floor of the New York Stock Exchange.
But as digital infrastructure matured and algorithmic trading took over, machines which didn’t get tired and never made emotional trades started outperforming human traders. Value shifted as traders who had once thrived on reflexes now found themselves redundant. In their place, new roles of algorithmic strategists and behavioral signal analysts came up. The tasks of buying and selling were still central to the system, but the human role had transformed from performing the trade to interpreting the market system and identifying second-order patterns the models couldn’t see.
Judgment becomes more valuable in a world of frictionless execution. This is why misplaced prophecies about radiologists losing their jobs and becoming redundant don’t actually play out. Radiologists who were paid to examine scans and identify anomalies realized they had a new role when AI could identify tumors betters than them. Their role had shifted to judgment - to deciding what the scan meant in clinical context and what appropriate next steps should be triggered given the larger context of the patient. Machines were better than radiologists at image classification. But radiologists now have high contextual value in the new system, thanks to their clinical judgment and understanding of the larger patient context.
These are examples of role migration where the new role has high contextual value as well as high economic value - the ideal place you want to end up in. And you don’t get there through ‘reskilling’. You get there by understanding what the AI cannot understand - the larger complex system within which it is operating.
What ties all of these examples together is a shift in the decision constraint. When data and information are abundantly available and agentic execution takes over slow and inefficient human execution, knowing what to do based on how you interpret the facts is increasingly valuable. The new roles that emerge are rebundled around judgment.
This transformation has nothing to do with reskilling in the narrow sense. It is about understanding broader contexts that narrowly efficient AI systems are not very good at figuring out.
When multidimensional humans are invisible to unidimensional AI
For centuries, caravans have moved salt across the Sahara.
The salt itself is abundantly available, and is extracted from mines in the north and traded for goods in the south. But moving that salt is a system constraint.
Navigating the shifting dunes of the Sahara, with its invisible hazards and its vanishing waypoints, and where GPS-based navigation fails to make sense of a landscape that is always in motion, is non-trivial.
Tuareg guides made these trade routes viable for centuries. These nomads navigate by memory and intuition, aided by generations of ecological awareness of the region. They hold the keys to navigation.
Yet, even though they hold high contextual value, their economic value is limited.
Because economic value depends not just on how essential your role ism but on how visible, concentrated, and hard to substitute your leverage is in the market.
The Tuareg worked in a system where their coordination was distributed across fragmented individuals, not centralized in institutions. They had no access to institutional bargaining which would have helped the rare skill command its market price. Further, the value they created in salt trade was captured further downstream by markets and middlemen, far detached from the Tuareg navigators themselves.
They were neither visible, nor coordinated, nor close to value capture. The Tuareg remind us that contextual value is necessary, but not sufficient for economic reward.
In the early 2000s, global food retailers began demanding end-to-end traceability from their suppliers. But for many crops, like cocoa, coffee, or cotton, the supply chains were informal and localized. Smallholder farmers and local middlemen became the eyes and ears of the traceability system as they managed tagging and compliance with certifications. The digital traceability systems that they fed data into depended on their inputs, but rarely rewarded them in proportion to their value. The same issues that had plagued the Tuareg - lack of visibility in the system, lack of institutional power, and distance from value capture - plagued these farmers as well.
As AI takes over system-wide coordination, such local actors become more valuable than ever. Logistics platforms like Uber Freight may manage trucking routes and job pricing, but when a truck gets stuck behind a blocked alley or needs to resolve a broken gate code, it’s the driver who resolves it. In AI-enabled warehouses, robots do the heavy lifting, but human pickers still handle the exceptions, taking care of the mislabeled bin or the damaged item. As AI-enabled execution scales, these edge cases only multiply.
The role of such local actors becomes central to system resilience. They coordinate between machine execution and the complexity of the environment in which machines operate.
What makes these roles valuable is not that they do what AI cannot, but that they adapt in ways the AI cannot anticipate.
Yet despite their criticality, these actors often remain economically undervalued.
Like the Tuareg guides of the Saharan salt caravans, whose deep environmental knowledge was critical to navigation but whose compensation was decoupled from the trade value they enabled, today’s warehouse workers, delivery drivers, and field technicians operate in roles where contextual value is high, but control is low.
They resolve breakdowns and absorb variability, yet they do so in ways that are distributed, informal, and largely invisible to the system’s central intelligence. The system tracks GPS pings and swipe-in/swipe-out at the warehouse, not the hundreds of edge case judgment the human is making throughout the day.
Because they don’t sit at a leverage point where pricing is negotiated or profits are captured, their labor is treated as a cost to be minimized, not a capability to be invested in.
The system needs them to adapt, but it isn’t built to reward them for doing so.
Find your fugu
At the beginning of this journey, we met the fugu chef, whose role gained value from the new scarcity in the system - access to a rare dining experience that had the potential to kill you but most likely wouldn’t. They had high contextual value.
Yet, as we’ve noted since, contextual value does not automatically translate into economic value. Your role may be indispensable and yet not be rewarded. Like the Tuareg caravan guides or the cleanroom janitors in semiconductor fabs, you may perform mission-critical work and yet never show up in the performance review of the algorithm.
Economic value comes from what is recognized & rewarded in the new system.
The mistake is assuming that being essential is enough. It’s not.
So don’t just ask what AI can’t do. Ask what new constraints it creates.
Don’t just look for the human touch that AI can’t provide. Look for what the system can’t yet coordinate, and ensure that you’re positioned in a way that makes your value visible and tradable.
That’s your fugu - your system-critical intervention that makes it impossible to ignore you.
And then, move as close to the point of value capture as possible to ensure that your value creation is traded appropriately. And that you get paid as you should.
Find your fugu. Then make sure you’re the one serving it.
Reshuffle in Europe and the Caribbean
I’ll be traveling on a speaking tour across the Caribbean (late June) and Europe (July-August) on the ideas covered in my book Reshuffle.
If you’d like to discuss a speech at your company, please write in to liz@platformthinkinglabs.com or just send in a reply to this email.
Great framing! Actively finding my Fugu.
Brilliant, Sangeet!