Something I've been noodling on...I think knowledge work can broadly be distilled to the generation of ideas that convert into value.
And at an atomic level, that looks like the creation, transformation, and dissemination of tokens: novel and valuable units of thought produced (as of today, primarily) by human collective effort.
Over the weekend I was working with Claude Code and realized that in a single day I generated about 800,000 tokens. By comparison, in a normal workday, without AI, I probably generate 7,000 tokens through Slack, meetings, docs, and email, and consume about 12,000. One AI-assisted day created 40x more tokens than normal.
The problem has always been that most work output never converts into value for the company or the customer. Docs that do not get read. Prototypes that collect dust. Reports no one checks. Recordings no one replays.
Now it is easier than ever to generate tokens with AI. The volume of output explodes, but without alignment and context, the conversion rate to actual value will continue to drop.
It's a classic tragedy of the commons.
Every tool is incentivized to drive token count up. Slack sends more notifications to pull you back in. Reporting tools generate more views to keep you engaged. Every person is incentivized to produce more tokens to show progress. Supply grows, focus does not.
But we have very real and finite intake limits. Human capacity to absorb tokens is flat while supply explodes. Without the help of AI summarization and similar tools, our ability to process tokens does not increase. AI can help compress intake, but it also multiplies output — so without system-level design, the gap only widens.
Dropping value yield per token generated. Only a small fraction of tokens create leverage for the team. Most are exhaust.
This is where context becomes critical. LLMs are only as good as the context they have. Individuals operate with narrow context windows. Organizations hold vast repositories of past, present, and future tokens. Without systems to shape and share that context, the conversion rate of tokens to value stays low.
The challenge (and the opportunity) is organizational design. How do you maximize the value of all tokens toward end outcomes: customer impact, dollars, better decisions? How do you route and align tokens so both people and AI agents can operate with shared, high-leverage context?
This is the token economy problem. And solving it may be the most important management challenge of the AI era.
Would love to hear your thoughts.