Although AI assistants are now deeply embedded
in society, there has been limited empirical study
of how their usage affects human empowerment.
We present the first large-scale empirical analysis of disempowerment patterns in real-world AI
assistant interactions, analyzing 1.5 million consumer Claude.ai conversations using a privacypreserving approach. We focus on situational disempowerment potential, which occurs when AI
assistant interactions risk leading users to form
distorted perceptions of reality, make inauthentic value judgments, or act in ways misaligned
with their values.
Technology companies are adding a fourth component to engineering compensation : salary, bonus, options, & inference costs. Levels.fyi pegs the 75th percentile software engineer salary at $375k. Add $100k in inference & the fully loaded cost is $475k. That’s 21% in tokens.
Enterprise AI spending jumped 3.2× in 2025 to $37B (Gartner), while inference costs dropped 90%.
The paradox isn't new, it's cloud economics on fast-forward.
What took cloud 10 years to reveal is happening in AI in 2-3 years:
a) Low unit costs → elastic, untracked adoption
b) Agentic workflows multiply requests 10-50×
c) Cost drivers are architectural (retries, context bloat), not volumetric
d) Finance sees the bill only when it's too late to re-architect
Rigorous cost governance will be essential to realizing AI ROI.
"One approach is to focus on data quality rather than quantity. ai labs do not simply train their models on the entire internet. They filter and sequence data to maximise how much their models learn. Naveen Rao of Databricks, an ai firm, says that this is the “main differentiator” between ai models on the market. “True information” about the world obviously matters; so does lots of “reasoning”." said Naveen Rao
reply