In a pivotal scene from the 2006 film The Devil Wears Prada, the formidable fashion editor Miranda Priestly delivers a monologue that has become a foundational text for understanding systemic influence. When her assistant, Andy, scoffs at the perceived triviality of choosing between two near-identical blue belts, Priestly meticulously traces the lineage of Andy's own "lumpy" cerulean sweater — from Oscar de la Renta's runway collection, through department stores, down to the clearance bin where Andy presumably found it. The lesson is surgical: even those who believe they are standing outside a system are often its unwitting products.

This "cerulean logic," as the argument goes, offers a surprisingly precise framework for the current debate over artificial intelligence. The analogy is not new in spirit — technology critics have long pointed out that opting out of dominant technological paradigms is more difficult than it appears — but the fashion metaphor sharpens the point in a way that abstract arguments about network effects rarely do.

The Illusion of Distance

Many AI skeptics treat the technology as a discrete, avoidable accessory — a "blue belt" they can simply choose not to wear. Generative models are dismissed as vanity projects, productivity gimmicks, or tools relevant only to a narrow class of early adopters. This posture carries a certain intellectual comfort: if AI is merely a trend, it can be safely ignored by those with better taste.

But the analogy to fashion's supply chain cuts against that comfort. The fashion industry does not require individual buy-in to exert influence. Trend decisions made in a handful of ateliers ripple outward through manufacturing, retail, and eventually into the wardrobes of people who have never heard of the designers responsible. The same structural dynamic is at work in AI adoption. Large language models and machine learning systems are not waiting for universal consumer enthusiasm. They are being embedded into enterprise software, logistics platforms, customer service infrastructure, and content moderation pipelines. The end user does not need to consciously adopt AI for AI to shape the information and services that reach them.

This pattern has historical precedent. The spread of relational databases in the 1980s and cloud computing in the 2010s followed a similar trajectory: initial skepticism from end users, rapid adoption at the infrastructure layer, and eventual ubiquity that rendered the question of individual opt-in moot. By the time most people understood what a relational database was, they had been interacting with dozens of them daily for years.

The Bargain Bin Has Already Been Stocked

The more uncomfortable dimension of the cerulean argument is its implication about agency. Andy did not choose cerulean; cerulean chose her, mediated by a supply chain she neither understood nor acknowledged. For AI skeptics, the parallel is direct. AI is already embedded in the logistics of product delivery, the ranking of search results, the drafting of legal boilerplate, and the filtering of email. The tools used to write critiques of AI increasingly rely on AI-assisted spell-checking, grammar correction, and autocomplete. The infrastructure is not waiting for permission.

This does not mean skepticism is without value. Rigorous scrutiny of AI's limitations — its tendency toward confabulation, its opacity, its concentration of power among a small number of model providers — remains essential. The cerulean metaphor is not an argument for uncritical acceptance. Priestly, after all, is not a hero in the film; she is a figure of immense power wielded without accountability. The monologue reveals how systems operate, not that those systems are benign.

The tension worth watching is between two forces that show no sign of reconciling. On one side, AI infrastructure continues to expand beneath the surface of consumer-facing products, making opt-out increasingly theoretical. On the other, the questions skeptics raise about reliability, governance, and concentration of control grow more urgent precisely because the technology is becoming harder to avoid. Whether the appropriate response is engagement, regulation, resistance, or some combination remains an open question — but the premise that one can simply stand outside the system and observe it from a safe distance looks less tenable with each passing quarter.

With reporting from Fast Company.

Source · Fast Company