hopshomemade/iStock

Gimme a Minute …

No, Make It Fifteen
 

edward tenner, a frequent contributor to the Review, is a research affiliate of the Smithsonian Institution and Rutgers University.

Published August 31, 2023

 

Earlier this year, a series of news stories and opinion pieces in The New York Times illuminated a cultural uproar that, for a change, has nothing to do with race, religion, gender or gluten. It’s all about the goal of walkable communities, where basic needs can be met within 15 minutes on foot or bicycle. Conservatives apparently had no previous objection to the concept, paying millions for apartments in New York and London near dry cleaners and supermarkets, not to mention three-star restaurants. But to some right-wing populists, the 15-minute city is less a convenience than a conspiracy to corral citizens into high-density neighborhoods where they could be seduced with collectivist, European ideas.

The Colombian-born French academic promoter of the idea, Carlos Moreno, now receives hate mail from folks who see 15-minute neighborhoods as the precursor to walled ghettos. Thousands gathered to protest one such plan in Oxford.

True, death threats from QAnon believers are surely overkill for what is nothing more than well-meaning mandarin social engineering, although in hindsight license-plate monitoring in some of the plans probably is an invitation to the paranoid. What struck me about the plans was why 15 minutes was chosen as the magic time horizon. More broadly, why is 15 minutes such a common unit in everyday life?

Why Fifteen?

Professor Moreno did not reply to an email inquiry about this choice, and a paper he co-authored on the subject in the journal Nature does not explain. It must have seemed self-evident. I believe it is, but as with the truths in the Declaration of Independence, self-evidence results not from neurobiological hard-wiring but from centuries of cultural conditioning.

After years of watching TED Talks online (and two appearances on the TED mainstage) I’ve been intrigued by the 15-to-20-minute time horizon as an apparent building block of attention. I found it was possible to compress all the essential ideas of a 45-to-50-minute lecture by a factor of three, but no more. I always assumed that the founder of TED, the design guru Richard Saul Wurman, had relied on decades of cognitive science research to find an optimum attention span — even though by 2018 (my second go-round), lengths had shrunk slightly. Yet when I asked a friend working at TED, she said she was unaware of any scientific rationale for the length.

Indeed, when I began to explore the psychological and sociological literature on attention spans, I discovered there was much less agreement about them than I had assumed. The most recent literature review on the subject, by Neil A. Bradbury of the Department of Physiology and Biophysics at Chicago Medical School, revealed that the most common citation of the 15-minute student attention span was about note-taking, not attention. Of other common citations, Bradbury found that many “suffer from methodological flaws and subjectivity in data collection.” In fact — surprise! — the most important variable in attention spans proved to be the instructor.

Fifteen- as opposed to ten- and twenty-minute units are ubiquitous in Western culture. Fifteen minutes has been the standard time for medical appointments in the U.S. —though, now, more a maximum than a norm as administrators pressure physicians to stuff in five or six patients per hour, rather than four.

The restaurant reservation app OpenTable recommends 15-minute intervals on its page for dining managers; elsewhere, the most frequently given interval for holding a table for late guests is 15 minutes, and many travel sites suggest arriving 15 minutes early to ensure that the table will be ready at the promised time.

 
The ubiquity of 15 minutes, whether as presentation interval, grace period or metaphor, raises the question: how did this standard originate? Like many institutions we take for granted, it has no obvious explanation.
 

Sixty Minutes, the longest-running U.S. prime-time network television program — it debuted in 1968— owes part of its success to its ingenious division of the hour: three segments of about 15 minutes each, separated by commercials that some viewers may experience as welcome breaks rather than interruptions.

Decades of student folklore have held that students can leave a classroom without penalty if a professor is more than 15 minutes late. The time somehow feels not too brief and not too long. But it was never a good idea: a teacher irritated by the absence of part (or even all) of the class always has the option of lecturing for the remaining half hour or so and including the material on the test.

And there is always the dictum attributed to Andy Warhol that “in the future, everyone will be famous for 15 minutes.” At least three other artists, critics and curators have also been credited with the remark, reinforcing how central the metaphor is to our society. Somehow, it just does not work with 10 minutes or a half-hour.

Really: Why Fifteen?

The ubiquity of 15 minutes, whether as presentation interval, grace period or metaphor, raises the question: how did this standard originate? Like many institutions we take for granted, it has no obvious explanation. But I can suggest a hypothesis for the West. 

Our sense of time was transformed by the first mechanical clocks of the 14th century. Some of the most magnificent have survived centuries of war and revolution, and are in working order today. By the 18th century, variation by a few minutes a day was the norm. But in the Middle Ages, 15 minutes was a tolerable margin both for mechanical clocks and an older alternative, water clocks. They made it possible to divide the day into segments that did not vary with the ebb and flow of daylight hours from season to season. By the same token, most sundials also had no divisions shorter than a quarter hour.

I am unaware of any text from the age of mechanical timekeeping that explains why clocks customarily chime at 15-minute intervals. (YouTube hosts videos of cathedral clocks chiming quarter-hours.) How far back can we go to find the origin of the quarter-hour interval, which was not widely recognized in antiquity? Glad you asked.

A Northumbrian monk now known as the Venerable Bede noted the quarter-hour in his book, The Reckoning of Time, in 725. Bede recognized the punctus of 15 minutes, which he related to the swiftly passing quarter-hour divisions of sundials — leading indirectly to today’s English word punctual.

It would probably take years of research in medieval and early modern manuscripts and printed books to understand how the quarter-hour interval made its way into customs and collective mentalities. The chiming of countless church clocks, and later of household clocks, must have had something to do with it. Bede’s monastery of Jarrow was probably a 15-minute city in itself; we can only wonder what Bede would have thought about a professor in a still-unbuilt city receiving death threats because of the idea.

main topic: Culture