Algae Caviar, Anyone? What We’ll Eat on the Journey to Mars

To anyone who happened to be looking up that morning, perhaps from the deck of a boat off the coast of Portsmouth, New Hampshire, the plane would have appeared to be on an extremely alarming trajectory. It rocketed into the cloudless late-summer sky at a 45-degree angle, slowed momentarily and leveled out, then nosed down toward the ocean, plunging 17,000 vertical feet in a matter of seconds. At the last moment, it leveled out again and began another climb, looking for all the world as though it were being piloted by a hopelessly indecisive hijacker.

Onboard the plane, the mood was euphoric and a little hysterical. The main cabin had been converted into a kind of padded cell, lined with soft white tiles in lieu of seats and overhead bins. Two dozen passengers, clad in blue jumpsuits, lay on their backs on the floor. As the plane neared the crest of its first roller-coaster wave, a member of the flight crew got on the PA. “Pushing over, slow and easy,” he shouted over the roar of the engines. “Release!” Moments before he uttered that final word, the passengers began to levitate. Their feet, hands, and hair lifted first, then their bodies, arms dog-paddling and legs kicking ineffectually as they giggled and grinned like fools for a fleeting, floating instant. “Feet down, coming out,” the crew member said 20 seconds later. The passengers hit the floor ass first and lay spread-eagled, staring at the ceiling.

March 2020. Subscribe to WIRED.

Photograph: Stephanie Gonot 

The plane flew 20 parabolic arcs that day, for a total of around six minutes of weightlessness. Each time gravity loosened its grip, the blue-suited occupants frantically got to work on a range of activities and experiments. I hovered in the middle of the cabin, toes down, hair up, and took in the scene. Up by the cockpit, a square-jawed jock raced to strap himself into a vertical rowing machine. Not far away, a waifish young woman sculpted spidery 3D figures in midair with a hot glue gun, sucking on her lip piercing with a look of deep concentration. Behind me, toward the rear of the fuselage, the world's first musical instrument designed exclusively for performance in microgravity—a sort of metallic octopus called the Telemetron—emitted plaintive digital chimes as it spun. A woman wearing a seahorse-inspired robotic tail rotated serenely, twirling around its flexible ballast like a stripper on a pole.

A few feet away from where I hung, Cady Coleman, a former NASA astronaut with six months of spaceflight experience, took a nostalgic joyride, somersaulting and gliding like a pro. Nearby, silkworms in varying stages of development bounced gently in the hammock of their freshly woven cocoons, largely unnoticed inside a small acrylic box. I struggled to keep hold of my pencil and notebook as I watched industrial designer Maggie Coblentz, immaculately costumed in a Ziggy Stardust-inspired white jumpsuit and matching go-go boots, chase down and swallow a handful of boba pearls, nibbling at them like a goldfish.

The flight had been chartered by Ariel Ekblaw, the intimidatingly accomplished founder of the MIT Media Lab's Space Exploration Initiative. Ekblaw has a round face, long curls, and the earnest demeanor that comes with being a Girl Scout Gold Award winner and high school valedictorian. Her mother set the bar for overachievement in a male-dominated field: She was a reservist instructor in the US Air Force back when female trainers were unheard of, and she would have flown fighter jets if women had been allowed to at the time. But it was Ekblaw's father, a fighter pilot himself, who kindled her obsession with space. He was a sci-fi buff, and Ekblaw grew up devouring his paperback copies of Isaac Asimov and Robert Heinlein. She also watched Star Trek: The Next Generation at a formative age, imprinting on its impossibly optimistic vision of the future. After majoring in physics, math, and philosophy as an undergrad, she earned a master's degree in blockchain research. Then, four years ago, at the age of 23, she decided to return to her first love.

The Space Exploration Initiative's goal is to bring together “artists, scientists, engineers, and designers to build a real-life Starfleet Academy.” Ekblaw and her expanding team of more than 50 collaborators are getting ready for the day when humanity becomes a space-native civilization, as comfortable in the cosmos as we have been on Earth. “People say we're putting the cart before the horse,” Ekblaw concedes. “But the complexities of space are such that we really should be at least designing the cart while the horse is being prepared.”

As the billionaire rocket bros never tire of reminding us, we stand on the cusp of a new era of space travel. In the coming decades, there will be celestial cruises aboard Richard Branson's Virgin Galactic. There may be off-world factories and lunar mining operations, courtesy of Jeff Bezos and Blue Origin. There will probably be hydroponic grow houses at Elon Musk's SpaceX colony on Mars. Even the bureaucrats at NASA have grand plans for the future. But while a new generation of aerospace engineers toils over the tech that will get us into orbit and beyond—reusable launch vehicles, rocket-bearing planes—an important question remains unanswered, Ekblaw says: “What will delight humans in space?”

Advertisement

Even in the near term, this is not a frivolous concern. A one-way trip to Mars will take about nine months, which is a long time to spend inside a hermetically sealed tube hurtling through a cold, dark void. Like all animals, humans require stimulation; without something to break the monotony, most of us end up like a tiger pacing its cage—stressed, depressed, and prone to problematic behaviors. Indeed, many scientists believe that boredom is one of the most serious challenges facing future spacefarers.

Until now, design for space has focused on survival. But Ekblaw thinks it's possible, even essential, to imagine an entirely new microgravitational culture, one that doesn't simply adapt Earth products and technologies but instead conceives them anew. Cady Coleman amused herself by playing her flute on the International Space Station—another astronaut brought his bagpipes—but future travelers might instead pick up a Telemetron. They might wear clothes spun of special zero-g silk, or sculpt delicate forms that couldn't exist on Earth, or choreograph new forms of dance, assisted by their robot tails. They might, in other words, stop seeing themselves as homesick earthlings and begin to feel like stimulated, satisfied spacelings.

Whatever else they do, they'll require nourishment, which is why food is a central focus of the MIT program. NASA and other government space agencies have traditionally treated food as a practical challenge—an extreme version of provisioning for an outback camping trip. But while a highly trained astronaut might be able to subsist on space gorp without losing her mind, what about a civilian with a one-way ticket to Mars? Coblentz, who is leading the Space Exploration Initiative's gastronomic research, argues that, as much as art or music or movement, good food will enable us to thrive as we leave Earth behind. It has always been the glue that connects us to each other and to the environment around us. Our pursuit of food has shaped the evolution of our sensory apparatus—the very tools through which we, as a species, perceive the world. The choices we make every day about food selection, preparation, and consumption lie at the foundation of our identities and relationships and affinities. As the Italian historian Massimo Montanari succinctly put it, food is culture.

This truth will surely endure into our interplanetary future—even as far as the 24th century, if Ekblaw's beloved Star Trek is to be believed. When Captain Jean-Luc Picard narrowly survives an attempted body-snatching by the Borg, a group of pasty techno-supremacists who invade his mind with nanoprobes and threaten to steal his humanity forever, the place he goes to recuperate is his family's ancestral vineyard in France, where his brother still works the soil, tends the vines, and harvests the grapes, and where the meals are made from scratch. Picard was lucky: Real-life spacefarers won't have the option of hightailing it back to Earth to regain their sense of meaning and identity. They'll need to make it fresh in whichever brave new world they find themselves. As Coblentz puts it, “What will the terroir of Mars be?” To find out, she's compiling a speculative guide to the kinds of culinary tools, tastes, and rituals that might help humans feel at home in space—an interplanetary cookbook.

Coblentz grew up just outside of Toronto and spent her summers canoe-tripping in the Canadian wilderness. After high school, she studied design in New Delhi and New York; she favors the all-black wardrobe common to the field. Yet her love of backcountry exploration has translated into a fascination with extreme environments. Before she came to MIT, she investigated the role that food plays in prisons and on the battlefield. Still, outer space presents challenges all its own; before she could begin developing interplanetary recipes, some market research was in order. And so, on a sunny morning in September, she invited Cady Coleman, Italian astronaut Paolo Nespoli, and a handful of MIT colleagues to a daylong workshop at the Media Lab.

The focus group gathered in a fluorescent-lit conference room decorated with large-format photos of lollipops and Buffalo wings and coiled spirals of salami. On the table, Coblentz had laid out small plastic cups of M&Ms, freeze-dried cheese bites, and Tang; these would serve as both snacks and design inspiration. Nespoli showed up with props of his own—some silvery foil packets from NASA's current menu rotation; some cans filched from the Russian supplies and the European Space Agency, including one simply labeled SPACE FOOD; and a translucent plastic package filled with what looked like yellowish plugs of ear wax but were apparently dehydrated mashed potatoes. “Nobody goes to space for the food,” Coleman said.

Advertisement

A European Space Agency can of braised calf cheek.Photograph: Tony Luong

Coblentz began by making her pitch. Humanity's off-world survival, she said, will depend on a diet that can nourish not only travelers' bodies but their minds and souls. Space food must inspire and unite; it must reflect both the grandeur of the endeavor and the majesty of the surroundings. Coleman, a kind-faced, nurturing type who wore a T-shirt depicting a Martian mountain range, nodded. Nespoli, a rugged former special-forces operator from Milan, raised his heavy eyebrows in polite skepticism.

Undaunted, Coblentz invited Coleman and Nespoli to describe their culinary experiences aboard the International Space Station—the challenges, the frustrations, and the highlights. “You know, people ask me, ‘Why don't you cook pasta in space? You're Italian!’ ” Nespoli replied, still seemingly determined to deflate Coblentz's grand aspirations. “And I'm like, ‘Well, I would love to. But you simply cannot.’ I think you will not understand food in space unless you start understanding some of the practical problems that make food in space what it is.”

Those practical problems have been the focus of sustained research for more than half a century. In the earliest days of the original space race, scientists worried that it might not be possible to eat in zero g at all. The human digestive system evolved to function in Earth's gravitational field; prolonged weightlessness might cause choking, constipation, or worse. The problem required research, but at the time there was no way of duplicating the proper conditions on Earth. “Gravity as a physical factor of environment has the outstanding property of being omnipresent and everlasting,” a 1950 technical report explained. “Not a single individual has as yet been away from its influence for more than one or two seconds.”

The scientists attempted a number of workarounds, the most memorable of which involved a German-born aeromedical doctor, Hubertus Strughold, numbing his buttocks with novocaine. Once anesthetized, he had a pilot fly him through a series of acrobatic maneuvers, reasoning that the lack of any seat-of-the-pants sensation would be a decent substitute for weightlessness. According to contemporary accounts, “he found the experience very disagreeable.” (Strughold was one of many former Third Reich scientists who were brought to the US after World War II to work on the space program. Although he was revered for decades as the so-called father of space medicine, his reputation has since been tarnished by his alleged association with Nazi war crimes. He denied any involvement.)

Scrambled eggs à la NASA.Photograph: Tony Luong
NASA beef patty in a bag.Photograph: Tony Luong

By 1955, the Air Force had refined the art of parabolic flight and could reliably provide up to 30 seconds of microgravity at a time. Although some test subjects initially struggled, choking and gasping when they tried to eat or drink, it was clear that scientists' earlier concerns had been overblown. Still, there is a reason planes like the one Ekblaw chartered are known as “vomit comets.” Somewhere between half and three-quarters of all spacefarers suffer from what NASA calls space adaptation syndrome, triggered by a sudden lack of data from the otoliths. These ancient organs in the inner ear, made up of tiny crystals of chalk embedded in a gelatinous membrane, normally tell the brain where it is in relation to Earth's gravitational field.

Advertisement

Most astronauts get over their motion sickness within a few days, but nausea is far from the only hunger suppressant they face. For one thing, there's no way of cracking a window in space, which means the enclosed environment could easily smell, as Ekblaw described it, “like everyone who has ever been there, every meal that has been eaten, and every dump that has been taken.” Coleman was quick to point out that the ISS has an excellent filtration system, but the fight against funkiness never ends. “They tell you if you open a package of food you have to eat it, all of it, if you like it or not,” Nespoli said. “Whatever you have left over, it will start rotting and it will stink. And you are a good disposal machine.” This organic tendency in food—its inevitable trajectory toward decay—is a major headache for space agencies. When Nespoli asked to bring aged Parmigiano-Reggiano aboard the ISS, NASA said no, because the artisans who produced the cheese could not provide its expiration date. (He had better luck with lasagna.)

Maggie Coblentz, the Space Exploration Initiative’s head of food research, created a special helmet for eating in zero g.

Photograph: Tony Luong

Mitigating the malodor, but reinforcing the appetite loss, is a condition known as “space face.” In the absence of gravity, body fluids pool in the head. This is the suspected cause of the irreversible vision problems reported by some astronauts, but it also means that, for many, eating in orbit is like eating with a severe head cold here on Earth. Astronauts have reported cravings for stronger tastes that cut through the flavor-muffling congestion. Coleman says she “liked sugar up there a little bit more” and began taking her coffee sweetened; her crewmate Scott Kelly, who'd never much cared for desserts on the ground, became something of a chocoholic during his year aboard the ISS.

But the “practical problems” Nespoli alluded to exert by far the biggest effect on astronauts' diet. Every pound that NASA transports to and from space costs thousands of dollars, which means food must be lightweight and compact. It also has to last a long time. Like Nespoli's mashed potatoes, many of the dishes on offer—shrimp cocktail, chicken teriyaki, or one of a couple hundred other options—come dehydrated. And they tend to share another property too, Coleman said: “Everything is kind of mushy.” This is a side effect of NASA's all-out war on crumbs. On Earth, crumbs fall; in microgravity, they can end up anywhere, including inside critical equipment or astronauts' lungs. On the earliest space missions, food came in the form of squeezable purées and “intermediate moisture bites” such as bacon squares and brownies, which were coated in a crumb-proof layer of gelatin. Today's menu is more expansive, but certain foods, like bread, remain off limits. In its place is the all-purpose flour tortilla, to which rehydrated sauces and stews adhere thanks to surface tension.

Advertisement

Although it's possible to eat, say, Fig Newtons or Doritos in space, Coleman said such friable indulgences require careful planning. “You really need to open them near a vent so that any crumbs go on the vent,” she explained. “Then you take the vacuum cleaner and you vacuum the vent, like a good space station citizen.” (Identical rules apply to clipping one's fingernails.) Even so, astronauts often notice little edible-looking things drifting by. In Kelly's 2017 memoir, Endurance, he relates a stomach-turning anecdote in which the Italian astronaut Samantha Cristoforetti confesses to having eaten an unidentified floating object she thought was candy but turned out to be garbage.

Nespoli's longed-for spaghetti is not crumbly, but even if he did find a way to cook it, there would be no appropriate way to eat it. For the most part, space cutlery has been reduced to a pair of scissors, for opening packages, and a spoon, for scooping out their contents. (As it happens, Nespoli's ancestral compatriots were the first Europeans to adopt the modern fork. It was a multi-tined improvement on their previous tool—a combination ravioli spear and spaghetti twirling rod.) The process of cooking is similarly simplified. On the ISS, the astronauts typically rehydrate their food by adding hot water from a nozzle mounted on the ceiling, then kneading the package. Dinner is ready to eat at this point, but most dishes are apparently greatly improved by also being warmed inside a slim aluminum briefcase with a heating element in the middle. “This is where it gets crazy,” Nespoli said. “You have a space station that cost a gazillion dollars, built by engineers that can build the most amazing things, and the food warmer is a briefcase that takes 20 minutes and only fits enough food for three people at a time.”

As a result, finding something to eat in the storage containers, rehydrating and kneading it, then warming it can easily take 30 or 40 minutes. Astronauts are always short on time; their days are tightly programmed by mission control, and overruns on repairs or science experiments frequently cut into their already limited window for meals. During the Media Lab focus group, Coleman described a favorite dinner that involved molding rice into sticky balls and then mixing it with Trader Joe's Thai curry, which she'd brought up as part of her personal allowance. “I really loved it,” she said. “But it took me probably twice as long to eat dinner when I did that.” Especially toward the end of her mission, she was more likely to eat a food bar instead, “because it was just efficient.”

By this point in the meeting, Coleman and Nespoli had rattled off an extraordinarily long list of challenges and constraints. Finally, though, they made the admission that Coblentz had been chasing all along: Food was an important part of daily life in orbit—and the subject of many of their fondest memories. Coleman said their entire crew, even the cosmonauts, made a point of eating together on Friday evenings. “It's how you become a team,” she explained, to Coblentz's evident delight. Coleman opened her laptop and flipped through her favorite photographs from her time aboard the ISS. One showed the kitchen table, which juts out into the corridor between the Russian and American segments of the station. “Everybody had bruises on each hip—one for the way there, one for the way back,” she said. “It was exactly in the way.” Of course, there's no real reason for a table to be horizontal in space; packets of food and drink have to be secured using Velcro either way, so it could just as easily lie parallel to the wall. But Coleman said there was an unspoken resistance to such an arrangement. The crew needed a place to “hang around,” she explained, and to ask that most human of questions: “How was your day?”

Nespoli's favorite ISS snapshots involved food too, in a way. He pulled up an image he captured of clouds over Lake Garda, Italy. “That looks like a margherita pizza,” he said. “And then the next picture—that looks like a quattro stagioni pizza.” Earth was pizza, pizza was Earth, and both were entirely out of reach. This was the obstacle Coblentz was determined to surmount.

Advertisement

The first people ever to leave Earth orbit and strike out into space were the three crew members of Apollo 8. They were surprised to find that the most compelling thing they saw on the quarter-of-a-million-mile-long journey lay in the rearview mirror. “We set out to explore the moon and instead discovered the Earth,” astronaut Bill Anders wrote 50 years after the mission's end.

It was Anders who captured the iconic “Earthrise” photo on Christmas Eve of 1968: a shiny blue jewel wreathed in clouds, floating above the pockmarked lunar surface, alone in the pitch-black void. Reflecting on the image in 2018, he recalled the powerful emotions that led him to ignore his assigned task—documenting potential landing sites—and turn his lens toward home. “Once-distant places appeared inseparably close,” he wrote. “Borders that once rendered division vanished. All of humanity appeared joined together.” His sublime experience, an overwhelming feeling of oneness coupled with a sudden awareness of Earth's beauty and fragility, became so common among future generations of astronauts that it earned a name: the overview effect. It offers an escape from the confined, smelly conditions, the mushy, repetitive meals, and the endless checklists. When Coleman was aboard the ISS, she played her flute in the Cupola, a windowed observatory purpose-built for world-watching.

On a journey to Mars, or beyond, that will no longer be an option. Psychologists have no idea how the so-called break-off phenomenon—the sense of detachment that can arise when our planet slips from view—will affect future astronauts' mental state. What's more, any communication with the now-invisible Earth will be subject to as much as a 45-minute lag. Kelley Slack, one of the experts on NASA's Behavioral Health and Performance team, recently told NBC, “It will be the first time that we've been totally disconnected from Earth.” Since the summer of 1975, when NASA convened a group of experts to discuss permanent settlement in space, researchers have warned of a psychological condition called “solipsism syndrome,” in which reality feels dreamlike and lonely astronauts become prone to self-destructive mistakes. Mars could be the theory's first real test.

“Food assumes added importance under all conditions of isolation and confinement because normal sources of gratification are denied,” Jack Stuster, an anthropologist and NASA consultant, wrote in Bold Endeavors, his 1996 book on the behavioral issues associated with extreme environments. “Usually, the longer the confinement, the more important food becomes.” Managers of offshore oil rigs, supertankers, and Antarctic research stations all appreciate the importance of food to maintaining group morale and productivity in isolated, remote, and confined situations. Stuster noted that “food has become such an important element onboard fleet ballistic missile submarines that, for years, meals have been served at cloth-covered tables in pleasant paneled dining rooms.”

Outer space is perhaps the most extreme environment humans will ever confront. To mitigate the inevitable burnout, NASA has developed a range of what it calls “countermeasures.” During his yearlong mission aboard the ISS, for instance, Scott Kelly tested a pair of rubber suction trousers, designed to combat fluid shift. (Afterward he reporting feeling, for the first time in months, “like I wasn't standing on my head.”) He and his crewmate Kjell Lindgren, the man with the bagpipes, also grew and ate some red romaine lettuce—a first for American astronauts.

Research by Marianna Obrist, a professor of multisensory experiences at the University of Sussex, suggests orbital agriculture could be a promising countermeasure. “In a way, that appreciation of what it takes to grow food and how wonderful and alive fresh food tastes—that's something you don't often think when you are eating here on Earth,” she told me. Perhaps a crunchy leaf of romaine could serve as the edible equivalent of the overview effect. For the foreseeable future, though, onboard farming will never provide more than a tiny portion of a crew's dietary requirements. The MIT team will have to look elsewhere.

Advertisement

Obrist's recent work has documented exactly the void that Coblentz is trying to fill. In anticipation of mass-market space tourism, she and her colleagues conducted a survey in which they asked ordinary people about the eating experiences they would want on a flight to the moon or Mars. The responses were clear: For the shorter lunar trip, travelers were perfectly happy to provision themselves like campers, provided there would be treats. But when it came to the longer Mars journey, the respondents said they'd require a wide variety of flavors, textures, and temperatures. They also felt it would be important to re-create some of the rituals and environments that accompany eating on Earth.

In short, Coblentz said, making better space food means thinking bigger than countermeasures. “If humans are going to thrive in space, we need to design embodied experiences,” she told me. She has even looked to zoos for inspiration. “For predatory animals like tigers, instead of just throwing a carcass into their cage, they might have a hunting contraption that drags and twitches the meat,” she explained. “They're manufacturing this more challenging experience to make eating more engaging for the animals, and I wondered what the space food analogy might be.” Hiding food around a spacecraft to encourage foraging behavior might not be feasible, she concluded, but what about meal preparation? What kinds of culinary transformations are possible in space—and what kinds of rituals could be built out of them?

Like generations of chefs before her, Coblentz began by taking advantage of the local environment. Liquids are known to behave peculiarly in microgravity, forming wobbly blobs rather than streams or droplets. This made her think of molecular gastronomy, in particular the technique of using calcium chloride and sodium alginate to turn liquids into squishy, caviar-like spheres that burst delightfully on the tongue. Coblentz got to work on a special spherification station to test in zero g—basically a plexiglass glove box equipped with preloaded syringes. She would inject a bead of ginger extract into a lemon-flavored bubble, or blood orange into a beet juice globule, creating spheres within spheres that would deliver a unique multipop sensation unattainable on Earth. And unlike their terrestrial counterparts, Coblentz's spheres would float rather than sit on a plate, meaning they could be appreciated in 360 degrees, rather than 180, and garnished accordingly. The entire process, as whimsical as it might seem, could offer future space travelers a welcome chance to express their culinary creativity and enjoy eating as a sensory experience, even if “space face” means the flavors themselves are subdued.

Coblentz holds a dish of algae-based "caviar," designed to remind space-faring earthlings of their faraway home.

Photograph: Tony Luong

Advertisement

Coblentz also had weightier weightless recipes in mind. Many of Earth's most deeply comforting foods rely on the byproducts of microbial digestion. Because metabolism works differently in microgravity, for microbes as well as humans, the resulting flavors might differ too. What would a wheel of space-aged Parmigiano-Reggiano, a loaf of space-risen sourdough bread, or a tube of space-fermented salami taste like? Coblentz is planning to send a batch of miso paste to the ISS later this year, to learn how its flavor profile changes. She has also developed a new way of consuming it. Pondering the station's lack of cutlery, she struck upon the idea of creating silicone “bones”—solid, ivory-colored crescents that resemble oversize macaroni more than the ribs that inspired them. Nibbling and sucking foods directly off a silicone bone might reduce spoon fatigue, she explained, and perhaps even put spacefarers in touch with humanity's most ancient foodways.

Coblentz has also considered sending brine into orbit, to evaporate into salt. As Phil Williams, who recently launched the world's first astropharmacy research program at the University of Nottingham, told me recently, “One of the problems of making crystals on Earth is that you have convective currents.” Driven by gravity, these currents affect the quality of crystal growth. “You can get far bigger crystals with fewer defects in microgravity,” he said. Chefs and foodies already pay a premium for the large, hollow pyramids of Maldon sea salt, a shape preferred for its crunch, its intermittent bursts of saltiness, and its superior adhesion to baked goods. No one yet knows what culinary properties the crystalline perfection of space salt might possess. Many pharmaceuticals rely on crystallization too, and any alteration in those structures can change the drug's therapeutic effects. “There may one day be compounds that we can only make off-planet and bring back,” Williams said, conjuring up a dazzling vision of the future in which drug factories and gourmet brine ponds orbit Earth.

In the weeks leading up to the parabolic flight, as Coblentz surveyed her prototypes, she decided she'd like to spend her precious moments in zero g actually eating stuff, not just fiddling with the spherification station. She would set aside time to inject a few test spheres, but for now she was more interested in replacing some of the ambiance, texture, and flavor that astronauts complain is missing aboard the ISS.

“I've designed a special space food helmet and a tasting menu,” she told me on our last call before we flew. “Have a light breakfast.”

As astronauts and entrepreneurs alike are fond of saying whenever something goes horribly wrong, “space is hard.” The same rule seemingly applied to MIT's zero-gravity flight. Initially slated for March, it was delayed for months, owing to a government shutdown, scheduling conflicts, and then at the last minute—with all the passengers, including the silkworms, ready to go—the FAA's refusal to recertify the plane until a single part was replaced. Finally, the morning dawned. I ate a quarter of a bagel, applied a motion-sickness patch, and boarded the team bus to ride up to an airstrip at Pease Air Force Base in New Hampshire.

We gathered in a hangarlike space haphazardly furnished with plastic tables, folding chairs, a metal detector, and an x-ray machine. Staff from Zero-G Corporation, the company operating the flight, issued us our blue onesies, complete with name badges, and our boarding passes. Flight ZG491 was scheduled to depart at 9 am.

As the passengers suited up and checked their experimental equipment one last time, the preflight briefing began. There would be no somersaults, no flipping, no spinning without permission—seriously, no horsing around of any kind.

“Don't look down,” one staffer warned. “You'll feel like your eyeballs are falling out.”

“Don't take off a ring and try to float it while you take a picture,” said another. “There's still a wedding ring in there somewhere from the last guy that tried that.”

Sign Up Today
Sign up for our Longreads newsletter for the best features, ideas, and investigations from WIRED.

After the briefing, I tried on Maggie Coblentz's food helmet, a sort of giant plastic goldfish bowl with two hand holes carved out. “It was injection-molded for me by people who make aquariums,” she said. “When you put it on, you're in a world of your own—and it catches crumbs. I've tried it in bed.” There was a built-in lazy Susan on which she had mounted five small containers. I spotted boba pearls in one and Pop Rocks in another. The hardware was spray-painted an Instagram-friendly rose gold.

We went through our own private TSA security line, after which Coblentz handed me some contraband boba pearls. As a potential hazard to the equipment onboard, they were approved for flight only on the condition that they remain contained within her helmet. I didn't have a helmet of my own, so I stashed them in my breast pocket, sealed it with velcro, and boarded the plane. Several rows of seats were installed at the back, and we sat and listened to a modified safety spiel. If the airplane lost pressure, we were told, oxygen masks would not drop automatically; instead, we would have to make our way over to the oxygen boxes mounted along the center aisle and walls. After a perfectly normal takeoff, the seat-belt sign switched off and we all moved forward to our appointed stations, next to the bolted-down equipment.

Advertisement

On the first weightless parabola, my shoelaces came undone. They remained that way for the duration. My instinct was to swim, but that didn't work. Moving gingerly, I hovered to one side, trying not to get in the way as Coblentz injected her spheres. (We wouldn't be eating them on the flight, mostly because there wasn't time to fish them out of the plexiglass box; still, the experiment would serve as proof of concept.) She was struggling too, her arms visibly shaking as she tried to control the speed at which the liquid came out of the syringes. Before either of us had any idea what was going on, it was time to serve the tasting menu.

Coblentz put on her helmet and immediately relaxed. She told me later that it functioned almost like noise-canceling headphones, allowing her to focus on eating amid the uproar. She piped in a soundtrack of frying onions, then opened a canister that released a matching scent—an attempt to increase her appetite and induce salivation, both known to enhance food enjoyment. The helmet became both restaurant and plate as she unleashed a handful of Pop Rocks and boba pearls and chased them in circles. Immediately, Coblentz sneezed: Most of the popping candy appeared to have gone straight up her nose. I set loose my contraband pearls and promptly lost half of them; perhaps they would reappear on a future flight. The few that managed to connect with my mouth bounced around on my tongue, a sensation that made me snort with laughter.

As we entered our final few parabolas, Coblentz sucked miso paste from her silicone bones. I floated the length of the cabin, marveling at an agility and grace I'd never demonstrated on Earth. Behind me, two unfortunate researchers were hunched, barf bags in hand, stricken by space adjustment syndrome. For the rest of us, weightlessness was over far too quickly.

Back at the airfield, Zero-G had laid out a sandwich buffet for our “regravitation celebration.” I dragged myself to it, heavy-limbed and slow. As I lifted my turkey club baguette to my mouth, I could hardly believe I'd have to eat this way for the rest of my life. At least for now, the psychological benefits of earthly terroir seemed hardly worth the price of being permanently rooted to the ground. I glanced at Coblentz. She was draped over a chair, eyes closed, with a huge smile. Slowly, her right arm floated up and she began gently combing Pop Rocks from her hair.


When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.


Nicola Twilley is the cohost of Gastropod, a podcast that looks at food through the lens of science and history. She is at work on two books, one about refrigeration and the other about quarantine.

This article appears in the March issue. Subscribe now.

Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.


Read more: https://www.wired.com/story/space-food-what-will-keep-us-human/

Related Articles

Markets Are Eating The World

For the last hundred years, individuals have worked for firms, and, by historical standards, large ones.

That many of us live in suburbs and drive our cars into the city to go to work at a large office building is so normal that it seems like it has always been this way. Of course, it hasn’t. In 1870, almost 50 percent of the U.S. population was employed in agriculture.[1] As of 2008, less than 2 percent of the population is directly employed in agriculture, but many people worked for these relatively new things called “corporations.”[2]

Many internet pioneers in the 90’s believed that the internet would start to break up corporations by letting people communicate and organize over a vast, open network. This reality has sort-of played out: the “gig economy” and rise in freelancing are persistent, if not explosive, trends. With the re-emergence of blockchain technology, talk of “the death of the firm” has returned. Is there reason to think this time will be different?

To understand why this time might (or might not) be different, let us first take a brief look back into Coasean economics and mechanical clocks.

In his 1937 paper, “The Nature of the Firm,” economist R.H. Coase asked “if markets were as efficient as economists believed at the time, why do firms exist at all? Why don’t entrepreneurs just go out and hire contractors for every task they need to get done?”[3]

If an entrepreneur hires employees, she has to pay them whether they are working or not. Contractors only get paid for the work they actually do. While the firm itself interacts with the market, buying supplies from suppliers and selling products or services to customers, the employees inside of it are insulated. Each employee does not renegotiate their compensation every time they are asked to do something new. But, why not?

Coase’s answer was transaction costs. Contracting out individual tasks can be more expensive than just keeping someone on the payroll because each task involves transaction costs.

Imagine if instead of answering every email yourself, you hired a contractor that was better than you at dealing with the particular issue in that email. However, it costs you something to find them. Once you found them you would have to bargain and agree on a price for their services then get them to sign a contract and potentially take them to court if they didn’t answer the email as stipulated in the contract.

Duke economist Mike Munger calls these three types of transaction costs triangulation, how hard it is to find and measure the quality of a service; transfer, how hard it is to bargain and agree on a contract for the good or service; and trust, whether the counterparty is trustworthy or you have recourse if they aren’t.

You might as well just answer the email yourself or, as some executives do, hire a full-time executive assistant. Even if the executive assistant isn’t busy all the time, it’s still better than hiring someone one off for every email or even every day.

Coase’s thesis was that in the presence of these transaction costs, firms will grow larger as long as they can benefit from doing tasks in-house rather than incurring the transaction costs of having to go out and search, bargain and enforce a contract in the market. They will expand or shrink until the cost of making it in the firm equals the cost of buying it on the market.

The lower the transaction costs are, the more efficient markets will be, and the smaller firms will be.

In a world where markets were extremely efficient, it would be very easy to find and measure things (low triangulation costs), it would be very easy to bargain and pay (low transfer costs), and it would be easy to trust the counterparty to fulfill the contract (low trust costs).

In that world, the optimal size of the firm is one person (or a very few people). There’s no reason to have a firm because business owners can just buy anything they need on a one-off basis from the market.[4] Most people wouldn’t have full-time jobs; they would do contract work.

Consumers would need to own very few things. If you needed a fruit dehydrator to prepare for a camping trip twice a year, you could rent one quickly and cheaply. If you wanted to take your family to the beach twice a year, you could easily rent a place just for the days you were there.

On the other hand, in a world that was extremely inefficient, it would be hard to find and measure things (high triangulation costs), it would be difficult to bargain and pay (high transfer costs) and it would be difficult to trust the counterparty to fulfill the contract (high trust costs).

In that world, firms would tend to be large. It would be inefficient to buy things from the market and so entrepreneurs would tend to accumulate large payrolls. Most people would work full-time jobs for large firms. If you wanted to take your family to the beach twice a year, you would need to own the beach house because it would be too inefficient to rent, the reality before online marketplaces like AirBnB showed up.

Consumers would need to own nearly everything they might conceivably need. Even if they only used their fruit dehydrator twice a year, they’d need to own it because the transaction costs involved in renting it would be too high.

If the structure of the economy is based on transaction costs, then what determines them?

Technological Eras and Transaction Costs

The primary determinant of transaction costs is technology.

The development of the wheel and domestication of horses and oxes decreased transfer costs by making it possible to move more goods further. Farmers who could bring their crops to market using an ox cart rather than carrying it by hand could charge less and still make the same profit.

The development of the modern legal system reduced the transaction cost of trust. It was possible to trust that your counterparty would fulfill their contract because they knew you had recourse if they didn’t.

The list goes on: standardized weights and  measures, the sail, the compass, the printing press, the limited liability corporation, canals, phones, warranties, container ships and, more recently, smartphones and the internet.

It’s hard to appreciate how impactful many of these technologies has been, because most of them had become so common by the time most of us were born that we take them for granted.

As the author Douglas Adams said, “Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. Anything invented after you’re thirty-five is against the natural order of things.”

To see how technology affects transaction costs, and how that affects the way our society is organized, let’s consider something which we all think of as “normal and ordinary,”  but which has had a huge impact on our lives: the mechanical clock.

The Unreasonable Effectiveness of the Mechanical Clock

In 1314, The city of Caen installed a mechanical clock with the following inscription: “I give the hours voice to make the common folk rejoice.” “Rejoice” is a pretty strong reaction to a clock, but it wasn’t overstated, everyone in Caen was pretty jazzed about the mechanical clock. Why?

A key element of why we have jobs today as opposed to working as slaves or serfs bonded to the land as was common in the Feudal system is a direct result of the clock.

Time was important before the invention of the clock but was very hard to measure. Rome was full of sundials, and medieval Europe’s bell towers where, time was tolled, were the tallest structures in town.[5]

This was not cheap. In the larger and more important belfries, two bell-ringers lived full time, each serving as a check on the other. The bells themselves were usually financed by local guilds that relied on the time kept to tell their workers when they had to start working and when they could go home.

This system was problematic for a few reasons.

For one, it was expensive. Imagine if you had to pool funds together with your neighbors to hire two guys to sit in the tower down the street full time and ring the bell to wake you up in the morning.

For another, the bell could only signal a few events per day. If you wanted to organize a lunch meeting with a friend, you couldn’t ask the belltower to toll just for you. Medieval bell towers had not yet developed snooze functionality.

Finally, sundials suffered from accuracy problems. Something as common as clouds could make it difficult to tell precisely when dawn, dusk, and midday occurred.

In the 14th and 15th centuries, the expensive bell towers of Europe’s main cities got a snazzy upgrade that dramatically reduced transaction costs: the mechanical clock.

The key technological breakthrough that allowed the development was the escapement.

The escapement transfers energy to the clock’s pendulum to replace the energy lost to friction and keep it on time. Each swing of the pendulum releases a tooth of the escapement’s wheel gear, allowing the clock’s gear train to advance or “escape” by a set amount. This moves the clock’s hands forward at a steady rate.[6]

The accuracy of early mechanical clocks, plus or minus 10-15 minutes per day, was not notably better than late water clocks and less accurate than the sandglass, yet mechanical clocks became widespread. Why?

  1. Its automatic striking feature meant the clock could be struck every hour at lower cost, making it easier to schedule events than only striking at dawn, dusk and noon.
  2. It was more provably fair than the alternatives, which gave all parties greater confidence that the time being struck was accurate. (Workers were often suspicious that employers could bribe or coerce the bell-ringers to extend the workday, which was harder to do with a mechanical clock.)

Mechanical clocks broadcast by bell towers provided a fair (lower trust costs) and fungible [7] (lower transfer costs) measure of time. Each hour rung on the bell tower could be trusted to be the same length as another hour.

Most workers in the modern economy earn money based on a time-rate, whether the time period is an hour, a day, a week or a month. This is possible only because we have a measure of time which both employer and employee agree upon. If you hire someone to pressure-wash your garage for an hour, you may argue with them over the quality of the work, but you can both easily agree whether they spent an hour in the garage.

Prior to the advent of the mechanical clock, slavery and serfdom were the primary economic relationships, in part because the transaction cost of measuring time beyond just sunup and sundown was so high, workers were chained to their masters or lords.[8]

The employer is then able to use promotions, raises, and firing to incentivize employees to produce quality services during the time they are being paid for.[9]

In a system based on time-rate wages rather than slavery or serfdom, workers have a choice. If the talented blacksmith can get a higher time-rate wage from a competitor, she’s able to go work for them because there is an objective, fungible measure of time she’s able to trade.

As history has shown, this was a major productivity and quality-of-life improvement for both parties.[10]

It gradually became clear that mechanical time opened up entirely new categories of economic organization and productivity that had hitherto been not just impossible, but unimaginable.

We could look at almost any technology listed abovestandardized weights and measures, the sail, the compass, the printing press, etc.and do a similar analysis of how it affected transaction costs and eventually how it affected society as a result.

The primary effect is an increase in what we will call coordination scalability.

Coordination Scalability

“It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them.”   Alfred North Whitehead

About 70,000 years ago, there were between six and ten species of the genus homo. Now, of course, there is just one: Homo sapiens. Why did Homo sapiens prevail over the other species, like Homo neanderthalensis?

Homo sapiens prevailed because of their ability to coordinate. Coordination was made possible by increased neocortical size, which led to an ability to work together in large groups, not just as single individuals. Instead of single individuals hunting, groups could hunt and bring down larger prey more safely and efficiently.[11]

The brain of Homo sapiens has proven able to invent other, external structures which further increased coordination scalability by expanding the network of other people we could rely on.

Maybe the most important of these was language, but we have evolved many others since, including the mechanical clock.

The increased brain size has driven our species through four coordination revolutions: Neolithic, Industrial, Computing, Blockchain.

Neolithic Era: The Emergence of Division of Labor

The first economic revolution was a shift from humans as hunter-gatherers to homo sapiens as farmers.

Coordination scalability among hunter-gatherers was limited to the size of the band, which tended to range from 15 to 150 individuals.[12] The abandonment of a nomadic way of life and move to agriculture changed this by allowing specialization and the formation of cities.

Agriculture meant that people could, for the first time, accumulate wealth. Farmers could save excess crops to eat later or trade them for farming equipment, baskets or decorations. The problem was that this wealth was suddenly worth stealing and so farmers needed to defend their wealth.

Neolithic societies typically consisted of groups of farmers protected by what Mancur Olson called “stationary bandits,” basically warlords.[13] This allowed the emergence of much greater specialization. Farmers accumulated wealth and paid some to the warlords for protection, but even then there was still some left over, making it possible for individuals to specialize.

A city of 10,000 people requires, but also makes possible, specialists.

The limits of coordination scalability increased from 150 to thousands or, in some cases, tens of thousands. This was not necessarily a boon to human happiness. Anthropologist Jared Diamond called the move to agriculture “the worst mistake in the history of the human race.”[14] The quality of life for individuals declined: lifespans shortened, nutrition was worse leading to smaller stature, and disease was more prevalent.

But this shift was irresistible because specialization created so much more wealth and power that groups which adopted this shift came to dominate those that didn’t. The economies of scale in military specialization, in particular, were overwhelming. Hunt-gatherers couldn’t compete.

In the Neolithic era, the State was the limit of coordination scalability.

Industrial Era: Division of Labor Is Eating the World

Alongside the city-state, a new technology started to emerge that would further increase the limits of coordination scalability: money. To illustrate, let us take the European case, from ancient Greece to modernity, though the path in other parts of the world was broadly similar. Around 630 B.C., the Lydian kings recognized the need for small, easily transported coins worth no more than a few days’ labor. They made these ingots in a standard sizeabout the size of a thumbnail—and weight, and stamped an emblem of a lion’s head on them.

This eliminated one of the most time-consuming (and highest transaction cost) steps in commerce: weighing gold and silver ingots each time a transaction was made. Merchants could easily count the number of coins without worrying about cheating.

Prior to the invention of coins, trade had been limited to big commercial transactions, like buying a herd of cattle. With the reduced transfer cost facilitated by coins, Lydians began trading in the daily necessities of lifegrain, olive oil, beer, wine, and wood.[15]

The variety and abundance of goods which could suddenly be traded led to another innovation: the retail market.

Previously, buyers had to go to the home of sellers of whatever they needed. If you needed olive oil, you had to walk over to the olive oil lady’s house to get it. With the amount of trade that began happening after coinage, a central market emerged. Small stalls lined the market where each merchant specialized in (and so could produce more efficiently) a particular goodmeat, grain, jewelry, bread, cloth, etc. Instead of having to go the olive oil lady’s house, you could go to her stall and pick up bread from the baker while you were there.

From this retail market in Lydia sprang the Greek agora, Medieval market squares in Europe and, the suburban shopping mall and, eventually, the “online shopping malls” Amazon and Google. Though markets were around as early as 7th century BCE Lydia, they really hit their stride in The Industrial Revolution in the 18th century.[16]

Adam Smith was the first to describe in detail the effect of this marketization of the world. Markets made it possible to promote the division of labor across political units, not just within them. Instead of each city or country manufacturing all the goods they needed, different political entities could further divide labor. Coordination scalability started to stretch across political borders.

Coming back to Coase, firms will expand or shrink until “making” equals the cost of “buying.” Under this Industrial era, transaction costs made administrative and managerial coordination (making) more efficient than market coordination (buying) for most industries, which led to the rise of large firms.

The major efficiency gain of Industrial companies over their more “artisanal” forebearers was that using the techniques of mass production, they could produce products of a higher quality at a lower price. This was possible only if they were able to enforce standards throughout the supply chain. The triangulation transaction cost can be broken down into search and measurement: a company needed to find the vendor and to be able to measure the quality of the good or service.

In the early Industrial era, the supply chain was extremely fragmented. By bringing all the pieces into the firm, a large vertically integrated company could be more efficient.[17]

As an example, In the 1860s and 1870s, the Carnegie Corporation purchased mines to ensure it had reliable access to the iron ore and coke it needed to make steel. The upstream suppliers were unreliable and non-standardized and Carnegie Corporation could lower the cost of production by simply owning the whole supply chain.

This was the case in nearly every industry. By bringing many discrete entities under one roof and one system of coordination, greater economic efficiencies were gained and the multi-unit business corporation replaced the small, single-unit enterprise because administrative coordination enabled greater productivity through lower transaction costs per task than was possible before. Economies of scale flourished.

This system of large firms connected by markets greatly increased coordination scalability. Large multinational firms could stretch across political boundaries and provide goods and services more efficiently.

In Henry Ford’s world, the point where making equaled the cost of buying was pretty big. Ford built a giant plant at River Rouge just outside Detroit between 1917 and 1928 that took in iron ore and rubber at one end and sent cars out the other. At the factory’s peak, 100,000 people worked there. These economies of scale allowed Ford to dramatically drive down the cost of an automobile, making it possible for the middle class to own a car.[18]

As with Carnegie, Ford learned that supplier networks take a while to emerge and grow into something reliable. In 1917, doing everything himself was the only way to get the scale he needed to be able to make an affordable car.

One of the implications of this model was that industrial businesses required huge startup costs.

The only chance any entrepreneur had to compete required starting out with similarly massive amounts of capital required to build a factory large and efficient enough to compete with Ford.

For workers, this meant that someone in a specialized role, like an electric engineer or an underwriter, did not freelance or work for small businesses. Because the most efficient way to produce products was in large organizations, specialized workers could earn the most by working inside large organizations, be they Ford, AT&T or Chase Bank.

At the peak of the Industrial era, there were two dominant institutions: firms and markets.

Work inside the firm allowed for greater organization and specialization which, in the presence of high transaction costs was more economically efficient.

Markets were more chaotic and less organized, but also more motivating. Henry Ford engaged with the market and made out just a touch better than any of his workers; there just wasn’t room for many Henry Fords.

This started to dissolve in the second half of the 20th century. Ford no longer takes iron ore and rubber as the inputs to their factories, but has a vast network of upstream suppliers.[19] The design and manufacturing of car parts now happens over a long supply chain, which the car companies ultimately assemble and sell.

One reason is that supplier networks became more standardized and reliable. Ford can now buy ball bearings and brake pads more efficiently than he can make them, so he does. Each company in the supply chain focuses on what they know best and competition forces them to constantly improve.

By the 1880s, it cost Carnegie more to operate the coke ovens in-house than to buy it from an independent source, so he sold off the coke ovens and bought it from the open market. Reduced transaction costs in the form of more standardized and reliable production technology caused both Ford and Carnegie corporation to shrink as Coase’s theory would suggest.

The second reason is that if you want to make a car using a network of cooperating companies, you have to be able to coordinate their efforts, and you can do that much better with telecommunication technology broadly and computers specifically. Computers reduce the transaction costs that Coase argued are the raison d’etre of corporations. That is a fundamental change.[20]

The Computing Era: Software Is Eating the World

Computers, and the software and networks built on top of them, had a new economic logic driven by lower transaction costs.

Internet aggregators such as Amazon, Facebook, Google, Uber and Airbnb reduced the transaction costs for participants on their platforms. For the industries that these platforms affected, the line between “making” and “buying” shifted toward buying. The line between owning and renting shifted toward renting.

Primarily, this was done through a reduction in triangulation costs (how hard it is to find and measure the quality of a service), and transfer costs (how hard it is to bargain and agree on a contract for the good or service).

Triangulation costs came down for two reasons. One was the proliferation of smartphones, which made it possible for services like Uber and Airbnb to exist. The other was the increasing digitization of the economy. Digital goods are both easier to find (think Googling versus going to the library or opening the Yellow Pages) and easier to measure the quality of (I know exactly how many people read my website each day and how many seconds they are there, the local newspaper does not).

The big improvement in transfer costs was the result of matchmaking: bringing together and facilitating the negotiation of mutually beneficial commercial or retail deals.  

Take Yelp, the popular restaurant review app. Yelp allows small businesses like restaurants, coffee shops, and bars to advertise to an extremely targeted group: individuals close enough to come to the restaurant and that searched for some relevant term. A barbecue restaurant in Nashville can show ads only to people searching their zip code for terms like “bbq” and “barbecue.” This enables small businesses that couldn’t afford to do radio or television advertising to attract customers.

The existence of online customer reviews gives consumers a more trusted way to evaluate the restaurant.

All of the internet aggregators, including Amazon, Facebook, and Google, enabled new service providers by creating a market and standardizing the rules of that market to reduce transaction costs.[21]

The “sharing economy” is more accurately called the “renting economy” from the perspective of consumers, and the “gig economy” from the perspective of producers. Most of the benefits are the result of new markets enabled by lower transaction costs, which allows consumers to rent rather than own, including “renting” some else’s time rather than employing them full time.

It’s easier to become an Uber driver than a cab driver, and an Airbnb host than a hotel owner. It’s easier to get your product into Amazon than Walmart. It’s easier to advertise your small business on Yelp, Google or Facebook than on a billboard, radio or TV.

Prior to the internet, the product designer was faced with the option of selling locally (which was often too small a market), trying to get into Walmart (which was impossible without significant funding and traction), or simply working for a company that already had distribution in Walmart.

On the internet, they could start distributing nationally or internationally on day one. The “shelf space” of Amazon or Google’s search engine results page was a lot more accessible than the shelf space of Walmart.

As a result, it became possible for people in certain highly specialized roles to work independently of firms entirely. Product designers and marketers could sell products through the internet and the platforms erected on top of it (mostly Amazon and Alibaba in the case of physical products) and have the potential to make as much or more as they could inside a corporation.

This group is highly motivated because their pay is directly based on how many products they sell. The aggregators and the internet were able to reduce the transaction costs that had historically made it economically inefficient or impossible for small businesses and individual entrepreneurs to exist.

The result was that in industries touched by the internet, we saw an industry structure of large aggregators and a long tail [22] of small business which were able to use the aggregators to reach previously unreachable, niche segments of the market. Though there aren’t many cities where a high-end cat furniture retail store makes economic sense, on Google or Amazon, it does.

source: stratechery.com

Before


After (Platform-Enabled Markets)


Firms


Platform


Long Tail



Walmart and big box retailers
Amazon Niche product designers and manufacturers

Cab companies
Uber Drivers with extra seats

Hotel chains
Airbnb Homeowners with extra rooms

Traditional media outlets
Google and Facebook Small offline and niche online businesses

For these industries, coordination scalability was far greater and could be seen in the emergence of micro-multinational businesses. Businesses as small as a half dozen people could manufacture in China, distribute products in North America, and employ people from Europe and Asia. This sort of outsourcing and the economic efficiencies it created had previously been reserved for large corporations.

As a result, consumers received cheaper, but also more personalized products from the ecosystem of aggregators and small businesses.

However, the rental economy still represents a tiny fraction of the overall economy. At any given time, only a thin subset of industries are ready to be marketized. What’s been done so far is only a small fraction of what will be done in the next few decades.

Yet, we can already start to imagine a world which Munger calls “Tomorrow 3.0.” You need a drill to hang some shelves in your new apartment. You open an app on your smartphone and tap “rent drill.” An autonomous car picks up a drill and delivers it outside your apartment in a keypad-protected pod and your phone vibrates “drill delivered.” Once you’re done, you put it back in the pod, which sends a message to another autonomous car nearby to come pick it up. The rental costs $5, much less than buying a commercial quality power drill. This is, of course, not limited to drillsit could have been a saw, fruit dehydrator, bread machine or deep fryer.

You own almost nothing, but have access to almost everything.

You, nor your neighbors, have a job, at least in the traditional sense. You pick up shifts or client work as needed and maybe manage a few small side businesses. After you finish drilling the shelves in, you might sit down at your computer and see what work requests are open and work for a few hours on designing a new graphic or finishing up the monthly financial statements for a client.

This is a world in which triangulation and transfer costs have come down dramatically, resulting in more renting than buying from consumers and more gig work than full-time jobs for producers.

This is a world we are on our way to already, and there aren’t any big, unexpected breakthroughs that need to happen first.

But what about the transaction cost of trust?

In the computer era, the areas that have been affected most are what could be called low-trust industries. If the sleeping mask you order off of Amazon isn’t as high-quality as you thought, that’s not a life or death problem.

What about areas where trust is essential?

Enter stage right: blockchains.

The Blockchain Era: Blockchain Markets Are Eating the World

One area where trust matters a lot is money. Most of the developed world doesn’t think about the possibility of fiat money [23] not being trustworthy because it hasn’t happened in our lifetimes. For those that have experienced it, including major currency devaluations, trusting that your money will be worth roughly the same tomorrow as it is today is a big deal.

Citizens of countries like Argentina and particularly Venezuela have been quicker to adopt bitcoin as a savings vehicle because their economic history made the value of censorship resistance more obvious.

Due to poor governance, the inflation rate in Venezuela averaged 32.42 percent from 1973 until 2017. Argentina was even worse; the inflation rate there averaged 200.80 percent between 1944 and 2017.

The story of North America and Europe is different. In the second half of the 20th century, monetary policy has been stable.

The Bretton Woods Agreement, struck in the aftermath of the Second World War, aggregated control of most of the globe’s monetary policy in the hands of the United States. The European powers acceded to this in part because the U.S. dollar was backed by gold, meaning that the U.S. government was subject to the laws of physics and geology of gold mining. They could not expand the money supply any faster than gold could be taken out of the ground.

With the abandonment of the gold standard under Nixon in 1973, control over money and monetary policy has moved into a historically small group of central bankers and powerful political and financial leaders and is no longer restricted by gold.

Fundamentally, the value of the U.S. dollar today is based on trust. There is no gold in a vault that backs the dollars in your pocket. Most fiat currencies today have value because the market trusts that the officials in charge of U.S. monetary policy will manage it responsibly.

It is at this point that the debate around monetary policy devolves into one group that imagines this small group of elitist power brokers sitting in a dark room on large leather couches surrounded by expensive art and mahogany bookshelves filled with copies of The Fountainhead smoking cigars and plotting against humanity using obscure financial maneuvering.

Another group, quite reasonably, points to the economic prosperity of the last half-century under this system and insists on the quackery of the former group.

A better way to understand the tension between a monetary system based on gold versus one based on fiat money this has been offered by political science professor Bruce Bueno de Mesquita:  “Democracy is a better form of government than dictatorships, not because presidents are intrinsically better people than dictators, but simply because presidents have less agency and power than dictators.”

Bueno de Mesquita calls this Selectorate Theory. The selectorate represents the number of people who have influence in a government, and thus the degree to which power is distributed. The selectorate of a dictatorship will tend to be very small: the dictator and a few cronies. The selectorate in democracy tends to be much larger, typically encompassing the Executive, Legislative, and Judicial branches and the voters which elect them.

Historically, the size of the selectorate involves a tradeoff between the efficiency and the robustness of the governmental system. Let’s call this the “Selectorate Spectrum.”

Dictatorships can be more efficient than democracies because they don’t have to get many people on board to make a decision. Democracies, by contrast, are more robust, but at the cost of efficiency.

Conservatives and progressives alike bemoan how little their elected representatives get done but happily observe how little their opponents accomplish. A single individual with unilateral power can accomplish far more (good or bad) than a government of “checks and balances.” The long-run health of a government means balancing the tradeoff between robustness and efficiency. The number of stakeholders cannot be so large that nothing gets done or the country will never adapt nor too small that one or a small group of individuals can hijack the government for personal gain.

This tension between centralized efficiency and decentralized robustness exists in many other areas. Firms try to balance the size of the selectorate to make it large enough so there is some accountability (e.g. a board and shareholder voting) but not so large as to make it impossible to compete in a marketby centralizing most decisions in the hands of a CEO.

We can view both the current monetary system and the internet aggregators through the lens of the selectorate. In both areas, the trend over the past few decades is that the robustness of a large selectorate has been traded away for the efficiency of a small one.[24]

A few individualsheads of central banks, leaders of state, corporate CEOs, and leaders of large financial entities like sovereign wealth funds and pensions fundscan move markets and politics globally with even whispers of significant change. This sort of centralizing in the name of efficiency can sometimes lead to long feedback loops with potentially dramatic consequences.

Said another way, much of what appears efficient in the short term may not be efficient but hiding risk somewhere, creating the potential for a blow-up. A large selectorate tends to appear to be working less efficiently in the short term, but can be more robust in the long term, making it more efficient in the long term as well. It is a story of the Tortoise and the Hare: slow and steady may lose the first leg, but win the race.

In the Beginning, There Was Bitcoin

In October 2008, an anonymous individual or group using the pseudonym Satoshi Nakamoto sent an email to a cypherpunk mailing list, explaining a new system called bitcoin. The opening line of the conclusion summed up the paper:

“We have proposed a system for electronic transactions without relying on trust”

When the network went live a few months later in January 2009, Satoshi embedded the headline of a story running that day in The London Times:

“The Times 03/Jan/2009 Chancellor on brink of second bailout for banks”

Though we can’t know for sure what was going through Satoshi’s mind at the time, the most likely explanation based is that Satoshi was reacting against the decisions being made in response to the 2008 Global Financial Crisis by the small selectorate in charge of monetary policy.

Instead of impactful decisions about the monetary system like a bailout being reliant upon a single individual, the chancellor, Satoshi envisioned bitcoin as a more robust monetary system, with a larger selectorate beyond the control of a single individual.

But why create a new form of money? Throughout history, the most common way for individuals to show their objections to their nation’s monetary policy was by trading their currency for some commodity like gold, silver, or livestock that they believed would hold its value better than the government-issued currency.

Gold, in particular, has been used as a form of money for nearly 6,000 years for one primary reason: the stock-to-flow ratio. Because of how gold is deposited in the Earth’s crust, it’s very difficult to mine. Despite all the technological changes in the last few hundred years, this has meant that the amount of new gold mined in a given year (the flow) has averaged between 1-2 percent of the total gold supply (stock) with very little variation year to year.

As a result, the total gold supply has never increased by more than 1-2 percent per year. In comparison to Venezuela’s 32.4 percent inflation and Argentina’s 200.80 percent inflation, gold’s inflation is far lower and more predictable.

Viewed through the lens of Selectorate Theory, we can say that gold or other commodity forms of money have a larger selectorate and are more robust than government-issued fiat currency. In the same way a larger group of stakeholders in a democracy constrains the actions of any one politician, the geological properties of gold constrained governments and their monetary policy.

Whether or not these constraints were “good” or “bad” is still a matter of debate. The Keynesian school of economics, which has come to be the view of mainstream economics, emerged out of John Maynard Keynes’s reaction to the Great Depression, which he thought was greatly exacerbated by the commitment to the gold standard and that governments should manage monetary policy to soften the cyclical nature of markets.

The Austrian and monetarist schools believe that human behavior is too idiosyncratic to model accurately with mathematics and that minimal government intervention is best. Attempts to intervene can be destabilizing and lead to inflation so a commitment to the gold standard is the lesser evil in the long run.

Taken in good faith, these schools represent different beliefs about the ideal point on the Selectorate Spectrum. Keynesians believe that greater efficiency could be gained by giving government officials greater control over monetary policy without sacrificing much robustness. Austrians and monetarists argue the opposite, that any short-term efficiency gains actually create huge risks to the long-term health of the system.

Viewed as a money, bitcoin has many gold-like properties, embodying something closer to the Austrian and monetarist view of ideal money. For one, we know exactly how many bitcoin will be created21 millionand the rate at which they will be created. Like gold, the ability to change this is outside of the control of a single or small group of individuals, giving it a predictable stock-to-flow ratio and making it extremely difficult to inflate.

Similar to gold, the core bitcoin protocol also makes great trade-offs in terms of efficiency in the name of robustness.[25]

However, bitcoin has two key properties of fiat money which gold lacksit is very easy to divide and transport. Someone in Singapore can send 1/100th of a bitcoin to someone in Canada in less than an hour. Sending 1/100th of a gold bar would be a bit trickier.

In his 1998 book, Cryptonomicon, science fiction author Neal Stephenson imagined a bitcoin-like money built by the grandchild of Holocaust survivors who wanted to create a way for individuals to escape totalitarian regimes without giving up all their wealth. It was difficult, if not impossible, for Jews to carry gold bars out of Germany, but what if all they had to do was remember a 12-word password phrase? How might history have been different?

Seen in this way, bitcoin offers a potentially better trade-off between robustness and efficiency. Its programmatically defined supply schedule means the inflation rate will be lower than gold (making it more robust) while it’s digital nature makes it as divisible and transportable as any fiat currency (making it more efficient).

Using a nifty combination of economic incentives for mining (proof-of-work system) and cryptography (including blockchain), bitcoin allowed individuals to engage in a network that was both open (like a market) and coordinated (like a firm) without needing a single or small group of power brokers to facilitate the coordination.

Said another way, bitcoin was the first example of money going from being controlled from a small group of firm-like entities (central banks) to being market-driven. What cryptocurrency represents is the technology-enabled possibility that anyone can make their own form of money.

Whether or not bitcoin survives, that Pandora’s Box is now open. In the same way computing and the internet opened up new areas of the economy to being eaten by markets, blockchain and cryptocurrency technology have opened up a different area to be eaten by markets: money.

The Future of Public Blockchains

Bitcoin is unique among forms of electronic money because it is both trustworthy and maintained by a large selectorate rather than a small one.

There was a group that started to wonder whether the same underlying technology could be used to develop open networks in other areas by reducing the transaction cost of trust.[26]

One group, the monetary maximalists, thinks not. According to them, public blockchains like bitcoin will only ever be useful as money because it is the area where trust is most important and so you can afford to trade everything else away. The refugee fleeing political chaos does not care that a transaction takes an hour to go through and costs $10 or even $100. They care about having the most difficult to seize, censorship-resistant form of wealth.

Bitcoin, as it exists today, enhances coordination scalability by allowing any two parties to transact without relying on a centralized intermediary and by allowing individuals in unstable political situations to store their wealth in the most difficult-to-seize form ever created.

The second school of thought is that bitcoin is the first example of a canonical, trustworthy ledger with a large selectorate and that there could be other types of ledgers which are able to emulate it.

At its core, money is just a ledger. The amount of money in your personal bank account is a list of all the transactions coming in (paychecks, deposits, etc.) and all the transactions going out (paying rent, groceries, etc.). When you add all those together, you get a balance for your account.

Historically, this ledger was maintained by a single entity, like your bank. In the case of U.S. dollars, the number in circulation can be figured out by adding up how much money the U.S. government has printed and released into the market and how much it has taken back out of the market.

What else could be seen as a ledger?

The answer is “nearly everything.” Governments and firms can be seen just as groups of ledgers. Governments maintain ledgers of citizenship, passports, tax obligations, social security entitlements and property ownership. Firms maintain ledgers of employment, assets, processes, customers and intellectual property.

Economists sometimes refer to firms as “a nexus of contracts.” The value of the firm comes from those contracts and how they are structured within the “ledger of the firm.” Google has a contract with users to provide search results, with advertisers to display ads to users looking for specific search terms, and with employees to maintain the quality of their search engine. That particular ledger of contracts is worth quite a lot.

Mechanical time opened up entirely new categories of economic organization. It allowed for trade to be synchronized at great distanceswithout mechanical time, there would have been no railroads (how would you know when to go?) and no Industrial Revolution. Mechanical time allowed for new modes of employment that lifted people out of serfdom and slavery.[27]

In the same way, it may be that public blockchains make it possible to have ledgers that are trustworthy without requiring a centralized firm to manage them. This would shift the line further in favor of “renting” over “buying” by reducing the transaction cost of trust.

Entrepreneurs may be able to write a valuable app and release for anyone and everyone who needs that functionality. The entrepreneur would collect micro-payments in their wallet. A product designer could release their design into the wild and consumers could download it to be printed on their 3D printer almost immediately.[28]

For the first 10 years of bitcoin’s existence, this hasn’t been possible. Using a blockchain has meant minimizing the transaction cost of trust at all costs, but that may not always be the case. Different proposals are already being built out that allow for more transactions to happen without compromising the trust which bitcoin and other crypto-networks offer.

There are widely differing opinions on what the best way to scale blockchains are. One faction, usually identifying as Web 3/smart contracting platform/Ethereum, believes that scaling quickly at the base layer is essential and can be done with minimal security risk while the other groups believe that scaling should be done slowly and only where it does not sacrifice the censorship-resistant nature of blockchains (bitcoin). Just like the debate between Keynesian and Austrian/monetarist views of monetary policy, these views represent different beliefs about the optimal tradeoff point on the Selectorate Spectrum. But, both groups believe that significant progress can be made on making blockchains more scalable without sacrificing too much trust.

Public blockchains may allow aggregation without the aggregators. For certain use cases, perhaps few, perhaps many, public blockchains like bitcoin will allow the organization and coordination benefits of firms and the motivation of markets while maintaining a large selectorate.

Ultimately, what we call society is a series of overlapping and interacting ledgers.

In order for ledgers to function, they must be organized according to rules. Historically, rules have required rulers to enforce them. Because of network effects, these rulers tend to become the most powerful people in society. In medieval Europe, the Pope enforced the rules of Christianity and so he was among the most powerful.

Today, Facebook controls the ledger of our social connections. Different groups of elites control the university ledgers and banking ledgers.

Public blockchains allow people to engage in a coordinated and meritocratic network without requiring a small selectorate.

Blockchains may introduce markets into corners of society that have never before been reached. In doing so, blockchains have the potential to replace ledgers previously run by kings, corporations, and aristocracies. They could extend the logic of the long tail to new industries and lengthen the tail for suppliers and producers by removing rent-seeking behavior and allowing for permissionless innovation.

Public blockchains allow for rules without a ruler. It began with money, but they may move on to corporate ledgers, social ledgers and perhaps eventually, the nation-state ledger.[29]

Acknowledgments: Credit for the phrase “Markets Are Eating the World” to Patri Friedman.


  1. https://www.bls.gov/opub/mlr/1981/11/art2full.pdf
  2. https://www.bls.gov/emp/tables/employment-by-major-industry-sector.htm
  3. http://www3.nccu.edu.tw/~jsfeng/CPEC11.pdf
  4. There are, of course, other types of transaction costs than the ones listed here. A frequent one brought up in response to Coase is company culture, which nearly all entrepreneurs and investors agree is an important factor in a firm’s productivity. This is certainly true, but the broader point about the relationship between firm size and transaction costs hold—culture is just another transaction cost.
  5. http://www.fon.hum.uva.nl/rob/Courses/InformationInSpeech/CDROM/Literature/LOTwinterschool2006/szabo.best.vwh.net/synch.html
  6. https://en.wikipedia.org/wiki/Escapement
  7. Fungibility is the property of a good or a commodity whose individual units are interchangeable. For example, one ounce of pure silver is fungible with any other ounce of pure silver. This is not the same for most goods: a dining table chair is not fungible with a fold-out chair.
  8. Piece rates, paying for some measurement of a finished output like bushels of apples or balls of yarn, seems fairer. But they suffer from two issues: For one, the output of the labor depends partially on the skill and effort of the laborer, but also on the vagaries of the work environment. This is particularly true in a society like that of medieval Europe, where nearly everyone worked in agriculture. The best farmer in the world can’t make it rain. The employee wants something like insurance that they will still be compensated for the effort in the case of events outside their control, and the employer who has more wealth and knowledge of market conditions takes on these risks in exchange for increased profit potential.
  9. For the worker, time doesn’t specify costs such as effort, skill or danger. A laborer would want to demand a higher time-rate wage for working in a dangerous mine than in a field. A skilled craftsman might demand a higher time-rate wage than an unskilled craftsman.
  10. The advent of the clock was necessary for the shift from farms to cities. Sunup to sundown worked effectively as a schedule for farmers because summer was typically when the most labor on farms was required, so longer days were useful. For craftsman or others working in cities, their work was not as driven by the seasons and so a trusted measure of time that didn’t vary with the seasons was necessary. The advent of a trusted measure of time led to an increase in the quantity, quality and variety of goods and services because urban, craftsman type work was now more feasible.
  11. https://unenumerated.blogspot.com/2017/02/money-blockchains-and-social-scalability.html. I am using the phrase “coordination scalability” synonymously with how Nick uses “social scalability.” A few readers suggested that social scalability was a confusing term as it made them think of scaling social networks.
  12. 150 is often referred to as Dunbar’s number, referring to a number calculated by University of Oxford anthropologist and psychologist Robin Dunbar using a ratio of neocortical volume to total brain volume and mean group size. For more see  https://www.newyorker.com/science/maria-konnikova/social-media-affect-math-dunbar-number-friendships. The lower band of 15 was cited in Pankaj Ghemawat’s World 3.0
  13. https://www.jstor.org/stable/2938736
  14. http://discovermagazine.com/1987/may/02-the-worst-mistake-in-the-history-of-the-human-race
  15. Because what else would you want to do besides eat bread dipped in fresh olive oil and drink fresh beer and wine?
  16. From The History of Money by Jack Weatherford.
  17. It also allowed them to squeeze out competitors at different places in the supply chain and put them out of business which Standard Oil did many times before finally being broken up by anti-trust legislation.
  18. http://www.paulgraham.com/re.html
  19. Tomorrow 3.0 by Michael Munger
  20. http://www.paulgraham.com/re.html
  21. There were quite a few things, even pre-internet, in the intersection between markets and firms, like approved vendor auction markets for government contracting and bidding, but they were primarily very high ticket items where higher transaction costs could be absorbed. The internet brought down the threshold for these dramatically to something as small as a $5 cab ride.
  22. The Long Tail was a concept WIRED editor Chris Anderson used to describe the proliferation of small, niche businesses that were possible after the end of the “tyranny of geography.” https://www.wired.com/2004/10/tail/
  23. From Wikipedia: “Fiat money is a currency without intrinsic value that has been established as money, often by government regulation. Fiat money does not have use value, and has value only because a government maintains its value, or because parties engaging in exchange agree on its value.” By contrast, “Commodity money is created from a good, often a precious metal such as gold or silver.” Almost all of what we call money today, from dollars to euros to yuan, is fiat.
  24. Small institutions can get both coordination and a larger selectorate by using social norms. This doesn’t enable coordination scalability though as it stops working somewhere around Dunbar’s number of 150.
  25. Visa processes thousands of transactions per second, while the bitcoin network’s decentralized structure processes a mere seven transactions per second. The key difference being that Visa transactions are easily reversed or censored whereas bitcoin’s are not.
  26. https://medium.com/@cdixon/crypto-tokens-a-breakthrough-in-open-network-design-e600975be2ef
  27. https://medium.com/cryptoeconomics-australia/the-blockchain-economy-a-beginners-guide-to-institutional-cryptoeconomics-64bf2f2beec4
  28. https://medium.com/cryptoeconomics-australia/the-blockchain-economy-a-beginners-guide-to-institutional-cryptoeconomics-64bf2f2beec4
  29. https://twitter.com/naval/status/877467629308395521