The Great Influenza of 1918

On January 19, 1917, German Foreign Secretary Arthur Zimmerman sent a fateful telegram to Heinrich von Eckardt, Germany’s Ambassador to Mexico. The telegram authorized Eckardt to propose a military alliance with Mexico. Germany would provide Mexico with generous financial aid and the military support needed to recover lost territory in Texas, New Mexico, and Arizona. Germany hoped that this alliance and the opening of a war theater in the southwestern United States would impede American entry into the European theater of the World War.

President Wilson had campaigned for re-election in 1916 on a platform that included staying out of the war but the discovery of the Zimmerman telegram forced his hand. And once Wilson delivered his war message to Congress on April 2, 1917, he was all in. The United States immediately shifted to a war footing. Wilson intended to be ruthless; nothing would stand in the way of the war effort:

“Once lead this people into war, and they’ll forget there ever was such a thing as tolerance. To fight you must be brutal and ruthless, and the spirit of ruthless brutality will enter into the very fibre of our national life, infecting Congress, the courts, the policeman on the beat, the man in the street.”

– Woodrow Wilson, The Great Influenza, p. 121

With those chilling words, Wilson instituted emergency measures to rapidly mobilize American industry to produce the weapons of war that would be required for victory. The military instituted a draft in June 1917 for men between the ages of 21 and 30 which was later supplemented to include men between the ages of 18 and 45. In May 1918, the Sedition Act of 1918 was signed into law. This act prohibited “disloyal, profane, scurrilous, or abusive language” directed against the United States government. The die was cast: America was on a war footing, a massive mobilization of troops and material was underway, and freedom of speech was curtailed. This confluence of events would have soon bring fateful consequences that no one at the time could foresee.

To understand the influenza pandemic of 1918, one must first acquire some basic knowledge of the war in which the pathogen spread. It is also important to survey the state of modern medicine at the time. John M. Barry’s book, The Great Influenza, was first published in 2004, long before the world was besieged by the Coronavirus pandemic of 2020. Barry’s book sets the stage in a comprehensive manner by devoting nearly a hundred pages to the rapid advance of medicine during the final decades of the ninetieth century which culminated in the establishment of great institutions such as Johns Hopkins and the Rockefeller Institute. These institutions would be at the forefront of medical research for decades and played an instrumental role in the response to the 1918 pandemic. There is much that we can learn from Barry’s account of the distant past with many parallels to what we are currently going through in our own pandemic of 2020.

The Civil War resulted in the deaths of 624,511 soldiers, but most did not die in battle. An estimated 388,580 soldiers, or over 62 percent of overall deaths, perished due to disease. War throughout history has resulted in the spread of disease due to the movement of troops and the close proximity of soldiers. The toll taken by disease in the Civil War was unprecedented in scale but all American wars up to World War I had involved more death due to infection than due to combat. The scientists who were making breakthroughs in medicine during the first decades of the twentieth century were fully aware of this and anticipated that a major epidemic could occur in the process of fighting the current war.

Carver Hospital, Washington D.C. during the Civil War. Photo: Smithsonian Magazine

Surgeon General William C. Gorgas was well aware of the history of infectious disease in wartime. Gorgas had built his reputation two decades earlier after the end of the Spanish-American war when he worked to eradicate yellow fever and malaria during the construction of the Panama Canal. Now, in 1917, Gorgas was faced with the scenario of the U.S. Army expanding from tens of thousands of soldiers to millions in just a few months. President Wilson’s singular focus on the war required millions of men to be trained in numerous “cantonments” housing approximately 50,000 men each.1 Barry describes the process of ramping up this military activity in 1917 noting that men were crammed into overcrowded barracks and many spent the first winter in tents. This was the first time in history when so many men were rapidly brought together from all parts of the country. These men, from the cities and the countryside, had different immunity and vulnerability to diseases.

Gorgas understood the risk of an epidemic spreading within the military cantonments and the fact that any infectious disease would inevitably spread across the country since troops were being moved frequently by rail between various bases before deploying overseas. And the masses of soldiers sailing for Europe would bring with them infectious disease as well.

Gorgas and the scientific community were not helpless or unaware of the risks and took steps to prepare for an epidemic. Army doctors who had trained at the Rockefeller Institute were assigned to cantonments and the military began producing their own vaccines including enough typhoid vaccine for five million soldiers. The military also acquired serums to treat pneumonia and meningitis as well as smallpox vaccine. Special railroad cars funded by the Rockefeller Institute and the American Red Cross were equipped as rolling laboratories for deployment to camps at the first sign of an outbreak.2

All of these efforts did pay off in terms of protecting American soldiers from diseases such as malaria which killed tens of thousands of French, British, and Italian troops.3 However, none of the efforts were a match for the Great Influenza of 1918. Although we will never have an exact accounting of the final U.S. death toll, epidemiologists today believe that 675,000 Americans out of a total population of 105 million died of the 1918 flu.4 Influenza, despite the best efforts of many brilliant scientists, ended up taking more American lives in 1918 and 1919 than all of the fatalities recorded in the Civil War. Despite his best efforts, Gorgas’s nightmare scenario had come true.

The story of the Great Influenza of 1918 is long and tortured, and one that Barry delves into in great detail. Like most books worth reading, Barry’s work does not lend itself to an easy summary. However, it is worth highlighting two aspects of the book regarding the spread of influenza within very different settings. The influenza first got its foothold within the military establishment and later found its way into the civilian population. In both cases, critical lapses in leadership contributed to catastrophic results and a study of what went wrong can shed light on what policymakers should be doing today.

Camp Funston, Kansas, 1918, Photo: Battle Creek Enquirer

The evidence suggests that the influenza of 1918 began in February in rural Haskell County, Kansas where a severe form of the virus was prevalent. Haskell County is about three hundred miles from Camp Funston where military training was taking place in earnest during that winter. Barry notes that only a trickle of people moved between Haskell and Camp Funston but a large number of soldiers moved between Funston and other army bases and, eventually, to France.5

In March, 1,100 troops at Camp Funston were sick enough to require hospitalization and, shortly thereafter, the illness struck camps in Georgia with over ten percent of forces reporting in sick. Two-thirds of the largest army camps suffered from this form of influenza during the spring and several cities near the bases also had a higher than normal number of cases.

Soon, the flu was prevalent in Europe, particularly in France where U.S. soldiers were rapidly arriving. However, in its initial form, the influenza of 1918 was not particularly lethal and nearly all of the troops recovered. However, by late May, it appeared that the virus had become more virulent. In one station of 1,018 French army troops, 688 soldiers required hospitalization and 49 died.6 This mortality rate among otherwise healthy young men caused Surgeon General Gorgas and others to take note. The illness continued during the summer months in Europe but by August 20, the virus seemed to totally die out.7

Barry goes into some detail regarding how influenza viruses adapt to their hosts.8 Through a phenomenon known as “passage”, the virus adapts to its environment as it propagates from one living animal to another. Passage can cause a virus to become a more efficient killer and, in the case of the 1918 influenza, the virus was becoming more ferocious even as it seemed to temporarily disappear.

Camp Devens, Massachusetts, April 1919 (Photo: Soldiers’ Mail)

The influenza outbreak at Camp Devens proved to be the site of the explosion of the of the second wave of the virus during the month of September.9 Camp Devens was opened in August 1917 and had suffered from diseases such as measles and pneumonia, perhaps due to the high speed of construction, the concentration of soldiers, and the insufficient sanitary standards.

Despite these disadvantages amid the inherent stresses of preparing for war, the camp had what Barry characterizes as a “first rate” medical staff. In early September of 1918, the camp held over 45,000 men but was designed for only 36,000. The camp’s hospital was capable of housing 1,200 patients and was almost empty on September 6 with just 84 patients. The hospital was so underutilized that the staff was preparing to launch several major scientific investigations.

This calm state of affairs was soon shattered. A number of cases of pneumonia were diagnosed early in the month and soon several men were hospitalized with symptoms that were believed to be meningitis. The symptoms that were noted did not appear to be related to influenza and there were no attempts to quarantine the men who were falling ill. Then the situation truly exploded. On September 22, 1,543 soldiers reported ill with influenza. Soon, nearly 20 percent of the camp reported sick and 75 percent of those who reported sick had to be hospitalized. On September 26, the camp’s medical staff was completely overwhelmed with nurses and doctors ill and dying. They had to stop admitting new patients to the hospital.

The men were dying of pneumonia, but not any type of ordinary pneumonia that the medical staff was used to seeing. Within hours, many of the victims had pneumonia so severe that cyanosis set in as the lungs became incapable of transferring oxygen to the blood and patients took on a color so dark that there were rumors that the disease was actually the Black Death. Deaths soon averaged about 100 per day amid complete chaos with care almost nonexistent and corpses lining hallways surrounding the overflowing morgue like “cord wood”.

With Surgeon General Gorgas in Europe, his deputy Charles Richard responded quickly with an order to isolate and quarantine all cases and to segregate soldiers from civilians outside Camp Devens. Richard knew that the epidemic was almost certain to spread and that once it is established in a population, it would be nearly impossible to stop. He urged a complete halt to transfers of soldiers, but by this time, it was too late. The contagion had spread far and wide to other bases. The civilian population would be next.

Liberty Loan Parade, Philadelphia, PA, September 28, 1918 (Photo: Washington Post)

On September 28, 1918, several hundred thousand people came out to see a Liberty Loan parade in Philadelphia that stretched at least two miles. The parade, billed as the greatest in the city’s history, featured bands, soldiers, sailors, military equipment, Boy Scouts and other civic groups. The point of the parade, other than to maintain high morale among the civilian population in a time of war, was to raise funds for the war effort through the sale of Liberty Bonds. During the war, each city was given a quota for the sale of Liberty Bonds which were crucial for funding the war effort. The federal government, empowered by the 1918 Sedition Act, used every tool at its disposal to appeal to patriotism and silence any dissent or bad news. The pressure to deliver was intense. And the consequences were catastrophic.10

By mid-September, influenza was raging in the military installations surrounding Philadelphia and reports of contagion in Boston and the Great Lakes region were known as well. Philadelphia’s director of public health, Wilmer Krusen, had taken no action, denied that influenza posed any danger to the city, and decided to make no contingency plans whatsoever. Krusen finally met with several medical experts on September 18, a week after influenza first arrived in the city, but at the time few cases had surfaced within the civilian population.

Krusen was advised to implement a quarantine, but his priority was to keep the public “calm” and restricted his response to displaying posters around the city reminding people to take small precautions such as placing a handkerchief over their mouth before sneezing or coughing. Even as the disease accelerated, Krusen insisted that the dead were not victims of an epidemic but rather had succumbed to the “old-fashioned influenza”.

Despite the suppression of free speech during this period, several doctors did more than advise Krusen to take stronger measures and to cancel the parade. They tried to warn reporters that the parade would spread influenza and result in deaths but no newspaper was willing to quote the warning. Barry does note that Krusen would have had no support from the mayor or other public officials if he had attempted to take stronger measures but, as the man in charge of public health for the city, he clearly failed to sound the alarm. In fact, he assured parade goers that they were in no danger!

The aftermath was predictable and severe. Just three days after the parade, on October 1, 117 people in Philadelphia died of the influenza. On October 3, Krusen finally banned public meetings and closed churches, schools, and theaters. But by then, the epidemic had taken firm hold. In just ten days, “the epidemic had exploded from a few hundred civilian cases and one or two deaths a day to hundreds of thousands ill and hundreds of deaths each day.”11. Soon, bodies were piling up in front of houses and people were not even able to remove bodies from homes. Undertakers were unable to keep up and gravediggers were either too ill to work or too scared to dig graves. Hundreds of bodies lined the city morgue which had capacity for only thirty-six. For those who were able to get into a hospital, nearly one in four died each day. Philadelphia General Hospital had 126 nurses and they took precautions including wearing masks and gowns. Nevertheless, 43 percent of the staff required hospitalization and ten nurses died.

By the end of the epidemic, more than 12,000 deaths were recorded.12

Mass grave, Philadelphia, PA, 1918. (Photo: Philly Voice)

It is difficult to do justice to the full horror of the 1918 influenza pandemic in an article such as this one and even in a full length book because the scale of the disaster is incomprehensible by modern standards. The Coronavirus pandemic has brought its own horrors to the United States in March and April 2020, but we have not yet seen anything on the scale of what took place over a century ago. Nevertheless, there are important lessons that we can draw from what took place so long ago.

It is important to note that COVID-19 is not a form of influenza but a form of Coronavirus.13 This poses challenges to scientists because this novel coronavirus is relatively unknown compared to the various strains of the influenza virus that have been circulating for decades. Although the initial symptoms are similar to the common flu, the rate of transmission is higher and mortality is much higher.

However, we can still learn a great deal from Barry’s account of the Great Influenza of 1918 because the mitigation methods are still basically the same. Social distancing, self-isolation, and quarantine were the only effective tools available to society a century ago and remain the only effective tools today until a safe and effective treatment is found. Many of the steps Barry describes will sound familiar to someone reading the book today because we are in the midst of living through very similar constraints on our activity.

Perhaps the most important lesson is that human nature often resists taking strong preventative action until the situation is bad enough to make the short-term pain of prevention seem justified. That was certainly the case in Philadelphia but also in countless other locations in 1918. Compounding this basic aspect of human nature was the incessant pressure from the federal government to prioritize the war effort above all else and to not tolerate any dissent. The propaganda measures that the government relied on to rally public support left truth and honesty as casualties — with dreadful consequences.

At a time of widespread pessimism, the other lesson we should draw from 1918 is that pandemics eventually die out. The virulent strain of influenza that shattered so many lives in 1918 eventually weakened and finally disappeared by 1920. Today, many people are talking about how our world will never be the same, how people will not want to be in close proximity to others, and some even suggest that people will stop shaking hands. This seems unlikely.

The Roaring 20s

The Great Influenza of 1918 took an unbelievably sad toll on the country but it did pass. Although medicine was not helpless a century ago, it is far more advanced today. The best minds in medicine are working on treatments and vaccines for COVID-19 and society has implemented much more effective social distancing rules even at great economic cost. We are not in the midst of war. We have scientists willing to contradict those in power and reporters willing to publicize contrary views. We also have social media and the internet which makes suppression of free speech much more difficult. All of these things are cause for optimism regarding our ability to bounce back after this pandemic passes.

At the same time, we must realize that we are not out of the woods yet. Many experts believe that we could face a second wave of infections this fall which brings dreadful reminders of how the initially mild influenza of 1918 adapted and returned in a much more virulent form in the fall. The history of COVID-19 is still being written. We have no excuse to ignore the lessons of the past as we chart the rest of our course through the current crisis.

  1. The Great Influenza, p. 145 []
  2. Ibid., p. 147 []
  3. Ibid. p. 406 []
  4. Ibid. p. 397 []
  5. The source of much of this section can be found in chapter 14 of the book. []
  6. Ibid., p. 173 []
  7. Ibid., P. 174 []
  8. See Ibid., chapter 15 []
  9. Ibid., Chapter 16 []
  10. Ibid. See Chapters 17 and 19 for an account of the Philadelphia parade and its aftermath. []
  11. Ibid., p. 221 []
  12. For a more detailed account of the Liberty Loan parade, see this Philly Voice article dated September 27, 2018 []
  13. See this article from LiveScience for a basic explanation of the differences. []

The World Has No Pause Button

Time marches on relentlessly. From birth to death of a human being, or any living creature, the passage of time is a constant rhythm, never ceasing to make forward progress.

Try as we might to slow the relentless tick of the clock, Father Time remains undefeated.

Modern medicine can put a human being into a medically induced coma in order to better treat an underlying disease but time still marches on relentlessly with consequences such as loss of muscle mass, which gets more severe the longer the coma lasts. You can temporarily pause consciousness but not the relentless effect of time on the body.

When it comes to a civilization, there isn’t even such a thing as a medically induced coma. The world marches on and the individual actors in the drama experience the world day by day. Attempting to temporarily halt a complex human system risks the societal equivalent of muscular atrophy, both in terms of real productive capacity as well as the morale and underlying psychology of the people within the society.

Father Time remains undefeated, as he always has been and always will be.

The United States and much of the world is in the midst of a disruption that will no doubt be regarded as one of the most significant events of the twenty-first century. From our vantage point in early April 2020, we do not yet know how the coronavirus pandemic itself will turn out or what the economic and social ramifications will be, but it is already clear that we are living through a period of historic significance.

Coronavirus, or COVID-19, is not unique in terms of its severity or the fact that it exists and spread widely. The world has seen worse pandemics in the course of human history, with the influenza pandemic of 1918 being the most recent comparable example.1 However, the current coronavirus pandemic is unique in at least three important ways that are worth highlighting.

First, modern transportation systems allowed the virus to spread rapidly over vast distances. The length of time between infection and the appearance of symptoms is lengthy enough to allow apparently healthy people to move all over the world before health problems are even noted. Additionally, the presence of asymptomatic carriers who never show any symptoms facilitated greater spread compared to an epidemic such as Ebola which quickly strikes down its victims. In 1918, a human being could travel several hundred miles over a twenty-four hour period. In 2020, one can travel to nearly any corner of the world in that time. This has dramatic ramifications.

Second, once the extent of the problem could no longer be denied, forceful measures were taken to attempt to reduce transmission of the disease even at tremendous economic cost. Never before have so many countries voluntarily shut down vast portions of their economies in order to save lives. This is even more remarkable in light of the mortality risk being skewed toward the elderly who, from a cold-blooded economic perspective, have fewer “life-years” ahead of them than the young. Society was too slow to react but eventually made an honorable and moral choice to protect our parents and grandparents from harm, even at the cost of massive disruption.

Third, this is the first worldwide pandemic where the suffering and drama has been recorded in real-time for everyone to see, not only on television but on social media. The 1918 influenza pandemic left us with photographs and personal accounts of its victims, the medical community, and political leaders, but we do not have video footage of the crisis. Since the generation that witnessed the 1918 pandemic is almost entirely gone, we do not have living eyewitness accounts to turn to either. In contrast, we are witnessing, in real-time, the suffering of the coronavirus pandemic, and we can track the ever growing number of cases, deaths, and hospitalizations several times a day if we choose to.

St Louis, Missouri, October 1918. (Photo by Underwood Archives/Getty Images)

John Maynard Keynes, writing during the Great Depression, coined the term “animal spirits” to describe the underlying aspects of human psychology that drive economic decision making and cannot be reduced to calculations on a spreadsheet. Keynes understood that an individual’s decision to do something depended on a certain level of confidence in the future, and that the level of confidence is informed by experience drawn from the past:

“Most, probably, of our decisions to do something positive, the full consequences of which will be drawn out over many days to come, can only be taken as the result of animal spirits – a spontaneous urge to action rather than inaction, and not as the outcome of a weighted average of quantitative benefits multiplied by quantitative probabilities.”

The General Theory of Employment, Interest, and Money, J.M. Keynes, 1936

Human beings are not bloodless value-maximizing algorithms, but flawed and emotional beings who rely on qualitative as well as quantitative analysis when it comes to economic decision making. Those of us who have known older relatives who lived through the Great Depression or the World War II years might recall a sense of frugality and risk aversion that could seem excessively pessimistic at times.

This generation saw hardship and the darker side of life and that influenced their outlook well into the post-war years. Shaped by the Depression and tested by the war, this generation prudently and relentlessly built the post-war economy, igniting a quarter-century economic boom often referred to as America’s “golden age”.2 This generation, and the baby-boomer generation that followed, are now the most vulnerable to the current pandemic, and the ones we seek to protect with the forceful measures being taken to slow the rate of infection and avoid overwhelming our medical system.

Depression-era bread lines

The United States is clearly in the midst of a recession far more severe than the “Great Recession” that coincided with the 2008-09 financial crisis. There is no point in citing statistics at this point because estimates of GDP contraction are all over the place, but it is clear that major segments of the economy have all but shut down in recent weeks and remain shut down as we begin the second quarter. Unemployment will skyrocket from the multi-year lows recorded earlier this year and will most likely hit record levels not seen even during the Great Depression. This paragraph is necessarily qualitative; the tyranny of the false precision of “estimates” from economists, which are nothing more than guesses at this point, serve no useful purpose. Suffice it to say that conditions are bad, and likely to get worse in the short run.

There are major differences between the modern economy of 2020 and the economy of the Great Depression years. Those of us who are able to work from home are extremely fortunate and would not have this opportunity without the communication revolution brought about by the internet. Those who work in the restaurant and tourism industries are not nearly as fortunate, nor are those who work in the gig economy. The recent $2+ trillion federal spending bill will lessen the impact of this shock with one-time payments to individuals and expansion of unemployment benefits, but these are palliative measures. The same is true for the steps to shore up small businesses. Such steps, while needed in the short run, are the economic equivalent of trying to put the affected sectors of the economy into a medically induced coma.

In a complex system such as the modern economy, there are follow-on effects when major sectors simply shut down. All businesses face fixed costs which continue to be incurred even when revenue falls to zero. The most obvious include expenses such as rent and debt service. A distressed business that cannot pay such expenses will cause ripple effects when its suppliers and creditors face potential cash shortfalls of their own. Like a malign chain reaction, such ripples course over the entire economy like the ripples in a pond emanating from a thrown stone. No one knows the extent of the damage at this point.

The speed and shape of the recovery is on everyone’s minds at this point, and much will depend on Keynes’s animal spirits. Will the boarded up businesses scattered through countless American cities open up again when governments give the all-clear to do so? Will the customers of these businesses be willing to again go out and spend money in person after weeks or months of self-isolation?

Much will depend on whether the coronavirus pandemic is viewed as a one-time event or as a potentially recurring feature of our lives going forward. If the pandemic is viewed as a horrible, but temporary, interlude in an era of prosperity, then the government’s efforts to induce a “medical coma” of the hardest hit sectors of the economy could well succeed. Sound businesses will emerge with muscle atrophy. Those that were weak even before the crisis may never reopen at all. But the overall system will rebound and regroup.

Consumers, after enduring months of isolation, are likely to return to communal experiences sooner rather than later, as long as they believe that the risks of doing so are contained. Humans are highly social beings and not meant to live in isolation. There will be pent up demand and a sense of excitement to be able to resume our normal lives. A certain segment of the population, whether due to lingering risks for susceptible populations or general fear, might hold back but eventually most everyone will be lured back to restaurants, bars, performances and sporting events. Life will resume. The economy will come out of the “coma”, weakened but still very much alive, and slowly rebuild. We might get the much discussed “V-shaped” recovery and perhaps the economy will reach or exceed 2019 levels of output by 2021.

The alternate scenario is terribly grim. Businesses and consumers will start to view pandemics as a feature of twenty-first century life and will retrench. This is likely to occur if the economy reopens prematurely and is forced to shut down again in the fall if a second wave of infections gains traction. The initial euphoria of getting back to normal life will be crushed if there is a second round of shutdowns. Hopes of a “V-shaped” recovery will evaporate and we will likely enter years of economic stagnation, mired in depression.

Whether the current shutdown lasts another month or another two months is certainly of critical importance when it comes to estimates of second quarter 2020 GDP, which is what economists seem to be focusing on. However, the more important goal at this point is to ensure that the pandemic is contained to the point where it is unlikely to resurface again and require a second shutdown.

How will we know when it is safe to reopen without risking a ruinous second shutdown? Policy makers will have to balance risk of a second wave of infection with the risk of an extended shutdown. The emergence of a safe and effective therapeutic drug that could be used to save lives would be a major milestone. Testing a much larger percentage of the population would also be a step in the right direction. Experts believe that a vaccine could take twelve to eighteen months to develop. The Gates Foundation recently announced a major initiative to speed research into a vaccine. The current shut-downs clearly cannot continue for that long without risking that the patient never wakes up from the coma.

The next few weeks and months could very well set the direction for the next decade or even longer. Policymakers should focus on the science behind the pandemic and only reopen the economy when trusted experts believe that the risk of an uncontrolled second wave is well contained. A second wave of closures and shut-downs must be avoided at all costs. Such an event will likely kill animal spirits and bring about a depression.

  1. The Great Influenza by John M. Barry provides a compelling account of the 1918 pandemic. []
  2. This Wikipedia article provides some good information on the post-war boom, along with links to additional information. []

The Genius Behind Apple’s Greatest Products

“Jony had a special status. He would come by our house, and our families became close. Steve is never intentionally wounding to him. Most people in Steve’s life are replaceable. But not Jony.”

— Laurene Powell Jobs

Steve Jobs walked onto the stage on the morning of July 9, 1997, unshaven, and wearing shorts and sneakers. Gilbert Amelio had just announced his departure as CEO of Apple moments earlier and Jobs was making his triumphant return to the company he had co-founded more than two decades earlier. He opened by asking the assembled group a rhetorical question:

“What’s wrong with this place?”

“It’s products. The products suck! There’s no sex in them anymore!”

Jony Ive was sitting in the audience and wanted to quit and return to England with his family. At that point, he had worked for Apple for nearly five years and had recently been promoted to lead the design group. However, he was disillusioned by Amelio’s lack of appreciation for design. His group was being asked to perform “skin jobs” where the designers would simply suggest what a product would look like from the outside and then engineers would make it as cheap as possible.

However, there was something about the way Jobs spoke about Apple that caused Ive to reconsider leaving. Jobs spoke about how Apple needed to return to its roots and focus on making great products rather than just making money. To a creative genius, working under the Amelio regime had been demoralizing. Jobs, famous for his “reality distortion field”, clearly inspired Ive at a point where he was extremely discouraged. He decided to stay.

Steve Jobs liked to talk about the concept of serendipity which refers to the magic that can result from chance encounters or events. According to Leander Kahney’s book, Jony Ive: The Genius Behind Apple’s Greatest Products, Jobs nearly overlooked the existence of Ive’s design studio, which was housed away from the main Apple campus, and began a search for an outside designer shortly after he returned. Had Jobs done that, it would have amounted to the opposite of serendipity – a historic error of immense proportions. But Ive quickly sensed the danger of being overlooked and began to market his team’s capabilities within Apple. Jobs eventually took a tour of the design studio and was “bowled over by the creativity and rigor he saw”. According to Ive, he and Jobs saw eye to eye immediately. Jobs became “almost a fixture” at the design studio, dropping by the studio at the end of the day. Ive was a brilliant designer but an inexperienced manager. Jobs was not a designer but had a vision for the company. In combination, they made history.

The great collaboration between Jobs and Ive began during a dark period in Apple’s history. After a number of years of strong profitability at Apple, the release of Microsoft’s Window 95 operating system flooded the market with cheap computers that had a graphical user interface similar to what Apple had been offering for years. Many consumers were no longer willing to pay a premium price for Apple products. By the first quarter of 1996, the year that Ive became head of the design group, Apple was posting losses and laying off thousands of workers. By the time Jobs returned to Apple in mid-1997, Apple had lost $1.6 billion and market share had dropped from 10 percent to 3 percent. There were serious doubts about the company’s ability to survive.

Jobs set out to dramatically simplify an overly complex product line and focused on introducing consumer and professional versions of desktop and portable computers. Jobs and Ive saw eye-to-eye when it came to the role of customer feedback in general. Ive had the following to say regarding the design of the iBook which was the consumer oriented laptop product:

“We don’t do focus groups — that is the job of the designer. It’s unfair to ask people who don’t have a sense of the opportunities of tomorrow from the context of today to design.”

Jony Ive: The Genius Behind Apple’s Greatest Products, p. 146.

The genius of Apple at the turn of the century was that they excelled at giving customers what they did not yet know that they wanted. Laptops tended to be utilitarian devices without much panache — all function with little thought for form. Both Jobs and Ive had long cared deeply about product aesthetics, believing that the smallest of details could make or break a user’s perception of a product. For example, the iBook was designed to encourage users to handle it with its curved and rubberized surfaces. The iBook was also unique in “awakening” when the lid was lifted, but this required much effort designing a special latch that customers would never see.

Of course, the products that propelled Apple to a trillion dollar valuation two decades later were not desktop or laptop computers but portable electronic devices. This started with Apple’s entry into music with the iPod in 2001. The iPod took off in 2003 when the device and iTunes ecosystem were made compatible with the Windows operating system. Apple sold 40 million iPods in 2005 but it was becoming obvious that mobile phones would eventually offer music functionality as a feature. Apple needed to build a phone.

Although Ive’s team was not involved in the early R&D for the iPod, the team was heavily involved in the iPhone’s design from inception. Kahney highlights Apple’s extreme secrecy during this “bet the company” project. While Ive was aware of the development of the operating system and was in constant communication with Jobs, his design team worked on the design of the iPhone without ever seeing the operating system. Similarly, the software team built the operating system without seeing the early prototypes of the product. Kahney’s access to members of Ive’s team make the narrative a compelling look into the internal workings of an ultra-secretive organization.

Kahney’s book was published in 2013, two years after the death of Jobs and while Ive was still working at Apple. Throughout the later chapters of the book, the reader cannot help but view Ive as the logical successor to Jobs. In fact, Jobs himself stated that no one at Apple had more operational power than Ive:

“The difference that Jony has made, not only at Apple but in the world, is huge. He is a wickedly intelligent person in all ways. He understands business concepts, marketing concepts. He picks stuff up just like that, click. He understands what we do at our core better than anyone. If I had a spiritual partner at Apple, it’s Jony. Jony and I think up most of the products together and then pull others in and say, ‘Hey, what do you think about this?’ He gets the big picture as well as the most infinitesimal details about each product. And he understands that Apple is a product company. He’s not just a designer. That’s why he works directly for me. He has more operational power than anyone else at Apple except me. There’s no one who can tell him what to do, or to butt out. That’s the way I set it up.”

Steve Jobs by Walter Isaacson, p. 342

One might ask why Jobs did not designate Ive as his successor as CEO instead of Tim Cook. Cook was hired by Jobs in 1998 to run Apple’s supply chain and was instrumental in making the supply chain and manufacturing operations more efficient over the years. If Ive was the creative genius enabling Jobs’s vision to become reality, Cook was the man who made the “trains run on time”. In retrospect, and perhaps even at the time, it seems clear that Jobs, Cook, and Ive were the key people driving forward Apple’s evolution in the 2000s which set the stage for the company’s continued meteoric rise in the 2010s.

Jobs chose to designate Cook as his successor as CEO, no doubt due to his great respect for Cook’s operational skills. After Jobs passed away in 2011, Ive remained at Apple for nearly eight more years before resigning to create his own design firm in mid 2019. After reading Kahney’s book, it is impossible to not view Ive’s departure as a major blow for Apple. Readers who are interested in the evolution of Apple over the past two decades will not be disappointed with Kahney’s book. Ive’s involvement is also described in detail by Walter Isaacson in his 2011 biography of Steve Jobs.

Consumer electronics is a highly competitive industry subject to the shifting tastes and preferences of customers. It is difficult to know what the industry will look like ten years into the future, yet a significant chunk of Apple’s market cap is implicitly discounting strong cash flows decades into the future that are expected to come from products that we may not even know we need yet. Tim Cook’s operational expertise will no doubt serve Apple well in the years to come. For Apple shareholders, it is unfortunate that Jony Ive will not be along for the ride as well.

Disclosure: Individuals associated with The Rational Walk LLC own shares of Berkshire Hathaway. As of December 31, 2019, Apple was Berkshire Hathaway’s largest stock investment.


Forgot Password?

Join Us

Password Reset
Please enter your e-mail address. You will receive a new password via e-mail.