Vaccination and the US Declaration of Independence

Vaccination and the US Declaration of Independence

History lessons and back to the future today. Leak theories aside, most think Covid-19 ‘jumped’ from a species like a bat to a human; it is likely smallpox jumped from a cow to a human and in itself may have had something to do with humans beginning to farm animals. Everywhere you look, the history of how we changed and evolved, the way in which we live, lead to viruses exploiting the new niches we create. History shaped viruses, but it’s reciprocal. Viruses and the issue of vaccines then has determined how history eventually panned out.

Some say the Roman Empire’s demise was because they didn’t have vaccines.  The first decline began in 180 A.D. as smallpox and/or measles decimated Roman troops on their return home from the Middle East, killing up to seven million people in the ‘Plague of Galen’:

https://www.mcgill.ca/oss/article/did-you-know-history/measles-plague-ruined-rome

Other stories of smallpox are analogous: 25 million people were living in the region of the world now referred to as Mexico when Hernando Cortes came ashore in 1519. The ensuing smallpox epidemic wiped out between 5 to 8 million of the local population within just two years. And: obviously, it’s well chronicled that an ambitious plan to rid the world of this scourge was hatched in 1959 the WHO smallpox vaccination program. The war was waged through relentless skirmishes, and territory was fought over inch by inch, like trench warfare, across the globe, as vaccination edged forward. Smallpox first disappeared in North America (1952), then Europe (1953), South America (1971), followed by Asia (1975), and finally Africa (1977).

Worldwide elimination of smallpox took thirty years. When anyone tells you that everyone has to be vaccinated before we can return to normal, how does that work when we understand what happened with other viruses. To look at this remarkably discoveries with Operation Warp Speed and our own UK efforts, it’s worth taking a step back at where vaccines originally came from and some things that as Michael Caine is often quoted as saying: “Not a lot of people know that”; my grandmother used to say “well fancy that”. After all, so much has been written on these subjects, what could possibly be new? What new narratives might there be on this subject – isn’t it all well-trodden ground?

Well, about half a century before Jesty’s and then Jenner’s milkmaid cowpox discoveries, in 1718, an English aristocrat named Lady Mary Wortley Montagu, wife to one of the senior British diplomats in Turkey, observed the oriental tradition of ‘variolation’, and demanded that her own son be inoculated using the technique by the embassy surgeon. Variolation, as a form of inoculation that uses the smallpox virus itself, is incredibly dangerous when compared with the future method of vaccinations we know of. Inoculation by ‘variolation’ took advantage of the fluid-filled vesicles symbolic of a smallpox infection, using that virus-containing fluid to marinade a thread or thin piece of cloth so that the thread becomes impregnated with a weak version of the virus itself. The rash that would be used probably would come from someone more mildly symptomatic as opposed to a patient on death’s door from the original disease.

Allowed to dry for at least twenty-four hours before use, the thread itself could be preserved for months before its use in variolation. To inoculate an individual, a physician would make small incision normally on the arm, just deep enough to expose the blood. A physician would immerse the smallpox-soaked thread into the incision site, then cover the wound for at least a day with cloth or plaster. In the most successful cases, the inoculated person would develop a mild case of smallpox and subsequently be kept quarantined, cared for only by those previously infected or inoculated.

When she returned to England in 1721, Lady Mary Wortley Montagu had her daughter inoculated as well-this time in the presence of the British court physicians and the president of the Royal Society, so the practice began to spread and be studied. However, the latest evidence is the story can be pushed back even further in time, according to Anna Durham in her study entitled, ‘Extirpating the Loathsome Smallpox: A Study in the History and Demise of Smallpox, as Aided by Thomas Jefferson’.

Cotton Mather, well known in American history as a Puritan minister and for his support of the Salem witch trials, wrote to the Royal Society of London about variolation. However, his 1716 letter was the result of a conversation with "[his] Negro-Man Onesimus, who is a pretty Intelligent Fellow." Anna Durham’s account published in the journal Rice Historical Review, is that Onesimus, an African slave purchased for Mather by his congregation in 1706, had shown Mather a peculiar scar on his arm. Onesimus explained that this scar was the result of undergoing ‘variolation’, when he had been living in Africa. Mather's letter explained that while this procedure inflicted upon Onesimus "something of ye Smallpox," the slave became, however, "forever preserved" from contracting the dreaded deadly disease ever again.

Because Mather's knowledge came primarily from a slave, and because practice of the procedure was done only in Eastern, non-Christian countries, all but one of the Boston doctors resisted the experiment-much as the conservative English doctors had done around the same time. During 1721 there was a smallpox outbreak in Massachusetts and Cotton Mather attempted to encourage vaccination (it wasn’t called that then) via variolation. Because Mather's knowledge came primarily from a slave, and because practice of the procedure was done only in Eastern, non-Christian countries, all but one of the Boston doctors resisted the experiment-much as the conservative English doctors had done around the same time.

The one doctor that took Mather's risk, however, contributed much to making inoculation widespread throughout the colonies. Dr. Zabdiel Boylston inoculated 244 individuals with smallpox, only 6 of whom died. This mortality rate of two and a half percent, compared with the mortality rate of colonists who did not undergo variolation at fourteen percent, demonstrated at that time the effectiveness of variolation. In 1766 Thomas Jefferson experienced variolation for himself in 1768 and experienced the subsequent protection which so impressed him that he began to make it part of his life’s work to defend the right of the individual to choose variolation should they desire it, against local laws at the time prohibiting it. When several angry citizens of Norfolk, Virginia formed a mob and rioted against a local doctor who performed inoculations, burning his house down, Jefferson in fact took on the duty of defending the doctor, but he lost the case. However, his continued passionate defence of the individual right to defend themselves with this technique meant he may have been involved behind the scenes in drafting legislation legalising variolation and interestingly the language used in that previous earlier bill is very close to the wording of the Declaration of Independence.

The wording may have been influenced by Thomas Jefferson’s strong interest in a legal defence of an early form of vaccination. There is a sense in which the wording of the Declaration of Independence was influenced by Jefferson’s previous legal battles over smallpox variolation so some think the Declaration of Independence, one of the most famous documents in the world, was influenced by an early form of vaccination.

Jenner and orphans

The British doctor Edward Jenner had also observed something strange, however: those who caught a related disease called cowpox never came down with its deadlier cousin. Thus, in 1796, he began giving people cowpox intentionally, rendering them immune to smallpox and creating the first vaccine. Jesty, a Dorset farmer, performed vaccinations in 1774, 22 years before Jenner's first vaccination in 1796. Fewster, an apothecary-surgeon who knew Jenner personally, is also reported to have performed the procedure several years before Jenner. And of course there’s Mongtagu, Mather and Boylston above. Regardless of who did it and who is credited, the breakthrough introduced another dilemma: how could doctors deliver vaccines to people who needed them? Within Europe, distributing the vaccine was manageable. People with cowpox developed blister-like sores filled with a fluid called lymph. Doctors would prick open the sores, smear the lymph on silk threads or lint, and let it dry. They would head to the next town over and mix the crusty lymph with water to reconstitute it. Then they’d scratch the fluid into the arms or legs of people there to give them cowpox. The process was straightforward but laborious.

The real trouble started when doctors tried to vaccinate people who were far away. The lymph could lose its potency traveling even the 215 miles from London to Paris, let alone to the Americas, where it was desperately needed: smallpox outbreaks there were verging on apocalyptic, killing up to 50% of people who got the virus. Every so often threads of dried lymph did survive an ocean journey, a batch reached Newfoundland in 1800, but the lymph was typically rendered impotent after months at sea. Spain especially struggled to reach its colonies in Central and South America, so in 1803, health officials in the country devised a radical new method for distributing the vaccine abroad: orphan boys (a far cry from our current logistical challenges).

The plan involved putting two dozen Spanish orphans on a ship. Right before they left for the colonies, a doctor would give two of them cowpox. After nine or 10 days at sea, the sores on their arms would be nice and ripe. A team of doctors onboard would lance the sores and scratch the fluid into the arms of two more boys. Nine or 10 days later, once those boys developed sores, a third pair would receive fluid, and so on. (The boys were infected in pairs as backup, just in case one’s sore broke too soon.) Overall, with good management and a bit of luck, the ship would arrive in the Americas when the last pair of orphans still had sores to lance. The doctors could then hop off the ship and start vaccinating people.

Given the era, it’s likely that no one asked the orphans whether they wanted to participate, and some seemed too young to consent anyway. They’d been abandoned by their parents, were living in institutions, and had no power to resist. But the Spanish king, Carlos IV, decided to make them a few promises: they would be stuffed with food on the voyage over to make sure they looked hearty and hale upon arrival. After all, no one would want lymph from the arm of a sickly child. Appearance mattered. And they’d get a free education in the colonies, plus the chance at a new life there with an adoptive family. It was a far better shake than they’d get in Spain. When Balmis’s crew started vaccinated people in the Americas, cathedral bells rang, priests said Masses of thanksgiving, and people shot off fireworks and held bullfights in their honour:

The Royal Philanthropic Vaccine Expedition finally set sail in November 1803. Twenty-two orphans, ages 3 to 9, made the journey, accompanied by lead doctor Francisco Xavier de Balmis, his team of assistants, and Isabel Zendal Gómez, the head of the boys’ orphanage, who would care and comfort them. Despite all the careful planning, the expedition nearly failed. When the ship arrived in modern-day Caracas, Venezuela, in March 1804, just a single sore was left on the arm of a single boy. But it was enough. Balmis immediately started vaccinating onshore, focusing on children, who were most susceptible to smallpox. By some accounts, Balmis and his team vaccinated 12,000 people in 2 months. From Caracas, Balmis’s crew split into two parties. His top deputy, Jóse Salvany Lleopart, led one expedition down into what’s now Colombia, Ecuador, Peru, and Bolivia. The journey was rugged, through thick jungles and over the forbidding Andes Mountains. Still, over the next few years they managed to vaccinate upwards of 200,000 people. Many villages greeted them as saviors.

Meanwhile, Balmis marched up through Mexico, where he vaccinated on the order of 100,000 more people. Midway through the journey, he dropped off the orphans with their new families in Mexico City. Then he headed to Acapulco to prepare for another vaccine expedition, this time to Spanish colonies in the Philippines. He picked up a few dozen more boys in the town, but instead of finding orphans, he hired boys from various families, essentially renting them as vaccine mules for the journey to Asia. The ship arrived in the Philippines on April 15, 1805, and within a few months, Balmis’s team had vaccinated 20,000 people. The expedition was so successful, in fact, that Balmis went rogue and sailed to China in the fall of 1805 to vaccinate people there. The achievement was staggering: Without any modern equipment or transportation, Balmis’s team managed to spread Jenner’s vaccine across the world in less than a decade, vaccinating hundreds of thousands of people and saving perhaps millions of lives.

The road to smallpox vaccination was messy. Spain devised the orphan plan only because prior attempts to transport the vaccine the old-fashioned way, to North America, South America, India, and elsewhere, had floundered, prolonging outbreaks that could have been stopped. If that last boy’s sore had popped before reaching Caracas, the orphan expedition would have failed as well. And while many towns hailed the vaccinators, some doctors who’d grown rich providing quack smallpox treatments were early anti-vaxxers, actively thwarting the campaign by refusing to administer vaccines. Let’s not forget the orphans, either: removed from their childhood home, injected with diseased fluid, and put on a ship for a strange land. The ordeal must have been terrifying.

No matter the era or disease, vaccine campaigns are always daunting logistical challenges, with dozens of points at which the distribution chain can, and often does, break down because of communication hiccups or weak supply-chain links. Despite health agencies the world over spending billions upon billions of dollars on campaigns to eradicate polio, the disease still exists in pockets, in large part because distributing vaccines to certain areas has proved nearly impossible. A campaign for a childhood malaria vaccine erupted in controversy when the UN failed to disclose side effects to parents in Africa, including meningitis and an increased risk of death in young girls. Even in the case of measles, a disease with an established vaccine, a bungled campaign killed 15 children in South Sudan as late as 2017.

On December 9, 1979, the Global Commission for the Certification of Smallpox Eradication signed their names to the statement that "smallpox has been eradicated from the world." It was the first time that a disease had been banished from the earth by the planning and action of the world's public health professionals. And it became a model for later (ongoing) efforts. The disease only spread from human to human, so there had been an unbroken chain of infection for more than three millennia. In the 1960s, before the eradication program, more than half a million people died every year from the disease.

But in country after country, vaccination and isolation programs lowered rates of infection until the numbers dwindled to 1 person who was infected, the last patient. In Botswana in 1974, it was a little girl, Prisca Elias. In Pakistan, 1976, Kausar Parveen. Rahima Banu, Bangladesh, 1976. Amina Salat, Ethiopia, 1976, from left to right:

Finally, someone had to be the last person on Earth to contract smallpox, and that person was Somalia's Ali Maow Maalin or was he?

He contracted it on October 12, 1977, while guiding a vehicle with two smallpox-infected children. On the 22, he fell ill and developed a fever. By the 26th October, the rash appeared on his skin. He was misdiagnosed with chickenpox and sent home. As his symptoms developed, it became clear he had smallpox, but Ali Maow Maalin did not want to go to an isolation camp, so he did not tell local disease surveillance officials. A friend, however, eventually reported his condition and collected a reward.  All told, he came into face-to-face contact with 91 people, which led the WHO to undertake a massive intervention to prevent the re-spreading of the disease.

An intensive search began to find everyone with whom he had come into contact. In all, 91 face-to-face contacts were identified, 12 of whom had no vaccination scar, and six who had been hospital patients or visitors. Heroic measures were taken, including a search and vaccination of the town and of everyone entering or leaving town at any one of four checkpoints. House-by-house searches throughout the region were conducted monthly, and a national search was completed on December 29. But no one caught the virus.

So, on October 16. 1979, after nearly two years of monitoring, the WHO declared Smallpox Zero. And Ali Maow Maalin went into the history books as the last man on earth to catch wild smallpox. Guess what? He died this year of malaria, after years of working to eradicate polio.  Now, the only potential sources of infection are the labs that do research on the virus. Some researchers have been pushing for destruction of lab stocks since the 1990s, but US and Russian bioterror researchers have fought the proposal, saying they need more time to develop defensive measures.

The last smallpox victim?

Interestingly, when the WHO was about to announce final victory over smallpox when a very strange thing happened which amazingly enough, still has not been explained to this day. The disease fought back with its last dying gasp. Infectious viruses have a tendency to deliver unexpected counter-punches just when you are convinced they are knocked out. And it was research linked to vaccination which may have been to blame. The very last person to catch and die from smallpox was not found by outreach WHO workers stranded in a remote, impoverished, ill-educated part of the world. It happened in Birmingham. The above was not ‘the last person’ who died of smallpox.

Janet Parker was a 40-year-old medical photographer working in the anatomy department at Birmingham Medical School.  Suddenly on Friday 11 August 1978 she became mysteriously unwell, developing unsightly unusual red spots on her back, limbs and face. As smallpox hadn’t been endemic in the UK since the early 1950’s and was thought to have been eradicated throughout Europe for decades, the doctors naturally first thought she had chickenpox. In fact, she was actually displaying the classic ‘centrifugal’ rash and influenza like symptoms of smallpox. But as she was not improving, even worsening, she was admitted to the Catherine-de-Barnes Isolation Hospital in Solihull on 20 August.

She worked one floor above a laboratory where heavily regulated research into smallpox was being conducted. This just happened to be one of the last few places on earth where the dangerous virus was allowed to be stored, though under very close supervision. This is Janet Parker:

Professor Alasdair Geddes, a consultant in infectious disease at East Birmingham Hospital, remembers the night that the fateful diagnosis was made. As reported by the BBC, he recalls asking Professor Henry Bedson, who ran the smallpox laboratory at Birmingham Medical School, about Janet Parker’s test samples, as they were being scrutinised.  By examining skin lesion scrapings under an electron microscope the pathogen could be identified as follows:

By a twist of fate that day the on call microbiologist was Professor Bedson. "I said 'can you see anything Henry?', and he never answered," said Prof Geddes, according to the BBC. "So, I gently moved his head aside so I could look down the microscope and there were brick-shaped particles that are characteristics of the smallpox virus”. Professor Geddes reports Professor Bedson’s reaction of horror. The smallpox virus must have escaped from his strictly sealed off part of the building, and had infected the medical photographer, despite the fact she had never visited his laboratory. Professor Geddes describes looking at that first fateful slide under the microscope; “like a building site, with bricks lying everywhere”. "I think as soon as he saw it, he knew somehow it had come from his lab. He knew what would follow," Professor Geddes said. But did Professor Bedson really foresee then that the world’s press as well as the WHO would descend, and he would find himself at the centre of an international media storm?

What must it be like to be the doctors making a diagnosis of a deadly disfiguring disease thought to have been eradicated worldwide? What if you thought it was a mistake at your lab which had re-introduced it back onto the planet? Do we get a clue from what later happened to Professor Bedson, as to how the scientists holed up in any laboratory under suspicion, for example, in Wuhan, might be feeling?

The 49-year-old professor, a recognised international expert on the disease, was clearly distressed by the outbreak. A mass media scrimmage, with reporters and camera crews camped outside Bedson's house, ensued. On August 30, with Government investigators swarming all over his lab, Professor Bedson, discouraged yet concerned to help, was ordered away and isolated at his home in the suburbs. A few days later he received a negative letter from the WHO. This might have been interpreted as a signal that effectively have ended his research, which ironically enough, given the current preoccupation of our modern pandemic, was on possible variants of the virus.

As the days went on and Mrs Parker remained in isolation, her condition deteriorated. She was left almost blind in both eyes from the sores, and then she descended into renal failure. Her hair would get trapped in the sores oozing pus between her shoulder blades. As the disease tightened its grip, Janet Parker developed pneumonia, then stopped responding. Relatives were not allowed to visit, so it was a very lonely slide into coma. It was found most likely she had caught a particularly virulent form of smallpox referred to as the Abid strain.

On 26 May 1978, several months before this incident, Professor Bedson had driven to London to visit Prof Keith Dumbell, a fellow smallpox specialist at St Mary’s Hospital medical school. As reported by Sally Williams writing in The Guardian, Dumbell offered vials of smallpox virus for research purposes. They included an extraordinarily contagious strain, isolated in 1970 from two Pakistani patients one of which was a three-year-old boy called Abid. Bedson appears to have driven the virus 120 miles back to Birmingham without, as far as we know, any special precautions. What would have happened if there had been an accidental car crash on the motorway?

His laboratory began investigating the Abid strain in mid-July. As Janet Parker lay in her hospital bed, lonely dying, on 5 September her father Frederick, 77, died from an apparent cardiac arrest while in quarantine; induced by the stress of his daughter's illness? An outbreak can claim lives in more than one way, as we are re-learning today. However, no post-mortem examination was ever carried out due to the smallpox infection risk.

With the outbreak claiming its first victim, a day later On September 6, 1978, on top of which, the Government inquiry was underway, Prof Bedson disappeared out to his garden shed, at his home, and killed himself by cutting his throat with a kitchen knife. He left a note, which read: "I am sorry to have misplaced the trust which so many of my friends and colleagues have placed in me and my work. Above all to have dragged into disrepute my wife whom I so early loved and my beloved children. I realise that this is not the most sensible thing I have done but may I hope in the end allow them to get some peace."

Professor Bedson was rushed to the emergency department of the local hospital and resuscitated, but it was too late to prevent irreversible brain damage; his life support was switched off the next day. At the inquest into his death the coroner Richard Whittington blamed Professor Bedson’s death on hounding by the press. The Professor’s phone at home had been ringing off the hook with calls from reporters. But it was a charge the National Union of Journalists took vociferous umbrage to. However, while it would not have been so fashionable back then, and therefore might have been missed, the isolation of an enforced quarantine may also have played its psychological role.

The problem was that no matter how thorough any inquiry was, it became stymied by the fact that the one person who could possibly have provided an answer to the vital question of how Janet Parker caught smallpox, was now dead. This pattern of suddenly unpredictable behaviour of supposedly dispassionate scientists at the centre of these high-stakes situations will come back to haunt our story of vaccination through recent history.

Five days later, at 03:50 on 11 September, 1978, Mrs Parker passed away. The last person on earth to die from a disease which had been the scourge of the human race for three thousand years. And the reason she got it was partly a deeply tragic side-effect of the much-needed research into the disease. Janet’s illness had lasted for thirty days and she was the first person to die from smallpox for five years. Yes medical science had saved the lives of millions, but it had also, in a sense, killed, ironically, the last victim. If you are going to understand vaccines, you are going to have to get the contrast between the headline grabbing anecdote, in contrast to the overwhelming statistics which may be pointing the other way.

Over 340 contacts of Janet Parker were traced and vaccinated. Fortunately, only one – her mother – developed smallpox and this was aborted by further vaccination and drugs. She had the Abid strain. That strain has been renamed The Parker Virus, and is still alive, kept in a CDC laboratory there.

Janet Parker herself was not allowed to be buried for fears that her body would contaminate the soil with virus. Staff were given instructions on protecting themselves, including what clothing to wear, but in a twist which replicates the kind of poor planning we have seen with the modern pandemic, there was also some astonishing bungling. The undertaker, Ron Fleet, was left to remove the corpse from the hospital yet only allowed to do so by himself, which meant he struggled to move the body all by himself, probably engendering more risk to himself and therefore the outside world. “There was also some kind of liquid and I remember that I was frightened that the bag would split open. The body was covered in sores and scars – it was horrific” he later said to reporters. When he arrived at the hospital, he had been expecting to take the body from a fridge in the mortuary, but instead found Janet Parker had been stored in a transparent body bag mysteriously packed with wood shavings and sawdust, left on the floor of a garage away from the main hospital building. The Abid strain was just too new for her immune system.

This incident illustrates another feature of infectious outbreaks, innocuous actions taken by ordinary people on the ground floor, seemingly out of sight, can have massive consequences for the rest of society. It may be pandemics are unique as human predicaments, in this asymmetry between the potentially massive impact of the seemingly trivial.

Another paradox at the heart of vaccination is that it is such a successful technology that it wipes out a disease and also to an extent the memory of the ailment. It’s never quite so difficult to get people to accept a vaccine when someone nearby becomes seriously ill, fear drives take up, but the whole point of population level vaccination is to prevent you ever having to personally experience the disease or know someone with it. Paradoxically, maybe it was the success of past vaccination programs which made us feel we lived in a world free from contagion which may have lulled us into a false sense of security. Perhaps the story of vaccines will be this constant see-sawing between suppressing an epidemic then becoming complacent followed by another outbreak.

Some painful experiments

The difficult experiments on orphans and others weren’t just a feature of life centuries ago. Back a few decades to 1942, in the grip of the Second World War, the US military faced an existential threat from within. A hepatitis outbreak was suspected to have infected hundreds of thousands of personnel. There were no animal or cell-culture models for studying the viral liver disease. Desperate to find the source of the outbreak and learn how to contain it, the military joined forces with biomedical researchers, including some from the University of Pennsylvania in Philadelphia and Yale University in New Haven, Connecticut, to launch human experiments that continued for decades after the war.

During the 30-year programme, meticulously chronicled in Dangerous Medicine, researchers infected more than 1,000 people, including over 150 children, with viruses that cause hepatitis. The people enrolled were prison inmates, disabled children, people with severe mental illnesses, and conscientious objectors performing community service in lieu of fighting. Owing to biases in the US prison and psychiatric-hospital populations, a disproportionate number were Black. The long-term consequences will never be fully reckoned: although rarely fatal in the short term, hepatitis can lead to chronic liver disease and cancer years after the initial infection.

These days, horrifying stories of human medical experiments in the mid-twentieth century are well-trodden territory, the most famous being the studies in Tuskegee, Alabama, that withheld treatment from hundreds of Black men with syphilis for decades, starting in the 1930s which only ended in 1972 as I’ve written about.  The study initially involved 600 Black men, 399 with syphilis, 201 who did not have the disease. Participants’ informed consent was not collected. Researchers told the men they were being treated for “bad blood,” a local term used to describe several ailments, including syphilis, anemia, and fatigue. In exchange for taking part in the study, the men received free medical exams, free meals, and burial insurance. By 1943, penicillin was the treatment of choice for syphilis and becoming widely available, but the participants in the study were not offered treatment.

In 1972, an Associated Press story about the study was published. As a result, the Assistant Secretary for Health and Scientific Affairs appointed an Ad Hoc Advisory Panel to review the study. The advisory panel concluded that the study was “ethically unjustified”; that is, the “results [were] disproportionately meagre compared with known risks to human subjects involved.” In October 1972, the panel advised stopping the study and soon after medical care was provided to survivors.

Back to the hepatitis experiments which began in 1942 with the outbreak of the disease among US soldiers and military personnel, which was, researchers eventually determined, caused by a contaminated batch of yellow fever vaccine. Launched under the aegis of the war effort, the studies also ended only in 1972, when public and professional sentiment towards experiments on vulnerable populations shifted. Researchers studied a wide range of hepatitis biology, distinguishing between hepatitis A, which is transmitted by contaminated food, and hepatitis B, often spread by contaminated blood products. They looked for ways to inactivate the hepatitis B virus in blood supplies, and tested treatments and means of prevention.

Some of the studies involved deliberately exposing people to infected material, either by injection or through ingestion of “milkshakes” containing hepatitis virus in the form of stool samples mixed with chocolate milk. At least four people died from the disease in the course of these experiments. This is a 1945 newsletter for the Civilian Public Service camp in New Haven, Connecticut, where conscientious objectors were enrolled in hepatitis studies as part of the war:

For decades, the researchers justified their work to colleagues and the press, reframing it to meet the ethos of the times. Initially, it was portrayed as a necessary sacrifice to support the troops. Later, experiments on people in prison were pitched as paths to rehabilitation through service to society. Studies on mentally ill people were an extension of ‘fever therapy’, the idea among some scientists at the time that high body temperature caused by infectious diseases such as malaria and, perhaps, hepatitis might improve some psychiatric conditions. And experiments on disabled children were rationalized as an attempt to generate immunity against a disease that was already frequent in the crowded and unhygienic institution in which they were housed. Although the long-term consequences of hepatitis were not fully understood at the time of the experiments, there were signs as early as the 1940s, and the researchers could have acknowledged them. Eventually, by the 1970s and 1980s, epidemiological studies had shown that carriers of hepatitis B were more likely to develop cirrhosis and liver cancer than those who were not carriers.

Particularly crushing is the naivety about how hepatitis affects children. The immediate symptoms are not as severe in children, so scientists argued that infecting young people would give them immunity that would protect them when they grew older and more vulnerable to severe infections. In fact, children with hepatitis B are much more likely than infected adults to become lifelong carriers, and to experience long-term consequences.

There was a time when we could have casually looked down our noses at mid-twentieth-century ignorance about infectious diseases. But with the world still in the throes of a coronavirus pandemic, there are parallels. Witness how efforts have been focused on the acute impacts of disease (hospitalization, death) without much thought to long-term consequences (disability). Or think of how those with the least agency, children, people in prison, people with severe mental illnesses,  have been put at risk by those with the most power. Many clinical trials in healthy individuals still rely on vulnerable populations. Some people move from one to the next in search of food, housing or remuneration in exchange for their participation. People in regions with poor access to health care sometimes have to enrol to get basic medical treatment. And in the United States and Europe, there is still no requirement to provide compensation for long-term disability that might arise from participation in clinical trials. I wonder what research abuses are we justifying to ourselves today?

Seeing something new, now

Understanding novelty in the context of the immune system is related to most everything we’re discussing and debating. What do variants mean for vaccines? Should children get immunised? How can we think about vaccine breakthrough cases in terms of severity or Long Covid? What might endemic COVID look like? What about waning immunity and reinfections? And why is vaccinating globally an urgency? Geneticist Theodosius Dobzhansky had once famously said, “Nothing in biology makes sense except in the light of evolution.” This makes something else clear: nothing in this pandemic makes sense except in the light of novelty.

The above shows an electron microscope image of an isolate from the first U.S. case of COVID-19. Not too different to the smallpox pic above but very different in terms of harm.

The issue with SARS-CoV-2 is it’s pretty new to our immune systems. That makes it very dangerous. Viruses that are new to us spread faster and are more lethal than old familiar ones. Some scientists are tempted to chalk this up to evolution. The argument is that a virus that leaves its host alive will outcompete one that kills its host. Viruses do sometimes become less deadly as they adapt to a new host species (like us), but they also sometimes become more deadly. But whether wrong or right for a given virus, this tempting just-so story can be a distraction.

Novelty is bad regardless of virus evolution.  When a virus is new, nobody possesses acquired immune protection against it. Acquired immune protection is a different kind of adaptation: not virus evolution, but our own learned, adaptive, immunity. We build over our lifetimes as we encounter new pathogens and learn how to fend them off. If nobody has adaptive immune protection, a virus spreads faster. Even a few immune individuals in a population can meaningfully slow the rate of virus spread, since they are less likely to become infectious and infect others. If there are enough immune individuals, the virus may not be able to spread at all. This is the logic of population immunity and herd immunity. It is important. We talk about it a lot.

If nobody has adaptive immune protection, a virus causes severe disease in more of the people it infects. This is also important. We don't talk about it enough. Unless we eradicate SARS-CoV-2, possible but let’s face it not likely, especially in the short term, just about everyone is going to encounter the virus sooner or later, especially with Omicron. But those who have adaptive immunity from infection or vaccination may not get sick at all. Even if they do, they will be less likely to get very sick or die. 

Now that we have safe, effective vaccines, we can give people immunity without causing dangerous disease. The more people who see the vaccine before they see SARS-CoV-2, the fewer severe cases, long-term health problems, and deaths. Faster worldwide rollout will save lives. It really is that simple. But why is it that simple? Can't immunity wane over time? Can't the virus evolve, allowing it to infect the immune? Can't it infect or reinfect some immune individuals even without evolving? And if it does, won't we be back in March 2020, facing a fast-spreading virus that causes severe disease? No, as Omicron has shown us. It’s very different to smallpox.

Vaccines are great, but are they the answer if they fail to prevent infections? No. It is tempting to think of adaptive antiviral immunity as a castle wall. It might keep out some marauding raiders, but if they breach it, the defenceless townsfolk will be slaughtered just as easily as if there had been no wall in the first place. That's the wrong mental model. Some features of immune protection are indeed wall-like: mucus and antibodies in our noses can restrain infectious respiratory viruses before they ever reach one of our cells and start replicating. 

But we don't just have that wall. We also have a castle guard, who can fight and kill the marauders even if they successfully breach the wall, protecting our townsfolk. And if they're already familiar with the marauders' particular weapons and tactics, they'll have practiced counter-tactics that can give them the upper hand.  What's more, this is where the analogy, already strained, I admit, breaks completely: outside of fantasy novels, medieval castles could not produce an exponentially growing quantity of guardsmen in time to counter an exponentially growing horde of marauders.

With that in mind, I'll now drop the Arthurian analogies and explain the actual mechanisms but remember this: adaptive immunity gives you a head start against a virus infection that you didn't have before, when you were immunologically "naive" to it. That head start could be the difference between a mild fever and ending up on a ventilator. When you're infected with a virus you're never seen before, you first mount an innate immune response. This is largely non-virus-specific; it acts to limit virus replication in ways that are more or less the same regardless of the virus. For example, the innate immune response kills infected cells, preventing them from producing new infectious virus particles (virions).  Later in this first infection, you mount an adaptive immune response. This is a targeted response to the specific new invading virus. Like the innate response, it has many facets ("the immune response is complex", as immunologists love to remind us). But there's one key concept for our purposes here: antigenic recognition. In essence, your body gets more effective at fighting the virus by learning how to recognise some of the proteins that make up a virion. Through a process of trial-and-error, you produce immune cells and antibodies that match those proteins. This gives you both a targeted antiviral arsenal and an early warning system in the case of reinfection.

After you’ve gotten rid of all the viable virus and infected cells in your body (“cleared” the infection), the virus-specific B- and T-cells that you developed don't all go away. You keep some of them around to mount a faster and more powerful response if you’re ever reinfected. On reinfection, your cells notice that you're facing a known pathogen. Your memory immune cells then replicate, growing exponentially to counter the virus's own exponential growth within your body. Sometimes, this "recall response" will crush the infection before you even feel sick. Other times, you'll feel under the weather for a bit as you fight off the virus. But crucially, you have a head start that you didn't have when you were immunologically "naive" to the virus.

There are many viruses for which we do not have effective vaccines. By the time you reach your teenage years, you’ve almost certainly been infected by a bunch of respiratory viruses, some relatively nasty, like influenza A, and others comparatively benign, like rhinoviruses and the alphabet soup of seasonal coronavirus (sCoVs): HKU1, 229E, OC43, NL63. We think of the sCoVs as a minor nuisance; in most adults, they cause the common cold. 

SARS-CoV-2 is a coronavirus too. So why is it so much nastier than the sCoVs? One major reason: it's new. More precisely, it's new to us. One of the first observations people made about COVID was that it was frighteningly lethal in the elderly, but by and large, children were not getting too sick. Some people were surprised. Conventional wisdom was that influenza hit children and the elderly hardest, while sparing younger adults. Why was SARS-CoV-2 different?

But we need to look a little more closely, because it's hard to reach adulthood without having had the flu. Look at virus severity not by age but by age of first infection, and a pattern emerges: see something for the first time as a kid, and you'll most likely be okay (but only most likely). See it for the first time as an adult, and it can be nasty. The older you get, the worse it becomes to be infected with a virus you've never seen. No wonder most orphans survived the variolation – many older adults would not have. Just look at these graphs:

https://www.nature.com/articles/s41597-020-00668-y

The above findings show that the severity of most infectious diseases is at its lowest in school-age children. Strikingly, the severity was higher among young adults than among school-age children for polio, typhoid, tuberculosis, measles, smallpox, chickenpox, infectious mononucleosis, HIV, influenza, pertussis, Salmonella, Yellow fever, typhus, scarlet fever, Ebola, meningococcal meningitis, Japanese encephalitis, cholera, and perhaps Lassa fever. Some infections show a slower rise of severity with age after childhood (brucellosis, plague, coronavirus infections, acute hepatitis B and hepatitis A, St Louis encephalitis, Campylobacter, and Western equine encephalitis) but for most this rise begins well before old age. For Diphtheria and Shiga toxin-producing E coli severity remains raised in childhood and adolescents compared to young or middle-aged adults, and for Dengue the greatest severity is in school-age children.

A number of factors may explain these observations. The severity of infectious disease depends on the virulence of the infecting organism, the dose or route of infection, and the response of the host. Variation in strain virulence by age is unlikely, and many of the studies reported outbreaks of single strains (e.g. Spanish influenza, Ebola, smallpox, typhoid, cholera). Most infections have a single route or mode of infection, and for HIV the mode of infection does not explain the age pattern.

Adolescents and adults may be exposed to higher doses of infectious agents than are younger children, through caring responsibilities or because they eat more. Greater exposure often increases the risk of acquiring an infection, but the relationship with disease severity is less consistent. An association between infecting dose and severity of disease has been suggested for measles and perhaps chickenpox, and for Salmonella food-poisoning but not typhoid. Fatal cases of SARS and MERS had a slightly shorter incubation period, consistent with a higher infectious dose, but this was not found for influenza, and an association between severity and incubation period could be explained by host susceptibility. For Ebola, infecting dose (as estimated by degree of exposure to people with Ebola virus disease and their bodily fluids) is strongly associated with the risk of disease, but did not influence the case fatality rate. Infectious dose cannot explain the continued rise in severity throughout adulthood.

The J-shaped pattern of severity, high in infancy, dropping in early childhood to a minimum around age 10 years, and rising in adolescence and through adulthood, with a steep increment in old age mirrors that seen for all-cause mortality across populations. It is also seen for many of the more frequent causes of death including respiratory disease, diarrhoeal disease, and tuberculosis, consistent with a changing age-specific host response.

Co-morbidities tend to increase with age but are generally low in young adults. Interestingly, in a large study of trauma in the US, the case fatality rate, adjusted for severity of injury, only rose from age 55 years onwards, suggesting that the change in resilience to infection at younger ages is immunological, not dependent only on co-morbidities or physiological changes.

Further support comes from the immunological response to vaccine challenge by age. In clinical trials of human papilloma virus vaccine, the antibody levels to 9 different component vaccine sub-types measured 7 months post-vaccination decreased with each year increase in age from 9-26 years, with the steepest decline between ages 10 and 16 years. A similar profile was seen in responses to Hepatitis B vaccine in China, with antibody titres in subjects vaccinated at ages 15-24 years half of those vaccinated at 5-14 years; and in bactericidal antibody responses to Meningococcal group A conjugate vaccine in West Africa, with lower titres in those aged 18-29 years at vaccination than in those aged 2–10 or 11-17 years. However, this is not universal: responses by age to the Meningococcal B vaccine varied by vaccine strain.

Humoral responses to natural infection also indicate that immune function may be optimised in children and adolescents. Antibody titres against persistent herpesviruses such as cytomegalovirus and varicella zoster increase in response to recurrent episodes of viral reactivation so higher titres indicate less effective immune control, or re-infection. In the Gambia, where most people are infected with cytomegalovirus in infancy, cytomegalovirus-specific antibody levels show a ‘J-shaped’ pattern with age, with the lowest levels (suggesting optimal control) in adolescents, rising after age 20 years. Similarly, Varicella zoster is ubiquitous where there is no vaccination, and studies in different settings suggest that antibody levels in seropositive individuals are at their lowest (suggesting optimal control) at 8-9 or 15-19 years.

The immune system changes in infancy and old age are well described. After birth the weak innate immune system of neonates strengthens rapidly, and the adaptive immune system develops an expanded repertoire of memory B and T cells triggered by infections (and vaccinations), the microbiome, and allergens in food and the environment. With ageing, there is a decline in naïve T-cell numbers and T-cell receptor diversity, cytokine production by CD4 and CD8 T-cells is impaired, the CD4 to CD8 ratio is inverted, and the cells of the innate immune response function less well. It is likely that the majority of the data presented here reflect the response to initial pathogen exposure, as those who are already fully protected by acquired immunity will not be included as cases, and those who are partially protected may have mild disease and not be diagnosed. The primary immune response may be particularly affected by ageing. The very high case fatality rates in elderly people observed in response to novel pathogens such as COVID-19, MERS, SARS, St Louis encephalitis, and Ebola, may result from restriction of the naïve T-cell repertoire.

Studies of immune senescence have focussed largely on those over 60 years, but our data indicate that important changes in immune function are apparent at a much younger age. This could be explained by relative ‘senescence’ of the immune system starting in young adulthood or even earlier. Peak immune function in childhood and adolescence, at around age 5-14 years, might represent the intersection of improved function due to maturity and increasing antigenic exposure, and decreasing function due to early senescence. This is supported by assessment of immune parameters, some of which change almost linearly over the lifespan, and by changes in the thymic cortex and medulla which peak in size at age 4-11 years, or earlier.

The severity of infectious disease depends not only on the control of the infectious agent but also on pathological damage arising from the immune response. Dengue is an example where strong immune responses in childhood are associated with increased clinical complications due to antibody dependent enhancement. It has been suggested that the balance between an insufficient immune response and an over-active immunopathological response is generally more favourable in children4. However it seems surprising that a similar balance would be required for the wide range of viral and bacterial infections with similar age-severity patterns presented here, including infections in which protection is mainly antibody-mediated (e.g. smallpox) and those in which it is predominantly cell-mediated (e.g. tuberculosis).

The increase in infectious disease severity during adolescence suggests that puberty could have a role. Sex hormones influence immune responses but in different ways and with inconsistent findings between studies. Overall, testosterone tends to suppress and oestrogens to enhance the immune response. These associations may contribute to the generally higher mortality rate from infectious disease in males and higher incidence of autoimmune disease in females. Numerous differences in immune marker concentrations and response to infection have been described between males and females, and these increase after puberty. However, where data are available, the increase in severity of disease in young adults compared to children and adolescents was seen in both males and females (e.g. Ebola, tuberculosis, Spanish influenza, polio). The decreased immune response to human papilloma virus vaccine with age, and the profile of cytomegalovirus-specific immunity with age, are also similar in males and females.

Environmental exposures and heterologous infections may modulate the magnitude and quality of the immune response at different ages. For tuberculosis it has been argued that not only HIV but some (unknown) sexually and parenterally transmitted virus might trigger progression to disease and explain the age distribution.

Persistent infections, such as herpesviruses, may establish early in life, or increase with age, especially with the changing social contact patterns of adolescence. Cytomegalovirus seropositivity has been associated with immunosenescent changes, including a reduction in the naïve T-cell pool and commensurate increase in CD8 + memory T-cells with a reduced CD4:CD8 ratio, as well as with increased risk of mortality in older age. However, the functional consequence of persistent cytomegalovirus infection appears to vary with age and cytomegalovirus may enhance responses to heterologous infections in younger people. In most countries (and probably everywhere at the time when some of the data they present were collected) almost everyone is infected with cytomegalovirus in early childhood, and continuing exposure and multiple infections are likely. Thus if cytomegalovirus has a role in explaining the age-pattern of severity, it may reflect immunological attrition due to the burden of persistent infection or reinfection rather than simple presence or absence of infection.

The age pattern of severity suggests that natural selection has acted to optimise immune function in childhood and adolescence with a subsequent decline during adulthood. Peak function before reproduction seems counter-intuitive, but according to life history theory there is a trade-off between the energy required for immune response and that required for other functions, such as reproduction and growth. Whilst the energy costs of innate immunity are spread throughout the lifetime the adaptive immune system has high upfront but lower running energy costs. Thymic involution conserves energy when sufficient naïve T-cells have been generated, facilitating the metabolic demands of the adolescent growth spurt and secondary sexual development, but leaving the elderly more vulnerable.

SARS-CoV and SARS-CoV-2 appear to have more extreme variation in severity by age than other infections, with predominantly mild disease in children, and very high case fatality rates in the elderly. Unusually, there is no evidence of higher severity in infants. The reasons for this extreme variation are not yet known, but should be considered in the context of the findings of this review.

Appreciating how responses to infections vary by age outside the extremes of age can have major implications. Understanding the prevalence and transmission of SARS-CoV-2 in children, and how this differs from influenza, is crucial in predicting the role of school closures in the response. Vaccination programmes that increase the average age of infection, can paradoxically increase the numbers of people with severe disease. This has led to targeted vaccination of older children for measles and rubella. The higher vaccine response of younger children to human papilloma virus vaccine may lead to a shift in target age groups.

The finding that many infectious diseases are least severe in school-age children suggests that relative immune ageing starts far earlier than has previously been recognised. Improved understanding of the mechanisms underlying this observation may provide novel opportunities for intervention strategies. Children tend to have lower mortality than adults from sepsis, and comparison of their responses is being used in drug discovery; similar approaches could be used for other infections. Comparison of immune response to infections which increase in severity in young adults and those that only increase later may give clues to drug and vaccine design. In the midst of a pandemic and with increasing antimicrobial resistance, new approaches such as these are desperately needed.

Children encounter many viruses to which they have no prior immunity. They compensate with robust innate immune responses that allow them to handle novel infections fairly well.  Robust doesn’t equal invincible. Without widespread childhood vaccination, infectious diseases kill many children, particularly children under five. A first encounter between the immune system and a virus can end tragically, even for a child. As you age, you get less good at handling novel viruses. And eventually you get less good at handling any virus, novel or familiar, your immune system ages ("immunosenescence"). The flu, for example, can be very severe in the elderly. But adults, even elderly adults, usually have at least some adaptive immunity to the viruses they face. 

Things can get bad if they don't. Chickenpox is a great example. It's benign in most children, but it's often severe in unlucky adults who make it to adulthood without being infected or vaccinated. When chickenpox vaccines were first rolled out, public health officials worried that the effort could backfire if not enough kids were vaccinated. Suppress the virus, but not fully, and unvaccinated people might avoid the virus until adulthood, only to get a severe case. That’s why we don’t have vaccines for this in much of Europe – it would delay it to adulthood.

Because SARS-CoV-2 is a new virus, by definition it's a virus that every adult avoided until adulthood. Small wonder that so many adults are getting severe cases. Almost everyone sees nuisance common cold coronaviruses like sCoV OC43 by the time they turn 10:

https://www.nature.com/articles/s41467-020-18450-4

But see OC43 for the first time at age 40, and you might have a pretty rough time of it. Indeed, OC43 can cause a high attack rate, often lethal outbreaks in nursing homes (where people have prior immunity, but have immunosenced, rendering their overall immune response less robust):

https://www.ncbi.nlm.nih.gov/labs/pmc/articles/PMC2095096/

Some scientists believe OC43 was the virus behind the 1889-90 global pandemic, which is estimated to have killed around 1 million people:

https://www.ncbi.nlm.nih.gov/labs/pmc/articles/PMC7252012/

In the above article on OC43, the author writes: "If OC43 was the culprit in the 1889/90 pandemic, it has clearly lost its sting in the past 130 years". Has it? Or do we (almost) all now see it in childhood?

The "almost" may be important. I often wonder about the strong similarity between myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS), a rare but severe chronic health condition, and many cases of long COVID. ME/CFS is more common in adults than in children; it often takes hold in adults after a viral infection. What if it is a rare but dangerous consequence of first seeing in your 30s a virus most people first saw in childhood? Evade OC43 or another common virus as a kid, and it could give you post-viral sequelae when it finally hits you in adulthood. 

And so while we don't yet have hard data on the efficacy of the vaccines in preventing Long COVID if they fail to prevent infection, the severity-is-novelty principle makes me hopeful. The virus might get you sick, but it won't be new to you. That could matter a lot. These are not new, grand insights. People have made these points throughout the pandemic these points throughout the pandemic. But they are easy to forget. And when we forget them, we make mistakes in combating the virus, lethal ones.

People suffer many flu and coronavirus infections over the course of a lifetime, thanks in part to this antibody-evading evolution ("antigenic evolution"). But until we immunosenesce, these reinfections rarely cause severe to critical illness, even when the virus has evolved partial immune escape. Rarely is not never. A small number of unlucky individuals fail to mount a meaningful, protective adaptive immune response to infection or vaccination. Immune-compromised or immune-suppressed individuals are particularly at risk of failing to develop this adaptive immune protection, as are the immunosenescent. That is where population immunity comes in: it's up to the rest of us to protect them. We do this by protecting ourselves, thus limiting their exposure (from us).

These unfortunate unprotected individuals account for some fraction of documented reinfections and "breakthrough" infections in the vaccinated. But as the virus evolves or our immunity wanes, we expect to see more successfully protected individuals get infected too. But we also expect that they will retain protection against severe disease. We can already see hints of this effect: a study in the UK found that vaccines reduce the risk of hospitalisation and death in elderly individuals with documented infections, they protect the castle even when the wall is breached:

https://www.bmj.com/content/373/bmj.n1088

Conclusions and where are we now?

Immune escape is rarely rapid and complete; it's more often gradual. We don't see viruses abruptly change so much that everyone with protective immunity loses all their protection, and the virus regains the full severity associated with full novelty. If you want to stay out of the hospital, giving your immune system a preview of the virus is valuable, even if that preview isn't perfectly accurate. Of course, one should never say never. Suppose a near-complete escape variant suddenly emerged, a SARS-CoV-2 virus that was sufficiently different (sufficiently novel) to cause severe disease in many previously-exposed or vaccinated people. Even in that unlikely nightmare scenario, we would not be back to square one. We have easily updated vaccine platforms now, designed for SARS-CoV-2. We would rush out an updated vaccine matching this new variant. It’s different to smallpox though – we only need the vaccine to prevent severe illness, not infection. The seasonal coronaviruses evolve over time and partially evade our prior antibody immunity:

https://www.theinsight.org/p/vaccine-efficacy-statistical-power

So do seasonal influenza viruses:

https://www.nature.com/articles/nrmicro.2017.118

We don’t need them to not do so. This is why talking about a "vaccine resistant virus" is misleading. When a virus evolves to evade our immune memory, whether induced by infection or vaccination, that doesn't render the virus impossible to vaccinate against. It means that our immunity needs an update. Exposure to the new version of the virus, whether via infection or, ideally, via an updated vaccine, updates our immunity. The adaptive immune system adapts. Remembering that novelty means severity helps us see that the vaccines provide cause for hope, even if SARS-CoV-2 manages to stay with us for years. SARS-CoV-2 might stick around; the COVID-19 pandemic will struggle to do so. But it also makes clear that those of us in wealthy countries have two choices for how the global pandemic ends: via natural infection or via vaccination.

The new normal now means that the disease poses less of a risk. Or will people ignore COVID even as it continues to kill hundreds of thousands of people every year? There is some real reason to anticipate the former, more hopeful, scenario. Viruses are most dangerous when they are introduced into a population that has never had contact with them before. The more “immunologically naive” people are, the more of them are likely to suffer from bad outcomes. This suggests that the next few months could provide us with significant protection against future strains of the virus: once a large portion of the population is exposed to Omicron, humanity will be a lot less immunologically naive, which might help us better handle future strains of the coronavirus without a significant increase in mortality.

This isn’t a foregone conclusion, however. Omicron could turn out to afford those it infects with very brief or very weak immunity against other strains. If we’re unlucky, some future strain could turn out to be (at least) as infectious as Omicron and (at least) as deadly as Delta. Clearly, the severity of future strains is of huge moral significance. And equally clearly, what we should do in response to future waves of the virus depends, at least in part, on the nature of the threat we will face. A model response would also take into account the after-effects of COVID, which seem to last a long time in many patients, including some who initially had mild symptoms. And yet, my guess as to what we will do no longer turns on these matters. We now seems poised to respond to future waves with a collective sigh and a shrug.

The truth of the matter is that virtually all humans have, for virtually all of recorded history, faced daily risks of disease or violent death that are far greater than those that the residents of developed countries currently face. And despite the genuine horrors of the past 24 months, that holds true even now. Is our drive to live life and socialize in the face of such dangers foolhardy? Or is it inspiring? I don’t know. But good or bad, it is unlikely to change. The determination to get on with our lives is deeply and perhaps unchangeably human.

In that sense, the spring of 2020 will be remembered as one of the most extraordinary periods in history, a time when people completely withdrew from social life to slow the spread of a dangerous pathogen. But what was possible for a few months has turned out to be unsustainable for years, let alone decades. Whatever damage Omicron might wreak in the immediate future, we will, most likely, soon lead lives that look a lot more like they did in the spring of 2019 than in the spring of 2020. We can prevent COVID-19, but not SARS-CoV-2, which is fine.

 


Justin Stebbing
Managing Director

Vaccination campaigns in most countries have reached phased progress

Vaccination campaigns in most countries have reached phased progress

Record cases in France, Portugal, Turkey, Italy, Sweden, the Netherlands and Israel; Italy will mandate vaccines in >50 year olds in public sector…

Read more
Omicron leads to milder disease compared to the Delta variant

Omicron leads to milder disease compared to the Delta variant

Macron said “I am not about pissing off the French people. But as for the non-vaccinated, I really want to piss them off. And we will continue…

Read more
Evidence of Omicron taking over and even breaking through lockdowns has led to cases rising ‘everywhere’

Evidence of Omicron taking over and even breaking through lockdowns has led to cases rising ‘everywhere’

The UK released vaccine efficacy data against Omicron hospitalisations this week, showing an estimated efficacy of 88% 2 weeks after a booster dose,…

Read more
Top