We all know someone who is good at everything they try. They instinctively find their groove, never seem to over-reach and succeed again and again where others fail. It makes you a little crazy, but how can you not love their talent and courage? Aisha Tyler has pulled off “good at everything” in the toughest of all spaces—the entertainment business. She has been brilliant as a standup comic, as an Emmy-winning host of The Talk and ringleader of Who’s Line Is It Anyway?, as a star of the beloved animated series Archer, as an unforgettable character on Friends and as a director on the Walking Dead franchise. The next challenge? Her own Brooklyn-based cocktail brand. The challenge for Gerry Strauss was a bit more daunting: slowing Aisha down long enough to cover the vast and ever-expanding landscape of her remarkable career. Is it any surprise that these pages could not contain her?
EDGE: You’ve taken a lot of chances and explored several different directions in your career. This being our “Best Case Scenario” issue, I have to ask: Is there anything that you can think of that you were advised not to do, but did anyway?
AT: I don’t know that I was ever told not to do anything, but I did get a lot of discouraging feedback when I started doing stand-up. I got a lot of This is never going to work. You’re never going to make it anywhere. I’m a grownup and my comedy was always grownup comedy, so my show was always kind of edgy. I had this club owner tell me that I should stop cursing, that my material was too dirty. Then the next guy after me was talking about a little person in the most vulgar fashion and I remember telling this guy he’s a hypocrite. He said, “You’re just a little girl trying to run with the big boys and it’s not going to work.” Later I saw him and he came up to me and he was like, “I always knew you were going—” and I was like, “No, no, you didn’t. No. No, you don’t get to revise history now, buddy. You said I was going to fail.” I’m not a petty person and I don’t hold grudges, but every kid just thinks, Someday I’m going to show them! So I did really enjoy cutting him off.
EDGE: Was standup always a dream of yours?
AT: I wish it was that intentional. As a kid, watching Eddie Murphy’s Raw or Richard Pryor’s Live on the Sunset Strip, I felt like those were magical people that had fallen out of the sky. There was no direct bright line between that and like, Oh, I could do that for a living. I had discovered standup in college but it wasn’t until I got out of school that I realized that comedy could be a vocation. My minor was Environmental Studies and I wanted to work in environmental policy, so I got a job at a conservation group called the Trust for Public Land. I was doing marketing and PR for them. I think the idea was just to take a year or two to work and then go to law school and become an environmental lawyer. In the interim, I realized that working in an office was just highly problematic for me. So I started doing standup, which was obviously the hardest of hard right turns. I was watching a lot of standup on TV and thinking there were some really mediocre comics out there…and I could be mediocre, too. So I tried it once to see if I liked it. I think any standup will tell you that that first set is typically pretty electrifying. Either you fall in love with it right away, or it’s not for you. But just doing it once—and doing it poorly, I might add—I was like, Oh…this is the gig for me!
EDGE: At that point, did success in comedy seem like it would be enough?
AT: Oh, yeah. And it’s not that I’m not an ambitious person. But I found out I could do standup and be able to pay my bills and I thought that might be enough. I was really just focused on trying to be the best standup comic I could be. It was a very mid-level “best-case scenario”—an eh-case scenario, I think [laughs]. It all turned out way better than I envisioned.
EDGE: As you branched out, did you prefer to play fictional characters or find projects, like hosting, where you could be you?
AT: Oh, that’s a good question. Being myself, that’s an easy job, right? For some people, hosting and doing standup live on stage is really discomforting, however I found hosting is a very easy thing for me to do and I quite enjoy it. But I probably preferred acting because I typically like to lean into the stuff that I find most challenging and most difficult. But the host stuff was cake, you know what I mean? Even when I kind of started doing comedy and then I started getting up into drama, I really gravitated towards doing drama. I wanted to do something where I knew I was going to have to stretch myself because I typically prefer the thing at which I’m least proficient.
EDGE: Is that true of directing? I’m thinking of your doing Fear the Walking Dead.
AT: I think so. I find directing to be the most challenging and, on some days, the most frightening—so definitely the most interesting. To go into a space where you’re good at what you do, but you know you have a lot to learn, you know you’re going to be growing and you’re going to constantly be expanding your skillset and your experiences long-term. So yes, it’s definitely true of directing.
EDGE: You won a Daytime Emmy during your time on CBS’s The Talk. What were the biggest challenges of occupying that chair every day?
AT: I don’t mean to be glib, but right from the beginning it was a very easy show to do. As a standup, I was accustomed to speaking contemporaneously and off-the-cuff and being myself. It was a network show and it was for daytime, so sometimes we had to be circumspective and kind of shave off the edges. But as the show became more popular, we were able to speak more freely. I think that’s why it did well when I was there. There was a nice frankness, a kind of emotional openness to the show. It was driven by personal experience rather than politics. I’m a pretty private person, so I did struggle to figure out what about my personal life I wanted to share and what I wanted to keep private. That was always a challenge—to want to be present and forthcoming, to be supportive of the other women, but also wanting to keep some of your life’s details to yourself. I think that’s a normal human inclination, that not everything has to be out there in the open.
EDGE: In regard to your acting résumé, I think your stint as Charlie, Ross Geller’s love interest, is something that will live forever as part of the Friends legacy. Were you nervous inserting yourself into the Ross-and-Rachel dynamic?
AT: No, and I’m sure it was because there was no social media then. I really just didn’t want to suck. I was just trying not to be bad at my job. I will say that there might’ve been a general backlash against anybody who came between Ross and Rachel, but I never had anything but positive feedback, to this day. Charlie Wheeler ended up being a fan favorite. I get 10-year-olds that watch the show now that love her. I think the way that her relationship with Joey and Ross was framed and how it happened, it was all very playful. So yeah, that paleontologist lady, people tended to like her quite a bit. But I was nervous for sure, because it was the best and most popular show on television at the time. It was the peak expression of that kind of comedy and, I think, has held up as a pillar of four-camera comedy.
EDGE: There’s a story out there that you got into acting because of Sam Rockwell—
AT: It is entirely true. We went to the same high school and I thought he was super cute. He went into an improv class and I followed him in there and stayed in there. I mean, not like a stalker [laughs]. I went and hung out with him in improv class and out of school. Luckily, we’re still very close friends to this day, so it all went well.
EDGE: So what’s something about Whose Line Is It Anyway? that most people wouldn’t know?
AT: I’ve said a million times that the guys absolutely don’t know what they’re going to be improvising about until I tell them. There are no cheat-sheets or advanced stuff. So what you wouldn’t know—unless you are in the studio for the taping—is that lots of things don’t go well. They flub a lot. They make mistakes. There’s a lot of stuff that’s not safe for television—a lot of cursing—but it’s always a really playful night. It’s all hilarious. Sometimes it’s perfect and sometimes it’s a mess. But they’re so good at what they do that the whole night is really joyful. Another thing people don’t know is that, in my first season on the show, the sound guy kept complaining because I was laughing too loud. He was like, “Aisha, you need to not enjoy this as much.” I was like, That’s an impossibility.
EDGE: Tell me something that you haven’t done but would like to try.
AT: Two years ago a friend of mine gave me a skydive as a Christmas present. They did that knowing that I was going to be really angry about it, because I do not want to jump out of a plane—but the fact that they challenged me to jump out of a plane means now, of course, I have to show them and jump out of a plane [laughs]. I’m terrified. I’m legitimately terrified to do it. So now there’s just this battle between my ego and my terrified inner child over whether I’m going to ever jump out of a plane. This person knows that I have a little bit of a soft spot and they can just goad me and shame me into doing it, because I won’t be like, You can’t tell me what I’m not going to do. So the skydive is looming on the horizon and giving me palpitations, but we’ll see.
EDGE: Don’t forget a GoPro camera. This can be your next film project.
AT: Or a diaper—I’m not worried about the camera—a large adult diaper. EDGE
If you haven’t eaten rice today, have you really eaten at all?
As a culinary historian and historic interpreter, I am never happier than when I’ve got in front of me a solid dish of red rice. This simple, hearty dish—one of the many direct contributions of West Africa to the Southern table—isn’t only an edible link to my genes, my DNA, my blood and bone. It’s a way to my heart. Indeed, rice has played a pivotal role in shaping my identity. My favorite rice dish growing up was my Alabama-born grandmother’s red rice (often misnamed “Spanish rice”)—a tasty, tomato-rich rice pilau with bell peppers, onions and spices. Little did I know that, if you followed that one dish back through all of the mamas and grandmas that came before her, you would go overland from Alabama to South Carolina and then across the Atlantic.
My grandmother’s great-grandmother was born in Charleston, the center of red rice country, and her great-grandmother’s grandmother was born in Sierra Leone, among the Mende people. To this day, one of the staple dishes of Sierra Leone is jollof rice, the West African antecedent of red rice. Prepared in different ways up and down the Atlantic world rice belt, today’s versions of red rice essentially maintain the same orange-red glow, as well as a taste that is pleasantly warm and pairs well with just about any leafy green or protein.
There’s an apocryphal story that rice entered the South through Charleston in 1685. A ship blown off its course from Madagascar to England landed unexpectedly in Charleston, where aid was provided to the crew. The grateful captain repaid the colonial British governor with seed grains from rice, which from then on could be grown in Carolina and used to enrich the colony for all time. Though rice was most likely already here when the ship from Madagascar arrived, this story of rice’s entrance into the South highlights how significant it was for the region. In the antebellum South, if cotton was the king of commodities, then rice was the queen. And the queen brought incomparable economic power. Charleston, and later Savannah, were thriving cosmopolitan trading ports, with fabulous wealth guaranteed by the cultivation of cash crops, which relied on the knowledge and labor of enslaved West and Central Africans.
West Africans from Senegal to Liberia, the western half of Cote d’Ivoire (Ivory Coast), and deep into the interior along the Niger and other rivers, had grown rice for almost four millennia by the time the transatlantic slave trade picked up in earnest. With the spread of Islam and the settlement of the western African coastline by the Portuguese, the indigenous red rice known as Oryza glaberrima and several other wild and cultivated species were joined by Oryza sativa, or “Asian rice.” On the island of Madagascar, some of my other ancestors were growing the latter, their ancestors having brought seed from Indonesia in outrigger canoes. As African and Asian cultures mixed, rice became both a staple and the central feature of Madagascan economic life. In West Africa, too, my forebears knew this reality, with women taking a primary role in processing of the crop.
It is no accident that my great-grandmothers passed their knowledge of rice culture from generation to generation. In the 1700s, planters from North Carolina to Florida imported thousands of enslaved human beings—many of them women—to properly grow and process husked rice. They were already rice-production experts. On the other side of the South, along the Gulf coast and up the Mississippi River Valley, the French sent Africans with rice that had originated in Benin and Senegal. Other Africans arrived in the Americas with similar knowledge, having grown Asian rice to supply slave ships sent to the Americas.
Jollof rice, the famous West African dish, is named after the Wolof people of Senegal and Gambia, who themselves call it benachin. Maggi, a bouillon cube ubiquitous in West Africa, has become part of the flavor profile of everything there. If you have access to an international market, it will have Maggi cubes; you can use them to make a Maggi broth to replace the stock in this recipe—just follow the instructions on the package. Be careful…it tends to be salty, so go lightly at first to find your bearings. Makes four servings.
*Kitchen Pepper is an old-school spice mix popular in early American cooking. It contains black pepper, nutmeg, ground allspice, ground cinnamon, ground ginger, ground mace, ground white pepper and red pepper flakes.
Heat oil in a medium saucepan with a tight-fitting lid over medium-high heat. Add the onion and garlic and sauté for 4 to 5 minutes, until soft. Add the tomato paste, turn the heat down to medium-low and cook for about 3 minutes, stirring constantly. Stir in the rice, chili pepper, black pepper and seasoned salt. Cook for 2 to 3 minutes, stirring constantly to prevent the rice from sticking to the bottom of the pan. Add the stock, cover, turn the heat down to low and simmer for about 20 minutes, until the liquid is nearly but not completely absorbed. Remove the lid, place a piece of aluminum foil over the pan, return the lid to the pan over the foil and steam for another 20 minutes.
We can see the importance of rice in African American folklore, which carried over rice’s unique mythology from Africa. Supposedly carried in seed form in the braided hair of African grandmothers, rice offered the enslaved a hidden and sacred link to ancestors and their deities. Among my Mende ancestors, for instance, rice mixed with palm oil fed the ancestors at their graves. For many other groups, too, African rice was a revered food, not just dinner.
They say in Sierra Leone that, if you have not eaten rice that day, then you haven’t really eaten at all. I appreciate that sentiment, as fare like pilau (which in some places is called perloo)—a simple southern chicken-and-rice dish—or a rice crepe stuffed with green onions, Vietnamese herbs and fresh seafood, trigger some of my most Pavlovian moments.
But even more important, rice connects me to every other person, southern and global, who is nourished by rice’s traditions and customs.
Michael W. Witty is a culinary historian and author of the James Beard Award-winning book The Cooking Gene: A Journey Through African American Culinary History in the Old South. Twitty’s new book, Rice, features 51 recipes ranging from Southern classics to international dishes. It explores the culinary history and African diasporic identity of rice. This story is excerpted from RICE: a SAVOR THE SOUTH® cookbook by Michael W. Twitty. ©2021 by the University of North Carolina Press. Used by permission of the publisher. For more information or to order visit uncpress.org
This tomato pilau is one of the greatest dishes ever to emerge from the Low Country and can be adjusted depending on your tastes. The recipe was inspired by the erudite Damon Lee Fowler—culinary historian and cookbook author from Savannah and a keeper of old Southern culinary traditions—who published it in The Savannah Cookbook in 2008; it is included with his permission. Makes 4 to 6 servings.
Salt, ground cayenne pepper and whole black pepper in a peppermill or Kitchen Pepper*
Put the bacon or salt pork in a Dutch oven and turn the heat to medium. Fry, uncovered, until the fat is rendered and the bacon is crisp. Spoon off all but 2 teaspoons of fat. Add the onion and bell pepper and sauté until translucent, about 5 minutes. Add the rice and stir until it’s well coated and warmed, about 3 or 4 minutes. Add the tomatoes with their juice, stock, Worcestershire sauce, salt, cayenne and a liberal grinding of pepper to taste. Bring to a boil and stir, scraping any loose grains that are sticking to the pan. Loosely cover, reduce the heat as low as possible and let simmer for 25 minutes. Remove it from the heat and allow to steam for 15 minutes before serving.
Occasionally on the news you hear journalists call vaccine research a “cat-and-mouse” game. That’s only partially true. It misses the point that, if the cat doesn’t get its claws into the mouse, sometimes the mouse turns around and kills the cat. This game has been going on for most of recorded history and, as we have become better-educated about the nature of viruses and vaccines over the last 18 months, it’s safe to say that we have gained a greater respect for both the cat and the mouse.
Thankfully, in 2021, the cat seems to have gained the upper hand. Because, for most of human history, let’s face it: The mouse was winning. Until relatively recently, in fact, unchecked viruses, diseases, infections and a fundamental misunderstanding of how the body works doomed most people to an early grave.
Which explains why the idea that human beings might somehow create resistance or immunity to serious illness stretches back centuries—and how we arrived at the current state of vaccine technology. Long before scientists understood how epidemics spread, or even what they were, many cultures in Europe, Asia and Africa subscribed to the belief that exposure to a small amount of virus could boost immune response and prevent large-scale deaths. As the vaccines developed in late 2020 and early 2021 helped us turn the corner on COVID-19, it is worth a look back at how we got to where we are today.
The initial breakthrough dates back more than 1,000 years to China, where we find the first mention of variolation experiments. Variolation takes its name from the scientific name for smallpox, variola. Smallpox killed an estimated one-third of the individuals who contracted the disease, and left its survivors hideously scarred and often sterile or blind. A ship entering an ancient harbor with reports of smallpox aboard was often quarantined until the disease had run its course. The root of the word quarantine, with which we are now all too familiar, is quarantena. It originated with the policy in Venice during the bubonic plague of the 1300s and 1400s, when all ships arriving at the Italian port were compelled to anchor for 40 days before sailors could come ashore. For the record, the “quarantine” imposed on travelers during the coronavirus pandemic isn’t a quarantine at all; technically it is medical isolation.
Variolation involved various methods of introducing a small amount of biological material taken from an infected patient into an uninfected person. Among the ways this was accomplished was with a needle piercing a smallpox pustule and then being scraped across the skin of a healthy individual. Another was the collection of scabs from a smallpox victim, which were then ground up and rubbed into an incision in the skin—or blown up the healthy person’s nose. Sometimes a needle and thread that had been pulled through a smallpox pustule were pulled through a small scratch in the uninfected patient’s skin. The goal was to induce a very mild version of the infection, which (fingers-crossed) would subside in a few weeks and create a strong resistance to smallpox.
Early adopters of variolation did not include England, which considered itself the world leader in medical expertise, which it wasn’t. However, enough prominent physicians had used it effectively in the 1600s and early 1700s to finally convince the government to approve its use against smallpox in the early 1720s, with the full support of the Royal Family. Remember, at this point scientists had almost no understanding of how viruses and bacteria caused disease. Predictably, there was public outcry against variolation—the precursor to today’s anti-vaxers—not just in Great Britain, but in the American colonies, as well. To many, giving someone a dreaded disease they didn’t have seemed crazy and dangerous. In Boston, clergyman Cotton Mather was the target of a bomb for his advocation of variolation. The explosive hurled through his window contained a note that read You dog, dam [sic] you; I’ll inoculate you with this; with a pox to you.
Mather was a controversial public figure who had his hands in just about everything in the Massachusetts colony and was nothing if not a paradox. For example, he was devoted to importing Newtonian science to the American wilderness on the one hand while, on the other, he was the guy who set up the Salem Witch Trials. Mather learned of variolation not from Newton or other British scientists but from a Libyan slave he received in 1703 as a gift, who he rechristened Onesimus (his real name is lost to history). Curious about the scars on the man’s body, Mather listened as Onesimus explained how North African cultures dealt with smallpox. He then spearheaded a variolation campaign during an outbreak in Boston that yielded spectacular results. And took all the credit, of course.
Within a generation, our Founding Fathers had all hopped on the variolation stagecoach. Benjamin Franklin convinced English physician William Heberden to produce a pamphlet touting the success of smallpox inoculation (a word borrowed from horticulture, originally related to the grafting of plants) and distributed it free of charge throughout the 13 colonies. Two decades earlier, Franklin had lost a son at age four to smallpox. The pamphlet included do-it-yourself instruction for home inoculations—a forerunner of YouTube videos for DIYers. George Washington contracted smallpox as a young man, which some have theorized was responsible for his inability to produce offspring with Martha. True or not, it did make him immune to the smallpox outbreaks that would ravage military camps during the American Revolution. One of Washington’s lesser known edicts as commander of the Continental Army was that his soldiers had to be variolated. During Thomas Jefferson’s presidency, he personally conducted inoculation experiments on his slaves at Monticello, demonstrating both his enlightened approach to science and his unenlightened regard for human freedom and dignity.
There were worse jobs for young ladies in 18th century Europe than being a milkmaid, but like most occupations it came with its own set of risks. Repeated contact with a cow’s udder, especially by someone with broken or abrased skin common to farm work, was an invitation to contract cowpox, which produced pustules on the hands and forearms. The disease was considered an occupational hazard—rarely serious and, once a girl got it, she didn’t get it again. And, farmers began to notice, those girls didn’t contract smallpox, either—even after close contact with workers or family members who did. In 1774, smallpox began tearing through the county of Dorset in the south of England. A farmer named Benjamin Jesty used fluid from cowpox lesions to successfully inoculate his wife and children against smallpox. Over the next 20 years, similar stories caught the attention of scientists. Another piece of the puzzle was that British cavalry officers tended to dodge smallpox outbreaks, presumably from exposure to horsepox, a close relative of cowpox.
In 1796, English scientists Edward Jenner made the big breakthrough. He took a sample of pus from the lesion of a cowpox-infected milkmaid and injected it into an 8-year-old boy who had never had smallpox. Six weeks later he injected smallpox into the boy, who was the son of his gardener—which poses some uncomfortable questions about Jenner’s moral compass—but lo and behold, there was no reaction! Jenner called this process vaccination, from vacca, the Latin word for cow. Recent DNA research suggests that the disease Jenner was working with may have been transferred from horses to cow, possibly through farriers, whose farm duties often involved shoeing horses and milking cows. In which case you could say that we are all getting equivaccinations this year.
Vaccinations caught on quickly in the United States and, in 1813, President James Madison signed a bill creating the National Vaccine Agency. Part of that bill waived postage fees for vaccine material. England went one better, making vaccination of infants mandatory; if parents refused, they faced possible imprisonment. In 1863, little more than halfway through Abraham Lincoln’s first term as President, he contracted smallpox and was desperately ill for a month before recovering. His valet, William Henry Johnson, was not so lucky. He caught smallpox from Lincoln and was dead by the end of January. He had worked with Lincoln going back to his Springfield days. Some historians posit that the president contracted the disease around the time he delivered his Gettysburg Address, which in retrospect had all the earmarks of a textbook super-spreader event.
The next breakthrough in vaccines was discovered, somewhat accidentally, by Louis Pasteur in 1879. The French biologist was the first person to create a vaccine in a lab. An oversight by an assistant during an experiment with chicken cholera demonstrated that exposure to oxygen makes bacteria less deadly. This led the way to several more discoveries, including the rabies vaccine in 1885. By the turn of the century, researchers in the U.S. and Europe were beginning to get the upper hand on typhoid and cholera and, by World War I, the first vaccines for these diseases were becoming available.
There were setbacks, of course, including a terrible story here in New Jersey, where nine children died from tainted smallpox vaccines. That incident led to the Biologics Control Act, which was passed in 1902. However, then as now, the increasing involvement of the government in public health was not always appreciated. In England and America, there was small but vocal opposition to vaccinations, mostly citing an infringement on personal freedom. Massachusetts was the first state to make smallpox vaccinations mandatory and the resulting lawsuit went all the way to the Supreme Court. In 1905, the court upheld the constitutionality of the program in Jacobson v. Massachusetts. In 1909, Americans learned the story of Mary Mallon, the woman dubbed “Typhoid Mary” by the press. She was what we now call an asymptomatic spreader of the disease, who worked as a cook for several wealthy families and, later, for hospitals, hotels and restaurants. Everywhere she worked (often gaining employment under a fake name) a typhoid outbreak soon followed. Mallon finally had to be quarantined on North Brother Island, near Riker’s Island in New York’s East River.
The need for further progress on the vaccine front was made particularly clear in the waning days of the First World War, when the Spanish Flu—which probably originated in a U.S. Army camp in Kansas—spread across the planet and killed tens of millions people. It was a particularly powerful version of the H1N1 influenza virus; it took a particularly high toll on young, healthy adults, who normally weathered the flu without incident. It spread fast, killed quickly and if a patient’s lungs filled with fluid, there was nothing doctors could do. If you got it, you went to bed and waited:
On the last day of that year, 1918, I went up to the shop with a bad cold. Influenza had been raging in [Norwood, MA], the hospital was crowded, and there were deaths almost daily. About two hours after going to work, I collapsed. It had got me. I was taken home, went to bed and a doctor was summoned. I was past 70, but he was the first doctor since my birth who had ever been called to treat me. He found me in great pain, temperature 104, but said my lungs were not affected. For several days as I lay in bed, I did a deal of thinking. I thought it was probable that I was near the end of the race, but I had no dread. I was comforted that I had been able to keep up the payments on my life insurance, and that my wife would have the use of it in her old age.
This patient, my great-grandfather George Stewart, survived to write this account before he died, in 1925. He was never the same, however. George was unable to walk more than a short distance or do anything that required strength, including ascending the flight of stairs leading to the second floor of his home. In his defense, I know those stairs. They are weirdly steep and are available if a movie company is interested in remaking the Hitchcock classic Vertigo. The point here is that the world became fixated on understanding how the influenza virus mutates and spreads, the long-term impact for survivors, and what could be done to give humans a fighting chance. A century has passed and we are still addressing these challenges.
The first important step in creating an effective, mass-produced flu vaccine was to develop a process for growing (or culturing) viruses in large quantities. Scientists already knew how to culture bacteria—give them a comfy medium and they’re off to the races—which was the key to making vaccines for bacterial illnesses. However, viruses don’t reproduce on their own. A virus must first infect a cell before using the cell to make copies of itself, and at the time of the Spanish Flu there was no method for growing viruses outside of a living host. The breakthrough came in the mid-1930s, when two English researchers discovered separately that the flu virus could be grown on the membranes of fertilized hens’ eggs—and soon after isolated the first neutralized antibodies. One of the U.S. scientists working on the flu vaccine at this time was Dr. Jonas Salk, who used what he’d learned in the 1930s to tackle polio in the 1950s.
The U.S. President during this era was Franklin Roosevelt, who had been stricken with polio as a young man. His inability to walk or stand was the best-kept/worst-kept secret in America. During a 1938 radio broadcast, entertainer Eddie Cantor, whose program had an audience of millions, pitched the idea of sending dimes to the White House to help fund polio research. Listeners sent more than 2 million dimes to Washington, triggering a grassroots effort that involved several more national celebrities and eventually became known as The March of Dimes. A year after FDR passed away, his profile replaced the head of the winged liberty on the silver 10-cent piece, commonly known as the Mercury dime. Reach into your pocket and you’ll notice that this coin still bears his image.
Building on a wave of discoveries by virologists and epidemiologists, over the next three decades, life-saving vaccines came fast and furious, including increasingly effective flu vaccines, Diphtheria-Tetanus for infants, and measles and mumps vaccines. The crowning achievement of the post-war era was the creation of an effective injectable polio vaccine by Dr. Salk. Salk’s work followed several research breakthroughs dating back to the 1930s and was ready for testing in 1952, the same year polio cases in the U.S. surged over 50,000. Testing proceeded over the next two years, culminating in the Francis Field Trial, which involved more than 1.8 million schoolchildren. It was the largest medical experiment in history at that time and the results were extremely positive, despite a one-man anti-vax campaign waged by gossip columnist Walter Winchell. Winchell “uncovered” that several monkeys had died during initial testing and labeled the Salk vaccine a killer, likening it to a phony cancer cure—and adding that even if the vaccine were 99% effective, well, that wasn’t good enough. Winchell was what we would now call an “influencer”…more than 150,000 parents pulled their children out of the study.
Nevertheless, the Salk vaccine was licensed in 1955 and, thanks to an ingenious “rocking bottle” manufacturing method developed by a team led by Canadian biochemist Leone Farrell, went instantly into mass production. An interesting sidenote is that when Salk traveled to Toronto to meet with Farrell’s team, she was barred from the reception because it was held in a men’s-only club. Farrell was an astonishing figure in medicine who is largely forgotten today; she later devised a way to accelerate penicillin production.
Over the last 30 years, vaccines have successfully tackled countless health issues in the U.S. and around the world, including hepatitis, shingles, cervical cancer and the H5N1 avian flu. However, viruses are nimble and clever. When faced with obstacles, they do what we do: evolve and mutate in order to survive. That’s why we get flu shots every year and why sometimes they are only 50% effective. And that’s why scientists are still looking for ways to eradicate tuberculosis, malaria, Lyme disease and hepatitis C. And HIV. And the common cold.
Keep in mind, too, that it takes a billion or more dollars and a decade of research and testing to develop a viable vaccine, and even with government funding, often private industry and investment funds determine what’s “worth” the time, money, effort and risk…and what’s not.
The arrival of COVID-19 on American shores in the winter of 2019–20 and our less-than-stellar response to the virus (including the politicizing of mask-wearing) will leave a public-health and also a cultural legacy that is certain to be with us for a long time. What we will probably forget pretty quickly is how vaccines were going into arms within a year of the start of the pandemic. That is an insanely short period of time to develop a vaccine for a virus that seemed to have everyone baffled at first.
How did it happen so quickly? The short answer is computers. Once the COVID-19 genome was posted by Chinese researchers, the vaccine could essentially be created on a computer screen and then go right into manufacturing. Also, no one in the vaccine game starts from scratch or goes it alone. The COVID-19 vaccines out there now had a scientific running start, which was further accelerated by Operation Warp Speed. The running start in this case were the multiple safe and effective vaccine platforms created since Edward Jenner started tinkering with cowpox. COVID researchers looked at which ones were most likely to offer the highest level of immunity and produce the least side effects, and then got right into determining which proteins, or viral antigens, would generate an immune response that would protect people from the disease.
As it turned out, the quick-turnaround vaccines used mRNA and vector-based platforms. The vector-based strategy was developed in the fight against SARS. Since the viral enemy is technically SARS-CoV-2, that seems logical. The mRNA platform was utilized in treating Zika a few years ago. What both have in common is that they did not exist in the 20th century. They are relatively new but appear to be safe. What is still in question is how long vaccinations will last—like the flu vaccine, we may need a new one every year—and whether they will be effective against the variants that will almost certainly develop in the coming months.
So is 2021 the Year of the Cat? Did we catch and kill the mouse? It may be too early to claim total victory. The mouse undoubtedly has a few more tricks up its sleeve. Also, human behavior is nothing if not unpredictable; even when Americans are “fully vaccinated,” there will be outliers who refuse the needle. And finally, just because we are vaccinated, that doesn’t mean they are. And by they I mean a billion or more people in developing countries who are unlikely to be offered, or avail themselves of, a vaccine—or who fall prey to supply-chain problems, as happened in India this past spring.
This is where the mouse loves to play. Hopefully the cat is watching. EDGE
Americans are hitting the road (and skies) in the second half of 2021 like nobody’s business. Here are some of the best cases and travel bags on the market…
What is Bitcoin, how does it work, and why am I not a billionaire?
A boy asked his Bitcoin-investing father for $10. His father said, “Twelve thousand dollars? Son, what do you need two dollars and fifty-seven cents for?” If you don’t get the joke, don’t worry—make it to the end of this article and I promise that you will. To its critics, Bitcoin and other cryptocurrencies—including Ethereum, Dogecoin and many others—are a pipe dream at best, a Ponzi scheme at worst. To its advocates, they’re the wave of the future, the ideal alternatives to precious metals in the rocky economic times and runaway inflation they are convinced lay ahead.
These aren’t just any critics and advocates. The most successful and credentialed economists and financial entrepreneurs on the planet are at complete loggerheads over the subject. So if you’re flummoxed by the idea of cryptocurrency as “digital gold,” don’t beat yourself up over it—you’re in excellent company.
I feel a good first step to understanding cryptocurrency is to break down the word itself. Crypto = encoded. Currency = a proxy for money. That’s clear enough, right? Oh, sure. As Groucho Marx said in Duck Soup, “Clear? Hah! Why a four-year-old child could understand this…run out and find me a four-year-old child, I can’t make head or tail of it.”
Fair warning: I don’t profess to be an expert by any means; in fact, I approached this assignment in the most basic way, by asking: if I had $10,000 lying around in a shoebox, would I invest it in Bitcoin? Please ask yourself the same question, as we embark on a journey into the world of cryptocurrency and its alternatives. What follows may seem painfully obvious to those who make their living in the banking and financial sectors, but for the rest of us knuckle-draggers, it’s important that we devote a few paragraphs to picking this question apart.
What Is Currency, Anyway?
Okay, so let’s look at the second part first. What does a “proxy for money” mean? Very early on in human history, people realized that it was far more efficient to trade goods and services using a universally valued and quantifiable medium of exchange, rather than direct barter. For 3,000 or so years, the preferred commodities have been gold or silver, usually in the form of small bars or coins. The problem for everyday people was assuming the burden and risk of carrying around these precious metals. Six hundred years ago, in Renaissance Italy—thanks in no small part to the Medicis and double-entry bookkeeping—“modern” banks and paper currency were invented. A bank or government treasury would hold gold and silver as a safe deposit, either for a fee or as a loan in exchange for interest paid, and issue receipt certificates (we call them bills) in denominations small enough to be used in daily commerce. They were redeemable on demand at the bank for the corresponding amount of physical gold and silver.
During the 20th century—as world war led to depression, to another world war, to inflation, etc.—the bond between gold and silver and paper currency was stretched to the limit and finally severed. If you are in your 60s or 70s, you may remember dollar bills used to be called Silver Certificates, and that until 1965, the dimes, quarters and half-dollars you received in change when you broke a dollar bill were partially made of silver. In August 1971, the Nixon administration “temporarily” suspended the convertibility of dollars to gold. That gold window is still closed, which makes the U.S. dollar a fiat currency.
Well there’s an interesting word. Fiat is Latin for “Let it Be (by decree)”—meaning that the bills in your wallet have value only because our government orders it so. We call dollars “legal tender” because coins or bank notes, by law, must be accepted when offered as payment “for all debts, public and private.”
Fiat currency has two main drawbacks: First, a government-authorized central bank issues and controls it; second, it is not limited by quantity. Our central bank (the Federal Reserve) can create as much as it wants, whenever it wants, and inflate the supply. What’s the problem with this? The fundamental rule of supply and demand says the more dollars that are created, the less value each one has. For example, a U.S. dollar in 2021 has less purchasing power for goods and services than a nickel in 1913.
Something else to understand about 2021 dollars: unlike that 1913 nickel, they are mostly digital. Today, we mainly use debit and credit cards, services like PayPal, Venmo, or various types of wire transfers. Once computers were available to keep track of how and where money moved, the transition away from paper and coins was fairly simple, since a centralized issuing authority and banking system already existed. The primary hurdle was not figuring out how to create a digital file that represents a dollar, but how to prevent someone from using that same dollar more than once (aka double-spending). Banks solved this problem by keeping centralized ledgers on their computer servers that record and confirm the transactions made by their account holders. We trust our banks. Our banks trust their servers. But digital transactions are still centralized, and predicated upon that fiat value of the U.S. dollar.
Now For the Crypto Part
Ask yourself this: What if you could enter into transactions with anyone, anywhere on earth, without using U.S. dollars—without worrying about the government and the Federal Reserve inflating away the value of your currency, and without a central authority keeping an eye on who is sending or receiving money? Many attempts had been made to realize this possibility, but none succeeded in eliminating the aforementioned risk of double-spending.
That all changed on October 31, 2008, when a document was published online—by an anonymous person (or persons) named Satoshi Nakamoto—entitled: Bitcoin: A Peer-to-Peer Electronic Cash System. It laid out the blueprint for creating an alternative digital currency using a transparent, decentralized ledger “…allowing any two willing parties to transact directly with each other without the need for a trusted third party.”
Why was this such a big deal? Bitcoin was the first digital “currency” not issued by government fiat or controlled by a central bank. Think about what the world was like before the internet, and how centralized information was. You could read newspapers like the New York Times or Washington Post, watch the evening news on the three major networks, pick up a copy of Time or Newsweek, or go to a bookstore or library and get “old information.” The internet decentralized news and information. Advocates of Bitcoin call it “the internet of currency.”
Bitcoin is called a cryptocurrency because of the encryption technology used in every single transaction, called a blockchain. Every single time that a transaction occurs, an enormous group (currently 80,000 or so) of individual server operators (called “miners”) compete to be the first to unlock its cryptography, in order to verify that the transaction is legitimate and that the same unit of Bitcoin is not being used more than once. Once a certain number of transactions are verified, they are grouped together in a “block,” which is then added to the “chain” of already existing “blocks.” Every block added to the chain creates a more unbreakable and trustworthy chain of transactions. The “miners” who decrypt, verify and record the transactions in the blockchain earn new Bitcoins as compensation for their work, which are then introduced into circulation. The Bitcoin blockchain is programmed so that only 21 million Bitcoins will ever exist—a predictably scarce, finite supply. At present, 19 or so million have been created in the manner just described, hence the nickname “digital gold”.
As you might have heard, the “mining” computers consume a jaw-dropping amount of energy, which has some potentially dire implications for crypto’s future, but let’s leave that to another time because it is just as interesting as (and even more complex than) the subject matter at hand.
If you feel more than a little bit lost at this point, take a deep breath and hang in there. Elon Musk likes to say that he’s one of the smartest people on the planet, but admits that he doesn’t “really understand” Bitcoin. Still and yet, he’s reportedly dumped billions into it.
Musk certainly understands the security advantages of cryptocurrency, and appreciates its transparency. Banks are not transparent; they keep a proprietary ledger on a central server, and only they can see and control what’s going on. Bitcoin’s blockchain is both transparent and totally anonymous. If you are a holder of Bitcoin, you can see all of the transactions and balances, but can’t identify any fellow Bitcoin holders. Every server in the Bitcoin network holds a duplicate, continually updated copy of the blockchain, which makes hacking or destroying the network virtually impossible, since “bad actors” would have to hack or destroy more than half (51 percent) of the servers simultaneously.
So What Exactly Is a Bitcoin?
Is Bitcoin a currency? An asset? Both? Neither? When I ask myself these questions, I think of Dr. McCoy in Star Trek: “Dammit, Jim, I’m a doctor, not a cryptocurrency Guru.” For starters, there are no physical Bitcoins. When you own Bitcoin, it means that you own access to your specific account record in the blockchain and can send or receive Bitcoin from other accounts. The more crucial question is: What’s it worth? Simple answer: The value of Bitcoin is entirely reliant on the number of people who decide what it is. In this way, it’s similar to gold. But remember—gold is a tangible hard asset, and Bitcoin is a digital asset.
Bitcoin has three major advantages to the current government fiat system. First, it gives you complete control over your currency. You and you alone have access to your account. No government or bank can freeze or confiscate your holdings. Second, Bitcoin eliminates third-party fiduciary intermediaries, so it’s potentially cheaper to use than traditional methods of transaction. Third, and this is perhaps the best advantage, it opens up global commerce to 2.5 billion people who don’t have access to the existing banking regime. The “underbanked”—the poorest, most disadvantaged and unfortunate of our fellow human beings—can buy, sell, or pay for goods and services because their bank is on a smartphone in their pocket. Stop and think for a moment about the potential of such a global blossoming of human capital.
More and more merchants are accepting Bitcoin and other cryptocurrency as payment, as are airlines, hotels and restaurants. Still, the world has taken only the first steps to global commercial acceptance.
Who’s on Board with Cryptocurrency?
Not everyone. At least not yet. Some financial professionals, including Peter Schiff and Michael Burry (portrayed by Christian Bale in The Big Short), see Bitcoin and other crypto assets as a “greater fool” game of hot potato or musical chairs. What if, suddenly, people all cash in their positions? Businesses might no longer accept it as payment, and its value could plummet to zero. Perhaps the important hedge against this happening is that Bitcoin is not issued or controlled by a central authority or government. Governments can theoretically make their currencies worthless by fiat, but they can’t make Bitcoin worthless by fiat. Also, there is no danger that the value of Bitcoin can be inflated or deflated by the creation of more Bitcoins, as there is that fixed number I mentioned earlier. Their value is determined by their holders and users, not a central authority. Another way to think of it is that the value of dollars is set by the government and the Fed from the inside out. By contrast, the value of Bitcoin is set by its holders and users from the outside in.
Where does the U.S. government (or any other government for that matter) stand on Bitcoin and other cryptocurrency? Let’s just say that Washington has serious concerns. Because there is a finite amount of Bitcoin, it may become more appealing to investors who traditionally put their money into Treasury bonds to look at Bitcoin as a safe haven. Even before the COVID-19 pandemic, our economy and that of other nations were becoming more and more dependent on government spending of trillions of freshly printed fiat currency to prop up their gross domestic products, in traditional Keynesian fashion.
Do you believe that our government will be able to service its soaring debt in the future? A lot of people have their doubts. If investors who currently keep their money in dollars and treasury bonds turn to cryptocurrency like Bitcoin, it will make it harder for the government to issue more debt, because someone actually has to buy it. You probably don’t hear much about this problem unless you’re an Econ nerd. The problem that you do hear about is that since the government cannot monitor or verify cryptocurrency transactions, they enable illicit and criminal activities like money laundering, drug dealing and terrorism—unquestionably very real problems with no immediate solutions. However, it’s not as if dollars aren’t being laundered or used by drug cartels or terrorists, too. Of course, there are many reasons (other than nefarious ones) why people might not want the government insinuating itself in their business. Libertarians, for example, believe that the government by definition is badly motivated (Google “Welfare and Warfare State”) and hold strong moral objections to its knowing how or with whom they transact business.
The truth of the matter is that, any time that people act entrepreneurially and seemingly under the radar of government, it tends to raise suspicions. But institutional investors and advocates of Bitcoin aren’t outlaws. They are simply placing their bets on a rival to the government’s primary asset, which is the full faith and credit of its currency, because the value of Bitcoin goes up when confidence in the dollar as a store of value drops. Bottom line? If you sense that your government is going to continue inflating its currency, then cryptocurrency is a possible hedge.
As mentioned earlier, Elon Musk (actually, Tesla) has invested billions in Bitcoin. So has Michael Saylor, the longest-sitting tech CEO in the country. The takeaway is that they believe more in Bitcoin as a repository of value than the U.S. dollar or Treasury bonds. The more new Musks and Saylors there are who follow suit, the more confidence will be generated in Bitcoin’s value. Its success or failure depends entirely on public perception, which is why so many remain so skeptical. However, as long as people are uncomfortable with the way their governments are handling money, diversifying into crypto assets is likely to remain an appealing option.
Will I be dumping my liquid assets into Bitcoin anytime soon? The short answer is Not yet. I would be more inclined to invest in gold, silver and durable commodities like grains and farmland. Bitcoin is way too volatile for my taste and bank account. Any investment that doubles in a year, as Bitcoin has, is tempting to be sure. But I don’t think I could stomach a one-day drop in value of 30%, which has also happened. The prospect of becoming a high-flying crypto-millionaire may be appealing, but the prospect of living under a highway overpass, however remote, isn’t worth the risk. As the old adage goes: Only invest what you are prepared to lose. EDGE
Editor’s Note: To say that the cryptocurrency world changes every day is an understatement. As this story went to press, the Government of El Salvador declared Bitcoin to be legal tender. Five other Central American and Caribbean Nations might follow their lead this summer. And, without revealing exactly how, the FBI announced that it managed to claw back most or all of the ransomware Bitcoins paid to hackers of the Colonial Pipeline, raising fascinating questions about just how secure and anonymous cryptocurrency may actually be.
History’s greatest ‘best-case’ scenarios.
Roughly 3,000 years ago, the Israelites and Philistines faced off in the Valley of Elah for what promised to be a costly, no-win bloodbath. The two sides did the sensible thing and agreed to settle their differences in single combat, but the Hebrew warrior-king Saul was none too keen on tangling with the giant Goliath. People were measured in cubits back then and Goliath had cubits to spare; most Biblical scholars believe that he stood nearly seven feet tall, so Saul’s reluctance is understandable. You know the rest of the story: A boy named David, tasked with bringing food to his older brothers on the front lines, volunteered to take on the towering Philistine by himself. He selected five smooth stones from a riverbank, eschewed Saul’s offer of armor and spear in favor of his trusty sling and—after some Old Testament trash-talking—took out Goliath with a rock to the forehead…and then decapitated his corpse (because that’s how they rolled back then) as the defeated Philistines hustled back to wherever it was they came from.
As “best-case scenarios” go, this one is arguably the great-grand-daddy of them all, especially considering the worst-case alternative: the near certainty that David would end up squirming in agony on the business end of Goliath’s javelin, his head soon to be a hood ornament on the giant’s chariot. Over the 30-plus centuries since, we have used this Biblical encounter to characterize the most extreme mismatches or impossible odds. When you say “David versus Goliath” everyone knows exactly what you mean. For what it’s worth, the same Biblical scholars who worked out Goliath’s height have also suggested that David might not have been the stone-slinging hero that day. Elhanen may have taken out Goliath; or perhaps David and Elhanen were one in the same. Regardless, it is David whose name we remember. It’s a testament to good press, no pun intended.
What makes an iconic, all-time great best-case scenario? You begin with a bleak set of circumstances, brought on by bad luck, poor decisions or misplaced courage. Then you need to consider the horrible (and highly likely) worst-case scenario. Finally, there has to be a magnificent, wow-factor plot twist—one that almost no one saw coming—that transforms a worst case into a best one. Given those ground rules, these are some of my favorites…
Battle of Salamis • 480 BC
The Persian Empire, under King Xerxes, invades Greece—then a loose confederation of city-states—and crushes Greek land forces in battle after battle. Athenian general Themosticles hatches a plan to lure the Persian fleet of 1,200 warships into a decisive battle against 180 Athenian vessels in the narrow strait of Salamis.
Worst Case Scenario: Persia destroys the Greek fleet, picks off the remaining city-states and snuffs out Western civilization before it begins.
How It Went Down: The Persian vessels press for a decisive victory, but their lines become jumbled as the battlefield narrowed, just as Themosticles had planned. The Greek ships form a wedge and ram their way through the disorganized Persian navy, sending heavily armored marines streaming onto enemy boats against the lightly armored Persian fighters. Xerxes may have lost half or more of his fleet in the debacle and Greece was never threatened by Persia again.
The American Revolution • 1775
British colonists, unhappy with their lack of say in Parliament, decide to take on the most powerful global military force in history for an idea: Freedom.
Worst Case Scenario: The Redcoats send their best troops to America, crush the ill-equipped and poorly trained colonial rebels, hang the signers of the Declaration of Independence, and then return to business as usual.
How It Went Down: Fighting an “idea” turns out to be a losing battle, at least in this case. The Americans know they had home-field advantage: They didn’t have to beat the British, only wear them out. With help from France, Washington’s army traps Cornwallis and his army at Yorktown and the rest is history. P.S. England saw New Jersey as a convenient highway between Philadelphia and New York City. How wrong they were!
Harland Sanders 1930
An impulsive, self-righteous and occasionally violent 40-year-old ex-manual laborer and disgraced attorney is hired to run a Shell station in Depression-era Kentucky. He begins selling fried chicken out of the structure to make ends meet.
Worst Case Scenario: Sanders blows yet another employment opportunity or, worse, kills an unwary traveler with tainted chicken.
How It Went Down: Thanks to his “secret recipe” and pressure-cooking method, Sanders turns Kentucky Fried Chicken into one of the great franchise operations in the world during the 1950s and 1960s. Sanders sells to John Y. Brown for $2 million in 1964 and becomes KFC’s brand ambassador.
Harry Truman 1945
A former haberdasher and Kansas City ward healer chosen by President Franklin Roosevelt as his 1944 running mate, the lowly regarded and largely marginalized Truman ascends to the oval office after FDR’s death. Truman has never been told about the atomic bomb, believes that Russia and China might make good postwar allies, and has a habit of personally attacking anyone he feels has slighted him.
Worst Case Scenario: Truman’s inexperience, stubbornness and indecisive leadership prolongs the war in the Pacific, leaves Europe in ruins and chokes off the American economy, leading to a humiliating defeat in the 1948 presidential election.
How It Went Down: Truman brings the war to a rapid conclusion, resurrects Europe with the Marshall Plan, keeps Greece and Turkey out of Communist hands, orchestrates the Berlin Airlift, supports the formation of NATO and the United Nations, pioneers legislation that creates the FHA and ends segregation in the armed forces. And still almost loses the 1948 election!
Polio Vaccine 1953
While still in testing phase, Dr. Jonas Salk brings home samples of his polio vaccine and inoculates his three young sons. Prior to this, Salk had mostly administered the vaccine to monkeys.
Worst Case Scenario: His sons turn into monkeys. No, just kidding. A life-threatening adverse reaction and the bad publicity accompanying it set back the polio program for years.
How It Went Down: The Salk children experience no ill effects and America welcomes the news with unrestrained joy. “There was jubilation,” Peter Salk recalls. “There was such a sense of relief that this fear, which had been hanging over everyone’s heads for years and years and years, was finally lifted.”
Lasse Viren • 1972
A police officer makes Finland’s Olympic team and reaches the finals of the 10,000 meters at the Summer Games in Munich.
Worst Case Scenario: He literally falls flat on his face. Which he did, midway through the race.
How It Went Down: Viren gets to his feet, catches up with the pack and stuns the crowd by starting his “kick” with more than a lap to go, winning gold in world-record time. In the 5,000 meters a week later, Viren wins again, beating heralded American Steve Prefontaine and other top international stars.
Botulinum Toxin 1977
Ophthalmologist Dr. Alan Scott begins injecting botulinum type A neurotoxin into patients to treat strabismus, a condition that causes eyes to cross or diverge.
Worst Case Scenario: The neurotoxic protein is considered nature’s most poisonous substance; so blindness, paralysis and death.
How It Went Down: The experiments are an unqualified success and during the 1980s, Dr. Scott trains hundreds of ophthalmologists how to inject the drug he names Oculinum. In 1991, he sells Oculinum for $9 million to the drug company Allergan, which renames it Botox. In 2002, Botox is approved by the FDA for cosmetic use.
The McRib • 1981
Following a promising test-marketing run, the McRib sandwich becomes a regular menu item at McDonald’s…and is a disastrous failure. It is removed from U.S. stores in 1985, then returns on a “limited” basis nine years later as a tie-in to The Flintstones live-action movie, with packaging that features Rosie O’Donnell as Betty Rubble.
Worst Case Scenario: The gray, “restructured” pork slab is a critical flop in its second incarnation, much as The Flintstones was.
How It Went Down: Over the next two decades, McDonald’s offers the McRib through special limited-time promotions before announcing a “Farewell Tour” for the sandwich in 2005. A grassroots consumer effort to save the McRib creates a groundswell of demand, which is further accelerated by social media posts that help fans “chase” the McRib wherever it is being offered. In 2020, during the COVID-19 pandemic, McDonald’s makes the McRib available nationwide for the first time.
Pabst Blue Ribbon • 1996
A beer brand in business for more than 150 years shutters its flagship brewery in Milwaukee following two decades of declining revenue. Five years later, the company hires a former Benetton exec to turn the company around.
Worst Case Scenario: Marketing a beer brand that has lost 90% of its customers proves a bit more difficult than selling sweaters…and “PBR” joins Jax, Falstaff, Schmidt’s, Grain Belt and Narragansett in the pantheon of extinct and zombie beer brands.
How It Went Down: One word: Hipsters. Pabst finds a new fan base with urban 20-somethings in search of the next dive bar and an ironic down-market beer. The company pours marketing dollars into sponsorships of indie rock, cool small businesses, post-college sports leagues and social media…and the brand returns to market prominence.
The Blair Witch Project • 1999
Wannabe film makers Daniel Myrick and Eduardo Sanchez write an improvised script about a trio of hikers who disappear in the Maryland woods. They blow through their $25,000 budget in eight days and shoot 20 hours of action to create a grainy 82-minute film they hope will go straight to cable.
Worst Case Scenario: A sci-fi film with no witch and no special effects is of little interest to distributors, no network will air it, critics pan it, audiences hate it, Myrick and Sanchez are broke and never find work in the movie industry again.
How It Went Down: Blair Witch generates buzz at the 1999 Sundance Festival by listing its actors as missing or dead and is marketed to filmgoers almost exclusively through the Internet, receiving more than 150 million hits. It grosses $250 million, returning ten thousand times its original cost and pioneering the genre of found-footage film making.
Deadliest Catch • 2005
Ex-Turner Broadcasting honcho Thom Beers films a pair of one-hour documentaries about crab fishing in Alaskan waters during the 1990s and successfully shops the concept as a cable-TV reality series.
Worst Case Scenario: Audiences quickly lose interest in a show hinging on how many crustaceans tumble out of a wire trap…episode after episode after episode.
How It Went Down: The gritty, true-to-life documentary style of the show—backed by the narration of Dirty Jobs star Mike Rowe—transfixes audiences, who learn that commercial fishing is not only wildly unpredictable…it does indeed have the highest mortality rate of any profession in the world. Deadliest Catch aired its 250th episode in 2020 and is still going strong.
Medical breakthroughs you may have missed during the pandemic.
With the 2020–21 news cycle hyper-focused on COVID-19, these five advances in medicine went almost unnoticed…
We may look back at 2020 as the year that we quietly turned the corner on Alzheimer’s, thanks to a couple of important breakthroughs in identifying biomarkers, as well as the development of a new drug. In the early days of the COVID-19 pandemic, the FDA approved Flortaucipir, a radioactive diagnostic agent used to image tau neurofibrillary tangles in the brain. In October, doctors began using the new PrecivityAD blood test to determine whether a patient is likely to have the presence (or absence) of amyloid plaques in the brain, which is a pathological hallmark of Alzheimer’s disease. In early November, the human monoclonal antibody Aducanumab—the first new drug in a generation to treat Alzheimer’s—moved an important step closer to final FDA approval. In addition to these news items, dozens of major non-drug studies on the effects of supplements, diet, exercise and sleep on cognitive decline continued to generate mounds of useful data for the prediction, diagnosis and treatment of the disease.
A couple of years ago, fewer than one in 500 doctor visits in the U.S. were virtual. Although the technology was in place, the impetus for patients, health providers and insurers just wasn’t. That all changed when we began masking up and hunkering down. Around one in 10 interactions fell under the Telehealth heading during the early months of the pandemic. While the healthcare industry is still sorting through the Mt. Everest of new data this shift generated, it is safe to say that Telehealth is no longer a solution looking for a problem to solve. At Trinitas, Telehealth has enabled the adult Dialectical Behavior Therapy (DBT) program to serve a larger audience at a critical time.
“Distance from our location does not have to be a barrier any more— explains Essie Larson, Ph.D., and co-director of the DBT Institute at the hospital. “The pandemic also allowed for large-scale research around the world to be conducted on the effectiveness of DBT in a virtual modality. The results are indicating not only great success, but also better attendance rates. While virtual services may not be a fit for everyone, having it as an option for either ongoing therapy or for occasional sessions—due to barriers such as bad weather, illness or car trouble, for example—is wonderful.”
It may have taken a pandemic to provide proof of concept, but Telehealth has become a tool ideally suited for a wide range of challenges to the traditional face-to-face, doctor-patient relationship, including continuity of care, triage and plain old capacity.
Last fall, the American Heart Association proclaimed SGLT2 inhibitors and GLP-1 receptor agonists—blood-sugar control medications that are prescribed primarily for Type 2 diabetes—as being game-changers for patients with higher risks for cardiovascular disease and chronic kidney disease. “There are many drugs available to treat diabetes that lower the patient’s blood sugar,” says Dr. Ari Eckman, an endocrinologist at Trinitas. “The class of drugs of SGL2 inhibitors and GLP-1 agonists not only lowers the patient’s blood sugar—which is important in managing their diabetes—but it has also been shown to decrease the risk of complications of heart disease and kidney disease. This is an additional benefit to using these excellent medications for patients with diabetes.”
The drugs have been around for over a decade but have not been widely prescribed for diabetics with these conditions. Clinical trials completed earlier in 2020 found that SGLT2 inhibitors and GLP-1 RAs can safely and significantly reduce the risk of cardiovascular events and death, reduce hospitalization and slow the progression of chronic to end-stage kidney disease.
A large-scale Iowa State University study of the connection between specific foods and later-in-life cognitive acuity confirms what we all secretly suspected: that the benefits of wine and cheese keep paying dividends long after the wine and cheese party is over. Of all the foods evaluated, cheese was shown to be the most protective against age-related cognitive problems. And moderate regular consumption of red wine was associated with improvements in cognitive function. Other tidbits from the Iowa State results were that excessive salt intake is bad, particularly for individuals at risk for Alzheimer’s (we knew that already) and that, among the red meats, only lamb was shown to improve long-term cognitive ability (good news…unless you are a lamb).
Any parent who has dealt with the nightmare of head lice will appreciate the FDA’s approval of Abametapir, a lotion applied to dry hair and rinsed out with water after 10 to 15 minutes. In clinical trials it was 80% effective at ridding the hair of children 6 months and older of the little buggers. Although virtual learning dramatically curtailed the number of cases in the U.S. of kids with head lice—which in some past school years touched 10 million—lice aren’t going anywhere. Indeed, many have started to develop a resistance to tried and true treatments.
“This new treatment,” says Dr. William Farrer, an infectious disease specialist at Trinitas, “requires a prescription, but does not require a second application, as older treatments often do. When it becomes available, it may be a significant improvement in the treatment of head lice.”
An inside look at New Jersey’s unbelievable real estate boom.
Distressing unemployment numbers. A busted economy. Stay-at-home orders. A terrifying pandemic. In early 2020, it was time to cut your portfolio losses and forget about selling your house, right? Wrong. Against all predictions and, frankly, common sense, the stock market surged and the housing market went completely berserk.
Here in New Jersey, a pandemic-induced perfect storm saw home inventory evaporate by the summer of 2020, triggering bidding wars among panicked buyers fleeing New York City and Philadelphia. They were competing with first-timers hoping to take advantage of historically low fixed interest rates (as low as 2.1%) and voracious house-flippers looking at soaring demand for rental properties. The normally healthy and predictable housing market of 2019—which had only just recovered from the 2008 housing bubble—turned into a heart-pounding, high-stakes game of chicken between desperate buyers and stubborn sellers, which has continued into the summer of 2021. Who the winners are may take a while to sort out.
Pandemic pricing created a strange new normal, where a reasonable asking price was merely a starting point for bigger, better offers. If you were a realtor with a listing, you were sitting pretty. If you were chauffeuring couples from house to house, the property you showed in the morning had offers by afternoon. New Jersey was particularly appealing to unnerved city-dwellers who equated the garden in Garden State with a virus-free suburban/ex-urban/rural landing spot.
The decision to move has become especially popular among Millennials, who represent the hottest age group in today’s market. “These younger buyers are ready to buy into the boom,” says Frank Isoldi of Coldwell Banker in Westfield. They are well-educated and well-versed in the finances of home buying. As they enter their 30s and their earning power increases, they begin to plan ahead—for a first home, for starting a family, for more space. They appreciate the financial perks that come with investing rather than renting. They are comfortable working remotely for companies seeking to reduce in-office space requirements without compromising personnel performance. This change in the business culture eliminates the need to “live close to the office” because the office can be right at home. As commuting has become a non-issue for many younger buyers, the concept of “geographically attractive” has been dramatically redefined.
That being said, New Jersey has become a particular hotbed of relocation activity because of its proximity to New York. You may not need to pull a 9 to 5 in the city every day, but physical proximity to clients, customers and co-workers—not to mention top-notch entertainment and dining—will never completely lose its importance or appeal.
Perception has also contributed to a red-hot New Jersey real estate market in 2020–21. Indeed, we are experiencing one of the rare moments in history where buyers believe it is the ideal time to buy and sellers are convinced this is the perfect time to sell. Just ask anyone in real estate: If you don’t move quickly, someone else almost certainly will.
“My buyers’ only concern is getting a house,” says Isoldi, who cites a tangible move from rent to buy along with COVID-phobia and lock-up lethargy as significant motivators. The downside of this frenetic pace is “fatigue,” he points out, explaining that some buyers are so exhausted from missing out on homes that they resort to unrealistically padded offers. This has led to a noticeable upswing in the number of withdrawals during the requisite three-day attorney review period, when some buyers and even some sellers have been known to back out.
“This scenario is not for the weak of heart,” agrees Stephen Smith, a realtor with Berkshire Hathaway on the Rumson peninsula in Monmouth County, who saw selling prices there jump on average 17% to 20% from the shutdown in March 2020 to March 2021. “The challenge for buyers is getting traction for their offer when there are multiple offers above the asking price. In Monmouth County MLS, we have a new ‘Coming Soon’ category for buyers to be able to view a property, online only, ahead of it going live on MLS. There are strict regulations preventing any property ingress during this period, but it enables the buyers to architect their offer with not only the number, but also the seductive terms along with a personal letter. The challenge for an agent is having the energy, the experience and the temperament to help your buyers woo the sellers. It’s a wild-wild west environment for everyone, including the appraisers, who need to keep up with rising values in order to support the financing for the robust contract prices.”
The secret to success? A smart buyer must quickly formulate the best possible offer, one that the seller simply cannot refuse. A bidding tug-of-war among the bravest buyers often leads to a seller resorting to a master list of the “highest and best” offers from which to select a winner. Long gone are the days of emotional and drawn-out (yet sometimes exhilarating) face-to-face negotiations—antidiscrimination laws have made those risky. Gone too is the endless trekking through open houses; realtors have upped their game on virtual tours. It is no longer unheard-of for an out-of-town buyer to make a strong offer on a home he or she has never set foot in.
What makes an offer irresistible? As always, cash is king. A full-asking-price offer is also tempting. An above-asking offer even more so. In lieu of an all-cash offer, buyers have been sweetening deals with the waiving of contingencies that were once accepted as part of the transaction, including no appraisal, no inspection, no house-sale contingency and a convenient and flexible closing date. “Love letters” accompanying offers have become popular, but there is growing concern in the real estate profession that they expose sellers, agents and even buyers to possible legal repercussions down the road (see sidebar on facing page).
Ask a New Jersey realtor what they’ve seen over the past 18 months that they couldn’t have imagined a few years ago and you get some really interesting answers. Jaynie Wagner Carlucci of David Realty Group in Westfield has noticed a couple of aggressive new buyer strategies. One is offering all cash for a house and then refinancing after closing. This eliminates the mortgage contingency, which is appealing to sellers in a hot market. “I have also heard of people losing a bid and then ‘stalking’ sellers by going on social media and finding some way to reach them,” she says. “For instance, they find mutual friends of the seller to create an emotional connection…and sometimes actually change the seller’s mind to win the house.”
Like many realtors in the Garden State, Carlucci has written successful offers for buyers who didn’t set foot in the house until the home inspection. In another instance, she had a client offer $100,000 over asking price with appraisal waiver who still lost the bid. Recently, she posted a fast-forward video of the staging of a house, mostly for fun. She ended up getting calls from 10 realtors, triggering a bidding war before the property was even listed.
Are sellers experiencing remorse that they sold their homes too early, missing future appreciation? Smith answers with a resounding No. Sellers, he says, are appreciative of this opportunity to capitalize on a hot market and are cognizant that the stimulus of this surge in demand may not be sustainable. “And buyers express little or no regret at having paid top dollar, because by the time the deal closes, they have often seen additional appreciation. It’s a win-win scenario, and a win-win-win if you factor in the State of New Jersey, which is experiencing a transfer tax windfall at a time when it needs the money most.”
One change in the business, Isoldi points out, is that the relationship with buyers has become trickier, while sellers have become easier to deal with. Realtors are reticent to make recommendations to buyers about making “best offers”—particularly waiving contingencies—but also when it comes to sweetening the deal in other ways. One buyer, he says, resorted to dangling expensive box seats at a critical Yankees game. Sellers, on the other hand, are more open to advice now, since the best agents come loaded with detailed metrics, a long list of “comps” and recent contract closings. And of course, proposing an attractive asking price is easier in a town such as Westfield, with an enviable mix of home sales, from brand new to historical vintage, in all price ranges.
Isoldi has been enjoying the energy and buzz in the marketplace and doesn’t mind the chaos. As he looks to the future, he is convinced that the market will stay strong, although it might not continue to rise at the same rate. In his opinion, there probably will come a point at which more buyers start to get cold feet and the seller boom levels off. Still, he points out, if interest rates stay low—and if the construction industry rebounds to feed the inventory, jobs increase and the economy stabilizes—the prognosis looks excellent. Despite so many “ifs,” according to Isoldi, “There’s still a lot of wind left in real estate’s sails.”
The real estate market shows no sign of cooling down thanks to the widespread availability of vaccinations and some positive signs of economic recovery. New Jersey’s current inventory shortage is projected to persist in the near term, keeping prices high through 2021 and into 2022. The consensus is that sellers will continue to maintain the edge in the tug of war with buyers, who will continue to find daring and creative ways to make their bid the best.
“Timing is everything when houses are selling as soon as they hit the market—or before,” Carlucci says. “Being able to act fast and in a compelling way is the key.”
Match the properties with the stars!
In the seller’s market of 2020–21, a lot of over-the-top homes in America sold for big money. No bigger than on that “other” coast, where some eye-popping California properties went up for sale during the pandemic. For eight figures—guess what?—you get a half-decent view!