Bad things tend to dominate commentary on the news, causing us to miss the continued reality of human progress. So I have promised to pause occasionally and put together a brief roundup of good news.
A lot of that good news is about science and technology—there is some mixed news from the “year of elections,” but that’s another roundup—but I can’t pass up mentioning one other big story: the US cricket team’s surprise win over Pakistan. I don’t know much about cricket, but I’ve read the British newspapers enough to know that Pakistan is one of the world’s top teams, so I have an idea how big this is.
CNN tries to explain it to Americans by comparing it to the Miracle on Ice, the surprise American win over the Russians at the 1980 Olympics.
Call it the Miracle on Grass.
A national cricket team that most Americans didn’t even know they had beat one of the sport’s global powers Pakistan, for whom the game is a national obsession….
Cricket briefly flickered into US consciousness as the result popped up on news sites in a rare moment for a sport that lives in obscurity in the United States outside the South Asian and Caribbean communities.
“Beating Pakistan in the World Cup is going to open many doors for us,” said USA Cricket captain Monank Patel in Texas, where the game took place in a converted minor league baseball park.
Corey Anderson, who represented New Zealand internationally and now plays for the US team, told CNN he got hundreds of text messages after the win. “It’s probably shocked the cricketing world,” said the 33-year-old who has an American wife and kids.
This report includes a brief explainer of how the game of cricket works—and, well, it almost made sense. I’ll keep working at it.
I just wrote a draft on this for Discourse making the argument that if America can win at cricket, a game vanishingly few of us even understand, then we can win at anything. And the key to this, as the excerpt above indicates, is being willing to import talent—such as the Indian, Caribbean, and other Anglosphere immigrants who make up the US cricket team.
I’ll post that piece when it goes up.
Situation Normal: We Think It’s All Fouled Up
One of our biases toward disaster and apocalypse is that we tend to pay a lot of attention to problems when they are problems—but the moment something ceases to be a crisis, the moment it returns to normal, it drops out of the news and is forgotten. We remember it only as a problem and not as a solution.
You see this in politics when crime spikes briefly for a year or so and everyone becomes convinced that crime is spiraling out of control. They go on thinking this, even as crime falls back down to previous levels. A Pew overview sums it up: “Americans tend to believe crime is up, even when official data shows it is down.” You could say maybe the people know better, and their own personal experience is telling them something that is not captured in the statistics. But: “While perceptions of rising crime at the national level are common, fewer Americans believe crime is up in their own communities.” In other words, everybody thinks crime is high somewhere else—probably New York City, or maybe San Francisco.
Meanwhile, crime is way down again.
The new fourth-quarter numbers showed a 13% decline in murder in 2023 from 2022, a 6% decline in reported violent crime and a 4% decline in reported property crime. That’s based on data from around 13,000 law enforcement agencies, policing about 82% of the US population, that provided the FBI with data through December.
“It suggests that when we get the final data in October, we will have seen likely the largest one-year decline in murder that has ever been recorded,” said Jeff Asher, a former CIA analyst who now studies crime trends.
The big picture here is that crime rates actually did increase precipitously from the 1960s through the 1980s. Then in the 1990s, they began a steady long-term decrease to the relatively safe levels of the last decade. There was a brief spike in 2020, as the pandemic hit, and there are various theories about why this happened. Yes, the George Floyd protests and the “defund the police” movement in 2020 probably had an effect. But for the most part, the police were never actually defunded, and the increase and subsequent decrease in crime also happened in places that were not directly affected by those trends.
The explanation offered in the NBC News report I linked to above seems more plausible.
Asher and other experts say the biggest factor behind the drop in crime may simply be the resumption of anti-crime initiatives by local governments and courts that had stopped during the pandemic.
“After a terrible period of underfunding and understaffing caused by the pandemic, local governments have, by most measures, returned to pre-pandemic levels,” wrote John Roman, a criminologist at the University of Chicago. In an interview, Roman said, “The courts were closed, a lot of cops got sick, a lot of police agencies told their officers not to interact with the public. Teachers were not in schools, not working with kids.”
The pandemic caused a lot of things to grind to a halt, including various anti-crime measures. Once they got going again, things returned to normal, as they did in most other areas. But due to our biases toward gloom and doom, it takes a long time for the public’s feelings about crime to readjust to the reality.
To borrow from the family friendly version I heard as a kid to explain the acronym SNAFU, the situation is normal—it’s not all fouled up, but we think it is.
Call Off the Beepocalypse
Something similar has happened in a rather different field. The Washington Postrecently reported that honeybees are back.
After almost two decades of relentless colony collapse coverage and years of grieving suspiciously clean windshields, we were stunned to run the numbers on the new Census of Agriculture (otherwise known as that wonderful time every five years where the government counts all the llamas): America’s honeybee population has rocketed to an all-time high.
We’ve added almost a million bee colonies in the past five years. We now have 3.8 million, the census shows. Since 2007, the first census after alarming bee die-offs began in 2006, the honeybee has been the fastest-growing livestock segment in the country! And that doesn’t count feral honeybees, which may outnumber their captive cousins several times over.
The Post report tries to return to its normal by focusing on the gloom-and-doom aspect of the story. (We don’t know very much about the population of non-domesticated bees.) But the Competitive Enterprise Institute has an overview of how the crisis disappeared.
Researchers were long been split on what was causing the bees to die. Some claimed that infestations of tiny mites were killing the bees and causing what biologists termed “colony collapse disorder,” or CCD. Other bee experts insisted that agricultural pesticides were to blame, and that farmers who were trying to kill off other insect pests were unintentionally destroying their own livelihood and future by damaging their most important pollinators, the honeybees.
My former Competitive Enterprise Institute colleague Angela Logomasini wrote quite a bit about this debate when it was raging…. “Honeybees will not go extinct any time soon for the same reason we don’t fear the loss of cows or chickens,” she predicted. “All these species have important market value. Honeybees are largely a domesticated species, not completely different from cattle or even the family dog, and their very presence here in the United States has always been driven by the desire for honey or pollination services. In fact, when the colonists appeared in America, there were no honeybees. They had to be imported from Europe so that the settlers could have an affordable supply of honey.”
And that is basically the same reason reported by the Washington Post for why bee populations have improved…. That is the opposite of the criticism that we heard for years, which was that a greedy capitalist desire to maximize farm yields led large farm operators to overuse pesticides, which were supposedly pushing bees to the brink of extinction….
It’s certainly true that some hives in the mid-to-late 2000s were dying off, raising the cost of pollination services for farms. In response, though, people in the agriculture industry invested more time and money into the world of beekeeping, and now we have more bees than ever.
The folks at CEI are great fans of the economist Julian Simon, and this story is a good example of his argument that the “ultimate resource” is human ingenuity, which has proven capable of averting every claimed apocalypse.
Climate Resilience
Some improvements in human life come so gradually that we don’t notice them until someone finds a way to measure the difference. In this case, Our World in Data put together a review of long-term improvements in weather forecasting.
The Met Office says its four-day forecasts are now as accurate as its one-day forecasts were 30 years ago….
The European Centre for Medium-Range Weather Forecasts (ECMWF) produces global numerical weather models. While national weather agencies use much higher-resolution processing to get local forecasts, these global models provide a crucial input into these systems. The ECMWF publishes analyses of its errors over time. This is shown in the chart below. It shows the difference between the forecast and the actual weather outcome for forecasts 3, 5, 7, and 10 days in advance….
Three-day forecasts—shown in blue—have been pretty accurate since the 1980s, and have still gotten a lot better over time. Today the accuracy is around 97%. The biggest improvements we’ve seen are for longer timeframes. By the early 2000s, 5-day forecasts were “highly accurate” and 7-day forecasts are reaching that threshold today. 10-day forecasts aren’t quite there yet but are getting better.
This report also graphs out the improving accuracy of prediction for the paths of hurricanes. “Meteorologists can now make pretty accurate predictions of where a hurricane will hit three or four days in advance, which lets cities and communities prepare while preventing unnecessary evacuations that might have been implemented in the past.”
That leads us to the significance of weather forecasting, which is about a lot more than whether the rain will ruin your picnic.
Accurate forecasts can save lives by giving early warnings of storms, heat waves, and disasters. Farmers use them for agricultural management, which can make the difference between a lost harvest or a harvest of plenty. Grid operators rely on accurate forecasts of temperatures for heating and cooling demand, and how much energy they’ll get from wind and solar farms. Pilots and sailors need them to carry people across oceans safely. Accurate information about future weather is often absolutely vital.
This report also indicates that there is still a gap between developed nations (particularly the US and Europe) versus less developed nations, both in the accuracy of their forecasts and in the networks to broadcast this information to the people who need it—which suggests a relatively inexpensive way to make further improvements in the lives of people in less developed nations.
The Paradox of mRNA
I’m going to have to write something in the near future about the big paradox of the pandemic years, which is that we produced a vaccine in record time that saved many millions of lives—the biggest demonstration in decades of the value of vaccines. Yet the result is that anti-vaccine sentiment has increased.
I think it’s a combination of three things. First, we are more culturally primed for anti-technology sentiment than we were when the polio vaccine was introduced in the 1950s. Second, thanks to vaccines, we are more culturally removed from the point at which infectious disease was a leading cause of death and a threat that continually loomed over human life, so we no longer appreciate what vaccines have saved us from. Third, a long period between major pandemics meant that nobody had to think about vaccines. They accepted them as a matter of course. But the pandemic suddenly required people to form an opinion about a new vaccine, and when people are required to think, a certain percentage of them will quite frankly be bad at it. Increased opposition to vaccines is a partial measure of how high a percentage this is.
At any rate, misplaced skepticism about vaccines has centered especially around the new technology of mRNA vaccines. But again, the paradox is that this targets a new technology that works. Specifically, mRNA vaccines offer tremendous speed and flexibility in creating new vaccines that shows enormous promise for treating things that could never be treated before.
In this case, it’s a vaccine for brain cancer.
Researchers at the University of Florida report they have developed an mRNA cancer vaccine that quickly reprograms the immune system to attack glioblastoma in a first-ever human clinical trial of four adult patients….
The vaccine was personalized to each patient, with the aim of maximizing immune system response. To generate each vaccine RNA was first extracted from each patient’s own surgically removed tumor, and then the messenger RNA was amplified and wrapped in the newly designed packaging of biocompatible lipid nanoparticles, to make tumor cells “look” like a dangerous virus when reinjected into the bloodstream and prompt an immune-system response.
The result:
“In less than 48 hours, we could see these tumors shifting from what we refer to as ‘cold’—immune cold, very few immune cells, very silenced immune response—to ‘hot,’ very active immune response. That was very surprising given how quick this happened, and what that told us is we were able to activate the early part of the immune system very rapidly against these cancers, and that’s critical to unlock the later effects of the immune response.”
The irony is that one of the immune responses that is triggered is “cytokines,” the reaction that proved so deadly in covid infections. (In case you don’t recall, the body attempted to mobilize a cytokine response against covid, but the response required was so intense it damaged the patient’s own tissue, perforating his lungs.) Now, by way of mRNA vaccines, it will provide a way to save lives and shorten or eliminate much more devastating chemotherapy and radiation treatment.
The Era of Genetic Therapy
We are actually on the cusp of an era of breakthroughs in genetic therapy. Here is just one.
A British toddler has had her hearing restored after becoming the first person in the world to take part in a pioneering gene therapy trial, in a development that doctors say marks a new era in treating deafness.
Opal Sandy was born unable to hear anything due to auditory neuropathy, a condition that disrupts nerve impulses travelling from the inner ear to the brain and can be caused by a faulty gene.
But after receiving an infusion containing a working copy of the gene during groundbreaking surgery that took just 16 minutes, the 18-month-old can hear almost perfectly and enjoys playing with toy drums….
A harmless virus is used to carry the working gene into the patient. The trial is “just the beginning of gene therapies”, Bance said.
It is interesting to contemplate the extent to which genetic engineering specifically has been a focus of apocalyptic attitudes toward technology. Even the famously optimistic Star Trek franchise regarded it as taboo (though they have partly walked that back in one of the better episodes of the recent “Strange New Worlds” series). But contrast the fear to the reality.
Augmented Humans
Stuart Hayashi brought my attention to an article that explores the power of technology on a more philosophical level, including the speculation that the use of tools actually augments the human body in a very literal sense.
When a tool in your hand “becomes part of you,” it’s not just a metaphor. And it’s not just a statistical description of the motions of your body and the motions of the tool. It’s real. Your brain makes it real.
Remarkably, neurons that respond specifically to objects that are within reach of your hand will also respond to objects that are close to a tool that’s in your hand. Cognitive psychologists Jessica Witt and Dennis Proffitt found that when they asked people to use a reaching tool (a 15-inch orchestra conductor’s baton) to reach targets that were just out of range, the targets looked closer than when they intended to reach without the tool…..
Whether they are tools, toys, or mirror reflections, external objects temporarily become part of who we are all the time. When I put my eyeglasses on, I am a being with 20/20 vision, not because my body can do that—it can’t—but because my body-with-augmented-vision-hardware can. So that’s who I am when I wear my glasses: a hardware-enhanced human with 20/20 vision.
If you have thousands of hours of practice with a musical instrument, when you play music with that object, it feels like an extension of your body—because it is.
This explains to me why I enjoy performing on the piano and (to a lesser extent, since I have so many fewer hours of practice) the violin. It’s not just my love for the music. It’s the sensation of creating that music, in effect, out of my own person.
All modern humans are, in effect, augmented by our technology.
A Natural History of the Internet
The piece above goes on to speculate that smartphones and the Internet can even be experienced as extensions of our body. I think that’s taking it a bit far.
Then again, I was stuck by a certain amount of wistfulness in reading about the final end of Usenet, which has achieved my own personal goal in life: to survive so long that by the time you die, most people had no idea you were still around.
A combination of the Web and social media from an era before either existed, Usenet was the first encounter many of us had with the internet.
In 1979, Duke University computer science graduate students Tom Truscott and Jim Ellis conceived of a network of shared messages under various topics. These messages, also known as articles or posts, were submitted to topic categories, which became known as newsgroups.
Within those groups, messages were bound together in threads and sub-threads….
Much of the vocabulary we use today to talk about using the net springs from Usenet. Frequently Asked Questions (FAQ) files, for example, started on Usenet as summaries of information about a newsgroup so the members wouldn't need to repeat the basics for newcomers.
Other phrases aren't as much fun. Flame and flame war, for instance, also started on Usenet. We've always been, I'm sorry to say, mean to each other. At the same time, we've also tried to be kinder to each other. The concept is called netiquette. Then, as now, it's always been more honored in the breach than in practice.
That early experience is why I haven’t been too fazed by anything I’ve encountered on today’s social media. As this recollection puts it, “In many ways, Usenet is a warning about how social networks can go bad. All the same problems we see today on social networks appeared first on Usenet.”
As everyone has moved to other Internet platforms over the years, Usenet has been increasingly dominated by spam and other forms of noise, and it has just stopped being supported by Google. Sic transit gloria mundi.
There are some lessons here that the proprietors of our contemporary platforms ought to heed.