In a 1998 episode of The Simpsons called "The Wizard of Evergreen Terrace," Homer Simpson does some calculations on a chalkboard that seem to have been way ahead of his time. A physicist named Dr. Singh has written a new book, The Mathematical Secrets of The Simpsons, that shows how Homer may be smarter than he lets on. The real reason that advanced principles of mathematics feature in parts of the show is simple: many mathematicians have contributed to the writing over the years.
The Higgs Boson on Evergreen Terrace
In the episode that features Homer at a blackboard sketching out a mathematical equation, Dr. Singh notes that the result of the equation arrives at a mass which is not far off the actual mass of the Higgs boson when it was discovered in experiments in 2012 with CERN's Large Hadron Collider. He notes that the equation Homer is writing is a combination of fundamental parameters that are known to physics, including the speed of light, the gravitational constant, and others.
Homer Had It Pretty Close
Dr. Singh points out that reference to the known numbers of the components of the equation that Homer wrote up adds up to a mass of 775 giga-electron-volts (GeV) for the Higgs boson. The actual estimate that was observed when the particle was created in the Large Hadron Collider places it at 125 GeV, not unreasonably far off from the result of Homer's calculations more than a dozen years before the element was confirmed.
Stephen Hawking Describes Potential Means of Human Destruction
Noted physicist Stephen Hawking has spoken out in recent years about risks to humanity. In the preface to his book Starmus, he notes that, "the Higgs potential has the worrisome feature that it might become metastable at energies above 100bn GeV. This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen in an instant and we wouldn't see it coming." After briefly freaking everyone out about the universe-destructive potential of advanced Higgs field research, he calmed nerves by noting that the only way to accelerate particles above 100bn GeV would be with a particle accelerator larger than the planet Earth.
Present Threats: Artificial Intelligence and Human Aggression
In other statements, Hawking has used his vast scientific knowledge to alert humanity to potential threats that could arise from technology either currently under development or already in existence. The top three threats he sees include:
- Advanced or "strong" artificial intelligence (AI)
- Human aggression with advanced weapons
- Aliens who may colonize or overrun Earth
Joining a group of concerned scientists and industry leaders, he signed a recent letter advising caution in the continuing development of artificial intelligence. The open letter, published online by the Future of Life Institute, advises that, "because of the great potential of AI, it is important to research how to reap its benefits while avoiding its pitfalls."
Current AI researchers, however, note that science is decades away from developing the type of "strong-AI" that the concerned scientists discuss in the letter and other public statements. One expert notes that, "it's good to start the conversation now" about how to control potential advanced technology of AI that threatens humanity.
Realizing Connections With Empathy
When asked what human trait he sees as most threatening to the future of humanity, Hawking pointed to aggression. While aggression may have once served a purpose in human development, helping people to acquire resources or territory, for example, he says that it is now a risk to all of human life. For example, nuclear weapons may wipe out humanity if used.
Asked what human trait he would most like to encourage, he answered empathy, because it "brings us together in a peaceful, loving state." It does not take a brilliant scientist to see that all of humanity is connected on this earth, that all are children of the same universe, and that empathy is the human trait most needed in the world.