Wednesday, May 1, 2024
Home > Exchanges > What We Lose When We Can’t Stargaze

What We Lose When We Can’t Stargaze

I once met a physics graduate student at a cosmology school (I’ll call him Max) who, until his late 20s had believed that you could only see the stars with a telescope. Max had grown up in New York City, where the twilight of artificially lit nights dissolved the firmament. When he discovered the “permanent presence of the sublime,” as poet Ralph Waldo Emerson described it in his 1836 essay “Nature,” patiently awaiting on a clear, dark night, he was mesmerized.

What do we lose when our connection with our cosmic environment is broken?

The night sky is humankind’s only truly global common, shared by all of us across civilizations and millennia. Yet, today a majority of us lives in cities, where increasing light pollution compromises our view of the stars. Even worse, a new kind of threat is rapidly encroaching: thousands of low-earth orbit satellites have been launched in the last five years to deliver global internet connection, and appear as fast-moving dots across the starry sky. According to current trends, by 2030, artificial satellites will outnumber real stars, and no corner of the planet will be spared: the starry messengers shoved aside by instant messaging.

To lose the stars would be to sever ourselves from our past and perhaps threaten our future. Over the millennia, the sight of the heavens subtly and silently guided humankind’s steps: It influenced religion and spirituality, inspired great works of art, enabled navigation in the open seas, which the Polynesian masters accomplished thousands of years before Western sailors—and without the help of any chart nor instrument. Indeed, astronomy is the midwife of science: It is the study of the motion of the heavenly bodies that ushered in the Scientific Revolution in the 17th century, and consequently, the advanced technology our lives depend on today—from electronic devices hinging on electromagnetism to planes relying on aerodynamics. It also led to sociology and experimental psychology, when Swiss astronomer Adolph Hirsch realized in 1864 that to clock the passage of stars overhead to the exacting standards of his watchmaker compatriots required understanding his own reaction time. It even paved the way to artificial intelligence by demonstrating, for the first time, the power of data-based prediction with the discovery of the asteroid Ceres in 1802—not to mention the names and order of the days of the week (a legacy of astrology!), the star-based rating system we use everywhere online, and the Hollywood walk of fame. 

There are hints that our neck-craning awe has been with us since the very beginning. For example, prehistoric decorated caves in Dordogne, France are preferentially oriented towards the rising and setting Sun at solstices. The Pleiades, a stunning cluster of blue stars near Taurus, have been universally described as “seven sisters” (or seven women), despite only six being visible with the naked eye for all of recorded history. The myth of how the missing Pleiade went lost, chased by a mighty hunter, is uncannily similar among the ancient Greeks and Australian First People—two cultures that had no contact since Sapiens reached Australia 50 thousand years ago. But 100 thousand years ago, the seventh sister would have been easily visible to our ancestors. The identical myths may thus have a common origin, reaching back to before humans left their cradle.

Read More: The Webb Telescope’s Latest Image Reveals The Birth of Very Young Stars

From the moment Homo Sapiens walked out of the plains of Africa, paying close attention to the stars and the phases of the Moon helped our ancestors predict the availability of food, stalk prey at full moon, and travel long distances. When the Earth’s climate underwent a period of rapid swings 45,000 years ago, the slightest advantage in locating resources and shelter would have made the difference between survival and extinction–the ultimate price paid by our less star-savvy cousins, the Neanderthals. Cooperation and exchange of knowledge between bands was likely key to our ancestors’ ability to adapt to changing conditions. And it was by the compass of the stars, and the calendar of the Moon phases, that they knew where and when to meet.

Certainly we know that the moon cycle has governed calendars—and therefore, the economy—since Akkadian times, over 5,000 years ago, and tracking the phase of the moon as a marker of their fertility cycle made of women not only the first astronomers, but likely the first mathematicians, too. The rising of Sirius, the brightest star in the sky, and its retinue of stars led the Egyptians to invent the 24-hours timekeeping system still used today. Even in our technological age, distant galaxies are needed to keep atomic clocks in sync with the slowing rotation of the Earth. GPS would be hopelessly inaccurate without corrections due to Einstein’s theory of general relativity, first tested in 1919 by observing the shift in the position of stars during a total eclipse. Deep down, we are still led by the stars.

Just like the stars helped Sapiens overcome the climate challenges that doomed the Neanderthals so long ago, today they can once again show us the path forward, as we face the combined mortal dangers of anthropogenic climate change and loss of biodiversity. The “overview effect” describes the sense of awe and humility that grips astronauts when they behold our shining blue marble floating in the blackness of space. By looking up at night and contemplating the remote, unreachable suns scattered in the infinite inhospitable darkness, we can all experience a “reverse overview effect”: The realization that our shared cosmic home is irreplaceable, and the urge to become better stewards of its, and our, destiny.

Source