A lot of us rely on weather reports to plan our days. Should you carry an umbrella today? While packing for that trip this weekend, do you need to include a coat?
These days, we can just turn on the television or check a weather app on our smartphones. In fact, most people routinely check these reports before they ever look out the window or walk out the front door.
But have you ever wondered what people did before modern technology? After all, weather forecasting has existed for much longer than our smartphones, or even TVs. Have humans always been so reliant on weather forecasts?
Weather Forecasting Goes Back to Ancient Times
Plenty of ancient civilizations attempted to predict the weather.
We have evidence that as far back as 650 BCE, the ancient Babylonians were trying to predict short-term weather changes. They used cloud patterns and astrology as far back as 650 BC. Of course, we know now that cloud patterns can be a logical way of trying to predict the weather — but relying on astrology probably wasn’t the best way to predict if it might rain.
Similarly, the Chinese observed the relationship between clouds and weather changes. Weather forecasting actually has a long history in China. The Chinese kept meteorological records almost continuously for the past 3,000 years.
It wasn’t until around 350 BCE, when Aristotle wrote a description of weather patterns in Meteorologica. He included theories about the formation of rain, clouds, hail, wind, thunder, lighting, and even hurricanes. Written across four books, the first three are what we now consider meteorology.
Around the same time, in about 300 BCE, Chinese astronomers had developed a calendar that included 24 festivals associated with different types of weather.
As you can tell, most ancient methods of weather prediction were based on pattern recognition over the years. While some of their observations were proven accurate, many have failed statistical testing in modern times.
Although Aristotle’s texts were considered by many to be the authority on weather theory for almost 2,000 years, people started to understand that speculations from philosophers weren’t really enough.
By the end of the Renaissance, people knew that knowledge about the atmosphere was needed for accurate weather predictions. We finally start to see thermometers, barometers for measuring atmospheric pressure, and hygrometers for measuring humidity.
Modern Methods Developed in the Nineteenth Century
In the year 1835, the electric telegraph was invented. The invention of telegraph networks allowed the routine transmission of weather observations. People were finally able to draw weather maps, and identify and study surface wind patterns and storm systems.
This was also the first time communication was able to travel fast enough across geographic regions to make accurate weather forecasting possible.
If a storm was passing over the east coast of the U.S., for example, and was observed to be traveling west, a weather forecaster could quickly send the information to recipients in the west to let them know a storm was coming.
In 1922, English scientist Lewis Fry Richardson published Weather Prediction by Numerical Process.
His work described how numerical prediction solutions could be found through rigorous calculations. For practical usage, however, an auditorium full of thousands of people performing calculations from current weather conditions would be necessary.
So really, it wasn’t until the 1950s that Richardson’s calculations could be processed, when powerful enough computers came along.
In current times, computer modeling is the primary source of weather forecasting. However, humans are still needed to distinguish which models are best based on pattern recognition skills, teleconnections, and knowledge of model performance.