Wind chill sounds pretty self explanatory. If it’s windy, it feels cooler — but, the concept isn’t as simple as you might think.
First, let’s start with the history. The idea of wind chill, and the literal name, was developed on the continent of Antarctica. While on the coldest place on Earth, explorer Paul Siple and polar scientist Charles Passel, decided to conduct an experiment by observing 250 grams of water in different temperature and wind conditions. They had an anemometer mounted at the same level. They found that the faster the wind blew, the quicker the water turned to ice. When winds increase, heat is carried away from an object or body at a faster rate, driving down the temperature.
Siple and Passel, after coming up with a formula, originally measured the wind chill in watts per square meter. There were some alterations to the formula along the way and Canadian meteorologists, who began using the wind chill value in the 1970s, converted it to a more usable “feels like” number. It caught on. However, over time, scientists realized it proved to be somewhat inaccurate.
So, the Joint Action Group on Temperature Indices was developed in 2001. Yes, it’s a real thing. Essentially, it was group of Canadian and American scientists who hoped to create a more accurate measure of wind chill. In doing so, some volunteers were asked to “take one for the team.” Twelve volunteers, six men and six women aged 18 to 42, were put on a treadmill in a climate chamber. They were asked to walk for 90 minutes in temperatures as low as 14 degrees, with winds as high as 18 mph. Electrodes attached to their face and inside their mouth recorded heat loss. That data was then put into a computer and a new wind chill formula was born. If you were curious, here’s what it looks like:
WCT (?F) = 35.74 + 0.6215T – 35.75(V 0.16) + 0.4275T(V 0.16)
where, T = Air Temperature (“F), V = Wind Speed (MPH)(** – valid only for wind speeds>4 MPH)
Here’s the NWS Wind Chill Chart, along with frostbite times.