Welcome 2014! And quite a start to the year, with a cold snap that rivals anything we’ve seen in two decades. I don’t remember cold like this since the horrid winter of 1994, when the Northeastern U.S. saw snowstorms and extreme cold that alternated back and forth for weeks. Of course, when I was a child in the 1970s, such chills happened a lot more often; I remember a number of New England mornings where I awoke to a thermometer reading of -20ºFahrenheit (-29ºCelsius) [244 Kelvin].
The scariest negative temperature numbers that one hears about from the media are associated with the “wind chill”, which is a number that is supposed to measure how cold the air “feels” to your skin. But “wind chill” is a rather subjective and controversial measure — there’s no unique way to define it, since you’ll feel differently depending on how much exposed skin you have, on your body weight, on your age and conditioning, etc. By contrast, the temperature measured by a thermometer is defined independent of how humans feel, and experts agree on what it is and means. Oh sure, people use different scales to measure it: Fahrenheit (F), Centigrade or Celsius (C), and Kelvin (K). But the differences are no more than the distinction between meters and feet, or between kilograms and pounds; it’s straightforward, if a bit annoying, to convert from one to the other.
So everyone agrees the temperature is and feels extremely cold, But is it, from the point of nature, really that much colder than usual? To say it another way: it was 84ºF (29ºC) in southern Florida yesterday. How much warmer is that than the -40ºF (-40ºC) that was registered in the cold Minnesota morning?
Well, you might first think: wow, it’s a difference of 124ºF (69ºC), which sounds like a huge difference. But is it really so huge?