The scourge of “terrorism” — for today’s purposes, let’s take the word to mean attacks on civilians perpetrated by individuals or by small, stateless groups — is a part of human existence going back as far as you want to look. If a person has what he or she views as a grievance, then attacking people who are loosely connected to that grievance, in order to kill and maim some of them and frighten the rest, is obviously one of the options, immoral and hideous as it may be. There’s nothing modern about the strategy of terror.
What’s new about terrorism in the modern world is science. Science, via the technology that it makes possible, is a great multiplier. It allows an individual, or a small group, to exploit power inherent in nature, turning a task that no human could perform, or that would take a cast of thousands, into something that can be done with ease by a few people, or even just one. Of course this multiplied power has many benefits for us as individuals and for society as a whole; think of trains, tunnel-boring machines, skyscraper cranes, snow-blowers, pneumatic drills, aircraft engines, power plants, and on and on. But it also poses many risks and challenges that we have to face, as individuals and as a global civilization. Continue reading
The greater New York region, having been broken into disconnected and damaged pieces by Hurricane/Nor’Easter Sandy, is still reassembling itself. Every day sees improvements to electrical grids and mass transit and delivery of goods, though there have been many hard lessons about insufficient preparations. Here’s an impressive challenge: over a million people and thousands of businesses lack electrical power; therefore many of them are running generators, to stay warm, keep food cold, and so forth; but the generators require fuel, typically diesel or gasoline; and so there is a greater need for fuel than usual; but a significant fraction of the petrol stations can’t pump fuel for their customers… because they lack electrical power and don’t have their own generators. These and other nasty surprises of post-storm recovery should be widely noted by policy makers and the public everywhere, especially in places that, like New York when I was a child, rarely experience disasters.
Unfortunately, another storm (a simple nor’easter) is now forecast for mid-week. While much weaker than the last, it is potentially still a dangerous situation for a region whose defenses are still being repaired. As was the case with Sandy, the new storm was already signaled a week in advance by the ECMWF (European Center for Medium-range Weather Forecasting), the current European weather-forecasting computer program or “model”. Confidence in the prediction has been growing, but still, predictions so far in advance do change. Also one must keep in mind that a shift in the storm’s track of one or two hundred miles or so could very much change its impact, so the consequences of this storm, even if it occurs, are still very uncertain. But again we are reminded, as we were last week, that weather forecasting has dramatically improved compared to thirty years ago; the possibility of a significant storm can now often be noted a week in advance.
What is this European ECMWF model? what is its competitor, the US-based GFS (Global Forecast System) model? And what about the other models that also get used? All of these are computer programs for forecasting the weather; all of them use the same basic weather data as their starting point, and all have the same basic physics of weather built into their computer programs. So what makes them different, and more or less reliable than one another? I asked one of my commenters, Dan D., about this after my last post. Here’s what he said, along with my best (and hopefully accurate) attempts at translation for less experienced readers: Continue reading
I know Anand Gnanadesikan, professor at Johns Hopkins University’s department of Earth and Planetary Sciences, from when we were both studying physics as undergraduates in college. He wrote something today that speaks with more authority than I could in my post earlier this morning, and it is a pleasure to make it available to you.
As Sandy approaches the coast I am very thankful for my former colleagues at the Geophysical Fluid Dynamics Lab who have spent decades trying to make better predictions of tropical cyclones. And the colleagues around the world who have spent decades developing the techniques of observing and modeling the physics of cyclones. Both the recurving of the storm and the high storm surge (currently already at major flood levels at a number of points between NY and DE) would have taken tens of thousands of people, at a minimum, by surprise. Kurihara’s first paper in the line of research that led to today’s prediction (http://www.gfdl.noaa.gov/bibliography/results.php?author=1061) was in 1965. It took him almost a decade to get to the first three-dimensional model of a hurricane and another couple of decades to improve the models to the point where they showed useful skill.
Just a little plug — it is important to remember that this kind of event, low-probability, high impact, is what well-run government is for. Putting together a storm-surge warning system for NY Harbor is not something that a private company is likely to do — the chances of it being used in any given 10-year period are so small as to make it a worthless investment. And the research that goes into making a forecast like this involves understanding of small-scale turbulence, understanding the transfer of radiation through the atmosphere and its interaction with multiple scatterers and absorbers, figuring out how to put weather satellites into orbit and keep them running, figuring out how to incorporate this information in computer codes, getting these codes to run reproducibly on large numbers of processors… almost all of this was accomplished by people working on the government dime.
The big storm of 2012 (at least, we hope it’s the biggest we’ll see this year) is approaching the New York City area, and though no one can predict in detail how bad it will be and for whom, there’s no question that with so much energy to play with, post-tropical quasi-hurricane quasi-nor’easter Sandy (also called “Frankenstorm” in honor of the Halloween holiday) is going to hit some of us very hard in the northeastern United States. Not that it will be a disaster everywhere in the region. With hurricane Irene last year, some areas just had a bit of wind and rain, while others had tremendous flooding that wiped out towns and roads and houses and history… and a few dozen lives, too. It will likely be the same this time.
How unusual is this storm? Several weather forecasters have been quoted as saying that their supercomputer-based forecasting tools, which predicted Sandy to strengthen and become a monster in size, were doing things they’d never previously seen them do. Right now, all you have to do is look at the weather map — the fact that there are tropical force winds extending over several hundreds of miles, and at the fact that the pressure of the atmosphere at the core of this storm is around 946 millibars and falling — to know there’s a lot of energy in this system that has to go somewhere, and is going to be taken out on somebody. Although this is a Category 1 hurricane in terms of its fastest winds, 946 millibars is what one expects for a strong Category 3 hurricane; 1000 is average atmospheric pressure, and the mid-800s is about as low as it ever gets. By comparison, the great blizzard of 1993 had a central pressure of about 960 millibars. The Perfect Storm of 1991 (also a nor’easter-hurricane hybrid, like Sandy) had a central pressure of 972 millibars. Anyone who thinks Sandy isn’t a dangerous storm hasn’t read enough history. Continue reading