Access options:
- gift link - registration required
- archive.today
- ghostarchive.org - many images are broken
This isn’t surprising or even something we can address. You can’t really plan for the unprecedented.
Sure you can. It’s a matter of using modeling to estimate its probability and then planning around it. Californians have done a planning exercise around a storm somewhat worse than the 1860-1861 storm sequence for exactly this kind of reason.
But that’s the thing, we keep finding out our models are wrong or inadequate. You can model anything you want, but it’s only going to be as good as your base data and hypothesis on how to project that forward. We keep seeing “once a century” storms happen years apart, “once every 500 year storms” happen within the same decade, etc.
That’s what those flood maps involve - models based on previous patterns, but they fail when the patterns no longer apply.
I’m not saying there’s no point to modeling, just we shouldn’t be surprised when they underperform or are wrong. As the saying goes, “we don’t know what we don’t know”.
The problem here isn’t that models are wrong or inadequate, but that FEMA, for political reasons, has based its maps and risk estimates on historical averages, and those don’t adequately capture the change we’ve had, or relatively low-probability events.
Ahh, you’re talking the risk assessments and the associated floodplain maps. So yes, many of the maps are decades out of date, despite the legal requirement they keep them updated.
Those maps, by nature, are based on historical data but then use projections to assign future risk assessments. My point was that while we may have consensus on increases in global temperature, what that means for specific areas is almost impossible to forecast in a way that wasn’t true before. For example, we know that major ocean currents may be in danger because of global warming, but we don’t know when or if that will happen and exactly what exact effect it will have on a specific parcel of land, and that effect may be quite acute.
Between extreme weather events, changes in land use, etc., the unprecedented nature of the changes we’re facing and the complexities involved mean that no matter how accurate the previous data and no matter how bleeding edge the climate modeling we use, any new maps are going to be much more unreliable than they’ve been previously.
And because those maps are expressly tied to the National Flood Insurance Program, that means potentially billions being lost because the map was wrong. Of course the map is wrong now, but that’s unfortunately how bureaucracies think - better to be wrong because of inaction rather than sticking your neck out.
The real problem going forward is that the very fundamental idea of being able to map weather risk is a fiction. It would be far better to assume most areas are going to see extreme flooding and then judge how resilient the area is to that flooding and make policy decisions based on that. Even with perfect maps and models, people in the next 50 years need to understand that there will be no climate havens and that it’s not a question of if you’ll experience an extreme weather event but when.
Flood maps are used by insurance companies to set their rates. If the risks are artificially low to keep insurance rates low in Florida, that smells like corruption.
The article is about North Carolina. That’s where the devastating flooding outside of flood zones was. Obviously there are wider implications nationwide but it’s not a Florida specific problem.