Covering The San Diego Fires – Sources And Methods

| October 30, 2007

First off, I would like to thank everyone for the mountains of praise and complements on our coverage of the fires in San Diego last week. It was a large amount of work, but I think it really helped everyone, including myself, understand what was going on. Several folks have expressed interest in how and why my efforts to map and visualize the fire came about. A discussion of “Sources and Methods” after the jump.


One of the founding concepts behind my new software venture is that the current trend in IT is reaching a point of diminishing returns. One of the wonders of the past 20 years has been the power of the US economy, largely thanks to insane levels of worker productivity in this country. Much of that is directly attributable to the adoption of information technology by American companies to improve the efficiency of almost any task they do more than twice. The concept behind Boomerang is that the next big jump will come from making people smarter, rather than making it easier to order paper for the printer.

This means being able to discover and extract information from many different sources automatically, and have back end machines and programs that can shape this information into a form where it can be combined and re-combined based on topics of interest. For those readers with a background in the Intelligence Community, it’s the classic All-Source problem and how do you let everyone know what is going on without swamping them with a million details so that the message is lost.

We have built several really interesting systems using this approach, known more widely as “Enterprise Mashups“, and many of the ideas of how it works and why it matters stem directly from earlier work on DoD projects such as DCGS (opens a PDF).

On Sunday October 21st, our first clue that something was going on came with the wind that was blowing strongly from the east. The slight but unmistakable smell of a brush fire. The first question on everyone’s mind is – where is it and where is it going? Flipping between the local stations yielded a lot of football and no information.

We needed to decide quickly if the horses in the eastern hills near Valley Center would be threatened or not by this fire. Checking the local news paper’s web site resulted in little or no information. Checking the California Department of Forestry was scant to none. Waiting to be told there was a problem was asking for trouble. What was needed was “Situational Awareness” of the threat now and the projected threat a few hours out, and we were low on data to feed that decision.

We had heard the fire was in Ramona, and we knew how the wind was blowing. We also knew some information about the terrain. Using a combination of Google Earth, physical maps and our knowledge of the burn pattern in 2003, we started laying out what we knew, what we suspected and what we though the fire would do next. We made the decision there was a very real threat to the stables and mobilized people to evacuate the horses, several hours ahead of the official word to move out from the County.

Throughout the first 2 days of this fire, information resources could not handle the stress of people trying to access them over the web. The entire information structure started to buckle. It was at this point I decided that based on what we knew from listening to AM radio, scanners and what news we could get that it should be possible to display where the fire was and where it was headed. I did what any intel guy would do, I started making maps. Just by whim I published one to the web site, and people were drawn to it. So much so that we got our web site temporarily knocked off line.

It cannot be stressed enough the importance of getting accurate, condensed information in the hands of everyone as quickly and directly as possible. The web is an outstanding resource for doing that, but the tools to do it quickly and easily are lacking.

Some of the best source of data that were used:

Pacific Southwest Research Station – These are the guys who were suppling the infra-red data that I was overlying on Google Earth with such great results. If I were President or Governor I would buy them a pair of Predator UAVs to boost their capability. The proved they could deliver results in a time of need, and they are the best bet (in my book) to move forward.

San Diego Union Tribune – They were a bit slow on the up-take, but quickly got in gear putting out information. Specifically there were some false starts with their Fireblog, that they eventually got off of their network and onto a high capacity server.

Cal Fire – This web site folded like a lawn chair for the first 2 days under the traffic, and did not have much useful detailed information until the 2nd or 3rd day. Suggest California think about whether this is fulfilling its intended mission. It’s saving grace is it gave me the rough parameters of the Witch Creek Fire late Sunday night after trying to load it for over an hour.

San Diego County Emergency – Another web site that quickly imploded under the network traffic. They even started publishing detailed PDF maps towards the end of the second day. I am hoping they are still smarting after their failure for the first 36 hours and someone in power will build that site out to what it should be in terms of content generation and capacity.

Google Maps – Three cheers for Google Maps and the community around it. Early on all kinds of people got just as frustrated as I was about what was going on and the lack of information around it. We started making maps on Google and sharing them. The software allows you to combine multiple maps in one view. This made it easy for people like me to combine knowledge from many people at once and create a common view. The only down side is you sort of had to know what you were doing to get really good results out of it.

KPBS – Knocked off the air early when the fire took out their broadcast antenna, they made some of the more detailed Google Maps of this event. Well done.

Tools that handled the load:

Google Earth – Decent visualization tool for 3D terrain and geo-spatial data. KML file format was very helpful in swapping data with other people watching the fire.

Mars Edit – Blog (for the Mac) posting tool was my front end to the web site. Proved itself as a huge help in quickly creating posts with a large amount of graphical data easily.

Word Press – The Blog software that runs this site. A few strange quirks under load but did ok.

OmniGraffle – Mac based drawing tool that I used to annotate the maps and visualizations. Good stuff all around.

Thanks again for reading this blog and supporting us.

Bruce

Some of the maps and visualizations we produced…

October 22nd

Overview 0900.png

1230 Map.png

1315 Map.png

1545 Map-sm.png

Burn Map 1730.gif

1953 Map.gif

October 23rd

Poomacha-0730.png

Hidden Meadows-Evac.gif

south escondido 0830-2.gif

RSF 0900-2.gif

Horse Evac 2-1330.gif

Lake Hodges Funnel2.jpg

Encinitas Witch.jpg

Poomacha Map3.jpg

Poomacha GE.jpg

Harris Fire GE.jpg

Be Sociable, Share!

Category: Blogs, Main, Maps, Mashups, Military, Personal, SAN DIEGO EMERGENCIES, San Diego Fires, Visualizations

About the Author ()

Bruce Henderson is a former Marine who focuses custom data mining and visualization technologies on the economy and other disasters.

Comments are closed.