Fluoride Action Network

Building America’s Anti-Terror Machine

Source: FORTUNE Magazine | July 22nd, 2002 | by Stuart F. Brown

It’s a very bad day in Galveston, Texas, home to one of the world’s densest concentrations of petrochemical plants. An airborne plume of hydrofluoric acid–stuff so nasty it can dissolve glass–is spreading from a railroad tank car blown up in a terrorist attack. Public-safety officials are in a scramble to understand the scope of the disaster and how to protect the population. Fortunately they’ve got a geographic information system, or GIS, to get a handle on the crisis and respond to it–fast.

First, they call up a county map on a computer screen and pinpoint the source of the plume. Then they grab current weather data from the Internet and superimpose them on the map. From a hazardous-materials database they fetch information on the properties of the gas; plugging that into a dispersion model, they get a prediction of where the plume will drift. They consult highway department logs, gather the phone numbers of the residents downwind, and create an automated “reverse 911” telephone message telling people to get the hell out, and by what route.

Although this is only a what-if scenario created by ESRI, a leading supplier of GIS software in Redlands, Calif., it vividly demonstrates how the power of infotech can cut through the fog of confusion accompanying disastrous events. Experts in emergency planning and disaster response say their jobs can be divided into three main tasks: (1) anticipating bad events, (2) developing strategies to prevent them, and (3) planning how to save lives and preserve property in the event something awful occurs. Police and fire departments are supposed to function along those lines and, writ large, this three-pronged strategy will provide the underpinning for whatever anti-terrorism system the U.S. evolves.

Congress and the White House can wrangle all they like about the bureaucratic structure of the proposed Department of Homeland Security, but one conclusion is foregone: To get smart about combating terrorism, the government will need a central nervous system implant. That will mean doing two big jobs quickly. First, the U.S. must assemble what may be the largest computer network ever built, a decentralized system for tapping troves of information stored in all sorts of government and private-sector databases. And second, to turn all that information into knowledge, it must learn to sift and collaborate on a scale no government has been asked to before.

Tom Ridge’s Office of Homeland Security has already begun work on what it’s calling the Enterprise Architectures for Homeland Security. To build this giant knowledge machine, the Homeland Security people are using as a model corporations that make savvy use of infotech. “The enterprise approach is newer to the federal government, but very well understood in the private sector,” says Homeland Security’s recently appointed chief information officer, Steven Cooper, who comes from a background in information management at companies including Eli Lilly and, most recently, Corning.

It’s too soon to know what the government’s anti-terrorism network will look like, of course. But a tour of the infotech frontier provides glimpses of the tools it is likely to incorporate.

Methods for coping with natural disasters like hurricanes, floods, and earthquakes–especially GIS–are already being adapted to respond to terrorism. Ever heard of NIMA? It’s an intelligence outfit, the National Imaging and Mapping Agency, that usually provides specialized maps for the armed forces. When the FBI was preparing to secure last winter’s Olympic games in Utah against terrorism, it asked NIMA to lend its GIS mapping expertise. Drawing on commercial and government databases as well as highly classified spy satellite information, NIMA set up a mapping center in Utah that cranked out maps for teams ranging from policemen on street corners to fighter pilots patrolling overhead. The mappers tailored data to make the security situation clearest to each group of users. If, say, a gunman had appeared on the steps of the Mormon Tabernacle, NIMA agents could have instantly identified, with the click of a mouse, his potential field of fire. How? By using software developed by the cellphone industry, which needs to calculate lines of sight when choosing where to put antennas. The agents also had the ability to create “fly-through” computer animations of many building interiors–useful in hostage situations–derived from architectural drawings. “Five years ago, we would have just given them all the same maps,” an intelligence official says.

Having access to multiple databases and GIS mapping expertise was immensely helpful to New York City in the days after Sept. 11, when its new, high-tech emergency operations center was destroyed along with the World Trade Center towers. Working nonstop over the next 72 hours, managers and technicians from a host of agencies and contractor firms regrouped on Pier 92 at 52nd Street along the Hudson River–which usually hosts cruise ships and trade shows–and patched together an operations and mapping center. The center’s big color plotters cranked out fresh maps every day, superimposing on an aerial photo of the rubble at ground zero all kinds of features that rescue workers needed to know about–hazards like fuel tanks and elevator shafts, hot spots where buried fires burned most intensely, locations of electric cables, gas pipes, water mains, and sewers, and routes cleared for trucks hauling heavy loads of twisted steel beams from the area.

Layering data and looking for the patterns within them is hard to beat as a means for connecting the dots during complex, confusing events. The strategy dates back at least as far as 1854, when central London was in the grip of a lethal cholera outbreak. By plotting both the locations of deaths and the locations of public-drinking-water pumps, pioneering epidemiologist Dr. John Snow observed that the cholera cases clustered around one of the 11 water pumps in the area. Officials removed the pump’s handle to protect the population from the cholera-tainted well.

“We’ve always known that images or maps are an extremely efficient and effective way to convey information,” says a senior U.S. intelligence official. “This is really about data integration. GIS, for example, is really an expression for organizing data around nodes or positions in space. And when you add in the 3-D visualization techniques and do fly-throughs, it becomes very powerful.”

Whopping amounts of useful information exist within the government for this kind of purpose. The United States Geological Survey, for instance, which has been mapping the nation and its natural resources for more than a century, is now embarking on a program called the National Map. The aim is to capture the information on the 55,000 paper topological maps familiar to hikers and keep it continuously updated on the Internet. By clicking on areas of interest, users will be able to link to layers of information, like census data, compiled by other agencies. “In the case of homeland security, the National Map will let local first responders share the same information as people who have regional or national response authority, so all levels of government are working off the same sheet of music when an event happens suddenly,” says Michael Domaratz, a USGS cartographer in Reston, Va. “We’re working with the intelligence community to make sure this National Map is available to them for their threat-assessment activities, to ask what-ifs: Where are things? Where are they in juxtaposition to others? Who might be affected downstream by an event?” Of course, terrorists could make use of such an open system too. “Unfortunately, the information I need to protect myself can also be the information someone else needs to attack me,” says Domaratz. “So we have to hope that we have smarter people out there doing protection, prediction, and prevention, and have more of them than there are bad guys.”

The government has so much information at its disposal, in fact, that the Homeland Security Office’s first step is to review the current state of its knowledge. Such an inventory of information is called metadata, or “data about data.” “We’re now surveying federal agencies that have part of their mission relevant to homeland security,” says Cooper. “The fact that some agencies have many different databases is an artifact of the way the federal government grew up and operates, which is around specific programs, rather than the agencies themselves. Some of them are going to have to work on their internal information architectures.”

A big corporation might address that kind of problem by knocking over its information silos and merging the data into a new, integrated system. But that’s not really an option for the government, which has lots of big, expensive “legacy” computer systems it can’t afford to replace in a hurry. So it is likely to turn to what are called enterprise decision-management systems to tap its many sources of data.

An executive of Teradata, an NCR division that makes such systems, explains how they work with an example from the cellphone industry. When a cellphone customer dials a call-center operator to discuss an account, Teradata makes sure the operator’s computer shows a raft of information about that person–billing records, service history, how profitable a customer the caller is, etc.–even though the data are stored in many different places. By keying in a customer’s name, the operator automatically launches a “holistic query” that reaches across boundaries to provide a complete picture. Teradata does similar work for government agencies that collect and analyze gigantic amounts of classified information.

A big challenge in sharing information broadly, of course, is not inadvertently giving away secrets or trampling on privacy. In its classified government work, Teradata’s systems provide for individualized “views,” or access privileges, that let different users browsing secured databases see different sets of information according to their “need to know.”

The touchy task for Homeland Security’s information architects will be to avoid constipating the interagency flow of data and insights by cordoning off too many things, while still protecting genuinely sensitive information. Done right, this could help the spooks to noodle threat scenarios and ways of protecting against them, just as the emergency-services people imagine and plan for fighting fires and aiding injured people.

So we’re lucky. There’s an abundance of breathtakingly versatile technology available to counter the menace of terrorist attacks at home. Now for the bad news: Computers are only as smart as the bureaucrats who use them. It may require a Manhattan Project of social engineering to induce agencies that have traditionally viewed each other mostly as rivals for budget dollars to reach out and hold hands. “The greatest barrier to this solution is probably not a technology one, but an institutional one,” says an executive at a software supplier to the government. Teradata President Mark Hurd says, “This is as much about leadership and process as it is about technology. Commercial companies are already doing this stuff.”