Open government may exceed prudent boundaries

Security experts warn that as Obama makes government data more accessible, it is imperative to improve data classification and information management so that sensitive but unclassified documents will not get into the wrong hands. -DB

Computerworld
Analysis
June 8, 2009
By Jaikumar Vijayan

The Obama Administration’s push to make government data more open and easily accessible is elevating the need for standardized data classification and information management approaches across federal agencies, security experts say.

Without such standards, federal agencies run a higher risk of accidentally exposing protected data in their rush to comply with the Presidents mandate for greater transparency and open government, they said.

The caution comes in the wake of a recent incident where the U.S Government Printing Office accidentally published on its Web site a document containing sensitive but unclassified information on dozens of U.S. civilian nuclear sites.

The 267-page document had been compiled as part of a report being prepared by the federal government for the International Atomic Energy Agency, or IAEA. It had been marked as “Highly Confidential Safeguards Sensitive,” by none other than President Obama.

The reason it had been published was probably because the GPO had a different process for handling sensitive but unclassified (SBU) documents than the agency that handed it to them, said Karen Evans, former de facto CIO of the federal government.

SBU documents are those that are considered sensitive enough to merit some level of protection from disclosure but not that sensitive as to merit a classified status. A large number of government documents presently fall under this category.

Because each agency has its own process for defining, labeling and protecting SBU information, there is little consistency in the way federal agencies handle such data, sometimes even internally, Evans said. Currently, there are 107 unique markings and more than 130 different handling processes and procedures for SBU information. That can create unexpected problems given the current push towards greater information sharing among agencies and the open-government movement, she said.

As part of his promise to reverse some of the secrecy and over-classification of data by the previous administration, the President has been pushing agencies to make data more accessible to the public. His administration has also been urging agencies to use Web 2.0 tools, blogs and social networking services such as Facebook, and Twitter for pushing out unfiltered information to the public.

Just last week, federal CIO Vivek Kundra announced plans to make more than 100,000 data sources available to the public by the end of this week on its data.gov Web site.

Virtually every federal agency has a new representative these days to help figure out how to use such tools to engage with the public at large, said Gartner Inc. analyst John Pescatore, who works with several large government agencies. The problem is that many agencies have not yet implemented ways to keep data not meant for public consumption private.

“The federal government is trying to push out more data, but they need to make sure they have a process in place to ensure that [sensitive] data isnt pushed out to places where it shouldnt be,” Pescatore said. “There still is such as thing as ‘need to know’,” he said.

“Openness is a wonderful thing so long as you have checks and balances to see it doesn’t become too open,” said Ken Silva, chief technology officer at Verisign Inc. When data previously available from a few hundred government sources suddenly starts becoming available from thousands of sources—including sites such as Facebook and MySpace—there needs to be controls in place for protecting against inadvertent leaks, he said.

This is especially true when dealing with sensitive but unclassified data of the sort that was leaked last week by the GPO, he said. In that case, the individual document itself may not have revealed any national security secrets. But if enough such documents get pushed out the composite data could provide classified information, said Silva, who is a former executive technical director at National Security Agency. “A sufficient amount of unclassified data can become classified,” Silva said.

A currently under way 90-day review of the rules for classifying, declassifying and maintaining national security information at federal agencies could help mitigate some of these issues. The review was ordered by President Obama in a May 27 memorandum to federal agency and department heads that reiterates the government’s commitment to “operating with an unprecedented level of openness.”

In the memo, President Obama seeks recommendations on whether it might make sense for federal agencies to treat SBU data in the same way that a separate category of data known as Controlled Unclassified Information (CUI) is treated. The term CUI data was coined during the Bush Administration and relates to sensitive unclassified data that is specifically terrorism-related. Unlike SBU data, federal agencies have a standardized procedure for handling CUI data.

Obama’s memorandum seeks recommendations on whether the CUI standards framework can be applied to SBU data as well.
If the change is applied, some of the problems that stem from the non-standardized approach to SBU will be tackled, Evans said. But in addition to such standardization and policy controls, technology controls are needed to protect against accidental data leaks, Pescatore said.

Government agencies with a presence on social networking sites run a higher risk of their data being compromised from the outside. So it is going to be necessary for them to implement filtering tools for blocking malicious executables coming in from Web sites, Pescatore said.

Data leak prevention tools are going to be crucial for monitoring outbound traffic to detect if personally identifiable information and other sensitive data is accidentally going out, he said.

Gartner recommends that government sites add “brand monitoring services to continually monitor social networking sites to see what information shows up,” Pescatore said. “If an agency is using Facebook or Twitter, others can pretend to be that agency and set up spoofed sites.” So brand monitoring is useful in this regard as well, he said.

“Just having a policy doesn’t make sense if you have no means of enforcing the policy,” Silva said. Best practice says first decide what your policy is going to be. You first decide what you are willing to share, then communicate that to all employees, and then implement controls for enforcing that policy, he said.

Copyright 2009 Computerworld Inc.