Disinformation: The Real Cyber Security Challenge

U.S. intelligence experts and analysts are in general agreement that the protection of highly classified information is not only a “gentlemanly” goal, but also vital to the nation’s survival. However, the quality and accuracy of that information also needs to be protected, particularly in an age when there is a massive daily flow of data and content to cope with … and a tidal wave of communication that is false, malicious, and misleading as well.

In the early 1930s, when intelligence information was collected primarily by intercepting foreign embassy cables, U.S. Secretary of State Henry L. Stimson expressed his disapproval by stating that the act was an “ungentlemanly business” and a “travesty to diplomacy.” Despite Stimson’s disdain for such shocking behavior, the art of cyber exploitation and the countering of the cyber security measures of other nations was just beginning.

Today, most cyber security challenges and initiatives focus primarily on protection of the information itself and of the highly sophisticated networks used to transmit, store, and manipulate that information. Although this change is certainly an important aspect of cyber security, it does not address several new and/or emerging challenges that the emergency management community is beginning to face.

In recent years, social media such as Twitter, YouTube, and various “Rich Site Summary” (RSS) feeds have become more widespread in their use and still growing capabilities. Moreover, the same social media outlets have become the new norm for sharing information and are rapidly replacing the several ways that government agencies and everyday citizens have been using in recent years to learn about and communicate with one another – and with other agencies, both public and in the private sector.

Beyond & Sometimes Because of the Worst-Case Scenario

Because of the mostly positive implications for the emergency management community, many companies also are developing and using the technologies needed to enable that community, and others, to use the “new media” as effectively as possible. Individual citizens already can: (a) follow and post feedback to local, state, and federal agencies and organizations on Facebook and Twitter; (b) subscribe to local alerts; and (c) most important of all, perhaps – use many of the other services provided to inform and build confidence in preparedness and response efforts during sudden times of crisis.

To ensure the continuity of information, the primary cyber security focus typically addresses the worst-case scenario – content no longer being available (as a result of service attacks, perhaps, and/or other malicious behaviors designed to destroy or disable valuable networks). Other important planning initiatives focus on preventing the hacking of power grids, attacks on nuclear facilities, and the destruction of other high-value/high-consequence infrastructures.

Such protection is of course both necessary and extremely important, but fails to address the most vulnerable area of cyber security – namely, the information itself. An even greater threat than not being able to communicate at all would be the communication of erroneous, inaccurate, or misleading information, thereby creating widespread doubt and eroding community confidence in the ability of government (state and local as well as federal) to provide the leadership needed in times of disaster.

Misinformation Problems Increase as Societal Buffers Deteriorate

Before use of the social media became so prevalent, the societal buffers were much stronger and usually able to quickly dispel inaccurate rumors – typically passed either by word of mouth or by handwritten and/or typed documents. However, as information began moving not only much faster but also more freely and in massive volume, the situation started to change significantly.

There are two factors in particular that have emerged to reduce the traditional effectiveness of the societal buffer: (1) The societal buffer is constantly being bombarded with information – transmitted in large quantities every millisecond; the transmission rate for such transmission is already astounding – and that problem may be only just beginning. (2) The societal buffer is usually working in close proximity to nontraditional media on the other side of the societal buffer. Reports by the so-called fringe media, the growing dissemination of unverified (and sometimes even manufactured) “facts,” and even the unintentional negative consequences caused by simple typographical errors are: (a) difficult to control; (b) can lead to the ruin of careers and companies; and (c) are rapidly leading to a “crisis in confidence.” To cite but one example: On 12 December 2011, some residents in New Jersey received an alarming text message stating “Civil Emergency in this area until 1:24 p.m. EST Take Shelter Now.”

“Within about 90 minutes,” according to CBS News, “the state homeland security and emergency management offices posted on Twitter that no emergency existed, but by then people had called a variety of local, county, and state agencies to express their concerns.”

A later investigation determined that what was originally thought to be a malicious “spamming” type of text turned out to be an error, by Verizon, in not describing the alert as a “TEST.” Whatever the reason (or excuse), some unquantifiable social damage was done and citizens’ reactions in that state may be considerably different the next time an emergency alert is issued.

Greater Costs & Higher Consequences

The emerging threat that emergency managers now must consider is coping with anyone wanting to do harm and exploit a disaster by turning it into a higher-consequence event. Last summer, emergency management agencies took appropriate actions across many states and municipalities as Hurricane Irene roared its way up the East Coast. Evacuations were timely and orderly, and information to the public was available on, among other outlets: municipal websites; Facebook accounts: RSS feeds; and email/text alerting subscriptions.

However, in a matter of seconds, these and other efforts might just as easily have been “hijacked.” Using any of a growing number of relatively unsophisticated techniques, an organized group that wanted to disrupt or otherwise harm preparedness and response efforts might easily have: (a) rerouted websites to “mirror” sites providing erroneous information; (b) created other error-crammed websites appearing to be credible; (c) posted bogus Facebook comments from “expert sources”; and/or (d) “spammed” alerting mechanisms into the cellular network. All of these and other harmful actions could be coordinated in a mutually assuring way to misguide literally millions of private citizens, many of whom would probably behave in ways that could have potentially very harmful consequences.

Protecting the privacy of information – and the security as well as the continuity of operations – is and must continue to be a very high priority for successful emergency management. However, any failure to protect the quality and accuracy of the information itself poses yet another dangerous threat that might sometimes be overlooked by those responsible for creating hazard-mitigation and continuity-of-operations plans for the communities they serve.

For additional information on: The CBS News report, visit http://www.cbsnews.com/8301-201_162-57341882/mistaken-verizon-emergency-alert-scares-n.j/

W. Ross Ashley

W. Ross Ashley is the Executive Director of the National Fusion Center Association (NFCA). He also serves on the Board of Advisors to numerous corporate clients. He was confirmed by the U.S. Senate in December 2007 and served as Assistant Administrator of the Grant Programs Directorate until August 2009. Previous roles include: Chief Executive Officer of the National Children’s Center (NCC), founder of the Templar Corporation, Director of Law Enforcement Technologies at ISX Corporation, and other private-sector positions. He is a retired Air Force Intelligence Officer who served in both the Virginia Air National Guard and the U.S. Air Force Reserve.

SHARE:

TAGS:

No tags to display

COMMENTS

Translate »