Breach of Faith Research Paper

  • Length: 6 pages
  • Sources: 7
  • Subject: Psychology
  • Type: Research Paper
  • Paper: #12967670

Excerpt from Research Paper :

Breach of Faith

Over the course of twenty-two years, from 1979 to 2001, Robert Hanssen participated in what is possibly the most severe breach of national intelligence in the United States' history. Through a combination of skill and sheer luck, Hanssen was able to pass critical information from his job at the FBI to Soviet and later Russian intelligence agencies, information that may have contributed to the capture and execution of a number of individuals. Hanssen's case is particularly interesting because it takes place over the course of two decades that included the end of the Cold War and the beginning of the internet age, and as such examining the various means by which Hanssen was able to breach security offers extra insight into the security threats, new and old, that face those tasked with protecting sensitive government information. Ultimately, the Hanssen case reveals a number of ongoing vulnerabilities concerning the safeguarding of sensitive government information, because although the FBI adapted its security functions in response to developments of the last thirty years, this response has been characterized by the same underlying structural and cultural problems that created these intelligence and security failures in the first place.

1. Identify specific vulnerabilities that allowed Robert Hanssen to access highly sensitive personal and secret information, disclosure of which could prove harmful to U.S. security and endanger American lives, to breach FBI security rules and requirements, and pass this sensitive information to the Soviet Union, a government inimical to the U.S. At the time.

Hanssen's breach was so successful that in its aftermath, the Justice Department ordered a review of the FBI's security programs in order to identify which policies and procedures should be changed or updated in order to confront the kind of threats embodied by Robert Hanssen and others like him. Many of the specific vulnerabilities that allowed Hanssen to access highly sensitive information and pass it along to the Soviet Union and eventually Russia were outlined in the report compiled by the Commission for Review of FBI Programs, also known as the Webster Commission, after its chairman William H. Webster. The Webster Commission notes the relative success of Hanssen's breach in the introduction to its report, noting that "the Commission for the Review of FBI Security Programs was established in response to possibly the worst intelligence disaster in U.S. history: the treason of Robert Hanssen, an FBI Supervisory Special Agent, who ever twenty-two years gave the Soviet Union and Russia vast quantities of documents and computer diskettes filled with national security information of incalculable value" (Webster, 2002, p. 1).

First and foremost, this represents a security failure at the basic level of physical security. Although some of the information Hanssen sold was acquired through his skill with computers, a major reason he was able to go undetected for so long was because he could simply "walk into Bureau units in which he had worked some time before, log on to stand-alone data systems, and retrieve, for example, the identities of foreign agents whom U.S. intelligence services had compromised, information vital to American interests and even more immediately vital to those whose identities Hanssen betrayed" (Webster, 2002, p. 1). Considering "who needs to be able to physically access each machine, and structur[ing] your site layout in order to allow the minimum necessary access" is one of the most basic steps that can be taken to help ensure the security of sensitive information, but according to the Webster Commission report, "Bureau personnel routinely upload classified information into widely accessed databases, a form of electronic open storage that allows essentially unregulated downloading and printing" (Beale, 2007, p. 22, & Webster, 2002, p. 4).

Hanssen also exploited the FBI's lack of any kind of behavior analysis and anomaly detection program, because as the Webster Report notes, "an information-system auditing program," -- which works by detecting behavioral anomalies "as current host activity is gauged and compared to a statistical baseline, or threshold, of known activity for that host" -- "would surely have flagged Hanssen's frequent use of FBI computer systems to determine whether he was the subject of a counterintelligence investigation" (Webster, 2002, p. 4, & Trost, 2010, "Network Flows and Anomaly Detection"). Because he actually developed many of the information systems used by the Bureau, Hanssen was ideally placed to exploit any potential weaknesses or shortcomings in those systems.

2. Considering the serious consequences of security breaches at the highest levels of our government, as evidenced in the example of Robert Hanssen and others, what organizational, management, technology and procedural approaches would you employ to prevent any future recurrence of such breaches?

One of the most damning conclusions of the Webster Commission's report was the claim that there are "significant deficiencies in Bureau policy and practice," and furthermore, that "those deficiencies flow from a pervasive inattention to security, which has been at best a low priority," at least until something dramatic, like the arrest of Robert Hanssen, takes place (Webster, 2002, p. 1). This is an important element of the report's conclusions because it demonstrates how the blame for Hanssen's breach must lie not only with Hanssen and specific vulnerabilities he exploited, but also the organization, management, technology, and procedures that contribute to this culture of "insecurity" wherein warning signs and useful hints can be ignored for so long.

The Webster Commission's report locates the source of this pervasive insecurity in the inherent tension between the FBI's law-enforcement and intelligence operations, because intelligence and law enforcement sometimes demand entirely different goals, standards, and procedures, especially when it comes to the sharing of information. For example, the report notes that "until the terrorist attacks in September 2001, the FBI focused on detecting and prosecuting traditional crime, and FBI culture emphasized the priorities and morale of criminal components within the Bureau" (Webster, 2002, p. 1). Problems arose because "this culture was based on cooperation and the free flow of information inside the Bureau, a work ethic wholly at odds with the compartmentation characteristic of intelligence investigations involving highly sensitive, classified information" (Webster, 2002, p. 1-2). As a result, "operational imperatives will normally and without reflection trump security needs," such that, for example, "senior Bureau management recently removed certain security based access restrictions from the FBI's automated system of records, the principal computer system Hanssen exploited, because the restrictions had hindered the investigation of the terrorist attacks" Webster, 2002, p. 2).

In a way, then, the FBI's security problems revealed as a result of the Hanssen investigation were indicative of the issues more or less facing the entire Intelligence Community prior to the attacks of September 11, 2001, because they represent an imbalance between the necessary and sometimes exclusive goals of sharing important intelligence while keeping strict control over the flow of that intelligence. When the latter goal takes precedence over the former, crucial information may not be shared between offices and agencies, leading to unnecessary and costly gaps in intelligence (as in the run-up to 9/11). Or, as in the FBI's case, the goal of information flow may be prioritized in the name of law enforcement efficiency, precipitating decisions that may help operational efficiency but ultimately undermine security; for example, Bureau management's decision to remove certain access restrictions to the FBI's automated system of records unintentionally granted "general access within the Bureau to information obtained through warrants under the Foreign Intelligence Surveillance Act [] the use of [which] in criminal investigations is tightly restricted by Constitutional considerations and Department of Justice guidelines" (Webster, 2002, p. 2). Thus, when attempting to consider the organizational, management, technological, and procedural approaches one might take towards preventing the future recurrence of such breaches, one must bear in mind that these breaches are ultimately the result of a conflict between necessary but occasionally oppositional duties, and that as such the solution to these breaches must focus specifically on minimizing this tension.

The first substantial action that could be taken to help ensure future breaches do not occur is a reorganization of the FBI's security and intelligence functions. The Webster Commission compared the FBI's organization of its security functions with the rest of the Intelligence Community and found that, "in sharp contrast to other agencies," the FBI's security and intelligence functions "are fragmented, with security responsibilities spread across eight Headquarters divisions and fifty-six field offices" (Webster, 2002, p. 4). This fragmentation of security functions dramatically increases the likelihood of a breach because it means that the overall security apparatus is that much more porous, with adequate, lacking, or inconsistent oversight depending on particular Headquarters or field office.

To combat this phenomenon, the Webster Commission recommended that the Bureau establish an Office of Security tasked with, among other things, consolidating security functions under a senior executive" in order to "prompt management to focus on security, resolve conflicts between operational and security objectives, and foster Headquarters and field coordination" (Webster, 2002, p. 4). The FBI did not establish an Office of Security, which would have meant a high level office reporting directly to the…

Cite This Research Paper:

"Breach Of Faith" (2012, April 10) Retrieved January 18, 2017, from

"Breach Of Faith" 10 April 2012. Web.18 January. 2017. <>

"Breach Of Faith", 10 April 2012, Accessed.18 January. 2017,