Friday, June 28, 2013

How The NSA Still Harvests Online Data - Business Insider

NSA

REUTERS/Jim Urquhart

A cyber security analyst works in a watch and warning center at a Department of Homeland Security cyber security defense lab at the Idaho National Laboratory, September 30, 2011, in Idaho Falls, Idaho.

A review of top-secret NSA documents suggests that the surveillance agency still collects and sifts through large quantities of Americans' online data ? despite the Obama administration's insistence that the program that began under Bush ended in 2011.

Shawn Turner, the Obama administration's director of communications for National Intelligence, told the Guardian that "the internet metadata collection program authorized by the Fisa court was discontinued in 2011 for operational and resource reasons and has not been restarted."

But the documents indicate that the amount of internet metadata harvested, viewed, processed and overseen by the Special Source Operations (SSO) directorate inside the NSA is extensive.

While there is no reference to any specific program currently collecting purely domestic internet metadata in bulk, it is clear that the agency collects and analyzes significant amounts of data from US communications systems in the course of monitoring foreign targets.

On December 26 2012, SSO announced what it described as a new capability to allow it to collect far more internet traffic and data than ever before. With this new system, the NSA is able to direct more than half of the internet traffic it intercepts from its collection points into its own repositories. One end of the communications collected are inside the United States.

The NSA called it the "One-End Foreign (1EF) solution". It intended the program, codenamed EvilOlive, for "broadening the scope" of what it is able to collect. It relied, legally, on "FAA Authority", a reference to the 2008 Fisa Amendments Act that relaxed surveillance restrictions.

This new system, SSO stated in December, enables vastly increased collection by the NSA of internet traffic. "The 1EF solution is allowing more than 75% of the traffic to pass through the filter," the SSO December document reads. "This milestone not only opened the aperture of the access but allowed the possibility for more traffic to be identified, selected and forwarded to NSA repositories."

It continued: "After the EvilOlive deployment, traffic has literally doubled."

The scale of the NSA's metadata collection is highlighted by references in the documents to another NSA program, codenamed ShellTrumpet.

On December 31, 2012, an SSO official wrote that ShellTrumpet had just "processed its One Trillionth metadata record".

It is not clear how much of this collection concerns foreigners' online records and how much concerns those of Americans. Also unclear is the claimed legal authority for this collection.

Explaining that the five-year old program "began as a near-real-time metadata analyzer ? for a classic collection system", the SSO official noted: "In its five year history, numerous other systems from across the Agency have come to use ShellTrumpet's processing capabilities for performance monitoring" and other tasks, such as "direct email tip alerting."

Almost half of those trillion pieces of internet metadata were processed in 2012, the document detailed: "though it took five years to get to the one trillion mark, almost half of this volume was processed in this calendar year".

Another SSO entry, dated February 6, 2013, described ongoing plans to expand metadata collection. A joint surveillance collection operation with an unnamed partner agency yielded a new program "to query metadata" that was "turned on in the Fall 2012". Two others, called MoonLightPath and Spinneret, "are planned to be added by September 2013."

A substantial portion of the internet metadata still collected and analyzed by the NSA comes from allied governments, including its British counterpart, GCHQ.

An SSO entry dated September 21, 2012, announced that "Transient Thurible, a new Government Communications Head Quarters (GCHQ) managed XKeyScore (XKS) Deep Dive was declared operational." The entry states that GCHQ "modified" an existing program so the NSA could "benefit" from what GCHQ harvested.

"Transient Thurible metadata [has been] flowing into NSA repositories since 13 August 2012," the entry states.

This article originally appeared on guardian.co.uk

Source: http://www.businessinsider.com/how-the-nsa-still-harvests-online-data-2013-6

zsa zsa gabor illinois primary trayvon martin 911 call kiribati vernal equinox mr rogers jamie lee curtis

No comments:

Post a Comment