• Vast amounts of data kept in repository codenamed Marina
• Data retained regardless of whether person is NSA target
• Material used to build ‘pattern-of-life’ profiles of individuals
• What is metadata? Find out with our interactive guide
The National Security Agency is storing the online metadata of millions of internet users for up to a year, regardless of whether or not they are persons of interest to the agency, top secret documents reveal.
Metadata provides a record of almost anything a user does online, from browsing history – such as map searches and websites visited – to account details, email activity, and even some account passwords. This can be used to build a detailed picture of an individual’s life.
The Obama administration has repeatedly stated that the NSA keeps only the content of messages and communications of people it is intentionally targeting – but internal documents reveal the agency retains vast amounts of metadata.
An introductory guide to digital network intelligence for NSA field agents, included in documents disclosed by former contractor Edward Snowden, describes the agency’s metadata repository, codenamed Marina. Any computer metadata picked up by NSA collection systems is routed to the Marina database, the guide explains. Phone metadata is sent to a separate system.
“The Marina metadata application tracks a user’s browser experience, gathers contact information/content and develops summaries of target,” the analysts’ guide explains. “This tool offers the ability to export the data in a variety of formats, as well as create various charts to assist in pattern-of-life development.”
The guide goes on to explain Marina’s unique capability: “Of the more distinguishing features, Marina has the ability to look back on the last 365 days’ worth of DNI metadata seen by the Sigint collection system, regardless whether or not it was tasked for collection.” [Emphasis in original.]
On Saturday, the New York Times reported that the NSA was using its metadata troves to build profiles of US citizens’ social connections, associations and in some cases location, augmenting the material the agency collects with additional information bought in from the commercial sector, which is is not subject to the same legal restrictions as other data.
The ability to look back on a full year’s history for any individual whose data was collected – either deliberately or incidentally – offers the NSA the potential to find information on people who have later become targets. But it relies on storing the personal data of large numbers of internet users who are not, and never will be, of interest to the US intelligence community.
Marina aggregates NSA metadata from an array of sources, some targeted, others on a large scale. Programs such as Prism – which operates through legally compelled “partnerships” with major internet companies – allow the NSA to obtain content and metadata on thousands of targets without individual warrants.
The NSA also collects enormous quantities of metadata from the fibre-optic cables that make up the backbone of the internet. The agency has placed taps on undersea cables, and is given access to internet data through partnerships with American telecoms companies.
About 90% of the world’s online communications cross the US, giving the NSA what it calls in classified documents a “home-field advantage” when it comes to intercepting information.
By confirming that all metadata “seen” by NSA collection systems is stored, the Marina document suggests such collections are not merely used to filter target information, but also to store data at scale.
A sign of how much information could be contained within the repository comes from a document voluntarily disclosed by the NSA in August, in the wake of the first tranche of revelations from the Snowden documents.
The seven-page document, titled “The National Security Agency: Missions, Authorities, Oversight and Partnerships”, says the agency “touches” 1.6% of daily internet traffic – an estimate which is not believed to include large-scale internet taps operated by GCHQ, the NSA’s UK counterpart.
The document cites figures from a major tech provider that the internet carries 1,826 petabytes of information per day. One petabyte, according to tech website Gizmodo, is equivalent to over 13 years of HDTV video.
“In its foreign intelligence mission, NSA touches about 1.6% of that,” the document states. “However, of the 1.6% of the data, only 0.025% is actually selected for review.
“The net effect is that NSA analysts look at 0.00004% of the world’s traffic in conducting their mission – that’s less than one part in a million.”
However, critics were skeptical of the reassurances, because large quantities of internet data is represented by music and video sharing, or large file transfers – content which is easy to identify and dismiss without entering it into systems. Therefore, the NSA could be picking up a much larger percentage of internet traffic that contains communications and browsing activity.
Journalism professor and internet commentator Jeff Jarvis noted: “[By] very rough, beer-soaked-napkin numbers, the NSA’s 1.6% of net traffic would be half of the communication on the net. That’s one helluva lot of ‘touching’.”
Much of the NSA’s data collection is carried out under section 702 of the Fisa Amendments Act. This provision allows for the collection of data without individual warrants of communications, where at least one end of the conversation, or data exchange, involves a non-American located outside the US at the time of collection.
The NSA is required to “minimize” the data of US persons, but is permitted to keep US communications where it is not technically possible to remove them, and also to keep and use any “inadvertently” obtained US communications if they contain intelligence material, evidence of a crime, or if they are encrypted.
The Guardian has also revealed the existence of a so-called “backdoor search loophole”, a 2011 rule change that allows NSA analysts to search for the names of US citizens, under certain circumstances, in mass-data repositories collected under section 702.
According to the New York Times, NSA analysts were told that metadata could be used “without regard to the nationality or location of the communicants”, and that Americans’ social contacts could be traced by the agency, providing there was some foreign intelligence justification for doing so.
The Guardian approached the NSA with four specific questions about the use of metadata, including a request for the rationale behind storing 365 days’ worth of untargeted data, and an estimate of the quantity of US citizens’ metadata stored in its repositories.
But the NSA did not address any of these questions in its response, providing instead a statement focusing on its foreign intelligence activities.
“NSA is a foreign intelligence agency,” the statement said. “NSA’s foreign intelligence activities are conducted pursuant to procedures approved by the US attorney general and the secretary of defense, and, where applicable, the foreign intelligence surveillance (Fisa) court, to protect the privacy interests of Americans.
“These interests must be addressed in the collection, retention, and dissemination of any information. Moreover, all queries of lawfully collected data must be conducted for a foreign intelligence purpose.”
It continued: “We know there is a false perception out there that NSA listens to the phone calls and reads the email of everyday Americans, aiming to unlawfully monitor or profile US citizens. It’s just not the case.
“NSA’s activities are directed against foreign intelligence targets in response to requirements from US leaders in order to protect the nation and its interests from threats such as terrorism and the proliferation of weapons of mass destruction.”