Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
February 2, 2016 09:36 am PST

Doxxing Sherlock

Sherl

Id tell you, but Id have to kill you. This is what I shout at the TV (or the Youtube window) whenever I see a surveillance boss explain why none of his methods, or his mission, can be subjected to scrutiny. I write about surveillance, counter surveillance, and civil liberties, and have spent a fair bit of time in company with both the grunts and the generals of the surveillance industry, and I can always tell when one of these moments is coming up, the flinty-eyed look of someone about play Jason Bourne.

The stories we tell ourselves are the secret pivots on which our lives turn.So when Laura Poitras approached me to write a piece forthe Astro Noise book -- to accompany her show at the Whitney -- and offered me access to the Snowden archive forthe purpose, I jumped at the opportunity.

Fortuitously, the Astro Noise offer coincided perfectly with anotheroffer, from Laurie King and Leslie Klinger. Laurie is a bestselling Holmes writer; Les is the lawyer who won the lawsuit that put Sherlock Holmes in the public domain, firmly and unequivocally. Since their legal victory, they've been putting together unauthorized Sherlock anthologies, and did I want to write one for"Echoes of Holmes," the next one in line?

The two projects coincided perfectly. Holmes, after all, is the masterof HUMINT, (human intelligence), the business of following peoplearound, getting information from snitches, dressing up in putty nosesand fake beards... Meanwhile, his smarter brother Mycroft is acorpulent, sedentary presence in the stories, the master of SIGINT(signals intelligence), a node through which all the intelligence ofthe nation flows, waiting to be pieced together by Mycroft and hisenormous intellect. The Mycroft-Sherlock dynamic perfectly embodies the fraternal rivalry between SIGINT and HUMINT: Sherlock chases all around town dressed like an old beggar woman or similar ruse, catches his man and hands him over to Scotland Yard, and then reports in to Mycroft, who interrupts him before he can get a word out, arching an eyebrow and saying, "I expect you found that it was the Bohemian stable-hand all along, working for those American Freemasons who were after the Sultan's pearls, was it not?"

In 2014, I watched Jennifer Gibson from the eminent prisoners rights group Reprieve talking about her group's project to conduct a census of those killed by US drone strikes in Yemen and Pakistan. The CIA conducts these strikes, using SIGINT to identify mobile phones belonging to likely targets and dispatch killer drones to annihilate anything in their vicinity. As former NSA and CIA director Michael Hayden once confessed: "We kill people based on metadata."

But the CIA does not specialize in SIGINT (that's the NSA's job). Formost of its existence, the CIA was known as a HUMINT agency, the mastersof disguise and infiltration..

That was the old CIA. The new CIA is just another SIGINT agency. Signals Intelligence isnt just an intelligence methodology, its a great business. SIGINT means huge procurements -- servers, administrators, electricity, data-centers, cooling -- while HUMINT involves sending a lot of your friends intoharm's way, potentially never to return.

We are indeed in the golden age of SIGINT. Despite security services' claims that terrorists are "going dark" with unbreakable encryption, the spooks have done much to wiretap the whole Internet.

The UK spy agency GCHQ really tipped their hand when they called their flagship surveillance program "Mastering the Internet." Not "Mastering Cybercrime," not "Mastering Our Enemies." Mastering the *Internet* -- the very same Internet that everyone uses, from the UK's allies in the Five Eyes nations to the UK Parliament to Britons themselves. Similarly, a cursory glance at the logo for the NSAs Special Source Operations -- the fiber-tapping specialists at the NSA -- tells the whole story.

These mass surveillance programs would likely not have withstood public scrutiny. If the NSAs decision to launch SSO had been attended by a nightly news broadcast featuring that logo, it would have been laughed out of the room. The program depended on the NSA by telling its story to itself, and not to the rest of us. The dotcom boom would have been a very different affair if the majorlegislative debate of the day had been over whether to allow thesurveillance agencies of Western governments to monitor all the fiber cables, and harvest every click and keystroke they can legally lay claim to, parcel it into arbitrary categories like metadata and content to decide what to retain indefinitely, to and run unaccountable algorithms on that data to ascribe secret guilt.

As a result, the entire surveillance project has been undertaken insecrecy, within the bubble of people who already think that surveillanceis the answer to virtually any question. The surveillance industry is a mushroom, grown in dark places, and it has sent out spores into every corner of the Internet, which have sprouted their own surveillance regimes. While this was happening, something important was happening to theInternet: as William Gibson wrote in 2007's "Spook Country, "cyberspaceis everting" -- turning inside out. Computers arent just the things in our bags in the trunks of our cars. Today, our cars are computers. This is why Volkswagen was able to design a car that sensed when it was undergoing regulatory inspection and changed its behavior to sneak through tests. Our implanted defibrillators are computers, which is why Dick Cheney had the wireless interface turned off on his defibrillator prior to its implantation. Everything is a networked computer.

Those networked devices are an attack surface that is available to theNSA and GCHQ's adversaries -- primarily other governments, as well asnon-government actors with political ambitions -- and to gardenvariety criminals. Blackmailers, voyeurs, identity thieves andantisocial trolls routinely seize control over innocents' computers andattack them in every conceivable way. Like the CIA and its drones, theyoften don't know who their victims are: they find an exploit, write ascript to find as many potential victims as possible, and harvest them.

For those who are high-value targets, this lurking insecurity is even more of a risk -- witness the recent takeover of the personal email accounts of US Director of National Intelligence James Clapper by a group of self-described teenagers who previously took over CIA Director John Brennan's email account.

This is the moment when the security services could shine. We needcyber defense and we need it badly. But for the security services toshine, they'd have to spend all their time patching up the leaky boat ofnetworked security, while their major project for a decade and more hasbeen to discover weaknesses in the network and its end-points andexpand them, adding vulnerabilities that they can weaponize againsttheir adversaries -- leaving these vulnerabilities wide open for theiradversaries to use in attacking *us*.

The NSA and GCHQ have weaponized flaws in router operating systems, rather than telling the vendors about these flaws, leaving the worlds electronic infrastructure vulnerable to attack by the NSA and GCHQs adversaries. Our spies hack core routers and their adversaries' infrastructure, but they have made themselves reliant upon the continuing fragility and insecurity of the architectures common to enemy and ally alike, when they could have been making us all more secure by figuring out how to harden it.

The mission of making it as hard as possible for the enemy to attack us is in irreconcilable tension with the mission of making it as easy as possible for our security services to attack their adversaries.

There isn't a Bad Guy Internet and a Good Guy Internet. There's no Bad Guy Operating System and Good Guy Operating System. When GCHQ discovers something breakable in a computer system that Iranians depend upon, they've also discovered something amiss that Britons rely upon. GCHQ can't keep that gap in Iran's armor intact without leaving an equally large gap open in our own armor.

For my Sherlock story, I wanted to explore what it means to have asecurity methodology that was all attack, and precious little defense, particularly one that proceeded in secret, without any accountability or even argumentfrom people who thought you were doing it all wrong.


The Documents

Though I reviewed dozens of unpublished documents from the Snowdenarchive in writing my story, I relied upon three documents, two of which we are releasing today.

First, there's the crux of my Sherlock story, drawn from a March 2010 GCHQ document titled "What's the worst that could happen?" marked "TOP SECRET STRAP 1." This is a kind of checklist for spies who are seeking permission to infect their adversaries' computersor networks with malicious software.

It's a surprising document in many regards. The first thing that caughtmy eye about it is the quality of the prose. Most of the GCHQ documentsI've reviewed read like they were written by management consultants, dryand anodyne in a way that makes even the famously tortured prose of themilitary seem juicy by comparison. The story the authors of those documents are telling themselves is called something like, Serious grownups, doing serious work, seriously.

"What's the worst..." reads like the transcript of a lecture by afascinating and seasoned mentor, someone who's seen all the pitfalls andwants to help you, their protege, navigate this tricky piece of theintel business without shooting yourself in the foot.

It even tells a kind of story: we have partners who help us with ourmalware implantation. Are they going to help us with that business inthe future if their names get splashed all over the papers? Remember, there are clever people like you working for foreign governments -- they're going to try and catch us out! Imagine what might happen if one of our good friends got blamed for what we did -- or blamed us for it! Let's not forget the exploits themselves: our brilliant researchers quietly beaver away, finding the defects that the best and the brightest programmers at, say, Apple and Microsoft have left behind in their code: if you get caught, the companies will patch the vulnerabilities and we will lose the use of them forever.

On it goes in this vein, for three pages, until the very last point:

Who will have direct access to the data resulting from the operationand do we have any control over this? Could anyone take action on itwithout our agreement, eg could we be enabling the US to conduct adetention op which we would not consider permissible?

That's where the whole thing comes to something of a screeching halt.We're not talking about Tom Clancy net-wars fantasies anymore -- nowwe're into the realm of something that must haunt every man and woman of good will and integrity who works in the spy agencies: the possibility that a colleague or ally, operating without oversight or consequence, might descend into barbarism based on something you did.

Reading this, I thought of the Canadian officials who incorrectly told US authorities that Maher Arar, a Canadian citizen of Syrian origin was suspected of being connected to Al Qaeda.

Arar was detained by the United States Immigration and Naturalization Service (INS) during a stopover in New York on his way home from a family vacation in Tunis. The Americans, acting on incomplete intelligence from the Canadian Royal Canadian Mounted Police (RCMP), deported Arar to Syria, a country he had not visited since his move to Canada, and which does permit the renunciation of citizenship.

Arar claims he was tortured during his imprisonment which lasted almost a year, and bombarded with questions from his torturers that seemed to originate with the US security services. Finally, the Syrian government decided that Arar was innocent of any terrorist connections and let him go home to Canada. The US authorities refused to participate in the hearings on the Arar affair and the DHS has kept his family on the no-fly list.


Why did Syrian officials let him go? "Why shouldn't we leave him to go? We thought that would be a gesture of good will towards Canada, which is a friendly nation. For Syria, second, we could not substantiate any of the allegations against him," says Moustapha. He added that the Syrian government now considers Arar completely innocent.

Is this what the unnamed author of this good-natured GCHQ document meantby "a detention op which we would not consider permissible?" TheCanadian intelligence services apparently told their US counterpartsearly on that they'd been mistaken about Arar, but when a serviceoperates with impunity, in secret, it gets to steamroller on, withoutletting facts get in the way, refusing to acknowledge its errors.

The security services are a system with a powerful accelerator and inadequate brakes. Theyve rebranded terrorism as an existential risk to civilization (rather than a lurid type of crime). The War on Terror is a lock that opens all doors. As innumerable DEA agents have discovered, the hint that the drug-runner youre chasing may be funding terror is a talisman that clears away red-tape, checks and balances, and oversight.

The story of terrorism is that it must be stopped at all costs, that there are no limits when it comes to the capture and punishment of terrorists. The story of people under suspicion of terrorism, therefore, is the story of people to whom no mercy is due, and of whom all cunning must be assumed.

Within the security apparatus, identification as a potential terrorist is a life sentence, a FAIR GAME sign taped to the back of your shirt, until you successfully negotiate a kafka-esque thicket of secretive procedures and kangaroo courts. What story must the author of this document have been telling themself when they wrote that final clause, thinking of someone telling himself the DIE HARD story, using GCHQs data to assign someone fair game status for the rest of their life?

Holmes stories are perfectly suited to this kind of problem. From "A Scandal in Bohemia" to "A Study in Scarlet," to "The Man With the Twisted Lip" Holmes's clients often present at his doorstep wracked with guilt or anxiety about the consequences of their actions. Often as not, Holmes's solution to their problems involves not just unraveling the mystery, but presenting a clever way for the moral question to be resolved as well.

The next document is the "HIMR Data Mining Research Problem Book," afascinating scholarly paper on the methods by which the massivedata-streams from the deep fiber taps can be parsed out intoidentifiable, individual parcels, combining data from home computers,phones, and work computers.

It was written by researchers from the Heilbronn Institute for Mathematical Research in Bristol, a partnership between the UK Government Communications Headquarters and the University of Bristol. Staff spend half their time working on public research, the other half is given over to secret projects for the government.

The Problem Book is a foundational document in the Snowden archive, written in clear prose that makes few assumptions about the readers existing knowledge. It likewise makes few ethical assertions about its work, striking a kind of academic posture in which something is good if it does some task efficiently, regardless of the task. It spells out the boundaries on what is and is not metadata without critical scrutiny, and dryly observes that cyber is a talisman -- reminiscent of terrorist -- that can be used to conjure up operating capital, even when all the other government agencies are having their budgets cut.

The UK government has recognized the critical importance of cyber to our strategic position: in the Comprehensive Spending Review of 2010, it allocated a significant amount of new money to cyber, at a time when almost everything else was cut. Much of this investment will be entrusted to GCHQ, and in return it is imperative for us to use that money for the UKs advantage.

Some of the problems in this book look at ways of leveraging GCHQs passive SIGINT capabilities to give us a cyber edge, but researchers should always be on the look-out for opportunities to advance the cyber agenda.

The story the Problem Book tells is of scholars whove been tasked with a chewy problem: sieving usable intelligence out of the firehoses that GCHQ has arogated to itself with its fiber optic taps.

Somewhere in that data, they are told, must be signatures that uniquely identify terrorists. Its a Big Data problem, and the Problem Book, dating to 2010, is very much a creature of the first rush of Big Data hype.

For the researchers, the problem is that their adversaries are no longer identifiable by their national affiliation. The UK government cant keep on top of its enemies by identifying the bad countries and then spying on their officials, spies and military. Now the bad guys could be anyone. The nation-state problem was figuring out how to spy on your enemies. The new problem is figuring out which people to spy on.

"It is important to bear in mind that other states (..) are not bound by the same legal framework and ideas of necessity and proportionality that we impose on ourselves. Moreover, there are many other malicious actors in cyberspace, including criminals and hackers (sometimes motivated by ideology, sometimes just doing it for fun, and sometimes tied more or less closely to a nation state). We certainly cannot ignore these non-state actors".

The problem with this is that once you accept this framing, and note the happy coincidence that your paymasters just happen to have found a way to spy on everyone, the conclusion is obvious: just mine all of the data, from everyone to everyone, and use an algorithm to figure out whos guilty.

The bad guys have a Modus Operandi, as anyone whos watched a cop show knows. Find the MO, turn it into a data fingerprint, and you can just sort the firehoses output into terrorist-ish and unterrorist-ish.

Once you accept this premise, then its equally obvious that the whole methodology has to be kept from scrutiny. If youre depending on three tells as indicators of terrorist planning, the terrorists will figure out how to plan their attacks without doing those three things.

This even has a name: Goodhart's law. "When a measure becomes a target, it ceases to be a good measure." Google started out by gauging a web pages importance by counting the number of links they could find to it. This worked well before they told people what they were doing. Once getting a page ranked by Google became important, unscrupulous people set up dummy sites (link-farms) with lots of links pointing at their pages.

The San Bernardino shootings re-opened the discussion on this problem. When small groups of people independently plan atrocities that dont require complicated or unusual steps to plan and set up, what kind of data massaging will surface them before its too late?

Much of the paper deals with supervised machine learning, a significant area of research and dispute today. Machine learning is used in "predictive policing" systems to send cops to neighborhoods where crime is predicted to be ripening, allegedly without bias. In reality, of course, the training data for these systems comes from the human-directed activity of the police before the system was set up. If the police stop-and-frisk all the brown people they find in poor neighborhoods, then that's where they'll find most of thecrime. Feed those arrest records to a supervised machine algorithm andask it where the crime will be and it will send your officers back tothe places where they're already focusing their efforts: in other words,"predictive policing" is great at predicting what the police will do,but has dubious utility in predicting crime itself.

The part of the document I was most interested in was the section onreading and making sense of network graphs. They are the kind of thing youd use in a PowerPoint slide when you want to represent an abstraction like "the Internet". Network graphs tell you a lot about the structures oforganizations, about the relative power relationships between them. Ifthe boss usually communicates to their top lieutenants after beingcontacted by a trusted advisor, then getting to that advisor is a greatway to move the whole organization, whether you're a spy or a sales rep.

The ability of data-miners to walk the social and network graphs oftheir targets, to trace the "information cascades" (that is, to watchwho takes orders from whom) and to spot anomalies in the network andzero in on them is an important piece of the debate on "going dark." Ifspies can look at who talks to whom, and when, and deduce organizationalstructure and upcoming actions, then the ability to read the contentof messages -- which may be masked by cryptography -- is hardly themake-or-break for fighting their adversaries.

This is crucial to the debate on surveillance. In the 1990s, there was a seminal debate over whether to prohibit civilian access to working cryptography, a debate that was won decisively for the side of unfettered access to privacy tools. Today, that debate has been renewed. David Cameron was re-elected to the UK Prime Minister's office after promising to ban strong crypto, and the UK government has just introduced a proposed cryptographic standard designed to be broken by spies.

The rubric for these measures is that spies have lost the ability to listen in on their targets, and with it, their ability to thwart attacks. But as the casebook demonstrates, a spy's-eye view on the Internet affords enormous insight into the activities of whole populations -- including high-value terrorism suspects.

The Problem Book sets up the Mycroftian counterpoint to Sherlock's human intelligence -- human and humane, focused on the particulars of each person in his stories.

Sherlock describes Mycroft as an all-knowing savant:

The conclusions of every department are passed to him, and he is the central exchange, the clearinghouse, which makes out the balance. All other men are specialists, but his specialism is omniscience.

While Sherlock is energized by his intellectual curiosity, his final actions are governed by moral consequences and empathy. Mycroft functions with the moral vacuum of a software: tell him to identify anomalies and he'll do it, regardless of why he's been asked or what happens next. Mycroft is a Big Data algorithm in human form.

The final document I relied upon in the story is one we won't bepublishing today: an intercepted transcript of a jihadi chat roomThis document isn't being released because there were many people inthat chat room, having what they thought was an off-the-recordconversation with their friends. Though some of them were espousingextreme ideology, mostly they were doing exactly what my friends and Idid when I was a teenager: mouthing off, talking about our love lives,telling dirty jokes, talking big.

These kids were funny, rude, silly, and sweet -- they were lovelorn andfighting with their parents. I went to school with kids like these. Iwas one of them. If you were to judge me and my friends based onour conversations like these, it would be difficult to tell usapart from these children. We all talked a big game, we all fretted aboutmilitary adventurism, we all cursed the generals who decided thatcivilian losses are acceptable in the pursuit of their personal goals.I still curse those generals, for whatever it's worth.I read reams of these chat transcripts and I am mystified at their value to national security. These children hold some foolish beliefs, but they're not engaged in anything more sinister than big talk and trash talk.

Most people -- including most people like these kids -- are not terrorists. You can tell, because we're not all dead. An indiscriminate surveillance dragnet will harvest far more big talkers than bad guys. Mass surveillance is a recipe for creating an endless stream of Arars, and each Arar serves as inspiration for more junior jihadis.

In my fiction, I've always tried to link together real world subjects of social and technological interest with storytelling that tries to get into the way that the coming changes will make us feel. Many readers have accused me of prdicting the future because I've written stories about mass surveillance and whistleblowers.

But the truth is that before Snowden, there was Wikileaks and Chelsea Manning, and Bill Binney and Thomas Drake before them, and Mark Klein before them. Mass surveillance has been an open secret since the first GW Bush administration, and informed speculation about where it was going was more a matter of paying attention to the newspaper than peering into a crystal ball.

Writing a Sherlock Holmes story from unpublished leaks was a novel experience, though, one that tied together my activist, journalist and fiction writing practices in a way that was both challenging and invigorating. In some ways, it represented a constraint, because once I had the nitty-gritty details of surveillance to hand, I couldn't make up new ones to suit the story. But it was also tremendous freedom, because the mass surveillance regimes of the NSA and GCHQ are so obviously ill-considered and prone to disastrous error that the story practically writes itself.

I worry about "cybersecurity," I really do. I know that kids can docrazy things. But in the absence of accountability and independentscrutiny, the security services have turned cyberspace into abattleground where they lob weapons at one another over our heads, andwe don't get a say in the matter. Long after this round of the war onterror is behind us, we'll still be contending with increasingly smallcomputers woven into our lives in increasingly intimate, life-or-deathways. The the parochial needs of spies and the corporations that supplymustn't trump the need for a resilient electronic nervous system for thetwenty first century.

Astro Noise: A Survival Guide for Living Under Total Surveillance, edited by Laura Poitras, features my story "Sherlock Holmes and the Adventure of the Extraordinary Rendition," as well as contributions from Dave Eggers, Ai Weiwei, former Guantanamo Bay detainee Lakhdar Boumediene, Kate Crawford, and Edward Snowden.

The Astro Noise exhibition is on at New York City's Whitney Museum from February 5 to May 1, 2016.

Henrik Moltke contributed research to this story.


Source documents



Original Link: http://feeds.boingboing.net/~r/boingboing/iBag/~3/TMOvjVzHWsk/doxxing-sherlock-3.html

Share this article:    Share on Facebook
View Full Article