Privacy

Is the US Ready for the Next Cyber Terror Attack?

Ian Blodger, MJLST Staff Member

The US’s military intervention against ISIL carries with it a high risk of cyber-terror attacks. The FBI reported that ISIL and other terrorist organizations may turn to cyber attacks against the US in response to the US’s military engagement of ISIL. While no specific targets have been confirmed, likely attacks could result in website defacement to denial of service attacks. Luckily, recent cyber terror attacks attempting to destabilize the US power grid failed, but next time we may not be so lucky. Susan Brenner’s recent article, Cyber-threats and the Limits of Bureaucratic Control, published in the Minnesota Journal of Law Science and Technology volume 14 issue 1, describes the structural reasons for the US’s vulnerabilities to cyber attacks, and offers one possible solution to the problem.

Brenner argues that the traditional methods of investigation do not work well when it comes to cyber attacks. This ineffectiveness results from the obscured origin and often hidden underlying purpose of the attack, both of which are crucial in determining whether a law enforcement or military response is necessary. The impairment leads to problems assessing which agency should control the investigation and response. A nation’s security from external attackers depends, in part, on its ability to present an effective deterrent to would be attackers. In the case of cyber attacks, however, the US’s confusion on which agency should respond often precludes an efficient response.

Brenner argues that these problems are not transitory, but will increase in direct proportion to our reliance on complex technology. The current steps taken by the US are unlikely to solve the issue since they do not address the underlying problem, instead continuing to approach cyber terrorists as conventional attackers. Concluding that top down command structures are unable to respond effectively to the treat of cyber attacks, Brenner suggests a return to a more primitive mode of defense. Rather than trusting the government to ensure the safety of the populace, Brenner suggests citizens should work with the government to ensure their own safety. This decentralized approach, modeled on British town defenses after the fall of the Roman Empire, may avoid the ineffective pitfalls of the bureaucratic approach to cyber security.

There are some issues with this proposed model for cyber security, however. Small British towns during the early middle ages may have been able to ward off attackers through an active citizen based defense, but the anonymity of the internet makes this approach challenging when applied to a digitized battlefield. Small British towns were able to easily identify threats because they knew who lived in the area. The internet, as Brenner concedes, makes it difficult to determine to whom any given person pays allegiance. Presumably, Brenner theorizes that individuals would simply respond to attacks on their own information, or enlist the help of others to fed off attacks. However, the anonymity of the internet would mean utter chaos in bolstering a collective defense. For example, an ISIL cyber terrorist could likely organize a collective US citizen response against a passive target by claiming they were attacked. Likewise, groups utilizing pre-emptive attacks against cyber terrorist organizations could be disrupted by other US groups that do not recognize the pre-emptive cyber strike as a defensive measure. This simply shows that the analogy between the defenses of a primitive British town and the Internet is not complete.

Brenner may argue that her alternative simply calls for current individuals, corporations, and groups to build up their own defenses and protect themselves from impending cyber threats. While this approach would avoid the problems inherent in a bureaucratic approach, it ignores the fact that these groups are unable to protect themselves currently. Shifting these groups’ understanding of their responsibility of self defense may spur innovation and increase investment in cyber protection, but this will likely be insufficient to stop a determined cyber attack. Large corporations like Apple, JPMorgan, Target, and others often hemorrhage confidential information as a result of cyber attacks, even though they have large financial incentives to protect that information. This suggests that an individualized approach to cyber protection would also likely fail.

With the threat of ISIL increasing, it is time for the United States to take additional steps to reduce the threat of a cyber terror attack. At this initial stage, the inefficiencies of bureaucratic action will result in a delayed response to large-scale cyber terror attacks. While allowing private citizens to band together for their own protection may have some advantages over government inefficiency, this too likely would not solve all cyber security problems.


Apple’s Bark Is Worse Than Its Bite

Jessica Ford, MJLST Staff

Apple’s iPhone tends to garner a great deal of excitement from its aficionados for its streamlined aspects and much resentment from users craving customization on their devices. Apple’s newest smartphone model, the iPhone 6, is no exception. However, at Apple’s September 9, 2014 iPhone 6 unveiling, Apple announced that the new iOS 8 operating system encrypts emails, photos, and contacts when a user assigns a passcode to the phone. Apple is unable to bypass a user’s passcode under the new operating system and is accordingly unable to comply with government warrants demanding physical data extraction from iOS 8 devices.

The director of the FBI, James Comey, has already voiced concerns that this lack of access to iOS 8 devices could prevent the government from gathering information on a terror attack or child kidnappings.

Comey is not the only one to criticize Apple’s apparent attempt to bypass legal court orders and warrants. Orin Kerr, a criminal procedure and computer crime law professor at The George Washington University Law School, worries that this could essentially nullify the Supreme Court’s finding in Riley v. California this year which requires the police to have a warrant before searching and seizing the contents of an arrested individual’s cell phone.

However, phone calls and text messages are not encrypted, and law enforcement can gain access to that data by serving a warrant upon wireless carriers. Law enforcement can also tap and monitor cellphones by going through the same process. Any data backed to iCloud, including iMessages and photos, can be accessed under a warrant. The only data that law enforcement would not be able to access without a passcode is data normally backed up to iCloud that still remains on the device.

While security agencies argue otherwise, iOS 8 seems far from rendering Riley’s warrants useless. Law enforcement still has several viable options to gain information with a warrant. Furthermore, the Supreme Court has already made it clear that it does not find that the public’s interest in solving or preventing crimes outweighs the public’s interest in privacy of phone data, even when there is a chance that the data on a cell phone at issue will be encrypted once the passcode locks the phone,

“[I]n situations in which . . . an officer discovers an unlocked phone, it is not clear that the ability to conduct a warrantless search would make much of a difference. The need to effect the arrest, secure the scene, and tend to other pressuring matters means that law enforcement officers may well not be able to turn their attention to a cell phone right away . . . . If ‘the police are truly confronted with a ‘now or never’ situation,’ . . . they may be able to rely on exigent circumstances to search the phone immediately . . . . Or, if officers happen to seize a phone in an unlocked state, they may be able to disable a phone’s automatic-lock feature in order to prevent the phone from locking and encrypting data . . . . Such a preventive measure could be analyzed under the principles set forth in our decision in McArthur, 531 U.S. 326, 121 S.Ct. 946, which approved officers’ reasonable steps to secure a scene to preserve evidence while they awaited a warrant.” (citations omitted) Riley v. California, 134 S. Ct. 2473, 2487-88 (2014).

With all the legal recourse that remains open, it appears somewhat hasty for the paragon-of-virtue FBI to be crying “big bad wolf.”


E-Discovery Costs: Quick Peek and Clawback

Joe McCartin, Managing Editor

E-Discovery costs can be quite prohibitive. The problem was detailed by David Degnan in Volume 12, Issue 1 of the Minnesota Journal of Law, Science, and Technology. In his article, Accounting for the Costs of Electronic Discovery, Degnan discussed the use of four methods for controlling costs – sampling, gap testing, crawl systems, and cooperation. Recently, FDIC litigation against former directors of failed banks has created a new trend in E-Discovery cost containment – the quick peek and clawback. However, this new cost control mechanism may not control cost at all. It merely shifts a significant amount of cost onto the requesting party, upending traditional discovery procedures.

In FDIC v. Hayden, et al. and FDIC v. Copenhaver, et al. the court required the requesting party of Electronically Stored Information (ESI) to submit search terms to the FDIC, which would then produce all documents relevant to those terms in a Relativity database. The requesting party would then have access to all hosted documents, but would be responsible for conducting initial document review itself. After the requesting party conducted a “quick peek” and selected relevant documents, the FDIC would then have the opportunity to “clawback” any privileged documents. The FDIC would not have to review any documents not selected by the requesting party.

It is entirely appropriate for courts to shift the costs to a requesting party at times. Zubulake v. UBS Warburg, LLC. detailed a number of factors that could warrant cost shifting from the producing to the requesting party, and in FDIC v Hayden, et al. the court engaged in extensive analysis of the Zubulake factors. However, courts need to bear in mind that review is not just a portion of the production cost, it is the overwhelming bulk of the cost, and should not be shifted between parties without compelling reasons. Degnan showed in his article that the primary costs associated with E-discovery comes from review, which accounts for roughly 58% of the cost of e-discovery. Even in the presence of a number of compelling Zubulake factors, courts should make an attempt to split, not just shift, the cost of review.

While some requesting parties have found the arrangement to their liking, courts have also foisted this on others. Notably, this practice doesn’t reduce the overall amount of review, it merely shifts the costs of initial review from the producing to the requesting party. Requesting parties need to be aware of the potential costs they will bear under this arrangement. If they want to avoid the imposition of quick peek and clawback by courts, they should seek to follow the guidance of Degnan and the Sedona Conference and cooperate extensively with the opposing party in crafting a discovery process that is acceptable. Failure to work on a discovery plan cooperatively, leaves the requesting party more vulnerable to having a plan foisted upon them, one that may shift the bulk of costs onto them.


Drones Raise Fourth Amendment Issues

Alex Vlisides, Symposium Editor

Law professors love to tweak hypotheticals until students become uncomfortable with the result. It is the classic law school trap. As soon as you agree a premeditated, unprovoked killing is never justified, you are swept away to a desperate life raft in which the only way for the innocents to survive is for one of them to be thrown overboard. This is how we test which of the competing values will break first. And how law professors entertain themselves.

The developments in drone and camera technology are bringing Fourth Amendment privacy rules, particularly the public observation doctrine, to their breaking point. Public observation is the idea that generally what one exposes to the public may be observed or even recorded without violating privacy. But fundamental changes in what can be observed alters this balance. The development of technologies that sound made up for law school hypotheticals will challenge constitutional doctrine. Surveillance technology capable of tracking the movements of every individual in a several square mile are. Drones which can stay stationed in the air for years at a time. Cameras capable of surveilling private land and spaces from so high above they are effectively invisible. These technologies exist and each challenge the notion that observation in and from public spaces does not violate privacy.

These technologies are not exactly new: both aerial crafts and surveillance technologies have improved steadily for decades. What is new is that the rapid development of the last decade has brought the doctrine near to the breaking point, the point that law professors love. The point at which the designed rule, the sturdy absolute, cracks under changing facts. The point at which we have to decide which principle gives: the general autonomy to observe and record in public spaces and right to privacy. The public observation doctrine was developed to navigate this balance. The challenge for courts, and perhaps law students, is that the breaking point approaching Fourth Amendment law is no longer hypothetical.



Target Data Security Breach: It’s Lawsuit Time!

by Jenny Warfield, UMN Law Student, MJLST Staff

On December 19th, 2013, Target announced that it fell victim to the second-largest security attack in US retail history. While initial reports showed the hack compromised only the credit and debit card information (including PIN numbers and CVV codes) of 40 million customers, recent findings revealed that the names, phone numbers, mailing addresses, and email addresses of 70 million shoppers between November 27 to December 15 had also been stolen.

As history has proved time and again, massive data security breaches lead to lawsuits. When Heartland Payment Systems (a payment card processing service for small and mid-sized businesses) had its information on 130 million credit and debit card holders exposed in a 2009 cyber-attack, it faced lawsuits by banks and credit card companies for the costs of replacing cards, extending branch hours, and refunding consumers for fraudulent transactions. These lawsuits have so far cost the company $140 million in settlements (with litigation ongoing). Similarly, when TJX Company (parent of T.J. Maxx) had its accounts hacked in 2007, it cost the company $256 million in settlements.

Target currently faces at least 15 lawsuits in state and federal court seeking class action status, and several other lawsuits by individuals across the country. Common themes by the claimants are that 1) Target failed to properly secure customer data (more specifically, that Target did not abide by Payment Card Industry Security Standards Council Data Security Standards “PCI DSS”); 2) Target failed to promptly notify customers of the security breach in violation of state notification statutes, preventing customers from taking steps to protect against fraud; 3) Target violated the Federal Stored Communications Act; 4) and Target breached its implied contracts with its customers.

A quick review of past data breach cases reveals that these plaintiffs face an uphill battle, especially in the class-action context. While financial institutions and credit card companies can point to pecuniary damages in the form of costs associated with card replacements and customer refunds for fraudulent transactions (as in the TJX and Heartland cases), the damages suffered by plaintiffs in these cases are usually speculative. Not only are customers almost always refunded for transactions they did not make, it is unclear how to value the loss of information like home addresses and phone numbers in the absence of evidence that such information has been used to the customer’s detriment. As a result, almost all of the class action suits brought against companies in cyber-attacks have failed.

However, the causes of the cyber-attack on Target are still unclear, and it may be too early to speculate on Target’s liability. Target is currently being investigated by the DOJ (and potentially the FTC) for its role in the data breach while also conducting its own investigation in partnership with the U.S. Secret Service. In any event, affected customers should take advantage of Target’s year-long free credit monitoring while waiting for more facts to unfold.


Can I Keep It Private? Privacy Laws in Various Contexts

by Ude Lu, UMN Law Student, MJLST Articles Editor

Target Corp., the second-largest retailer in the nation, announced to its customers on Dec 20, 2013 that its payment card data had been breached. About 40 million customers who shopped at Target between Nov. 27 and Dec. 15, 2013 using credit or debit cards are affected. The stolen information includes the customer’s name, credit or debit card number, and the card’s expiration date. [Update: The breach may have affected over 100 million customers, and additional kinds of information may have been disclosed.]

This data breach stirred public discussions about data security and privacy protections. Federal Trade (FTC) Commissioner Maureen Ohlhausen said on Jan. 6, during a Twitter chat, that this event highlights the need for consumer and business education on data security.

In the US, the FTC’s privacy protection enforcement runs on a “broken promise” framework. This means the FTC will enforce privacy protection according to what a business entity promised to its customers. Privacy laws have increasing importance in wake of the information age.

Readers of this blog are encouraged to explore the following four articles published in MJLST, discussing privacy laws in various contexts:

  1. Constitutionalizing E-mail Privacy by Informational Access, by Manish Kumar. This article highlights the legal analyses of email privacy under the Fourth Amendment.
  2. It’s the Autonomy, Stupid: Political Data-Mining and Voter Privacy in the Information Age, by Chris Evans. This article explores the unique threats to privacy protection posed by political data-mining.
  3. Privacy and Public Health in the Information Age: Electronic Health Records and the Minnesota Health Records Act, by Kari Bomash. This article examines the adequacy of the Minnesota Health Records Act (MHRA) that the state passed to meet then-Governor Pawlenty’s 2015 mandate requiring every health care provider in Minnesota to have electronic health records.
  4. An End to Privacy Theater: Exposing and Discouraging Corporate Disclosure of User Data to the Government, by Christopher Soghoian. This article explores how businesses vary in disclosing privacy information of their clients to governmental agencies.


Supreme Court Denies Request to Review FISC Court Order.

by Erin Fleury, UMN Law Student, MJLST Staff

Last week, the Supreme Court denied a petition requesting a writ of mandamus to review a decision that ordered Verizon to turn over domestic phone records to the National Security Administration (“NSA”) (denial available here). The petition alleged that the Foreign Intelligence Surveillance Court (“FISC”) exceeded its authority because the production of these types of records was not “relevant to an authorized investigation . . . to obtain foreign intelligence information not concerning a United States person.” 50 U.S.C. § 1861(b)(2)(A).

The Justice Department filed a brief with the Court that challenged the standing of a third party to request a writ of mandamus from the Supreme Court for a FISC decision. The concern, however, is that telecommunication companies do not adequately fight to protect their users’ privacy concerns. This apprehension certainly seems justified considering the fact that no telecom provider has yet challenged the legality of an order to produce user data. Any motivation to fight these orders for data is further reduced by the fact that telecommunication companies can obtain statutory immunity to lawsuits by their customers based on turning over data to the NSA. 50 USC § 1885a. If third parties cannot ask a higher court to review a decision made by the FISC, then the users whose information is being given to the NSA may have their rights limited without any recourse short of legislative overhaul.

Unfortunately, like most denials for hearing, the Supreme Court did not provide its reasoning for denying the request. The question remains though; if the end users cannot object to these orders (and may not even be aware that their data was turned over in the first place), and the telecommunication companies have no reason to, is the system adequately protecting the privacy interests of individual citizens? Or can the FISC operate with impunity as long as the telecom carriers do not object?


Censorship Remains Viable in China– but for How Long?

by Greg Singer, UMN Law Student, MJLST Managing Editor

Thumbnail-Greg-Singer.jpgIn the west, perhaps no right is held in higher regard than the freedom of speech. It is almost universally agreed that a person has the inherent right to speak their mind as he or she pleases, without fear of censorship or reprisal by the state. Yet for the more than 1.3 billion currently residing in what is one of the oldest civilizations on the planet, such a concept is either unknown or wholly unreflective of the reality they live in.

Despite the exploding amount of internet users in China (from 200 million users in 2007 to over 530 million by the end of the first half of 2012, more than the entire population of North America), the Chinese Government has remained implausibly effective at banishing almost all traces of dissenting thought from the wires. A recent New York Times article detailing the fabulous wealth of the Chinese Premier Wen Jiabao and his family members (at least $2.7 billion) resulted in the almost immediate censorship of the newspaper’s English and Chinese web presence in China. Not stopping there, the censorship apparatus went on to scrub almost all links, reproductions, or blog posts based on the article, leaving little trace of its existence to the average Chinese citizen. Earlier this year, the Bloomberg News suffered a similar fate, as it too published an unacceptable report regarding the unusual wealth of Xi Jinping, the Chinese Vice President and expected successor of current President, Hu Jintao.

In “Forbidden City Enclosed by the Great Firewall: The Law and Power of Internet Filtering in China,” published in the Winter 2012 version of the Minnesota Journal of Law, Science & Technology, Jyh-An Lee and Ching-Yi Liu explain that it is not mere tenacity that permits such effective censorship–the structure of the Chinese internet itself has been designed to allow the centralized authority to control and filter the flow of all communications over the network. Even despite the decentralizing face of content creation on the web, it appears as though censorship will remain technically possible in China for the foreseeable future.

Yet still, technical capability is not synonymous with political permissibility. A powerful middle class is emerging in the country, with particular strength in the large urban areas, where ideas and sentiments are prone to spread quickly, even in the face of government censorship. At the same time, GDP growth is steadily declining from its tremendous peak in the mid-2000s. These two factors may combine to produce a population that has the time, education, and wherewithal to challenge a status quo that will perhaps look somewhat less like marvelous prosperity in the coming years. If China wishes to enter the developed world as a peer to the west (with an economy based on skilled and educated individuals, rather than mass labor), addressing its ongoing civil rights issues seems like an almost unavoidable prerequisite.


Political Data-Mining and Election 2012

by Chris Evans, UMN Law Student, MJLST Managing Editor

Thumbnail-Chris-Evans.jpgIn “It’s the Autonomy, Stupid: Political Data-Mining and Voter Privacy in the Information Age,” I wrote about the compilation and aggregation of voter data by political campaigns and how data-mining can upset the balance of power between voters and politicians. The Democratic and Republican data operations have evolved rapidly and quietly since my Note went to press, so I’d like to point out a couple of recent articles on data-mining in the 2012 campaign.

In August, the AP ran this exclusive: “Romney uses secretive data-mining.” Romney has hired an analytics firm, Buxton Co., to help his fundraising by identifying untapped wealthy donors. The AP reports:

“The effort by Romney appears to be the first example of a political campaign using such extensive data analysis. President Barack Obama’s re-election campaign has long been known as data-savvy, but Romney’s project appears to take a page from the Fortune 500 business world and dig deeper into available consumer data.”

I’m not sure it’s true Buxton is digging any deeper than the Democrats’ Catalist or Obama’s fundraising operation. Campaigns from both parties have been scouring consumer data for years. As for labeling Romney’s operation “secretive,” the Obama campaign wouldn’t even comment on its fundraising practices for the article, which strikes me as equally if not more secretive. Political data-mining has always been nonpartisanly covert; that’s part of the problem. When voters don’t know they’re being monitored by campaigns, they are at a disadvantage to candidates. (And when they do know they’re being monitored, they may alter their behavior.) This is why I argued in my Note for greater transparency of data-mining practices by candidates.

A more positive spin on political data-mining appeared last week, also by way of the AP: “Voter registration drives using data mining to target their efforts, avoid restrictive laws.” Better, cheaper technology and Republican efforts to restrict voting around the country are inducing interest groups to change how they register voters, swapping their clipboards for motherboards. This is the bright side of political data-mining: being able to identify non-voters, speak to them on the issues they care about, and bring them into the political process.

The amount of personal voter data available to campaigns this fall is remarkable, and the ways data-miners aggregate and sort that data is fascinating. Individuals ought to be let in on the process, though, so they know what candidates and groups are collecting what type of personal information, and so they can opt out of the data-mining.