Privacy

Should You Worry That ISPs Can Sell Your Browsing Data?

Joshua Wold, Article Editor

Congress recently voted to overturn the FCC’s October 2016 rules Protecting the Privacy of Customers of Broadband and Other Telecommunications Services through the Congressional Review Act. As a result, those rules will likely never go into effect. Had the rules been implemented, they would have required Internet Service Providers (ISPs) to get customer permission before making certain uses of customer data.

Some commentators, looking the scope of the rules relative to the internet ecosystem as a whole, and the fact that the rules hadn’t yet taken effect, thought that this probably wouldn’t have a huge impact on privacy. Orin Kerr suggested that the overruling of the privacy regulations was unlikely to change what ISPs would do with data, because other laws constrain them. Others, however, were less sanguine. The Verge quoted Jeff Chester of the Center for Digital Democracy as saying “For the foreseeable future, we’re going to be living in a commercial surveillance state.”

While the specific context of these privacy regulations is new (the FCC couldn’t regulate ISPs until 2015, when it defined them as telecommunications providers instead of information services), debates over privacy are not. In 2013, MJLST published Adam Thierer’s Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle. In it, the author argues that privacy threats (as well as many other threats from technological advancement) are generally exaggerated. Thierer then lays out a four-part analytic framework for weighing regulation, calling on regulators and politicians to identify clear harms, engage in cost-benefit analysis, consider more permissive regulation, and then evaluate and measure the outcomes of their choices.

Given Minnesota’s response to Congress’s action, the debate over privacy and regulation of ISPs is unlikely to end soon. Other states may consider similar restrictions, or future political changes could lead to a swing back toward regulation. Or, the current movement toward less privacy regulation could continue. In any event, Thierer’s piece, and particularly his framework, may be useful to those wishing the evaluate regulatory policy as ISP regulation progresses.

For a different perspective on ISP regulation, see Paul Gaus’s student note, upcoming in Volume 19, Issue 1. That article will focus on presenting several arguments in favor of regulating ISPs’ privacy practices, and will be a thoughtful contribution to the discussion about privacy in today’s internet.


Broadening the Ethical Concerns of Unauthorized Copyright and Rights of Publicity Usage: Do We Need More Acronyms?

Travis Waller, MJLST Managing Editor

In 2013, Prof. Micheal Murray of Valparaiso University School of Law published an article with MJLST entitled “DIOS MIO—The KISS Principle of the Ethical Approach to Copyright and Right of Publicity Law”. (For those of you unfamiliar with the acronyms, as I was previous to reviewing this article, DIOS MIO stands for “Don’t Include Other’s Stuff or Modify It Obviously”, just as KISS stands for “Keep it Simple, Stupid”). This article explored an ethical approach to using copyrighted material or celebrity likeness that has developed over the last decade due to several court cases merging certain qualities of the two regimes together.

The general principle embodied here is that current case law tends to allow for transformative uses of either a celebrity’s likeness or a copyrighted work – that is, a use of the image or work in a way that essentially provides a new or “transformative” take on the original. At the other extreme, the law generally allows individuals to use a celebrity’s likeness if the usage is not similar enough to the actual celebrity to be identifiable, or a copyrighted work if the element used is scenes a faire or a de minimis usage. Ergo, prudent advice to a would-be user of said material may, theoretically, be summed up as “seek first to create and not to copy or exploit, and create new expression by obvious modification of the old expression and content”, or DIOS MIO/KISS for the acronym savvy.

The reason I revisit this issue is not to advocate for this framework, but rather to illustrate just how unusual of bedfellows the regimes of copyright and “rights of publicity” are. As a matter of policy, in the United States, copyright is a federal regime dedicated to the utilitarian goals of “[p]romot[ing] the progress of science,” while rights of publicity laws are state level protections with roots going back to the Victorian era Warren & Brandies publication “The Right to Privacy” (and perhaps even further back). That is to say, the “right to publicity” is not typically thought of as a strictly utilitarian regime at all, and rather more as one dedicated to either the protection of an individual’s economic interests in their likeness (a labor argument), or a protection of that individual’s privacy (a privacy tort argument).

My point is, if, in theory, copyright is meant to “promote science”, while the right to publicity is intended to either protect an individual’s right to privacy, or their right to profit from their own image, is it appropriate to consider each regime under the age-old lens of “thou shalt not appropriate?” I tend to disagree.

Perhaps a more nuanced resolution to the ethical quandary would be for a would-be user of the image or work to consider the purpose of each regime, and to ask oneself if the usage of that work or image would offend the policy goals enshrined therein. That is, to endeavor on the enlightened path of determining whether, for copyright, if their usage of a work will add to the collective library of human understanding and progress, or whether the usage of that celebrity’s likeness will infringe upon that individual’s right to privacy, or unjustly deprive the individual of their ability to profit from their own well cultivated image.

Or maybe just ask permission.


Confusion Continues After Spokeo

Paul Gaus, MJLST Staffer

Many observers hoped the Supreme Court’s decision in Spokeo v. Robins would bring clarity to whether plaintiffs could establish Article III standing for claims based on future harm from date breaches. John Biglow explored the issue prior to the Supreme Court’s decision in his note It Stands to Reason: An Argument for Article III Standing Based on the Threat of Future Harm in Date Breach Litigation. For those optimistic the Supreme Court would expand access to individuals seeking to litigate their privacy interests, they were disappointed.

Spokeo is a people search engine that generates publicly accessible online profiles on individuals (they had also been the subject of previous FTC data privacy enforcement actions). The plaintiff claimed Spokeo disseminated a false report on him, hampering his ability to find employment. Although the Ninth Circuit held the plaintiff suffered “concrete” and “particularized” harm, the Supreme Court disagreed, claiming the Ninth Circuit analysis applied only to the particularization requirement. The Supreme Court remanded the matter back to the Ninth Circuit, casting doubt on whether the plaintiff suffered concrete harm. Spokeo violated the Fair Credit Reporting Act, but the Supreme Court characterized the false report as a bare procedural harm, insufficient for Article III standing.

Already, the Circuits are split on how Spokeo impacted consumer data protection lawsuits. The Eighth Circuit held that a cable company’s failure to destroy personally identifiable information of a former customer was a bare procedural harm akin to Spokeo in Braitberg v. Charter Communications. The Eighth Circuit reached this conclusion despite the defendant’s clear violation of the Cable Act. By contrast, the Eleventh Circuit held a plaintiff did have standing when she failed to receive disclosures of her default debt from her creditor under the Fair Debt Collections Practices Act in Church v. Accretive Health.

Many observers consider Spokeo an adverse result for consumers seeking to litigate their privacy interests. The Supreme Court punting on the issue continued the divergent application of Article III standing and class action privacy suits among the Circuits.


Did the Warriors Commit a Flagrant Privacy Foul?

Paul Gaus, MJLST Staffer

Fans of the National Basketball Association (NBA) know the Golden State Warriors for the team’s offensive exploits on the hardwood. The Warriors boast the NBA’s top offense at nearly 120 points per game. However, earlier this year, events in a different type of court prompted the Warriors to play some defense. On August 29, 2016, a class action suit filed in the Northern District of California alleged the Warriors, along with co-defendants Sonic Notify Inc. and Yinzcam, Inc., violated the Electronic Communications Privacy Act (18 U.S.C. §§ 2510, et. seq.).

Satchell v. Sonic Notify, Inc. et al, focuses on the team’s mobile app. The Warriors partnered with the two other co-defendants to create an app based on beacon technology. The problem, as put forth in the complaint, is that the beacon technology the co-defendants employed mined the plaintiff’s microphone embedded in the smartphone to listen for nearby beacons. The complaint alleges this enabled the Warriors to access the plaintiff’s conversation without her consent.

The use of beacon technology is heralded in the business world as a revolutionary mechanism to connect consumers to the products they seek. Retailers, major sports organizations, and airlines regularly use beacons to connect with consumers. However, traditional beacon technology is based on Bluetooth. According to the InfoSec Institute, mobile apps send out signals and gather data on the basis of Bluetooth signals received. This enables targeted advertising on smartphones.

However, the complaint in Satchell maintains the defendants relied on a different kind of beacon technology: audio beacon technology. In contrast to Bluetooth beacon technology, audio beacon technology relies on sounds. For functionality, audio beacons must continuously listen for audio signals through the smartphone user’s microphone. Therefore, the Warriors app permitted the co-defendants to listen to the plaintiff’s private conversations on her smartphone – violating the plaintiff’s reasonable expectation of privacy.

While the Warriors continue to rack up wins on the court, Satchell has yet to tip off. As of December 5, 2016, the matter remains in the summary judgment phase.


The GIF That Keeps on Giving: The Problem of Dealing with Incidental Findings in Genetic Research.

 Angela Fralish, MJLST Invited Blogger

The ability to sequence a whole genome invites a tremendous opportunity to improve medical care in modern society. We are now able to prepare for, and may soon circumvent, genes carrying traits such as Alzheimer’s, breast cancer and embryonic abnormalities. These advancements hold great promise as well as suggest many new ways of looking at relationships in human subject research.

A 2008 National Institute of Health article, The Law of Incidental Findings in Human Subjects Research, discussed how modern technology has outpaced the capacity of human subject researchers to receive and interpret data responsibly. Disclosure of incidental findings, “data [results] gleaned from medical procedures or laboratory tests that were beyond the aims or goals of the particular laboratory test or medical procedure” is particularly challenging with new genetic testing. Non-paternity for example, which has been found in up to 30% of participants in some studies, result in researchers deciding how to tell participants that they are not biologically related to their parent or child. This finding could not only impact inheritance, custody and adoptions rights, but can also cause lifelong emotional harm. Modern researchers must be equipped to handle many new psychosocial and emotional variables. So where should a researcher look to determine the proper way to manage these “incidentalomas”?

Perspectives, expectations, and interests dictating policies governing incidental finding management are diverse and inconsistent. Some researchers advocate for an absolute ban on all findings of non-paternity because of the potential harm. Others argue that not revealing misattributed paternity result in a lifetime of living with inaccurate family health history. These scenarios can be difficult for all involved parties.

Legal responsibility of disclosure was indirectly addressed in Ande v.Rock in 2001 when the court held that parents did not have property rights to research results which identified spina bifida in their child. In 2016, an incidental finding of genetic mutation led a family to Mayo Clinic for a second opinion on a genetic incidental finding. The family was initially told that a gene mutation related to sudden cardiac death caused their 13-year-old son to die in his sleep, and the gene mutation was also identified in 20 family members. Mayo Clinic revealed the gene was misdiagnosed, but the decedent’s brother already had a defibrillator implanted and received two inappropriate shocks to his otherwise normal and healthy heart. Establishing guidance for the scope and limits of disclosure of incidental findings is a complex process.

Under 45 C.F.R. §§ 46.111 and 46.116, also known as the Common Rule, researchers in all human subject research must discuss any risks or benefits to participants during informed consent. However, there is debate over classification of incidental findings as a risk or benefit because liability can attach. Certainly the parents in Ande v. Rock would have viewed the researchers’ decision not to disclose positive test results for spina bifida as a risk or benefit that should have been discussed at the onset of their four-year involvement. On the other hand, as in the Mayo Clinic example above, is a misdiagnosed cardiac gene mutation a benefit or risk? The answers to these question is very subjective.

The Presidential Commission for the Study of Bioethical Issues has suggested 17 ethical guidelines which include discussing risks and benefits of incidental finding disclosures with research participants. The Commission’s principles are the only guidelines currently addressing incidental findings. There is a desperate need for solid legal guidance when disclosing incidental findings. It is not an easy task, but the law needs to quickly firm-up a foundation for appropriate disclosure in incidental findings.


Digital tracking: Same concept, Different Era

Meibo Chen, MJLST Staffer

The term “paper trail” continues to become more anachronistic in today’s world as time goes on.  While there are some people who still prefer the traditional old-fashioned pen and paper, our modern world has endowed us with technologies like computers and smartphones.  Whether we like it or not, this digital explosion is slowly consuming and taking over the lives of the average American (73% of US adults own a desktop or laptop computer, and 68% own a smartphone).

These new technologies have forced us to re-consider many novel legal issues that arose from their integration into our daily lives.  Recent Supreme Court decisions such as Riley v. California in 2014 pointed out the immense data storage capacity of a modern cell phone, and requires a warrant for its search in the context of a criminal prosecution.  In the civil context, many consumers are concerned with internet tracking.  Indeed, the MJLST published an article in 2012 addressing this issue.

We have grown so accustomed to seeing “suggestions” that eerily match our respective interests.  In fact, internet tracking technology has become far more sophisticated than the traditional cookies, and can now utilizes “fingerprinting” technology to look at battery status or window size to identify a user’s presence or interest. This leads many to fear for their data privacy in similar digital settings.  However, isn’t this digital tracking just the modern adaptation to “physical” tracking that we have grown so accustomed to?

When we physically go to a grocery store, don’t we subject ourselves to the prying eyes of those around us?  Why should it be any different in a cyberspace context?  While seemingly scary accurate at times, “suggestions” or “recommended pages” based on one’s browsing history can actually be beneficial to both the tracked and the tracker.  The tracked gets more personalized search results while the tracker uses that information for better business results between him and the consumer.  Many browsers already sport the “incognito” function to disable the tracks, bring a balance to when consumers want their privacy.  Of course, this tracking technology can be misused, but malicious use of a beneficial technology has always been there in our world.


6th Circuit Aligns With 7th Circuit on Data Breach Standing Issue

John Biglow, MJLST Managing Editor

To bring a suit in any judicial court in the United States, an individual, or group of individuals must satisfy Article III’s standing requirement. As recently clarified by the Supreme Court in Spokeo, Inc. v. Robins, 136 S. Ct. 1540 (2016), to meet this requirement, a “plaintiff must have (1) suffered an injury in fact, (2) that is fairly traceable to the challenged conduct of the defendant, and (3) that is likely to be redressed by a favorable judicial decision.” Id. at 1547. When cases involving data breaches have entered the Federal Circuit courts, there has been some disagreement as to whether the risk of future harm from data breaches, and the costs spent to prevent this harm, qualify as “injuries in fact,” Article III’s first prong.

Last Spring, I wrote a note concerning Article III standing in data breach litigation in which I highlighted the Federal Circuit split on the issue and argued that the reasoning of the 7th Circuit court in Remijas v. Neiman Marcus Group, LLC, 794 F.3d 688 (7th Cir. 2015) was superior to its sister courts and made for better law. In Remijas, the plaintiffs were a class of individuals whose credit and debit card information had been stolen when Neiman Marcus Group, LLC experienced a data breach. A portion of the class had not yet experienced any fraudulent charges on their accounts and were asserting Article III standing based upon the risk of future harm and the time and money spent mitigating this risk. In holding that these Plaintiffs had satisfied Article III’s injury in fact requirement, the court made a critical inference that when a hacker steals a consumer’s private information, “[p]resumably, the purpose of the hack is, sooner or later, to make fraudulent charges or assume [the] consumers’ identit[y].” Id. at 693.

This inference is in stark contrast to the line of reasoning engaged in by the 3rd Circuit in Reilly v. Ceridian Corp. 664 F.3d 38 (3rd Cir. 2011).  The facts of Reilly were similar to Remijas, except that in Reilly, Ceridian Corp., the company that had experienced the data breach, stated only that their firewall had been breached and that their customers’ information may have been stolen. In my note, mentioned supra, I argued that this difference in facts was not enough to wholly distinguish the two cases and overcome a circuit split, in part due to the Reilly court’s characterization of the risk of future harm. The Reilly court found that the risk of misuse of information was highly attenuated, reasoning that whether the Plaintiffs experience an injury depended on a series of “if’s,” including “if the hacker read, copied, and understood the hacked information, and if the hacker attempts to use the information, and if he does so successfully.” Id. at 43 (emphasis in original).

Often in the law, we are faced with an imperfect or incomplete set of facts. Any time an individual’s intent is an issue in a case, this is a certainty. When faced with these situations, lawyers have long utilized inferences to differentiate between more likely and less likely scenarios for what the missing facts are. In the case of a data breach, it is almost always the case that both parties will have little to no knowledge of the intent, capabilities, or plans of the hacker. However, it seems to me that there is room for reasonable inferences to be made about these facts. When a hacker is sophisticated enough to breach a company’s defenses and access data, it makes sense to assume they are sophisticated enough to utilize that data. Further, because there is risk involved in executing a data breach, because it is illegal, it makes sense to assume that the hacker seeks to gain from this act. Thus, as between the Reilly and Remijas courts’ characterizations of the likelihood of misuse of data, it seemed to me that the better rule is to assume that the hacker is able to utilize the data and plans to do so in the future. Further, if there are facts tending to show that this inference is wrong, it is much more likely at the pleading stage that the Defendant Corporation would be in possession of this information than the Plaintiff(s).

Since Remijas, there have been two data breach cases that have made it to the Federal Circuit courts on the issue of Article III standing. In Lewert v. P.F. Chang’s China Bistro, Inc., 819 F.3d 963, 965 (7th Cir. 2016), the court unsurprisingly followed the precedent set forth in their recent case, Remijas, in finding that Article III standing was properly alleged. In Galaria v. Nationwide Mut. Ins. Co., a recent 6th Circuit case, the court had to make an Article III ruling without the constraint of an earlier ruling in their Circuit, leaving the court open to choose what rule and reasoning to apply. Galaria v. Nationwide Mut. Ins. Co., No. 15-3386, 2016 WL 4728027, (6th Cir. Sept. 12, 2016). In the case, the Plaintiffs alleged, among other claims, negligence and bailment; these claims were dismissed by the district court for lack of Article III standing. In alleging that they had suffered an injury in fact, the Plaintiffs alleged “a substantial risk of harm, coupled with reasonably incurred mitigation costs.” Id. at 3. In holding that this was sufficient to establish Article III standing at the pleading stage, the Galaria court found the inference made by the Remijas court to be persuasive, stating that “[w]here a data breach targets personal information, a reasonable inference can be drawn that the hackers will use the victims’ data for the fraudulent purposes alleged in Plaintiffs’ complaints.” Moving forward, it will be intriguing to watch how Federal Circuits who have not faced this issue, like the 6th circuit before deciding Galaria, rule on this issue and whether, if the 3rd Circuit keeps its current reasoning, this issue will eventually make its way to the Supreme Court of the United States.


The Comment on the Note “Best Practices for Establishing Georgia’s Alzheimer’s Disease Registry” of Volume 17, Issue 1

Jing Han, MJLST Staffer

Alzheimer’s disease (AD), also known just Alzheimer’s, accounts for 60% to 70% of cases of dementia. It is a chronic neurodegenerative disease that usually starts slowly and gets worse over time. The cause of Alzheimer’s disease is poorly understood. No treatments could stop or reverse its progression, though some may temporarily improve symptoms. Affected people increasingly rely on others for assistance, often placing a burden on the caregiver; the pressures can include social, psychological, physical, and economic elements. It was first described by, and later named after, German psychiatrist and pathologist Alois Alzheimer in 1906. In 2015, there were approximately 48 million people worldwide with AD. In developed countries, AD is one of the most financially costly diseases. Before many states, including Georgia, South Carolina, passed legislation establishing the Registry, many private institutions across the country already had made tremendous efforts to establish their own Alzheimer’s disease registries. The country has experienced an exponential increase of people who are diagnosed with Alzheimer’s disease. More and more states have begun to have their own Alzheimer’s disease registry.

From this Note, the Registry in Georgia has emphasized from the outset, the importance of protecting the confidentiality of patent date from secondary uses. This Note explores many legal and ethical issues raised by the Registry. An Alzheimer’s disease patient’s diagnosis history, medication history, and personal lifestyle are generally confidential information, known only to the physician and patient himself. Reporting such information to the Registry, however, may lead to wider disclosure of what was previously private information and consequently may arouse constitutional concerns. It is generally known that the vast majority of public health registries in the past have focused on collection of infectious disease data, registries for non-infectious diseases, such as Alzheimer’s disease, diabetes, and cancer have been recently created. It is a delicate balance between the public interest and personal privacy. It is not a mandatory requirement to register because Alzheimer is not infectious. After all, people suffering from Alzheimer’s often face violations of their human rights, abuse and neglect, as well as widespread discrimination from the other people. When a patient is diagnosed as AD, the healthcare provider, the doctor should encourage, rather than compel patients to receive registry. Keeping all the patients’ information confidential, enacting the procedural rules to use the information and providing some incentives are good approaches to encourage more patients to join the registry.

Based on the attention to the privacy concerns under federal and state law, the Note recommend slightly broader data sharing with the Georgia Registry, such as a physician or other health care provider for the purpose of a medical evaluation or treatment of the individual; any individual or entity which provides the Registry with an order from a court of competent jurisdiction ordering the disclosure of confidential information. What’s more, the Note mentions there has the procedural rules designated to administer the registry in Georgia. The procedural rules involve clauses: who are the end-users of the registry; what type of information should be collected in the registry; how and from whom should the information be collected; and how should the information be shared or disclosed for policy planning for research purpose; how the legal representatives get authority from patient.

From this Note, we have a deep understanding of Alzheimer’s disease registry in the country through one state’s experience. The registry process has invoked many legal and moral issues. The Note compares the registry in Georgia with other states and points out the importance of protecting the confidentiality of patient data. Emphasizing the importance of protection of personal privacy could encourage more people and more states to get involved in this plan.


Requiring Backdoors into Encrypted Cellphones

Steven Groschen, MJLST Managing Editor

The New York State Senate is considering a bill that requires manufacturers and operating system designers to create backdoors into encrypted cellphones. Under the current draft, failure to comply with the law would result in a $2,500 fine, per offending device. This bill highlights the larger national debate concerning privacy rights and encryption.

In November of 2015, the Manhattan District Attorney’s Office (MDAO) published a report advocating for a federal statute requiring backdoors into encrypted devices. One of MDAO’s primary reasons in support of the statute is the lack of alternatives available to law enforcement for accessing encrypted devices. The MDAO notes that traditional investigative techniques have largely been ineffective. Additionally, the MDAO argues that certain types of data residing on encrypted devices often cannot be found elsewhere, such as on a cloud service. Naturally, the inaccessibility of this data is a significant hindrance to law enforcement. The report offers an excellent summary of the law enforcement perspective; however, as with all debates, there is another perspective.

The American Civil Liberties Union (ACLU) has stated it opposes using warrants to force device manufacturers to unlock their customers’ encrypted devices. A recent ACLU blog post presented arguments against this practice. First, the ACLU argued that the government should not require “extraordinary assistance from a third party that does not actually possess the information.” The ACLU perceives these warrants as conscripting Apple (and other manufacturers) to conduct surveillance on behalf of the government. Second, the ACLU argued using search warrants bypasses a “vigorous public debate” regarding the appropriateness of the government having backdoors into cellphones. Presumably, the ACLU is less opposed to laws such as that proposed in the New York Senate, because that process involves an open public debate rather than warrants.

Irrespective of whether the New York Senate bill passes, the debate over government access to its citizens’ encrypted devices is sure to continue. Citizens will have to balance public safety considerations against individual privacy rights—a tradeoff as old as government itself.


Digital Millennium Copyright Act Exemptions Announced

Zach Berger, MJLST Staffer

The Digital Millennium Copyright Act (DMCA) first enacted in 1998, prevents owners of digital devices from making use of these devices in any way that the copyright holder does not explicitly permit. Codified in part in 17 U.S.C. § 1201, the DMCA makes it illegal to circumvent digital security measures that prevent unauthorized access to copyrighted works such has movies, video games, and computer programs. This law prevents users from breaking what is known as access controls, even if the purpose would fall under lawful fair use. According to the Electronic Frontier Foundation’s (a nonprofit digital rights organization) staff attorney Kit Walsh, “This ‘access control’ rule is supposed to protect against unlawful copying. But as we’ve seen in the recent Volkswagen scandal . . . it can be used instead to hide wrongdoing hidden in computer code.” Essentially, everything not explicitly permitted is forbidden.

However, these restrictions are not iron clad. Every three years, users are allowed to request exemptions to this law for lawful fair uses from the Library of Congress (LOC), but these exemptions are not easy to receive. In order to receive an exemption, activists must not only propose new exemptions, but also plead for ones already granted to be continued. The system is flawed, as users often need to have a way to circumvent their devices to make full use of the products. However, the LOC has recently released its new list of exemptions, and this expanded list represents a small victory for digital rights activists.

The exemptions granted will go into effect in 2016, and cover 22 types of uses affecting movies, e-books, smart phones, tablets, video games and even cars. Some of the highlights of the exemptions are as follows:

  • Movies where circumvention is used in order to make use of short portions of the motion pictures:
    • For educational uses by University and grade school instructors and students.
    • For e-books offering film analysis
    • For uses in noncommercial videos
  • Smart devices
    • Can “jailbreak” these devices to allow them to interoperate with or remove software applications, allows phones to be unlocked from their carrier
    • Such devices include, smart phones, televisions, and tablets or other mobile computing devices
      • In 2012, jailbreaking smartphones was allowed, but not tablets. This distinction has been removed.
    • Video Games
      • Fan operated online servers are now allowed to support video games once the publishers shut down official servers.
        • However, this only applies to games that would be made nearly unplayable without the servers.
      • Museums, libraries, and archives can go a step further by jailbreaking games as needed to get them functioning properly again.
    • Computer programs that operate things primarily designed for use by individual consumers, for purposes of diagnosis, repair, and modification
      • This includes voting machines, automobiles, and implantation medical devices.
    • Computer programs that control automobiles, for purposes of diagnosis, repair, and modification of the vehicle

These new exemptions are a small, but significant victory for consumers under the DMCA. The ability to analyze your automotive software is especially relevant in the wake of the aforementioned Volkswagen emissions scandal. However, the exemptions are subject to some important caveats. For example, only video games that are almost completely unplayable can have user made servers. This means that games where only an online multiplayer feature is lost, such servers are not allowed. A better long-term solution is clearly needed, as this burdensome process is flawed and has led to what the EFF has called “unintended consequences.” Regardless, as long as we still have this draconian law, exemptions will be welcomed. To read the final rule, register’s recommendation, and introduction (which provides a general overview) click here.