December 2013

Making It Personal: The Key to Climate Change Action

by Brandon Palmen, UMN Law Student, MJLST Executive Editor

Climate change is the ultimate global governance challenge, right? It’s an intractable problem, demanding a masterfully coordinated international response and a delicate political solution, balancing entrenched economic interests against deeply-discounted, diffuse future harms that are still highly uncertain. But what if that approach to the problem were turned on its head? We often hear that the earth will likely warm 3-5 degrees centigrade (+/- 2 degrees), on average, over the next hundred years, and we may wonder whether that’s as painful as higher utility bills and the fear of losing business and jobs to free-riding overseas competitors. What if, instead, Americans asking “what’s in it for me?” could just go online and look up their home towns, the lakes where they vacation, the mountains where they ski, and fields where their crops are grown, and obtain predictions of how climate change is likely to impact the places they actually live and work?

A new climate change viewing tool from the U.S. Geological Survey is a first step toward changing that paradigm. The tool consolidates and averages temperature change predictions based on numerous climate change models and displays them on a map. The result is beautiful in its simplicity; like a weather map, it allows everyday information consumers to begin to understand how climate change will affect their lives on a daily basis, making what had been an abstract concept of “harm” more tangible and actionable. So far, the tool appears to use pre-calculated, regional values and static images (to support high-volume delivery over the internet, no doubt), and switching between models reveals fascinatingly wide predictive discrepancies. But it effectively communicates the central trend of climate change research, and suggests the possibility of developing a similar tool that could provide more granular data, either by incorporating the models and crunching numbers in real time, or by extrapolating missing values from neighboring data points. Google Earth also allows users to view climate change predictions geographically, but the accessibility of the USGS tool may give it greater impact with the general public.

There are still challenging bridges to be crossed — translation of what “N-degree” temperature changes will likely have on particular species, and “tagging,” “fencing,” or “painting” of specific tracts of land with those species — but it is plausible that within a few years, we will be able to obtain tailored predictions of climate change’s impact on the environments that actually matter to us — the ones in which we live. Of course those predictions will be imprecise or even wholly incorrect, but if they’re based on the best-available climate models, coupled with discoverable information about local geographic features, they’ll be no worse than many other prognostications that grip everyday experience, like stock market analysis and diet/nutrition advice. Maybe the problem with public climate change debate is that it’s too scientific, in the sense that scientists know the limitations of their knowledge and models, and are wary of “defrauding” the public by drawing inductive conclusions that aren’t directly confirmed by evidence. Or maybe there’s just no good way to integrate the best climate models with local environmental and economic knowledge … yet.

Well, so what? Isn’t tackling climate change still an intractable global political problem? Maybe not. The more that people understand about the impacts climate change will have on them personally, the more likely they are to personally take action to ameliorate climate change, even absent meaningful top-down climate change policy. And while global governance may be beyond the reach of most individuals, local and state programs are not so far removed from private participation. In her recent article, Localizing Climate Change Action, Myanna Dellinger examines several such “home-grown” programs, and concludes that they may be an important component of climate change mitigation. Minnesotans are probably most worried about climate change’s impact on snow storms, lake health, and crop yields, while Arizonans might worry more about drought and fragile desert ecosystems, and Floridians might worry about hurricanes and beach tourism. If all of these local groups are motivated by the same fundamental problem, their actions may be self-coordinating in effect, even if they are not coordinated by design.


Worldwide Canned Precooked Meat Product: The Legal Challenges of Combating International Spam

by Nathan Peske, UMN Law Student, MJLST Staff

On May 1, 1978 Gary Thuerk sent the first unsolicited mass e-mail on ARPANET, the predecessor to today’s Internet. Thuerk, a marketing manager for Digital Equipment Corporation (DEC), sent information about DEC’s new line of microcomputers to all 400 users of the ARPANET. Since ARPANET was still run by the government and subject to rules prohibiting commercial use, Thuerk received a stern tongue lashing from an ARPANET representative. Unfortunately this failed to deter future senders of unsolicited e-mails, or spam, and it has been a growing problem ever since.

From a single moderately annoying but legitimate advertisement sent by a lone individual in 1978, spam has exploded into a malicious, hydra-headed juggernaut. Trillions of spam e-mails are sent every year, up to 90% of all e-mail sent. Most spam e-mails are false ads for adult devices or health, IT, finance, or education products. The e-mails routinely harm the recipient through attempts to scam money like the famous Nigerian scam, phishing attacks to steal the recipient’s credentials, or distribution of malware either directly or through linked websites. It is estimated that spammers cost the global economy $20 billion a year in everything from lost productivity to the additional network equipment required to transmit the massive increase in e-mail traffic due to spam.

While spam is clearly a major problem, legal steps to combat it are confronted by a number of identification and jurisdictional issues. Gone are the Gary Thuerk days when the sender’s e-mail could be simply read off the spam e-mail. Spam today is typically distributed through large networks of malware-infected computers. These networks, or botnets, are controlled by botmasters who send out spam without the infected user’s knowledge, often for another party. Spam may be created in one jurisdiction, transmitted by a botmaster in another jurisdiction, distributed by bots in the botnet somewhere else, and received by recipients all over in the world.

Anti-spam laws generally share several provisions. They usually include one or all of the following: OPT-IN policies prohibiting sending bulk e-mails to users that have not subscribed to them, OPT-OUT policies requiring that a user must be able to unsubscribe at any time, clear and accurate indication of the sender’s identity and the advertising nature of the message, and a prohibition on e-mail address harvesting. While effective against spammers that can be found within that entity’s jurisdiction, these laws cannot touch other members in the spam chain outside of its borders. There is also a lack of laws penalizing legitimate companies, often more easily identified and prosecuted, that pay for spamming services. Only the spammers themselves are prosecuted.

Effectively reducing spam will require a more effective international framework to mirror the international nature of spam networks. Increased international cooperation will help identify and prosecute members throughout the spam chain. Changes in the law, such as penalizing those who use spamming services to advertise, will help reduce the demand for spam.

Efforts to reduce spam cannot include just legal efforts against spammers and their patrons. Much like the international drug trade, as long as spam continues to be a lucrative market, it will attract participants. Technical and educational efforts must be made to reduce the profit in spam. IT companies and industry groups are working to develop anti-spam techniques. These range from blocking IP address and domains at the network level to analyzing and filtering individual messages, and a host of other techniques. Spam experts are also experimenting with techniques like spamming the spammers with false responses to reduce their profit margins. Efforts to educate users on proper e-mail security and simple behaviors like “if you don’t know the sender, don’t open the attachment” will also help bring down spammers’ profit margins by decreasing the number of responses they get.

Like many issues facing society today, e-mail spam requires a response at all levels of society. National governments must work individually and cooperatively to pass effective anti-spam laws and prosecute spammers. Industry groups must develop ways to detect and destroy spam and the botnets that distribute them. And individual users must be educated on the techniques to defend themselves from the efforts of spammers. Only with a combined, multi-level effort can the battle against international e-mail spam be truly won.


Supreme Court Denies Request to Review FISC Court Order.

by Erin Fleury, UMN Law Student, MJLST Staff

Last week, the Supreme Court denied a petition requesting a writ of mandamus to review a decision that ordered Verizon to turn over domestic phone records to the National Security Administration (“NSA”) (denial available here). The petition alleged that the Foreign Intelligence Surveillance Court (“FISC”) exceeded its authority because the production of these types of records was not “relevant to an authorized investigation . . . to obtain foreign intelligence information not concerning a United States person.” 50 U.S.C. § 1861(b)(2)(A).

The Justice Department filed a brief with the Court that challenged the standing of a third party to request a writ of mandamus from the Supreme Court for a FISC decision. The concern, however, is that telecommunication companies do not adequately fight to protect their users’ privacy concerns. This apprehension certainly seems justified considering the fact that no telecom provider has yet challenged the legality of an order to produce user data. Any motivation to fight these orders for data is further reduced by the fact that telecommunication companies can obtain statutory immunity to lawsuits by their customers based on turning over data to the NSA. 50 USC § 1885a. If third parties cannot ask a higher court to review a decision made by the FISC, then the users whose information is being given to the NSA may have their rights limited without any recourse short of legislative overhaul.

Unfortunately, like most denials for hearing, the Supreme Court did not provide its reasoning for denying the request. The question remains though; if the end users cannot object to these orders (and may not even be aware that their data was turned over in the first place), and the telecommunication companies have no reason to, is the system adequately protecting the privacy interests of individual citizens? Or can the FISC operate with impunity as long as the telecom carriers do not object?


Problems With Forensic Expert Testimony in Arson Cases

by Becky Huting, UMN Law Student, MJLST Staff

In MJLST Volume 14, Issue 2, Rachel Diasco-Villa explored the evidentiary standard for arson investigation. Ms. Diasco-Villa, a lecturer at the School of Criminology and Criminal Justice at Griffith University, examined the history of arson-investigation knowledge, and how the manner in which it is conveyed in court can mislead, possibly leading to faulty conclusions and wrongful convictions. The article discussed the case of Todd Willingham, who was convicted and sentenced to death for setting fire to his home and killing his three children. Willingham had filed numerous unsuccessful appeals and petitions for clemency, and several years after his execution, a commission’s investigation concluded that there were several alternative explanations as to the cause of the fire, and that neither the investigation nor the evidence testimony were compliant with existing standards.

During the trial, the prosecutor’s fire expert, a Deputy Fire Marshall from the State Fire Marshall’s Office, testified as to why he believed the fire was set by arson. Little science was used in his explanation:

Heat rises. In the winter time when you are going to the bathroom and you don’t have any carpet on the rug. . .the floor is colder than the ceiling. It always is. . . So when I found that floor is hotter than the ceiling, that’s backwards, upside down. . .The only reason that the floor is hotter is because there was an accelerant. That’s the difference. Man made it hotter or woman or whatever.

The expert went on to explain that fire investigations and fire dynamics are logical and common sense, such that jurors themselves could evaluate with their sense and experiences to arrive at the same conclusions. All samples taken from “suspicious” areas of the house tested negative for any traces of an accelerant. The expert explained the chemical results: “And so there won’t be any — anything left; it will burn up.”

Fire and arson investigation has traditionally been experiential knowledge, passed down from mentors to their apprentices without experimental or scientific testing to validate their claims. Fire investigators do not necessarily have scientific training, nor is it necessary for them to hold a higher educational degree beyond a high school diploma. The National Academy of Science released a report in 2009 stating that the forensic sciences needed standardized reporting of their findings and testimony, and fire and arson investigation was no exception. The International Association of Arson Investigators has pushed back on such guidance, having filed an amicus brief arguing that arson investigation is experience-based and not novel or scientific, so it should not be subjected to higher evidentiary standards. This argument failed to convince the court, which ruled that fire investigation expertise should be subject to scrutiny under the Daubert standards that call for exacting measures of reliability.

Ms. Diasco-Villa’s note also considers the risk of contextual bias and overreach, should these experts’ testimony be admitted. In the Willingham case, the expert was given wide latitude as to his opinion on the defendant’s guilt or innocence. He was allowed to testify as to his belief that the suspect’s intent “was to kill the little girls” and identify the defendant by name as the individual who started the fire. Under Federal Rules of Evidence section 702, expert witnesses are given a certain degree of latitude in stating their opinions, but the author was concerned with the risk of jurors giving extra weight to this arguably overreaching testimony by expert witnesses.

She concluded by presenting statistics concerning the vast number of fires in the United States each year (1.6 million), and the significant quantity that are classified as having been intentionally set (43,000). There is a very real potential that thousands of arrests and convictions each year may have relied on overreaching testimony or evidence collected and interpreted using a defunct methodology. This state of affairs in arson investigations warrants continued improvements in forensic science techniques and evidentiary standards.


My Body, My Tattoo, My Copyright?

by Jenny Nomura, UMN Law Student, MJLST Managing Editor

A celebrity goes into a tattoo shop and gets an elaborate tattoo on her arm. The celebrity and her tattoo appear on TV and in magazines, and as a result, the tattoo becomes well-known. A director decides he wants to copy that tattoo for his new movie. He has an actress appear in the film with a copy of the signature tattoo. Not long after, the film company gets notice of a copyright infringement lawsuit filed against them, from the original tattoo artist. Similar situations are actually happening. Mike Tyson’s face tattoo artist sued Warner Bros. for copying his tattoo in “The Hangout Part 2.” Warner Bros. settled with the tattoo artist. Another tattoo artist, Christopher Escobedo, designed a large tattoo on a mixed martial arts fighter, Carlos Condit. Both the tattoo and the fighter appeared in a video game. Now Escobedo wants thousands of dollars for copyright infringement. Most people who get a tattoo never think about potential copyright issues, but these recent events might change that.

These situations leave us with a lot of uncertainties and questions. First of all, is there a copyright in a tattoo? It’s seems like it meets the basic definition of a copyright, but maybe just a thin copyright (most tattoos don’t have a lot of originality). Assuming there is a copyright, who owns the copyright: the wearer or the tattoo artist? Who can the owner, whoever he is, sue for copyright infringement? Can he or she sue other tattoo artists for violation of right of derivative works? Can he or she sue for violation of reproduction if another tattoo artist copies the original onto someone else? What about bringing a lawsuit against a film company for publicly displaying the tattoo? There are plenty of tattoos of copyrighted and trademarked materials, so could tattoo artists and wearers themselves be sued for infringement?

What can be done to avoid copyright infringement lawsuits? Assuming that the owner of the copyright is the tattoo artist, the potential-wearer could have the tattoo artist sign a release. It may cost more money to get the tattoo, but there’s no threat of a lawsuit. It has been argued that the best outcome would be if a court found an implied license. Sooner or later someone is going to refuse to settle and we will have a tattoo copyright infringement lawsuit and hopefully get some answers.


Uh-Oh Oreo? the Food and Drug Administration Takes Aim at Trans Fats

by Paul Overbee, UMN Law Student, MJLST Staff

In the near future, food currently part of your everyday diet may undergo some fundamental changes. From cakes and cookies to french-fries and bread, a recent action by the Food and Drug Administration puts these types of products in the spotlight. On November 8th, 2013 the FDA filed a notice requesting comments and scientific data on partially hydrogenated oils. The notice states that partially hydrogenated oils, most commonly found in trans fats, are no longer considered to be generally recognized as safe by the Food and Drug Administration.

Some partially hydrogenated oils are created during a stage of food processing in order to make vegetable oil more solid. The effects of this process contribute to a more pleasing texture, greater shelf life, and stronger flavor stability. Additionally, some trans fat is naturally occurring in some animal-based foods, including some milks and meats. The FDA’s proposal is meant to only to restrict the use of artificial partially hydrogenated oils. According to the findings of the FDA, exposure to partially hydrogenated oils raises bad cholesterol levels. This raised cholesterol level has been attributed to a higher risk of coronary heart disease.

Some companies have positioned their products so that they should not have to react to these new changes. The FDA incentivized companies in 2006 by putting rules in place to promote trans fat awareness. The new regulations allowed companies to label their products as trans fat free if they lowered the level of hydrogenated oils to near zero. Kraft Foods decided to change the recipe of its then 94-year-old product, the Oreo. It took 2 ½ years for Kraft Foods to reformulate the Oreo, and once that period was over, the trans fat free Oreo was introduced to the market. The Washington Post invited two pastry chefs to taste test the new trans fat free Oreo against the original product. Their conclusion was that the two products were virtually the same. This fact should act as a form of reassurance for consumers that are worried that their favorite snacks will be pulled off the shelves.

Returning to the FDA’s guidance, there are a few items worth highlighting. At this stage, the FDA is still in the process of formulating its opinion on how to regulate these partially hydrogenated oils. Actual implementation may take years. Once the rule comes into effect, products seeking to continue to use partially hydrogenated oils will still be able to seek approval on a case by case basis from the FDA. The FDA is seeking advice on the following issues: the correctness of its determination that partially hydrogenated oils are no longer considered safe, ways to approach a limited use of partially hydrogenated oils, and any other sanctions that have existed for the use of partially hydrogenated oils.

People interested in participating with the FDA in determining the next steps taken against partially hydrogenated oils can submit comments to http://www.regulations.gov.