NP Privacy Partner
Search Nixon Peabody's Data Privacy and Cybersecurity blog  Nixon Peabody on Twitter Nixon Peabody on YouTube
Subscribe:Nixon Peabody's Data Privacy and Cybersecurity blog  Nixon Peabody's Data Privacy and Cybersecurity blog
Share Print View
Data Privacy Blog > Categories
SCOTUS decision protects digital privacy

Last week, the U.S. Supreme Court ruled in favor of digital privacy. In a 5-4 decision, the justices concluded that law enforcement must have warrants to obtain phone location data as evidence to be used for trials.

Carpenter v. United States is the first case about phone location data that the Supreme Court has ruled on. In ruling on the side of Timothy Carpenter, who was convicted of robbery in 2013, the court determined that police officers illegally obtained location data from his cell phone carrier. From this data, the government pulled almost 13,000 location points tracking Carpenter’s whereabouts for 127 days. Four of these location points put Carpenter near the scenes of the robberies.

A Sixth Circuit Court of Appeals judge previously ruled that cell site location information (CSLI) is not protected by the Fourth Amendment, which protects against unreasonable search and seizure, and thus, did not require a warrant. The Supreme Court disagreed. Chief Justice John Roberts, writing for the majority, said “modern cell phones generate increasingly vast amounts of increasingly precise CSLI.”

The government argued that phone companies own the customers’ data, not the customer. Chief Justice Roberts responded that “[t]he government’s position fails to contend with the seismic shifts in digital technology…”

Prior to the trial, major tech companies filed amicus curiae briefs encouraging the Supreme Court to make it harder for law enforcement to obtain this data without a warrant.

The decision could potentially lead to an influx of litigation by defendants seeking to challenge the use of this information in pending cases. It could also open the gates to other privacy-related issues the courts will need to addresssuch as whether real-time GPS information should be treated differently. The Carpenter decision is limited to historical GPS data; it does not apply to security cameras, business records or real-time location tracking.

Additionally, this leads to cases regarding encryption. It is likely that courts will soon have to consider whether an accused person could be required by the government to provide a fingerprint or facial scan to unlock an encrypted device. This could implicate the Fifth Amendment’s protection against self-incrimination. A Minnesota Supreme Court recently ruled that law enforcement could compel a burglary suspect to provide a fingerprint to open an encrypted device, stating that the fingerprint was only physical evidence and not the contents of his mind.

For now, privacy advocates are celebrating the victory in the Carpenter case, but understand that this is only the beginning.

The full text of the case can be found here.



Sixth Circuit addresses the unmasking of an anonymous blogger who engaged in copyright infringement

We have posted pieces on several recent cases in which courts have addressed whether and how an anonymous blogger should be unmasked. Courts have reached conflicting results when balancing the alleged harms caused by anonymous posts against the speaker’s First Amendment rights. On November 28, 2017, the United States Court of Appeals for the Sixth Circuit became the first appellate court to weigh in on the issue. The Sixth Circuit addressed whether a plaintiff that prevailed in a copyright infringement lawsuit is entitled to injunctive relief that would include the unmasking of the John Doe defendant, who posted the company’s copyrighted materials on his blog. Signature Management Team LLC v. John Doe, No. 16-2188 (6th Cir. Nov. 28, 2017).

Signature Management Team LLC (“Team”) sells materials designed to help individuals profit in multi-level marketing materials. John Doe anonymous runs a blog that criticizes multi-level marketing companies. Doe posted a hyperlink to an edition of a book copyrighted by Team, which led Team to sue for infringement. Team sought judicial relief disclosing Doe’s identity, Doe’s destruction of all copies of the book in his possession and a permanent injunction barring Doe’s infringement use of the book. Doe responded by raising a fair use defense against the infringement claims and asserted a First Amendment right to speak anonymously. During discovery, Team moved to compel Doe’s identity. The trial court concluded that unmasking the anonymous speaker to Team could impact Doe’s defenses in the litigation, but it did order Doe to reveal his identity to the court and to Team’s lawyers, subject to a protective order preventing Team from learning Doe’s identity. When the case was reached on the merits, the trial court found for Team and had to determine the appropriate order. The trial court found that unmasking Doe was unnecessary because Doe represented that he would commit infringement again and had destroyed all copies of the book in his possession. Team appealed the trial court’s refusal to unmask Doe.

On appeal, the Sixth Circuit issued a split 21 ruling. Writing for the majority, Justice Helene M. White noted that “no case has considered the issue presented herewhether and under what circumstances a court can properly protect a party’s anonymity after judgment.” The fact that liability was established “is an important distinction. The prejudgment cases often deal with a plaintiff’s need to unmask a defendant to effectuate service of process . . . .” Regarding the issues before the court at this stage, Justice White wrote that the entry of a final judgment negates concerns that the unmasking could impair a defendant’s ability to defend itself in the litigation. Even so, there may not be a practical need for the post-judgment unmasking of an anonymous defendant who voluntarily complied with the relief to prevent further harm.

The majority ruled that the trial court applied too protective a standard in its ruling declining to unmask Doe. The trial court balanced factors developed in connection with pre-judgment proceedings. The majority stressed that the trial court failed to recognize that “very different considerations apply” after the entry of a final judgment on the merits, particularly the presumption in favor of open judicial proceedings. Nonetheless, the majority concluded that there are still factors suggesting that Doe may retain the right to remain anonymous, especially if an unmasking order would unmask him in connection with both protected and unprotected speech and might hinder his ability to engage in anonymous speech in the future. The Sixth Circuit remanded the case back to the trial court for reconsideration of the unmasking issue applying the concerns and factors identified in the majority’s opinion.

In a sharply worded and succinct dissent, Justice Richard F. Suhrheinrich criticized the majority for acting like “an overprotective parent.” The dissent stated that Doe should not be shielded from the consequences of his infringement actions, which are not protected speech under the First Amendment. Doe could have preserved his right to speak freely and anonymously by doing so without committing copyright infringement. The dissent contended that no balancing is necessary and that the proper course is to remand the case back to the trial court with an instruction to order the revealing of Doe’s identity.

We will monitor the proceedings on remand. This is not the last word in this case, and we expect to see similar issues continuing to arise in other cases with the proliferation of Internet speech.

FBI issues warning on internet-connected toy privacy risks
The Internet of Things has been a hot topic in recent years, with internet-connected toys forming a growing piece of our ever-connected world. And as with all such devices, “smart” toys have privacy and security risks that consumers need to be aware of. On July 17, 2017, the FBI issued a consumer notice warning that internet-connected toys could present such risks for children. In particular, the FBI noted the opportunities for child identity fraud and exploitation that these toys create. Containing sensors, microphones, cameras, data storage components, speech recognition software and GPS, internet-connected toys have the capability to collect detailed personal information about a child—such as names, schools, location, likes and dislikes.  Combined with the information consumers provide as part of creating user accounts as well as the more conventional photos and videos that may be taken while interacting with the toy, internet-connected toys could become a rich source of information for hackers.

Accordingly, the FBI is urging consumers to be proactive in investigating a company’s privacy and data security policies before purchasing and using an internet-connected toy. Understanding how your data is collected, transmitted and stored is a vital piece in protecting your personal information.  For example, all of the communication connections involved with an internet-connected toy, including connections from the toy to the internet, between partner applications and to servers where data may be stored, present vulnerabilities that could be exploited to the point where eavesdropping on conversations could occur. 
The FBI recommends that parents research how and where data collected by the toy is stored and used, any security issues a company has previously faced and to carefully read disclosures and privacy policies. Further, parents should monitor their children’s activity with “smart” toys, ensure that toys are turned off when they are not being used and to follow basic cybersecurity precautions such as having a strong password and providing only minimal personal information.

Finally, the FBI notes that the Children’s Online Privacy Protection Act is in place to provide certain protections to children under the age of 13 and that the FTC has recently updated its guidance with respect to internet-connected toys and associated services. Parents should understand what protections are in place and be vigilant in ensuring they are being followed as well as the limits of these protections.
FTC provides comments on measures to enhance Internet of Things device security
The Federal Trade Commission has submitted public comments to a working group convened by the U.S. Commerce Department’s National Telecommunications and Information Administration (NTIA), which is developing guidance about methods for Internet of Things (IoT) manufacturers to inform consumers about security upgrades for devices. In late April, the NTIA sought comments on its  draft of Communicating IoT Device Security Update Capability to Improve Transparency for Consumers (“Elements of Updatability”).
The FTC bases its comments upon its law enforcement, policy initiative and consumer and business education. The FTC noted that the burgeoning IoT marketplace offers enormous benefits to consumers, such as health tracking data, real time notifications in connected cars and efficiencies in home utilities. Yet, the FTC warns that “consumers do not trust IoT devices,” which can create opportunities for attackers to steal data or assume device control by ransomware or botnets. In deciding whether and how to patch devices, manufacturers must balance the benefits of the enhanced safeguards against the costs of developing, testing and deploying software updates. Providing consumers with clear information about whether, for how long and at what costs their IoT devices will receive security support will benefit consumers, enhance competition and promote innovation.
The FTC commended NTIA for creating a multi-stakeholder process in which industry, government and consumer representatives have shared recommendations on enhancing IoT security. The Elements of Updatability divides its guidance into two categories: (1) “key elements” that IoT manufacturers should convey to consumers before sale to them to facilitate informed purchasing decisions and (2) “additional elements” manufacturers should communicate to consumers either pre- or post-purchase.
The “Key Elements” suggest that companies should make the following three disclosures before sale: (1) whether the device can receive security updates, (2) how the device receives security updates and (3) the anticipated timeline for the end of security support. The FTC recommends adjusting the third disclosure (timeline) and adding a fourth one (key use limitations). The FTC suggests that manufacturers should consider whether they can disclose a minimum support period in addition to, or instead of, and an anticipated timeline. Disclosure of a guaranteed minimum period would provide clear information to consumers as they compare devices. Also, the FTC recommends adding to the “Key Elements” a disclosure of whether a device will stop functioning or become highly vulnerable when security support ends.
Regarding the “Additional Elements,” the FTC recommends suggestions regarding information that manufacturers should convey to consumers before or after purchase: (1) adopting a uniform notification method (e.g., a standard position on the device’s screen or in the notification center of the device-related app); (2) enabling consumers to sign up, either at the point-of-sale or after, for affirmative notifications about security support; and (3) providing consumers with real-time notifications when support is about to end. Finally, the FTC recommends the omission of the “additional element” describing how the manufacturer secures updates and the update process. Explaining those safeguards to consumers imposes significant communication costs on industry while providing little, if any, benefit to consumers.
We will continue to review guidance documents and recommendations on the evolving security and consumer issues arising with the proliferation of IoT devices.
House Committee on Small Business provides cyber security guidance
This month, the United States House of Representatives Committee on Small Business held a hearing on cyber risks facing small businesses and issued guidance to assist in addressing the challenges. The hearing included testimony from Maureen Ohlhausen, Acting Chairperson of the Federal Trade Commission, who warned that, in the case of small businesses, a data breach can be devastating. In fact, Chairperson Ohlhausen noted that the majority of cyberattacks target small- and mid-sized businesses, and, according to the National Cyber Security Alliance, approximately 60% of small businesses go out of business within six months of a breach.
As a result of its hearing and work, the Committee has issued three publications: Data Breach Response, Protecting Personal Information and Building Security in the Internet of Things.
Regarding data breach responses, the Committee suggests that small businesses follow the measures recommended by the FTC’s Protecting Personal Information: A Guide for Business and Start with Security: A Guide for Business publications. The Committee stresses the importance of immediate steps, including the assembly of the best cross-functional team reasonably available to small businesses, the need to secure physical areas and taking affected equipment off line to stop additional data loss. Next steps include interacting with service providers and forensic examiners, as well as having a communications plan in place. Further and careful consideration must be given to all notification requirements.
Regarding the challenges posed by increased connectivity, the Committee suggests that small businesses should start with the fundamentals to understand the evolving Internet of Things and take advantage of what experts have already learned about security. Businesses should design their products with proper authentication, which is a must in the Internet of Things. Careful consideration must be given to protecting the interfaces between a company’s product and other devices or services.
Regarding best practices to protect personal information, the Committee stresses the following basic but often ignored key principles:
• Take stock of the business’ personal information in its files and computers
• Scale down information by keeping only what must be maintained
• Lock down and properly protect maintained information
• Properly dispose of information that is no longer necessary
• Create a plan in advance to respond to security incidents.
The cyber risks and challenges facing small businesses can be daunting. Sole proprietorships and companies with only a few employees typically lack full-time information technology or human resources staff. They can fall easy prey to attacks on their network or phishing schemes. Also, in addition to preventative challenges, small companies may not be fully prepared for the typically rapid response required once a breach is discovered, including getting a full grasp on any notification requirements under the law. As the House Small Business Committee stresses, vigilance is a must for small businesses and maximization of available resources must be fully implemented to ensure cybersecurity.
California Supreme Court rules that public employees’ use of private e-mails may be subject to public records request
In a recent ruling, the California Supreme Court addressed how laws such as the California Public Records Act (CPRA), originally designed to cover paper documents, apply to evolving methods of electronic communication. The court recognized that, in today’s environment, not all employment activity occurs during a conventional workday, or in an employer-maintained workplace. The court adjudicated a public records request concerning how the balance of the public’s right to know under the CPRA is balanced against an employee’s personal privacy interests. City of San Jose v. The Superior Court of Santa Clara County, Cal. Sup. Ct. No. S218066 (filed 3/2/17).
The suit arose out of a public records act request targeting documents relating to redevelopment efforts in downtown San Jose and included e-mails and text messages “sent or received on private electronic devices used by” certain municipal officials. The City disclosed communications made by using City telephone numbers and e-mail accounts but did not disclose communications using the individuals’ personal accounts. A lawsuit arose, where the trial court ordered disclosure but an intermediate appellate court ruled otherwise. The California Supreme Court reversed and ordered disclosure.
Particularly, the California Supreme Court faced the following question: Are writings concerning the conduct of public business beyond CPRA’s reach merely because they were sent or received using a nongovernmental account? Considering CPRA’s language and the important policy interests that it serves, the answer is no. Employees’ communications about official agency business may be subject to CPRA disclosure regardless of the type of account used, the preparation, or transmission.
The City argued that a document concerning official business is only a public record if located on a government agency’s computer servers or in its offices. Under this argument, indirect access would not be sufficient to compel disclosure under the CPRA. The court rejected this argument, finding that a document’s status as public or confidential does not turn on the arbitrary circumstance of where it is located.
The California Supreme Court remained cognizant of employees’ privacy rights in personal devices and accounts. It noted that agencies may develop appropriate internal policies regarding the scope and conduct of document searches. An agency can reasonably delegate employees to search their own devices, provided that there will be no circumvention of the CPRA’s intent. Affidavits can give the requesting party and a reviewing court sufficient factual basis to determine the adequacy of the search and whether withheld material is indeed nonresponsive.
This ruling is important given the evolving realities of the workplace, and we expect to see similar challenges in other states under public records laws. We will report on significant developments in the always delicate balance between the public’s right to know and the privacy interests of public employees who use their own devices and accounts in the workplace.
FTC announces contest to enhance Internet of Things security
The Federal Trade Commission (FTC) has challenged the public to create a tool to enhance security protections in the software of home devices connected to the Internet of Things. The FTC is offering a cash prize up to $25,000 for the best technical solution, with up to $3,000 available for up to three honorable mention winner(s).
In describing the IoT Home Inspector Challenge, the FTC advises contestants that “[a]n ideal tool might be a physical device that the consumer can add to his or her home network, or it might be an app or cloud-based service, or a dashboard or other user interface. Contestants also have the option of adding features such as those that would address hard-coded, factory default or easy-to-guess passwords.”
The FTC notes that the Internet of Things, “an array of billions of everyday objects sending and receiving data over the internet, is expanding rapidly with the adoption of applications such as health and fitness monitors, home security devices, connected cars and household appliances. It holds many potential benefits for consumers, but also raises numerous privacy and security concerns that could undermine consumer confidence.”
Submissions will be accepted as of March 1, 2017, and up to May 22, 2017, at noon EST. Winners will be announced on July 27, 2017. The rules for the contest are published in the Federal Register and available at: where you can find instructions and requirements regarding the registration and submission process. Contest information will also be posted on, an online challenge platform administered by the U.S. General Services Administration.
The FTC has taken an active role in addressing evolving privacy and security issues relating to the Internet of Things. A useful resource for review is the FTC’s Staff Report issued in January 2015 titled Internet of Things: Privacy & Security in a Connected World.
Bipartisan bill seeks to enact Email Privacy Act
On January 9, Congressmen Kevin Yoder (R-KS) and Jared Polis (D-CO) reintroduced the Email Privacy Act, which seeks to update the Electronic Communications Privacy Act (ECPA) enacted in 1986. Under the ECPA, the government requires a probable cause warrant to access electronic communications stored on third-party service providers that are less than 180 days old or unopened. The government may obtain through a subpoena electronic communications that are opened or more than 180 days old. The Email Privacy Act would require all governmental agencies to obtain a warrant to search Americans’ online communications, regardless of when the e-mail was crafted. This legislation is the latest initiative among the many calls to update the ECPA.
The Email Privacy Act received bipartisan support last year and passed the House of Representatives in April 2016 by a 419–0 vote. The Senate, however, failed to act on the bill before the 114th Congress came to a close. In his press statement upon the legislation’s reintroduction last week, Congressman Yoder wrote: “Let’s give the Senate ample time to act, because more than 30 years has been long enough for Congress to wait on this. It’s simple, in 2017, if the federal government wants to access Americans’ digital content, it must get a warrant.”
The Email Privacy Act seeks to affirm that Americans have a reasonable expectation of privacy in their e-mail accounts and content stored online. The government will be expected to show probable cause to compel service providers to disclose any communications or their means of storage. The sponsors of the legislation stress that it still preserves the legal tools necessary to conduct criminal investigations and protect the public, and nothing in the bill alters the current warrant requirements under the Wiretap Act, Foreign Intelligence Surveillance Act or any other law.
We will monitor the Email Privacy Act in Congress this term and report back on any significant developments as it is considered for passage in both houses.
The Cato Institute provides detailed policy analysis regarding “Privacy in the Age of Police Drones”
On December 13, 2016, the Cato Institute issued Policy Analysis No. 807 entitled, “Surveillance Takes Wing: Privacy in the Age of Police Drones,” which overviews of current and near future drone technology used by the military and police and the United States Supreme Court’s Fourth Amendment jurisprudence as it relates to privacy. The author, Matthew Feeney, ends his analysis by recommending policies that would allow law enforcement to both take advantage of drone technology and also protect privacy.

Drone technology
Drones are used for a wide variety of purposes with clear benefits and come in a wide variety of sizes and flight distances and time. The most well-known (or notorious) being the Reaper that the U.S. military has used to surveil and/or eliminate (with Hellfire missiles) targets. Of course, most domestic law enforcement agencies do not have the same financial resources as the military in purchasing the highest end drones, but they do have more funds than the average drone hobbyist. Accordingly, examples of drone purchases range from Reapers used for border surveillance by Customs and Border Protection (CBP) to an 11-pound drone with cameras used by the Arlington, Texas, Police Department. Most worrisome are the types of surveillance equipment attached to these drones (e.g., thermal scanners and biometric tools) that make them so much more potentially intrusive than your average hobbyist drones. For instance, the Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS-IS) collects data from hundreds of the highest end cameras that allow the observer to see six-inch details from 20,000 feet over 25-square kilometers. Furthermore, companies are currently developing drones that can fly for extended periods of months to years or that are the size of a bird.

Supreme Court’s Fourth Amendment jurisprudence

With the ever expanding technology and use of drones by police, one obvious “casualty” could be Americans’ rights to be free from unreasonable searches and seizures. The Cato policy analysis opines that while the “Supreme Court has addressed Fourth Amendment privacy questions raised by new technologies such as GPS locators, thermal scanners and smartphones, ….the Court has yet to tackle the Fourth Amendment questions raised by the emergence of drones.”

The first case discussed is Katz v. United States, 389 U.S. 347 (1967), where the Court first articulated the “reasonable expectation of privacy” test, ruling defendant’s Fourth Amendment rights had been violated when FBI agents attached, without a warrant, an eavesdropping device to the outside of a public telephone booth he used to communicate illegal bets. The author describes the two-part reasonable expectation of privacy test (“an actual (subjective) expectation of privacy and [. . .] that the expectation be one that society is prepared to recognize as ‘reasonable’”) as “circular,” because “after all, citizens’ expectations of privacy are determined by court rulings, which are based on citizens’ expectations, which in turn are determined by court rulings.” Additionally, he believes this test “has been used by the Court to reach decisions not conducive to strong privacy protections.”

The case that the author sees as most relevant to drones, California v. Ciraolo, 476 U.S. 207 (1986), “sets a worrying precedent for privacy advocates amid the proliferation of drones.” There the Court ruled that, after receiving an anonymous tip marijuana plants were in the defendant’s backyard, Santa Clara police officers did not need a warrant to use an airplane flying at 1,000 feet to locate the plants. After spotting the marijuana, the officers secured a search warrant and arrested Ciraolo, who pled guilty to growing marijuana. Despite the fact the marijuana was in the defendant’s backyard that was surround by a six- to ten-foot fence, the Court held that the search did not violate Ciraolo’s expectation of privacy.

Similarly, in Florida v. Riley, 488 U.S. 445 (1989), the Supreme Court found that a police officer did not need a warrant to surveil a suspected marijuana grower’s property from a helicopter flying at 400 feet. Justice Brennan’s dissent provides a prophetical hypothetical:

Imagine a helicopter capable of hovering . . . without generating any noise, wind[] or dust at all . . . Suppose the police employed this miraculous tool to discover not only what crops people were growing in their greenhouses, but also what books they were reading and who their dinner guests were. Suppose, finally, that the FAA regulations remained unchanged, so that the police were undeniably “where they had a right to be.” Would today’s plurality continue to assert that ‘[t]he right of the people to be secure in their persons, houses, papers[] and effects, against unreasonable searches and seizures’ was not infringed by such surveillance?

Mr. Feeney’s take away from this is not surprising: the police currently have access to these “miraculous tools,” which “pose privacy risks more intrusive than the thought-police helicopters from” Orwell’s 1984.

The more recent Supreme Court case, United States v. Jones, 56 U.S. __ (2012), may indicate some change in thinking in this modern era with increased use by police of high-end technology. In Jones, police officers attached, pursuant to a warrant, a GPS locator to a suspected drug dealer’s wife’s car and monitored its position 24 hours a day for four weeks, producing more than 2,000 pages of data. However, the locator was installed outside the scope of the warrant as it was attached a day after expiration and in the wrong geographical location.

The Supreme Court unanimously ruled that fixing this locator to the car and using it to track the car’s public movements constituted a Fourth Amendment search, but arrived at their decisions for different reasons. For instance, the concurring Justices asked “whether respondent’s reasonable expectations of privacy were violated by the long-term monitoring of the movements of the vehicle he drove.” Here, Jones primarily travelled on public roads and could not have had any reasonable expectation of privacy while doing so; however, prolonged surveillance of public activities can reveal private details.

Legal scholars have described this thinking as the “mosaic theory” of the Fourth Amendment:

“Before Jones, Fourth Amendment decisions had always evaluated each step of an investigation individually. Jones introduced what we might call a “mosaic theory” of the Fourth Amendment, by which courts evaluate a collective sequence of government activity as an aggregated whole to consider whether the sequence amounts to a search.”

(Orin Kerr, “The Mosaic Theory of the Fourth Amendment,” Michigan Law Review 111 (2012)).

The author argues that adoption of this theory “would have a significant impact on when police would need a warrant before using a drone, because “[u]nder that approach, police would have to request a warrant to observe an individual for days, weeks or perhaps months with a drone, even if that individual was tracked only in areas where he or she had no reasonable expectation of privacy.” A judge using this theory would have to “analyze the metaphysics of events, determining when one event finishes and another begins.”

Mr. Feeney concludes his discussion of Supreme Court cases by stating, “[c]learly, current Fourth Amendment Supreme Court doctrine is underdeveloped and inadequate to the challenges presented by drones and other emerging technologies.” He continues by espousing the position that “the reasonable expectation of privacy test needs to be revised at a time when the kind of bulk collection used by the NSA is possible.” Failure to do so in this era of significant technological advances could lead to the reasonable expectation of privacy test “leaving us with no privacy.”

Recommended legislation
Instead of waiting on the Supreme Court to reconsider aerial surveillance, the author urges legislatures to pass laws that protect privacy during the rise of police drone usage. Legislatures are equipped to address concerns regarding surveillance, data retention, warrant requirements and weaponization. Regarding surveillance, the suggestion is to pass laws that allow police drone footage to be analyzed with biometric software “only if two conditions are met: 1) that biometric software is used exclusively in violent crime investigations, and 2) that biometric databases only include information related to citizens with a violent crime conviction.” Additionally, thermal scanners on drones should only be used for searches “that are either pursuant to a warrant or for suspects and missing persons.” Finally, legislation should allow the public to access drone footage under a narrow set of circumstances (e.g., arrests, use of force incidents).

For warrants, several states already require warrants for police drones, which prevent indiscriminate surveillance of entire regions. Recommended legislation should include police departments releasing to the public information about types and number of drones, total flight hours, types of missions, etc.

The analysis details the increased use of police willingness to use military technology in their operations (e.g., Mine-Resistant Ambush Protected vehicles, flash bang grenades, etc.). The most recent national incident that jumps out is the Dallas Police Department’s use of the bomb disposal robot to kill with explosives the barricaded suspect accused of killing five officers. The author’s recommendation is to not allow police drones to be equipped with lethal or nonlethal weapons but concludes “[f]ortunately, the law enforcement community has not strongly pushed for armed drones.”

Drones certainly benefit the lives of hobbyists. They also effectively assist both military and police in their duties. However, the increased usage and effectiveness of drone surveillance should not come at the cost of Americans’ Fourth Amendment rights. The Cato Institute’s policy analysis urges the Supreme Court to revisit aerial surveillance should the right case arise, but until then, state legislatures must regulate the police’s drone usage and protect privacy rights at the same time.
Journalist seeks to unmask Twitter user
A senior Newsweek writer, Kurt Eichenwald, has turned to the Texas state courts to learn the identity of a Twitter user against whom he seeks to bring an assault and battery claim. Eichenwald, who has epilepsy, is a commentator/writer on American politics and  President-elect Trump’s business affairs. Recently, Eichenwald appeared in a television interview during which he and a Fox News host engaged in a heated discussion. Shortly thereafter, a Twitter user tweeted Eichenwald an image known to trigger seizures in people with epilepsy—a strobe light image flashing at a rapid speed. The strobe included the words “You deserve a seizure for your posts.” Eichenwald suffered a seizure, and he intends to sue the Twitter user for assault and other intentional torts.
Eichenwald filed a petition in Texas state court seeking to discover from Twitter the identity of the individual, referenced as John Doe in the action. Eichenwald does not intend to sue Twitter, which has suspended the user’s account. He wants to depose a Twitter representative in order to identify John Doe and any persons who acted in concert with him. As noted in the court petition, Twitter’s registration process requires a user to provide a name and address, and records account information and IP addresses. Twitter’s privacy policy requires a court order prior to releasing personal information about its users.
The Texas state court promptly entered an order allowing Eichenwald to take the requested deposition of Twitter, and initiate other discovery as may be appropriate, to determine the identities of John Doe and anyone who assisted him in his tweet. Twitter, notified of Eichenwald’s court filing, agreed to the expedited discovery.
This case is the latest of several we have recently followed where litigants seek to unmask social media users. The results have been mixed, as the courts factor the underlying conduct and claims, as well as any First Amendment or privacy interests.
1 - 10 Next

Privacy Policy | Terms of Use and Conditions | Statement of Client Rights
This website contains attorney advertising. Prior results do not guarantee a similar outcome. © 2018 Nixon Peabody LLP