NP Privacy Partner
Search Nixon Peabody's Data Privacy and Cybersecurity blog  Nixon Peabody on Twitter Nixon Peabody on YouTube
Subscribe:Nixon Peabody's Data Privacy and Cybersecurity blog  Nixon Peabody's Data Privacy and Cybersecurity blog
Share Print View
Data Privacy Blog > Categories
Adverts or badverts: The difference, though important, may be difficult to discern

A “badvert” is a false advertisement that has been coded to redirect the user to malicious content. Known as maladvertising in the infosecurity community, badverts generate revenue for the attacker by redirecting the user to a page that delivers genuine advertisements that the coders behind the original, legitimate advertisement did not otherwise intend the user to see. It is also quite common for the page to which the user is redirected to contain malicious software (also known as malware), which is a term used to generally refer to computer viruses or software that enables a user to obtain covert information about another’s computer activities by transmitting data covertly from the victim’s hard drive.

One particularly prolific badvertising attacker is eGobbler, which has undertaken several wildly successful badvertising campaigns. The first truly newsworthy badvertising campaign by eGobbler resulted in roughly 500 million legitimate advertisements being compromised on the iPhone in only ten days in April 2019. The attacker, or more likely attackers, found a vulnerability in the Google Chrome application for iOS that allowed them to bypass pop-up blockers and redirect unsuspecting users to the badvert sites. Security researchers later concluded that eGobbler had been behind a campaign that resulted in the corruption of over 1.1 billion advertisements. Security researchers believe that eGobbler may be an organized criminal venture, as the attacker has been able to locate software vulnerabilities specific only to certain applications on certain devices and quickly exploit those vulnerabilities with expert efficiency. Researchers are attempting to run test environments on various devices to spot eGobbler campaigns in the early stages. This is an increasingly difficult task as the attackers have begun exploiting software loopholes that render “sandboxing”[1] measures useless as a defense against badvert campaigns.

How can you protect yourself?

Security research teams constantly monitor applications and devices for potential maladvertising threats. Once discovered, these teams report the vulnerabilities to in-house security teams at companies such as Google and Apple. The Google and Apple teams then develop protections to the vulnerabilities and release those protections in patches.[2] Therefore, you should ensure that your operating systems and browsers are completely up to date and capturing the latest patches released by the development teams. For example, the eGobbler loophole discussed above was corrected in the iOS 13 release on September 19.

[1] Sandboxing refers to a software management strategy that detects potentially malicious code and executes that code behind the scenes without causing harm to the user’s device or network.

[2] A patch is an update to computer software that is designed to fix specific issues with that software.

AOL to pay largest settlement in COPPA history

Oath Inc., the owner of AOL, will pay $4.95 million in penalties for violations of the Children’s Online Privacy Protection Act (COPPA).  New York Attorney General, Barbara Underwood, brought charges against AOL claiming that the company collected, used and disclosed the personal information of individuals who used its websites, allowing targeted display advertisements to be placed on websites directed to children under 13. 

COPPA is one of the few federal regulations related to online privacy.  It was passed by the U.S. Congress in 1998 and took effect in April 2000.  It requires websites to obtain parental consent for the collection or use of personal information from children under 13 and to include specific provisions in their privacy policies.  It also prohibits targeting children under 13 with ads connected to their online behavior.  Under COPPA, personal information also includes geolocation data and cookies that track a user across websites.

At issue was AOL’s use of its ad exchange, which is like a “virtual auction” that connect websites and potential advertisers with real-time bidding as webpages load.  When a user visits a webpage, information stored on the user’s browser, like cookies or geolocation data, is simultaneously sent to entities that can place a “bid” on the ad space.  The ad exchange, which happens in less than a second, will collect several bids and select a winner, who can then place their ad on that web page.  Ad exchanges must comply with COPPA. 

According to Underwood, AOL used its ad exchange to conduct at least 2 billion ad display auctions on websites it knew were covered by COPPA between October 2015 and February 2017.  It is also alleged that AOL violated COPPA when it participated in auctions conducted by other ad exchanges. 

As part of the settlement, Oath will create a COPPA compliance program, which includes providing annual training on COPPA compliance to employees who work with ads on children’s sites.  Oath has also agreed to destroy any personal information it has collected from children. 

This is the largest settlement amount in COPPA’s history.  It is further evidence of the intense scrutiny internet companies are facing in their collection and use of personal data. 

New suits allege that capturing “persistent identifiers” on devices used by children violates COPPA
The plaintiffs’ bar is working hard to expand the categories of “personal information” that will support a claim for invasion of privacy. Those efforts are reflected in recent class action cases complaining about Internet apps aimed at children that obtain user data and link it to “persistent identifiers” associated with specific devices. In these cases, plaintiffs’ counsel are trying to use the expansive definition of personal information in the Children’s Online Privacy Protection Act (COPPA) to expand the right to privacy into new terrain.

Most statutes define “personal information” as information that would allow one to identify a specific human being, e.g., name, address, social security number and medical information. However, a great deal of online data gathering is linked to codes that identify a particular device, not a human being. Nevertheless, those “persistent identifiers” can be used by data aggregators to create a profile of the online behavior engaged in by the user of the particular device and, as more and more people are the sole users of devices such as smart phones, that amounts to a profile of the device’s owner.

In 2012 the FTC amended its regulations implementing COPPA to include persistent identifiers within the definition of personal information. Beginning in 2015 it began bringing enforcement actions against children-oriented websites that collect user data linked to persistent identifiers without parental consent. Now the plaintiffs’ bar is getting in on the action. On August 3, 2017, plaintiffs brought a case against The Walt Disney Company claiming that online game apps such as “Beauty and the Beast” collect data about app users and link it the persistent identifiers in the devices the children are using. (Rushing v. The Walt Disney Company, N.D. Cal. No. 3:17-cv-4419.) A similar suit was filed against Viacom on August 7 (Rushing v. Viacom Inc., N.D. Cal. 3:17-cv-4492).

The hitch is that COPPA does not provide a remedy in damages, so the complaints allege the common law claim of “intrusion into seclusion” (for the nationwide class) and violation of the right to privacy contained in the California Constitution (for the California subclass). Both of those theories will require plaintiffs to show that the intrusion into privacy was “highly offensive,” a hurdle they clearly intend to surmount by referencing the fact that the FTC has issued a rule that defines “persistent identifiers” as personal information, at least when applied to children.

One interesting question will be whether the existence of the FTC definition in the COPPA regulation is enough to support the “highly offensive” test necessary to prevail on the common law tort, despite the fact that Congress chose not to put a private right of action in COPPA. The second question is the extent to which identifiers that tie to a device, not a human being, become generally recognized as a category of “sensitive personal information” for adults. That battle is already being fought in the context of other statutes. In April 2016, the U.S. Court of Appeals for the First Circuit ruled that a USA Today website that collected a unique device ID and the geolocation of the device being used, along with the name of the video being watched by the device user, was collecting “personal information” under the Video Privacy Protection Act. (Yershov v. Gannett Satellite Info. Network, Inc., 820 F.3d 482 (2016). On April 21, 2016, Jessica Rich, Director of the FTC Bureau of Consumer Protection, warned businesses that if their website captures persistent identifiers, they should not represent to their customers that no “personal information” is being captured.
The place of device identifiers in the law of privacy continues to be an interesting and evolving question.
FTC updates COPPA compliance plan for business

The Federal Trade Commission (FTC) has updated its compliance plan for companies subject to the federal Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501-6506. Passed in 1998, COPPA required the FTC to issue and enforce regulations concerning children’s online privacy. The Commission’s amended COPPA Rule took effect in July 2013.

The FTC’s COPPA Rule applies to websites, online services, mobile apps and other connected devices or internet-enabled platforms (such as gaming and social networking), operating for commercial purposes, that collect, use or disclose personal information from children, and operators of general audience websites or online services with actual knowledge that they are collecting, using or disclosing personal information from children. It also applies to websites or online services that have actual knowledge they are collecting personal information directly from users of another website or online services directed to children, and applies even if a company is passively tracking a child online. Generally, the law requires direct notice to parents and verifiable parental consent before information is collected about a child and specifies methods for notice and consent. COPPA also contains requirements for published privacy policies, and restricts marketing to kids. Under the law, businesses are required to establish and maintain reasonable procedures to protect the confidentiality, security and integrity of personal information collected from children.

According to the FTC, the “six-step compliance plan for your business” is intended “to reflect developments in the marketplace—for example, the introduction of internet-connected toys and other devices for kids.” The compliance plan helps businesses determine if they are subject to COPPA and if so, how to comply. The updated plan discusses staying COPPA compliant as companies adopt new business models to collect personal information with evolving technologies (e.g., through voice-activated devices), the expansion of products covered by the Act (e.g., “the law can also apply to the growing list of connected devices that make up the Internet of Things”) and additional methods for obtaining parental permission (asking knowledge-based authentication questions and using facial recognition to match a verified photo ID). The updated guidance can be found here.


Court rejects claims under Illinois Biometric Act relating to video game facial scans
As we have previously reported, several lawsuits have been filed recently in federal courts under Illinois’ Biometric Information Privacy Act (Act) raising challenges to the storage or use of biometric data. In the latest ruling addressing the Act, Judge John G. Koeltl of the United States District Court for the Southern District of New York dismissed a putative class action filed by two plaintiffs against the videogame maker of the NBA 2k videogame series. Vigil v. Take-Two Interactive Software, Inc., No. 15-8211 (S.D.N.Y. Jan. 27, 2017).
The plaintiffs’ lawsuit concerned the videogame’s feature allowing users to scan their faces to create personalized virtual basketball avatars for in-game play. Although the plaintiffs did not contend that their face scans were disseminated or used for any purpose other than playing the video game, for which they gave consent, they contended that Take-Two failed to comply with various provisions of the Act. They claimed that the videogame maker failed to give sufficient notice regarding its data retention policies regarding the collection and use of facial prints and failed to use adequate security when transmitting the prints.
In a lengthy opinion, Judge Koeltl analyzed whether the purported technical violations of the Act created legally sufficient alleged injuries to the plaintiffs to confer standing to sue. His ruling focused particularly on the standing analysis under the Supreme Court’s recent ruling in Spokeo and what constituted the requisite injuries to allow the plaintiffs to litigate under the Illinois Act. Judge Koeltl rejected the plaintiff’s arguments that Take-Two’s storage and transmission practices posed an “enhanced risk of harm” of hacking or misappropriation. “[T]he hypothetical magnitude of a highly speculative and abstract injury that is not certainly impending does not make the injury any less speculative and abstract.” The plaintiffs also based their claims upon the notice and consent associated with their creation of the personalized avatar, claiming that the MyPlayer terms and conditions did not disclose the specific purpose of the scanning or did not publish its data retention schedule. Plaintiff characterized this harm and their basis for standing as their “right to information.” Judge Koeltl concluded that the Act’s disclosure and consent requirements allowed parties to set the parameters of the use of their biometric data, which occurred here in the plaintiffs’ creation of their personalized avatars for gameplay. The court concluded that, at most, the plaintiffs had alleged bare procedural violations of the Act, which alone were insufficient to confer them with standing to sue under Spokeo’s requirements.
As more states follow Illinois’ lead and enact laws relating to the increasing use of biometric data, we expect to see more litigation relating to the precise scope of the statutory protections and the private causes of action allowed under them. We will continue to monitor and report on the several cases filed under the Illinois’ Act. Plaintiffs in the Take-Two litigation have filed an appeal to the Second Circuit challenging the dismissal of their lawsuit.
Connecticut student data law takes effect on October 1st
On June 9, Connecticut Governor Dannell Malloy signed into law Public Act 16-189, An Act Concerning Student Data Privacy, which enacts significant privacy protections regarding the compilation and use of student data. The Act, which takes effect on October 1, is significant as schools increasingly utilize web-based educational programs, cloud computing, mobile applications and other electronic means in daily educational settings. The Act imposes minimum privacy and contractual obligations for parties that create, use or handle student data.
The Act includes the following provisions to protect student data:
• Restricting the use of student data by entities that contract to provide educational software and electronic storage of student records and by operators of websites, online services or mobile applications;
• Confirming that student data collected for school purposes is not owned by third-party contractors;
• Requiring local education boards to notify parents of contracts with a software, data storage or internet service provider;
• Imposing data security and privacy provisions in all contracts between local school districts and software, data storage and internet service providers;
• Requiring school districts to withhold the release of student directory information if a local or regional education board determines that the request for the directory information is unrelated to school purposes;
• Describing the procedures and timing that a contractor must follow upon the unauthorized release, disclosure or acquisition of student information or records; and
• Requiring that the local or regional education board and a contractor ensure compliance with the Family Educational Rights and Privacy Act of 1974 (FERPA).
Connecticut joins several other states that have enacted similar student privacy laws, and we expect to see more state legislative activity over the next year as schools expand technologies in education that involve the sharing of student data.
Children's Online Privacy Protection Act might be next for changes

Democratic Senator Mark Warner of Virginia is urging the Federal Trade Commission to consider updating the Children’s Online Privacy Protection Act (COPPA) to reflect the changes in technology. Congress enacted COPPA to protect the privacy and safety of children online by making the unauthorized collection, storage and use of children’s personal information illegal. According to the FTC’s website, the “primary goal of COPPA is to place parents in control over what information is collected from their young children online.” The rule aims to protect children under the age of 13 and “applies to operators of commercial websites and online services (including mobile apps)” that “collect, use[] or disclose personal information from children.”

Senator Warner, as the co-chair of the Senate Cybersecurity Caucus, now fears that COPPA’s rules are out of date. COPPA was enacted in 1998 and new technologies available and present in children’s toys have dramatically increased the ability and prevalence of data collection. Senator Warner cited to the ability of hackers to unlock vulnerabilities in talking dolls and change their responses. Other data breaches involving toys have included the unauthorized collection of names, genders and birthdays of millions of children.

While children are the primary concern, parents are also at risk through these toys—hackers could use the digital toys as a weak link in a family’s home network.

Senator Warner's call to the FTC to update COPPA resembles recent calls to update other laws regulating internet use and data privacy. 

Dear Colleague Letter addresses privacy rights of transgender students
On May 13, 2016, the United States Departments of Justice (“DOJ”) and Education (“DOE”) issued a joint “Dear Colleague Letter” (“DCL”) offering “significant guidance” on a federally assisted school’s Title IX obligations regarding transgender students and explaining how the Departments evaluate a school’s compliance with these obligations. Since the DCL’s issuance, considerable public commentary has focused on the DOJ and DOE’s positions on a transgender student’s use of facilities such as restrooms, locker rooms and residence halls that correspond to the student’s gender identity. The DCL also provides important guidance on privacy and educational records for transgender students under the Family Educational Rights and Privacy Act (“FERPA”).
The DCL states that the “Departments may find a Title IX violation when a school limits students’ educational rights or opportunities by failing to take reasonable steps to protect students’ privacy related to their transgender status, including their birth name or sex assigned at birth.” Also, nondisclosure of personally identifiable information (“PII”), such as a student’s birth name or sex assigned at birth, could violate FERPA. Schools may maintain records with this information, but such records should be kept confidential.
The DCL offers the following specific guidance regarding the privacy rights of transgender students:
Disclosure of PII from Education Records: FERPA contains an exception allowing for the nonconsensual disclosure of a student’s records with PII when made to individual school personnel who have been determined to have a legitimate educational interest in the information. The Departments warn that this exception should be narrowly applied if a school has disclosed a student’s transgender status to some members of the school community.
Disclosure of Directory Information: FERPA’s implementing regulations allow schools to disclose appropriately designated “directory information” from a student’s record. Directory information is generally considered to be limited to information such as “a student’s name, address, telephone number, date and place of birth, honors and awards, and dates of service.” Schools cannot designate students’ sex, including transgender status, as directory information. Also, a school must allow eligible students (age 18 and over or attending a postsecondary institution) or parents, as appropriate, a reasonable amount of time to request that the school not disclosure the student’s directory information.
Amendment or Correction of Education Records: A school may receive a request to amend a student’s education records to make them consistent with the student’s gender identity. “Updating a transgender student’s records to reflect the student’s gender identity and new name will help protect privacy and ensure personnel consistently use appropriate names and pronouns.” FERPA requires a school to consider a request to amend information in an education record that is misleading, incorrect or in violation of the student’s privacy rights. A school must respond to a request relating to the student’s transgender status consistent with its general practices for amending other students’ records. If school declines, it must inform the requesting eligible student or parent of its decision and afford the requestor a right to a hearing. If the hearing likewise denies the request, the requester has the right to insert into the record a statement expressing his or her disagreement with the result. This statement must be disclosed whenever the student’s record is disclosed.
Colorado student data privacy bill on its way to passage

Districts and states collect a long list of data on students, including personal information, test scores, special education information, disciplinary records and more. But, as personal data becomes increasingly monetized and as people begin to create a digital footprint earlier in life, parents are concerned that their children’s early education data may create unwanted targeted advertising or come back to haunt them later in life if such data is not adequately protected.

Heeding this concern, Colorado lawmakers are attempting to pass a bill with tough student data privacy laws. The bill imposes duties on commercial entities that provide school technological services by formal contract with Colorado education entities, such as the Department of Education and charter schools. Educational institutions must ensure that the contract or agreement with commercial vendors includes the restrictions and requirements pertaining to student personally identifiable information (“PII”) and must terminate the contract or agreement if the contract provider commits a material breach of the contract involving misuse or unauthorized release of a student’s PII. The bill also requires the Department to enter into a contract with a person or entity conducting research that includes the same requirements and restrictions that are included in a contract with a commercial entity provider. It further sets controls over classroom apps and software used by individual teachers, among other things.

For example, a contract provider cannot sell student PII, use or share student PII for use in targeted advertising or use student PII to create a profile, except for purposes authorized by the contracting public education entity or with parental consent. Each contractor must also maintain a comprehensive information security program and must destroy student PII at the request of a contracting public education entity. It would also require more transparency, like posting information about such contracts on district websites and delineating the type of PII collected.

As technology has outpaced federal student privacy law, requiring over 20 states, including New York and California, to enact their own student privacy laws, commercial vendors and educational institutions should research the student privacy laws, if there are any, in their state to ensure that their contracts are legally compliant.

NY's highest court issues a split ruling addressing parents’ rights to record an overheard conversation
On April 4, 2016, New York’s highest court ruled that parents can legally eavesdrop and record an overheard conversation if they believe in good faith that their minor child is in danger. This ruling counters the current New York wiretapping law.
Under New York’s wiretapping law, “[a] person is guilty of eavesdropping when he unlawfully engages in wiretapping, mechanical overhearing of a conversation or intercepting or accessing of an electronic communication” without the consent of at least one party to the conversation. The court’s ruling allows a parent to give consent on behalf of their minor child.
The 4–3 decision by the court of appeals affirmed that a recording made by the father of a five-year-old son was admissible evidence in a criminal trial against the mother’s live-in boyfriend, Anthony Badalamenti. The May 6, 2008 recording captured Badalamenti threatening to punch the child in the face. However, the phone was inadvertently answered and no one was on the other end. The father, listening to these threats, used an application on his phone to record the conversation. He saved the conversation but did not immediately turn it over to the police.
Approximately five months later, the mother’s landlord called the police after hearing screams and crying coming from the apartment. The police arrested Badalamenti and charged him with assault. At this point, the father turned over the recording, which was played during the criminal trial. The jury found Badalamenti guilty and he was sentenced to seven years in prison.
On appeal, Badalamenti argued that the phone recording was inadmissible because there was no party consent. An appellate panel affirmed the trial court’s ruling, adopting the “vicarious consent doctrine” from the Sixth Circuit and a prior New York decision, which held that parents are not subject to penalties if they record a child’s conversation in the best interests of the child.
The court of appeals affirmed, stating that the boy’s father had a good faith, objectionably reasonable basis to believe that recording this conversation was necessary for the welfare of his son. The court was careful to note that this exception does not extend to a parent or guardian acting in bad faith or who is merely curious about their child’s conversations. Writing for the majority, Judge Eugene M. Fahey stated that this ruling was consistent with “the long-established principle that the law protects the rights of a parent or guardian to take actions he or she considers to be in his or her child’s best interests.”
The dissent written by Judge Leslie Stein raises policy concerns that this issue should be left to the legislature given its far-reaching implications into divorce and custody disputes as well as juvenile delinquency.
1 - 10 Next

Privacy Policy | Terms of Use and Conditions | Statement of Client Rights
This website contains attorney advertising. Prior results do not guarantee a similar outcome. © 2018 Nixon Peabody LLP