Author: William C. Martinez

New Jersey’s Consumer Data Privacy Statute – What You Need to Know

On January 16, 2024, Governor Murphy signed S332 into law, making New Jersey the 13th state to enact legislation designed to protect the personal data of its residents. The law will become effective next year, on January 15, 2025, and imposes various obligations on a person or entity (designated as either a “controller” or a “processor”) that collects, discloses, processes, or sells the personal data of New Jersey consumers. The statute establishes various rights for New Jersey residents with respect to their own personal data and also provides consumers with the ability to opt out of disclosure and sale of their personal data in certain circumstances. Finally, the Division of Consumer Affairs has the authority to develop rules and regulations necessary to effectuate the purposes of the statute, and the Attorney General has sole and exclusive enforcement authority. The scope of S332 signed by the Governor was expanded significantly from prior versions. As late as December 17, 2023, the bill only applied to a person or entity that operated “any service provided over the Internet that collects and maintains personally identifiable information from a consumer.” The law enacted less than one month later, however, is not limited to collection of data over the internet; it applies to all “personal data” regardless of how it is...

District Court Affirms United States Copyright Office’s Denial of Copyright Registration for AI-Generated Visual Art

Pursuant to the Copyright Act of 1976, “original works of authorship fixed in any tangible medium of expression, now known or later developed, from which they can be perceived, reproduced, or otherwise communicated, either directly or with the aid of a machine or device” are eligible for immediate copyright protection, provided certain requirements are met. Against this backdrop, Stephen Thaler applied for copyright registration with the United States Copyright Office (USCO) of a piece of visual art produced by a generative artificial intelligence system he created – the “Creativity Machine.” The USCO subsequently denied the application, reasoning that Thaler’s work “‘lack[ed] the human authorship necessary to support a copyright claim,’” as “copyright law only extends to works created by human beings.” After Thaler filed suit against the USCO, both parties moved for summary judgment on the sole issue of whether a work generated entirely by an artificial system should be eligible for copyright protection. On August 18, 2023, in Thaler v. Perlmutter the United States District Court for the District of Columbia granted the USCO’s motion for summary judgment, concluding that “human authorship is an essential part of a valid copyright claim.” The court rejected as contrary to the Copyright Act’s plain language Thaler’s contention that because he created the AI system that “autonomously” produced...

“Say Cheese!” CVS Passport Photo Practices Subject to BIPA Suit

In May 2022, a group of plaintiffs brought a putative class action against CVS Pharmacy, Inc. (CVS) alleging the company violated several provisions of the Illinois Biometric Information Privacy Act (BIPA) through its practices for taking passport photos. On May 4, 2023, in Daichendt and Odell v. CVS Pharmacy, Inc., the United States District Court for the Northern District of Illinois denied CVS’s motion to dismiss, holding the plaintiffs sufficiently stated a claim under Section 15(b) of BIPA. Section 15(b) of BIPA prohibits private entities from collecting “or otherwise obtain[ing] a person’s or a customer’s biometric identifier or biometric information, unless it first”: (1) provides notice of collection; (2) provides notice of the specific purpose of collection; and (3) obtains affirmative written consent. Here, the plaintiffs alleged that CVS required them to “enter[] their names, email addresses, and phone numbers into a computer terminal inside defendant’s stores prior to scanning their biometric identifiers.” Thereafter, CVS’s system would “check” and “verify” an individual’s facial features (i.e., whether the individual is smiling) to comply with government requirements. Against this backdrop, the plaintiffs argued this system violated Section 15(b) because it “collected and stored their personal contact data (‘real-world identifying information’), such as their names and email addresses,” thus allowing CVS the ability to identify the plaintiffs “when...

Are You Hallucinating? Attorneys Sanctioned for the “Unprecedented” Act of Submitting Nonexistent Case Law Provided by ChatGPT

On June 22, 2023, District Court Judge P. Kevin Castel of the United States District Court for the Southern District of New York sanctioned a law firm after it submitted fabricated judicial citations and opinions provided by the popular artificial intelligence (AI) engine, ChatGPT. After plaintiff’s counsel filed an affirmation with the court, which was drafted by one attorney but signed by another at the same firm, defense counsel advised that he had “‘been unable to locate most of the case law cited in [the Affirmation], and the few cases which the undersigned has been able to locate do not stand for the propositions for which they are cited.’” The court “conducted its own search for the cited cases but was unable to locate multiple authorities cited in the Affirmation [].” Accordingly, Judge Castel issued an order to show cause for sanctions, emphasizing the “unprecedented circumstance” presented to the court. The court required a hearing as to whether sanctions ought to be imposed. Following submissions, it made several findings and ultimately imposed sanctions on plaintiffs’ counsel. First, Judge Castel found that the attorney who signed the Affirmation “violated Rule 11 in not reading a single case cited in his … Affirmation [] and taking no other steps on his own to check whether any aspect of...

I’m Sorry, Motion Denied: Washington District Court Rejects Second Try at Class Action Suit Over Amazon Alexa’s Collection of Voice Data

In June 2022, a group of plaintiffs brought a putative class action against Amazon.com (“Amazon”) alleging the company violated several statutory and common law rights through its use of voice data collected through Alexa, its digital assistant software. After the court granted Amazon’s motion to dismiss, the named plaintiffs moved for leave to file an amended complaint. On March 29, 2023, in James Gray and Scott Horton v. Amazon.com, et. al., the United States District Court for the Western District of Washington denied the motion, concluding the plaintiff’s proposed amended complaint (PAC) failed to allege new material facts. The PAC alleged that Amazon failed to disclose to its consumers that it would use the data collected from the voice recordings made by Alexa devices for the purposes of targeted advertising. Accordingly, the plaintiffs asserted, as they had done previously, that Amazon: (1) breached the implied covenant of good faith and fair dealing; (2) violated Washington’s Consumer Protection Act (CPA) and Personality Rights Act (PRA); and (3) violated common law privacy rights. The court dismissed the plaintiffs’ implied covenant claim because the PAC “merely repeat[ed] the same arguments the Court ha[d] already rejected.” For example, the court previously rejected the plaintiffs’ argument that Amazon’s terms and conditions failed to inform them of Amazon’s use of their...

Delaware District Court Allows for Single Claim to Proceed Against Amazon in Illinois Biometric Information Privacy Act Class Action Suit

The Illinois Biometric Information Privacy Act (BIPA) is designed to protect and regulate the use of both “biometric identifiers” and “biometric information” of Illinois residents. “Biometric identifiers,” for instance, include “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” In contrast, “biometric information” means “any information … based on an individual’s biometric identifier used to identify an individual.” On March 29, 2023, in McGoveran v. Amazon Web Servs., Inc., the United States District Court for the District of Delaware granted in part Amazon Web Services (“Amazon”) and Pindrop Security’s (“Pindrop”) motion to dismiss a proposed class action brought pursuant to BIPA for lack of standing, based on a strict interpretation of the definitions of “biometric identifiers” and “biometric information” and the plaintiffs’ failure to adequately allege that they suffered any concrete, actual, or imminent injury as a result of the defendants’ conduct. In McGoveran, a group of Illinois residents alleged that Amazon and Pindrop violated BIPA by extracting their biometric information for authentication purposes when the plaintiffs called John Hancock to discuss their retirement accounts. At the outset, the court held that the plaintiffs lacked Article III standing to bring a claim under BIPA Section 15(a) and Section 15(c) or to otherwise obtain injunctive relief. Under Section 15(a), a company is...

GoodRx Fined $1.5 Million for Disclosure of Users’ Personal Information to Third Parties Without Notice or Consent

On February 1, 2023, the Federal Trade Commission (FTC) filed a “first of its kind” enforcement action under the FTC’s Health Breach Notification Rule, 16 CFR Part 318, which offers several useful takeaways for all companies that collect and process a consumer’s personal information – not just companies that handle health-related data. The FTC’s proposed order seeks to impose a $1.5 million civil penalty against GoodRx, a digital health platform, for sharing the sensitive personal health and other information of millions of GoodRx users with various advertising platforms, including Facebook and Google, and failing to report these disclosures to consumers. According to the FTC complaint, GoodRx collects sensitive personal information from users and represents that it will treat users’ information in accordance with its privacy policies. Since at least 2017, the GoodRx privacy policy specifically stated that GoodRx “would never disclose personal health information to advertisers or any third parties.”  Yet for several years, GoodRx allegedly violated these promises “by sharing information with Advertising Platforms, including Facebook, Google and Criteo, about users’ prescription medications or personal health conditions” and “did so without notice to users, and without obtaining consent.” In addition, GoodRx monetized the personal health information it collected through the creation of advertising campaigns on Facebook and Instagram that targeted GoodRx users. In August...