Privacy litigation risk growing


Privacy and data breach litigation remains a significant risk for businesses. In 2024, 2,529 data privacy lawsuits were filed across the U.S., a 77% increase from the number of suits filed in 2020, according to Thomson Reuters/Westlaw data analyzed by the International Association of Privacy Professionals. (See Figure 4.)

In 2024, plaintiffs continued to file suits following data breaches, relying on state statutes, such as the California Consumer Privacy Act (CCPA), and common law theories, including negligence, invasion of privacy, unjust enrichment, and breach of express or implied contract. Common law breach of contract claims often seek to leverage a company's own privacy notice, terms of service, advertisements or other public statements as contractual commitments or assurances that they will keep data secure.

With respect to negligence claims, some federal courts have recognized a duty to protect personal information where a defendant allegedly created a situation that they knew or should have known would pose a substantial risk to a plaintiff, such as intentional collection and storage of plaintiffs’ information. Contract and negligence claims are more likely to survive dismissal at the pleading stage because of the factual issues that arise with the merits of these claims.

The most frequently contested issue in data breach litigation continues to be standing — whether the plaintiff has sufficiently alleged “actual or imminent” harm traceable to the defendant’s conduct. In 2021, the Supreme Court held, in TransUnion v. Ramirez, that a risk of future harm stemming from disclosure of a data breach plaintiff’s personal information does not, by itself, support standing to sue for damages. Instead, plaintiffs must identify an actual, concrete injury.

In 2024, courts continued to grapple with the types of concrete harm sufficient to confer standing. The 9th U.S. Circuit Court of Appeals held in Greenstein v. Noblr that a general notice to a plaintiff that their personal information may have been exposed, without confirmation that the plaintiff’s information had been stolen, was not sufficient to establish a risk of future harm. The 9th Circuit left open the possibility that mitigation costs, such as identity theft monitoring services or time spent monitoring financial accounts for potential fraud, could constitute concrete injury in conjunction with an appropriately pled risk of future harm, such as confirmation that a plaintiff’s personal information was, in fact, accessed during a data breach.

The past year also saw case law developments regarding whether forensic communications and analyses qualify for attorney-client privilege following a data breach. In In re Samsung Customer Data Security Breach Litigation, the U.S. District Court for the District of New Jersey acknowledged that attorney-client privilege must be assessed on a case-by-case basis and narrowly construed. The court established a list of factors to be used to evaluate whether attorney-client privilege should be found in the data breach litigation context:

  • The type of services rendered by the third-party consulting firm to outside counsel.
  • The purpose and scope of the investigation as evidenced by the investigative materials or the services contract between outside counsel and a third-party consulting firm.
  • The existence of a two-track investigation commissioned by the impacted company.
  • The extent of a preexisting relationship between the impacted company and the third-party consulting firm.
  • The extent to which the third-party consulting firm’s investigative materials were shared with members of the impacted company and/or any other outside entities, including the government.
  • Whether the third-party consulting firm’s investigative services assisted the law firm in providing legal advice to the impacted company.

Corporate data breach victims should be aware of these factors as they engage in forensic investigations post-breach.

States stepping up

In 2025, the Trump administration has prioritized deregulation, including rolling back several privacy-related regulations established under the Biden administration. In May, for example, the Consumer Financial Protection Bureau announced it would not move forward with a Biden administration proposal to limit the sale of Americans' private information by data brokers. Several agencies, such as the Federal Trade Commission, are expected to focus more on enforcement of existing rules rather than additional rulemaking.

With the federal government generally taking a more hands-off approach, states are increasing enforcement. In January, new comprehensive privacy laws took effect in Delaware, Iowa, Nebraska, New Hampshire, and New Jersey. Similar laws will come into effect in Maryland, Minnesota, and Tennessee later this year.

Each of these new laws provides consumers with the right to access, delete, and opt out of sales. Each law, except Iowa’s, also provides a right to opt in for sensitive data processing and a right to opt out of certain automated decision-making. Notably, none of the state laws taking effect in 2025 includes a private right of action.

Several changes to existing state privacy laws are also taking place in 2025. Both the Colorado Privacy Act and the CCPA will include neural data as a type of sensitive personal data. The Colorado Privacy Act will expand biometric protections. Both Colorado and Virginia will also have new privacy protections for children’s data.

On April 16, 2025, the California Privacy Protection Agency (CPPA) and attorneys general from seven states — California, Colorado, Connecticut, Delaware, Indiana, New Jersey, and Oregon — announced they were forming the Consortium of Privacy Regulators, a new effort to better protect consumers’ privacy. The stated purpose of the consortium is to share expertise and resources and coordinate efforts to investigate potential violations of applicable laws.

Among the states, California is widely regarded as the leader in enforcement:

  • In February 2024, the California attorney general fined DoorDash $375,000 for alleged violations of the CCPA and California Online Privacy Protection Act (CalOPPA). The alleged violations included failure to comply with CCPA’s opt-out requirements for business that sell personal information, and neglecting to disclose that it provided personal information to marketing co-ops.
  • In May 2025, the CPPA fined a clothing retailer almost $350,000 for alleged CCPA violations, including an improperly configured opt-out tool and requiring consumers to submit more information than necessary to process their privacy rights requests. Notably, when the CPPA announced the settlement, the agency pointed out that the company had “deferred to third-party privacy management tools without knowing their limitations or validating their operation.” The head of the CPPA noted that companies should not rely solely on third-party privacy compliance tools, stating that “the buck stops with the businesses that use them,” and that “using a consent management platform doesn’t get you off the hook for compliance.”
  • In March 2025, the California attorney general announced an investigative sweep targeting the location data industry, specifically focusing on mobile apps, advertising networks, and data brokers. The sweep focuses on ensuring compliance with CCPA’s provisions regarding the handling of consumers' geolocation data.

Although not part of the consortium, the Texas attorney general is also taking an active role in enforcement. In June 2024, the attorney general’s office announced an initiative focused on enforcing privacy laws in the state. Since then, the office has completed a significant settlement with a social media company relating to biometric data, launched investigations into the automotive industry for alleged surveillance and data sharing practices, issued notices to over 100 companies for failing to comply with the Texas data broker law, and initiated a lawsuit and a large-scale probe into companies for suspected violations of the Securing Children Online through Parental Empowerment Act and the Texas Data Privacy and Security Act.

Even as the Trump administration takes a step back from privacy regulation, several bills under consideration by Congress warrant monitoring. These include proposals to update decades-old laws, such as the 1974 Privacy Act and the 1986 Electronic Communications Privacy Act.

One potential point of contention between the federal government and the states is regulation on AI. As of early August 2025, 23 states had passed specific AI legislation, and several others are considering proposed bills, according to law firm Bryan Cave Leighton Paisner LLP. (See Figure 5.) Congress considered including a 10-year moratorium on state and local AI laws in the recently passed budget and spending bill, but it was ultimately removed from the final bill.

Plaintiffs leveraging old laws

Lawsuits against consumer-facing businesses related to their use of digital tracking technologies, such as pixels and session replay tools, increased in 2024. Tracking technologies enable businesses to collect data about user interactions to refine marketing strategies and improve engagement. Plaintiffs now allege that these tracking technologies "record" or "intercept" user interactions without appropriate consent, in violation of state and federal laws originally intended to address wiretapping, pen registers, and trap-and-trace devices.

Filing suit under the federal Video Privacy Protection Act (VPPA) has proven particularly lucrative for plaintiffs’ attorneys. The VPPA, which Congress passed the VPPA in 1988 in response to concerns about the privacy of consumers’ video rental history, provides for statutory damages of $2,500 per violation. Plaintiffs have brought a surge of class-action suits against website owners with video functionality on their websites, contending that tracking pixels embedded on websites constitute an unlawful disclosure of video viewing history.

Complicating matters for businesses is a developing circuit split involving cases alleging VPPA violations. In April, the U.S. Court of Appeals for the Sixth Circuit, in Salazar v. Paramount Global, affirmed a lower court’s dismissal of a plaintiff’s case, narrowly interpreting the VPPA as only protecting individuals who subscribe to or purchase audio-visual materials, rather than those who consume digital content, such as newsletters. The court also held that individuals must show a direct relationship with audio-visual content (for example, subscribing to a video service) to qualify as a “consumer” and thus have standing to file suit under the VPPA.

This follows an October decision by the U.S. Court of Appeals for the Second Circuit — in Salazar v. NBA, involving the same plaintiff — to reverse a lower court ruling, holding that the VPPA protects consumers regardless of the specific goods or services they subscribe to, as long as video content is involved. The Second Circuit held that subscribing to a newsletter that includes video links may qualify someone as a VPPA “consumer.”

These contrasting rulings could lead to forum shopping by plaintiffs and inconsistent outcomes depending on jurisdiction. This also sets up the potential for Supreme Court intervention to resolve the discrepancy and/or action by Congress to clarify the VPPA’s scope in the context of modern technologies.

Beyond the VPPA, plaintiffs have filed hundreds of lawsuits alleging violations of the California Invasion of Privacy Act (CIPA), a 1967 wiretapping statute providing statutory damages of $5,000 per violation. Decades after its inception, plaintiffs are using CIPA as the basis for class actions against businesses alleging violations based on their use of website tracking tools, chatbots, and session replay software. Plaintiffs allege that tracking technologies are akin to wiretapping, allowing third parties to “eavesdrop” on users’ online activities without obtaining proper consent.

Plaintiffs have also alleged that web beacons and pixels violate the pen register provisions of CIPA by collecting data from users such as IP addresses and potential geolocation data, and such data can be used to “fingerprint” them. These allegations have had mixed success.

Other state laws that plaintiffs have used to file tracking technology claims include the Massachusetts Wiretap Act and Arizona’s Telephone, Utility and Communications Service Records Act. Inconsistent judicial rulings across the country continue to encourage plaintiffs’ attorneys to test novel legal theories regarding tracking technologies.

Global privacy regulations intensifying

International privacy laws have seen significant development since the start of 2024, reflecting a global emphasis on data protection and regulatory enforcement.

The European Union continues to set a high bar for privacy governance. In May 2025, Ireland’s Data Protection Commission fined TikTok 530 million euros for inadequately protecting EU users’ personal data, particularly concerning remote access by staff in China.

The rapid evolution of AI has also led the EU to establish the AI Act, which introduces a risk-based framework for AI systems, categorizing them into unacceptable, high, limited, and minimal risk levels. The act, which took effect in August 2024, subjects high-risk AI systems, such as those used in critical infrastructure and employment, to strict requirements, including data transparency, human oversight, and data governance.

The EU’s stringent data protection standards have influenced global privacy practices, prompting companies to adopt GDPR-aligned measures to maintain access to the EU market. There is a growing trend toward harmonizing privacy regulations internationally aiming to facilitate data flows while safeguarding individual rights.

Australia. Meanwhile, has long been perceived as the second-most litigious country in the world, after the U.S. Class-action litigation against businesses contributed to the more difficult director and officers liability insurance market of the early 2020s, and could now drive an uptick in cyber privacy claims. In late 2024, the Australian government passed updates to the Privacy Act 1988, which gave Australians — effective June 10, 2025 — the right to file litigation against businesses alleging serious invasions of privacy, such as the misuse of personal data.

This could lead to more litigation against businesses at the same time they face greater enforcement activity by federal regulators. The Office of the Australian Information Commissioner is more aggressively pursuing action against companies for privacy violations, including those connected to their use of AI and other emerging technologies.

Elsewhere in Asia-Pacific, new data protection laws and regulations are taking effect in China, India, Indonesia, Saudi Arabia, and Vietnam. Regulations in the region are also evolving, covering more than personal data, which makes compliance more difficult for global organizations.

Closer to home, Mexico's Law on the Protection of Personal Data Held by Private Parties (LFPDPPP) came into effect on March 21. The LFPDPPP replaces the previous 2010 federal data protection law and introduces several important changes, including broader definitions of personal data, databases, and data controllers; stricter privacy notice requirements; and enhanced individual rights over automated processing. The LFPDPPP also creates a new Secretariat of Anti-Corruption and Good Governance (SABG), which will oversee compliance with the law, conduct investigations, and impose sanctions.

Contents

© 2025 Lockton Companies. All rights reserved.

Next Page