Skip to main content

27 March 2026

Your data isn’t private — and it never really was: What recent U.S. cases and breaches show.


Brief summary

All images are AI-generated. They may illustrate people, places, or events but are not real photographs.

Press the play button in the top right corner to listen to the article

[[[SUMMARY_START]]]

A steady drumbeat of large data breaches and tougher state privacy enforcement is underscoring a hard truth for consumers: personal data routinely moves far beyond the place it was first collected.
In early 2026, California reached a record CCPA settlement tied to failures in opt-out controls, while Congress again advanced proposals focused on children and teens.
Meanwhile, breach notifications continue to reveal how widely sensitive data can spread across healthcare, entertainment, finance, and government contractors.

[[[SUMMARY_END]]]

For years, Americans have been told to read privacy policies, click consent pop-ups, and use account settings to control their information. But the past year of breaches, enforcement actions, and new state rules has highlighted a simpler reality: much of what people do online and off ends up copied, shared, and stored across a long chain of companies.

That chain can include app makers, advertisers, analytics firms, data brokers, cloud providers, payment processors, and industry-specific vendors. When one link breaks—through a cyberattack, a misconfiguration, or weak compliance—millions of records can be exposed.

## A patchwork of rules is getting stricter, not simpler
In the United States, privacy rights largely depend on where a person lives and what type of data is involved. There is still no single, comprehensive federal privacy law. Instead, a growing number of state laws now set baseline consumer rights such as access, deletion, and opt-out of certain data uses.

One practical shift in 2026 is the widening requirement for businesses to recognize universal opt-out signals, including the Global Privacy Control (GPC) setting available in some browsers and extensions. In states that require it, GPC is designed to let a user send a one-step signal to opt out of certain forms of data “sale” or “sharing,” rather than clicking through site-by-site controls.

States have also been narrowing exemptions, lowering thresholds that determine which businesses are covered, and phasing out “right to cure” periods that once gave companies time to fix violations before penalties.

## Enforcement is focusing on whether opt-outs actually work
A key theme in recent enforcement is operational follow-through. Regulators are not only asking whether a company offers an opt-out button, but whether the choice is honored across devices, services, and account-linked identifiers.

In February 2026, California announced a $2.75 million settlement with The Walt Disney Company over allegations that consumers’ opt-out requests were not fully effectuated across devices and streaming services tied to Disney accounts. The agreement also required changes aimed at ensuring opt-outs stop the sale or sharing of personal information as required by state law.

The case is a reminder that privacy compliance is often a systems problem. Modern identity graphs can connect a person’s activity across phones, tablets, smart TVs, browsers, and apps. If a company’s internal plumbing does not propagate a choice across that graph, the consumer-facing control can become more cosmetic than real.

## Breach notices keep showing how far sensitive data spreads
Even when a company tries to comply with privacy rules, the security risks of storing data at scale remain. Recent breach reports and filings show how widely sensitive information can be distributed across vendors—especially in healthcare, where billing and claims ecosystems concentrate data.

One of the largest U.S. healthcare breaches disclosed in recent years involved Change Healthcare, which reported that about 192.7 million people were affected by the 2024 cyberattack, with impacts and notifications continuing into 2025.

Breach notifications in early 2026 also continued to involve large healthcare-related service providers. For example, filings and notices tied to health technology and administrative platforms described incidents affecting millions of people, including cases involving personal identifiers and health-related details.

These events are not limited to hospitals or insurers. They often involve the less visible companies that sit behind the scenes—firms that run portals, manage scheduling and billing, provide IT services, or process claims.

## Children and teens are a renewed focus in Washington
While broad federal privacy legislation remains uncertain, lawmakers have kept working on narrower bills. In March 2026, the U.S. Senate unanimously passed COPPA 2.0 (the Children and Teens’ Online Privacy Protection Act), which would expand protections for teenagers and add new limits tied to data collection and use involving minors.

Even if the final form changes during House consideration, the direction of travel is clear. Policymakers are increasingly targeting the kinds of data practices that have become common in youth-facing services, such as engagement-driven design paired with extensive profiling.

## What “not private” looks like in daily life
For many consumers, the loss of privacy is not a single dramatic event. It is the accumulation of routine data flows:

- A streaming account connects device identifiers and viewing behavior.
- A healthcare visit produces administrative records that may be handled by multiple vendors.
- A browser’s ad-tech infrastructure can broadcast signals and identifiers to many parties in milliseconds.
- A breach at any one of those parties can expose data that originated somewhere else.

The result is that “my data” is rarely stored in one place, and “delete” often means removing one copy while others persist elsewhere for legal, technical, or business reasons.

AI Perspective

Privacy in 2026 is increasingly about execution, not promises. The biggest gaps appear where consumer choices do not reliably travel through complex systems and vendor networks. For most people, practical protection comes from a mix of stronger rules, real enforcement, and better security—because the data economy is built on copying and sharing by default.

AI Perspective


5

The content, including articles, medical topics, and photographs, has been created exclusively using artificial intelligence (AI). While efforts are made for accuracy and relevance, we do not guarantee the completeness, timeliness, or validity of the content and assume no responsibility for any inaccuracies or omissions. Use of the content is at the user's own risk and is intended exclusively for informational purposes.

#botnews

Technology meets information + Articles, photos, news trends, and podcasts created exclusively by artificial intelligence.