Restoring Privacy, Security, and Consumer Protection Under Existing Law**
Electronic Bill of Rights Framework for Clean Data Business Practices Are Available Upon Request.
The Electronic Bill of Rights (EBOR) is not a theoretical framework—it is grounded in existing federal law.
Before individuals can understand how they are harmed, they must first understand how the surveillance capitalism business model operates through essential telecom and technology products—such as smartphones—that are necessary for modern life and come at a cost.
It is also critical to understand which devices and platforms—including those supported by Android, iOS, and Windows—serve as the primary vectors for surveillance, as well as the scale and scope of data extraction from those devices and services.
The following section explains how this process works, including:
It is not just privacy that is at stake—it is the systematic exploitation of individuals for profit at the expense of their privacy, security, and safety, as well as their data and financial sovereignty, biological autonomy (including biometric and genetic information), civil liberties, and fundamental human rights by way of products and services that cost money.
Modern connected devices—including smartphones, tablet PCs, connected home products, personal computers, servers, wearable technologies, security and environmental systems, connected vehicles, AI-enabled toys, and other digital services—form a pervasive and continuous data-collection environment.
These devices and platforms operate as always-on data collection nodes, enabled by operating systems such as Android, iOS, and Windows, as well as AI-infused applications, social media platforms, and chatbots.
Within this ecosystem:
This model is primarily supported by data-driven business practices centered on targeted advertising, often described as surveillance-based monetization.
Participation in this ecosystem is frequently governed by non-negotiable terms of service (contracts of adhesion), where access to essential products and services is conditioned on acceptance of broad data collection and use policies.
Data collected from individuals—including teens, children, and business professionals—is often facilitated through what can be described as “leaky operating systems” (Android, iOS, and Windows). These systems support a broad ecosystem of AI-infused applications, social media platforms, and AI chatbots that are designed to drive engagement and data collection at scale.
Within this framework, many of these applications and platforms function in ways that may be characterized as “legal malware”—software that operates within the bounds of the predatory contract of adhesion (terms of service) while enabling extensive data extraction, behavioral tracking, and monetization.
These operating systems and their associated ecosystems underpin essential connected telecom and technology products—including smartphones, computers, and other digital services—that are now required for participation in modern life.
The surveillance and data mining being conducted by Alphabet (Google), Apple, Microsoft, Meta, TikTok USDS JV, ByteDance (China), Wildberries (Russia), and others is "indiscriminate" meaning they are collecting from individuals, businesses, and government agencies is the following buckets of highly confidential information (over 5,000 data points):
All data collected from individuals—including teens and children—is aggregated into identifiable digital profiles, often referred to as Digital DNA. These profiles represent detailed behavioral, personal, and usage patterns tied to a specific individual.
Operating system providers and developers—including those behind Android, iOS, and Windows—enable the creation of these identifiable user profiles through system-level data access, application activity, and integrated services. These profiles are then leveraged for monetization, primarily through targeted advertising, as well as other commercial uses, including data sharing and resale.
In addition to data collected directly from device owners and end users, operating system providers, app developers, and AI platforms may also acquire and combine user profile data from third-party sources. This results in a layered aggregation model, where multiple entities contribute to and exploit increasingly comprehensive user profiles.
The outcome is a data ecosystem in which:
The widespread use of highly addictive and manipulative technologies embedded in AI-infused applications, social media platforms, and AI chatbots has contributed to a growing global technology addiction crisis. This crisis affects adults, teens, and children alike and has been associated with increases in anxiety, depression, violence, online bullying, self-harm, public discord, political polarization, and suicide.
At some point, the question needs to be asked:
"If addiction, harm, mental health decline, exploitation, and the loss of life among teens and children are not the line at which government, advertisers, and Big Tech change their business models, then what is?"
Individuals who access the internet through devices supported by Android, iOS, or Windows operating systems are subject to large-scale data collection and profiling within Google's global advertising ecosystems supported by the AdTech AI-Quantum Control Platform (Core) that distributes targeted ads to billions of people around the world 24x7/365 days per year.
Google's ecosystem—often described as the AdTech Core and ecosystem of micro cores—aggregate user data to enable the delivery of targeted advertising across international markets. Through a network comprised of over 35,000 data brokers, developers, advertisers, PR agencies, developers, and platform providers, user information can be distributed and utilized across multiple jurisdictions worldwide, including those in China and Russia.
This includes regions with varying regulatory standards, raising important questions about:
These platforms operate through interconnected advertising infrastructures—sometimes referred to as “micro AdTech ecosystems”—that support real-time bidding, audience targeting, and behavioral profiling at planetary scale.
Through biometric data, individuals can potentially be identified and located—even without carrying their personal device—using technologies such as facial recognition and voice pattern analysis. These capabilities can operate through nearby connected devices, cameras, or other networked products within close proximity via Google's AdTech Core and global ecosystem, for advertisers this is the greatest system every created, for military use it is the greatest targeting system on the planet.
Global advertising technology (AdTech) infrastructures—originally developed for commercial targeting—are increasingly being examined in the context of national security and information operations.
Large-scale AdTech platforms and their surrounding ecosystems enable capabilities such as:
These same capabilities may be leveraged beyond commercial use cases. Governments and state-aligned actors have been documented utilizing data-driven platforms and digital ecosystems for purposes including:
This convergence of commercial data infrastructure and government use cases is often described as civil–military fusion, where technologies developed in the private sector are adapted for national security and strategic operations.
Chinese and Russian civil military programs including those in the U.S. involving Palantir Technologies Gotham Intelligence and Military Core pose massive threats to privacy, security, and safety to everyone, including teens and children, connected to the internet by way of any device or services supported by Android, iOS, or Windows.
For decades, telecommunications, consumer protection, child safety, and privacy laws have established clear legal obligations regarding the protection of user data, confidentiality, and the lawful use of communications infrastructure.
Yet today, consumers—including minors—are routinely subjected to pervasive surveillance and data mining through smartphones, applications, operating systems, and connected products, despite the existence of these established legal safeguards.
These practices are often justified through terms of service and privacy policies. However, EBOR asserts that contractual consent does not override statutory law, particularly when services are essential to modern life.
“You cannot contract your way out of federal law.”
Telecommunications infrastructure—including wireless, landline, fiber, and satellite—is regulated under a public interest framework, not a purely commercial one.
The use of this infrastructure for surveillance-based business models raises serious legal and constitutional questions.
47 U.S.C. § 222
Telecommunications carriers are legally required to protect the confidentiality of customer data, including:
Carriers do not own this data—they are custodians with a duty to protect it.
47 C.F.R. §§ 64.2001–64.2011
These rules implement Section 222 and require:
These are enforceable regulatory obligations, not optional guidelines.
3. Unjust and Unreasonable Practices
47 U.S.C. § 201(b)
It is unlawful for telecommunications providers to engage in practices that are unjust or unreasonable.
The misuse of customer data—especially at scale—may fall within this prohibition.
47 U.S.C. § 202(a)
This statute prohibits discriminatory or coercive practices in telecommunications services, including conditions that unfairly burden consumers.
Telecommunications carriers operate under a legal requirement to serve the:
“Public interest, convenience, and necessity.”
This establishes a public trust obligation—meaning infrastructure cannot be used solely for exploitative commercial purposes at the expense of consumers.
FTC Act – 15 U.S.C. § 45
Prohibits:
This applies directly to digital platforms, apps, and data-driven services.
This law is designed to address national security risks posed by applications and platforms that are owned, controlled, or influenced by foreign adversaries.
Key Provisions:
Consumers are told: “You agreed to it.”
But in reality:
This raises a fundamental question:
Is consent valid when participation in modern society requires acceptance?
At both the state and federal level, a consistent pattern has emerged:
The result:
Incomplete legislation + weak enforcement = systemic backdoors for exploitation
This issue extends far beyond privacy.
It impacts:
Key concerns include:
The Electronic Bill of Rights is based on three fundamental legal principles:
Terms of service cannot waive federally protected rights.
Coerced or conditional consent is not valid consent.
Licensed spectrum and regulated networks are not private data-extraction systems.
The issue is not the absence of law.
The issue is:
EBOR exists to:
The Electronic Bill of Rights calls on:
to recognize and act on the following reality:
Privacy, security, and safety are not optional features. They are legal rights.
This material is provided for informational and educational purposes only and does not constitute legal advice. Readers should consult qualified legal counsel for advice regarding specific legal matters.

Documented Categories of Harm- Targeted Advertising
There are many documented harms—including loss of life—associated with highly addictive and manipulative AI-infused apps, social media platforms, and emerging AI chatbots that can induce the ELIZA effect, the human tendency to attribute human emotion, trust, and authority to AI driven apps, chatbots, and evolving connected products and services.
Outlined below are the mot urgent categories of harm addressed by the Electronic Bill of Rights (EBOR) framework:
Ironically, those who design, protect, and profit from the system may also be affected by its consequences—along with their own families—as future generations are increasingly born into a oppressive, intrusive, exploitive and harmful AI and quantum driven digital environments supported by essential consumer products and services. In this context, individuals may unintentionally fund their own oppression and exploitation at the expense of their privacy, security, and safety by way of essential consumer products and services they pay for.
There is a better way forward for everyone.
Together, these harms underscore the urgent need for ethical, clean-data standards and practices by way of EBOR framework.