Electronic Bill of Rights (EBOR)

Electronic Bill of Rights (EBOR)Electronic Bill of Rights (EBOR)Electronic Bill of Rights (EBOR)

Electronic Bill of Rights (EBOR)

Electronic Bill of Rights (EBOR)Electronic Bill of Rights (EBOR)Electronic Bill of Rights (EBOR)

Clean Data Alliance Electronic Bill of Rights Framework

Clean Data Alliance Electronic Bill of Rights FrameworkClean Data Alliance Electronic Bill of Rights FrameworkClean Data Alliance Electronic Bill of Rights Framework
Contact

Clean Data Alliance Electronic Bill of Rights Framework

Clean Data Alliance Electronic Bill of Rights FrameworkClean Data Alliance Electronic Bill of Rights FrameworkClean Data Alliance Electronic Bill of Rights Framework
Contact

EBOR- Framework & Action

 

The Electronic Bill of Rights (EBOR)


Restoring Privacy, Security, and Consumer Protection Under Existing Law**


Electronic Bill of Rights Framework for Clean Data Business Practices Are Available Upon Request. 

 

Executive Summary


The Electronic Bill of Rights (EBOR) is not a theoretical framework—it is grounded in existing federal law.

 

Before individuals can understand how they are harmed, they must first understand how the surveillance capitalism business model operates through essential telecom and technology products—such as smartphones—that are necessary for modern life and come at a cost.


It is also critical to understand which devices and platforms—including those supported by Android, iOS, and Windows—serve as the primary vectors for surveillance, as well as the scale and scope of data extraction from those devices and services.


The following section explains how this process works, including:


  • How data is collected, aggregated, and analyzed 
  • The volume and types of highly confidential information involved 
  • How that data is used, shared, and monetized 
  • The legal mechanisms enabling these practices, including contracts of adhesion (terms of service) that condition access to essential products and services on acceptance of broad data collection policies


It is not just privacy that is at stake—it is the systematic exploitation of individuals for profit at the expense of their privacy, security, and safety, as well as their data and financial sovereignty, biological autonomy (including biometric and genetic information), civil liberties, and fundamental human rights by way of products and services that cost money.


Always-On Surveillance and Data Mining Nodes: Connected Telecom and Technology Products


Modern connected devices—including smartphones, tablet PCs, connected home products, personal computers, servers, wearable technologies, security and environmental systems, connected vehicles, AI-enabled toys, and other digital services—form a pervasive and continuous data-collection environment.


These devices and platforms operate as always-on data collection nodes, enabled by operating systems such as Android, iOS, and Windows, as well as AI-infused applications, social media platforms, and chatbots.


Within this ecosystem:


  • Data is continuously collected, transmitted, and analyzed 
  • User behavior, location, interactions, and preferences are monitored 
  • Multiple layers of software and services contribute to ongoing data aggregation 


This model is primarily supported by data-driven business practices centered on targeted advertising, often described as surveillance-based monetization.


Participation in this ecosystem is frequently governed by non-negotiable terms of service (contracts of adhesion), where access to essential products and services is conditioned on acceptance of broad data collection and use policies.

 

Indiscriminate Data Collection Across Global Technology Platforms — Leaky Operating Systems and “Legal Malware”


Data collected from individuals—including teens, children, and business professionals—is often facilitated through what can be described as “leaky operating systems” (Android, iOS, and Windows). These systems support a broad ecosystem of AI-infused applications, social media platforms, and AI chatbots that are designed to drive engagement and data collection at scale.


Within this framework, many of these applications and platforms function in ways that may be characterized as “legal malware”—software that operates within the bounds of the predatory contract of adhesion (terms of service) while enabling extensive data extraction, behavioral tracking, and monetization.


These operating systems and their associated ecosystems underpin essential connected telecom and technology products—including smartphones, computers, and other digital services—that are now required for participation in modern life.


 The surveillance and data mining being conducted by Alphabet (Google), Apple, Microsoft, Meta, TikTok USDS JV, ByteDance (China), Wildberries (Russia), and others is "indiscriminate" meaning they are collecting from individuals, businesses, and government agencies is the following buckets of highly confidential information (over 5,000 data points):


  • Personal, Business, Employment, Financial, and Legal
  • Medical, Health, and Biometric
  • Location, Geofence, and Motion
  • Behavioral 
  • Sensitive User Data (Texts, Emails, Attachments, Calendar, Contacts, Phone/Messaging Logs, Accounts, Audio Recordings of End User, Photos, Videos, Files, and Other Specific user Data)
  • External Information Connected to Host Device- Thumb Drive, External Hard Drive, Computer, TV, Connected Vehicle, any Connected Source
  • Indiscriminate Audio, Video, and Physical Surveillance on Personal and Business Activities 24x7/365 Days Per Year

 

Digital DNA Profiling and Data Exploitation


All data collected from individuals—including teens and children—is aggregated into identifiable digital profiles, often referred to as Digital DNA. These profiles represent detailed behavioral, personal, and usage patterns tied to a specific individual.


Operating system providers and developers—including those behind Android, iOS, and Windows—enable the creation of these identifiable user profiles through system-level data access, application activity, and integrated services. These profiles are then leveraged for monetization, primarily through targeted advertising, as well as other commercial uses, including data sharing and resale.


In addition to data collected directly from device owners and end users, operating system providers, app developers, and AI platforms may also acquire and combine user profile data from third-party sources. This results in a layered aggregation model, where multiple entities contribute to and exploit increasingly comprehensive user profiles.


The outcome is a data ecosystem in which:


  • Individuals are continuously profiled across devices and platforms 
  • Data is combined, enriched, and redistributed among multiple parties 
  • Both users and paying customers may be subject to profiling and monetization 
  • Sensitive behavioral and personal data becomes a commercial asset
  • Highly addictive AI infused apps, social media platforms, and AI chatbots programmed to induce "The Eliza Effect" (AI indoctrination) are used to ensure maximum engagement

 

The widespread use of highly addictive and manipulative technologies embedded in AI-infused applications, social media platforms, and AI chatbots has contributed to a growing global technology addiction crisis. This crisis affects adults, teens, and children alike and has been associated with increases in anxiety, depression, violence, online bullying, self-harm, public discord, political polarization, and suicide.


At some point, the question needs to be asked:


"If addiction, harm, mental health decline, exploitation, and the loss of life among teens and children are not the line at which government, advertisers, and Big Tech change their business models, then what is?" 

 

Google's Global AdTech Infrastructure and User Profiling


Individuals who access the internet through devices supported by Android, iOS, or Windows operating systems are subject to large-scale data collection and profiling within Google's global advertising ecosystems supported by the AdTech AI-Quantum Control Platform (Core) that distributes targeted ads to billions of people around the world 24x7/365 days per year.


Google's ecosystem—often described as the AdTech Core and ecosystem of micro cores—aggregate user data to enable the delivery of targeted advertising across international markets. Through a network comprised of over 35,000 data brokers, developers, advertisers, PR agencies, developers, and platform providers, user information can be distributed and utilized across multiple jurisdictions worldwide, including those in China and Russia.


This includes regions with varying regulatory standards, raising important questions about:


  • Where user data is processed and stored 
  • Who has access to that data across global markets 
  • How data flows between commercial ecosystems and international entities 
  • The extent to which user profiles are shared, licensed, or monetized globally
  • Targeted ads are sent to a user anywhere in the world through multiple digital signatures that include GPS, geofence, nearfield communication (NFC tags), Bluetooth, Wi-Fi access points, cellular tower triangulation, and bio metric data that includes facial recognition and voice prints


These platforms operate through interconnected advertising infrastructures—sometimes referred to as “micro AdTech ecosystems”—that support real-time bidding, audience targeting, and behavioral profiling at planetary scale.


Through biometric data, individuals can potentially be identified and located—even without carrying their personal device—using technologies such as facial recognition and voice pattern analysis. These capabilities can operate through nearby connected devices, cameras, or other networked products within close proximity via Google's AdTech Core and global ecosystem, for advertisers this is the greatest system every created, for military use it is the greatest targeting system on the planet.


Weaponized AdTech Technologies: The Rise of Global Civil–Military Fusion


Global advertising technology (AdTech) infrastructures—originally developed for commercial targeting—are increasingly being examined in the context of national security and information operations.


Large-scale AdTech platforms and their surrounding ecosystems enable capabilities such as:


  • Behavioral profiling at population scale 
  • Real-time data aggregation and analysis 
  • Precision targeting of individuals and groups 
  • Content amplification and distribution across digital channels 


These same capabilities may be leveraged beyond commercial use cases. Governments and state-aligned actors have been documented utilizing data-driven platforms and digital ecosystems for purposes including:


  • Intelligence gathering and surveillance 
  • Information operations, including misinformation and propaganda 
  • Influence and behavioral shaping at scale 
  • Strategic espionage and data mining and analytics 
  • Targeted digital campaigns across populations 
  • Blackmail


This convergence of commercial data infrastructure and government use cases is often described as civil–military fusion, where technologies developed in the private sector are adapted for national security and strategic operations.


Chinese and Russian civil military programs including those in the U.S. involving Palantir Technologies Gotham Intelligence and Military Core pose massive threats to privacy, security, and safety to everyone, including teens and children, connected to the internet by way of any device or services supported by Android, iOS, or Windows.

 

Non-Enforcement of Existing Consumer, Child Protection, Privacy, and National Security Laws Beyond the Scope of Section 230


For decades, telecommunications, consumer protection, child safety, and privacy laws have established clear legal obligations regarding the protection of user data, confidentiality, and the lawful use of communications infrastructure.


Yet today, consumers—including minors—are routinely subjected to pervasive surveillance and data mining through smartphones, applications, operating systems, and connected products, despite the existence of these established legal safeguards.


These practices are often justified through terms of service and privacy policies. However, EBOR asserts that contractual consent does not override statutory law, particularly when services are essential to modern life.

 

Key Insight


“You cannot contract your way out of federal law.”
 

Telecommunications infrastructure—including wireless, landline, fiber, and satellite—is regulated under a public interest framework, not a purely commercial one.
The use of this infrastructure for surveillance-based business models raises serious legal and constitutional questions.


 

The Legal Foundation Already Exists


1. Customer Proprietary Network Information (CPNI)


47 U.S.C. § 222


Telecommunications carriers are legally required to protect the confidentiality of customer data, including:


  • Call records 
  • Usage data 
  • Location information 
  • Service-related metadata 


Carriers do not own this data—they are custodians with a duty to protect it.

 

2. FCC CPNI Regulations


47 C.F.R. §§ 64.2001–64.2011


These rules implement Section 222 and require:


  • Explicit customer approval for data use in many cases 
  • Safeguards against unauthorized access 
  • Mandatory breach notification 
  • Restrictions on disclosure of sensitive data 

These are enforceable regulatory obligations, not optional guidelines.


 3. Unjust and Unreasonable Practices


47 U.S.C. § 201(b)


It is unlawful for telecommunications providers to engage in practices that are unjust or unreasonable.


The misuse of customer data—especially at scale—may fall within this prohibition.


 

Non-Discrimination in Service


47 U.S.C. § 202(a)


This statute prohibits discriminatory or coercive practices in telecommunications services, including conditions that unfairly burden consumers.


 

5. Public Interest Obligation (Communications Act Standard)


Telecommunications carriers operate under a legal requirement to serve the:

“Public interest, convenience, and necessity.”
This establishes a public trust obligation—meaning infrastructure cannot be used solely for exploitative commercial purposes at the expense of consumers.

 

6. Consumer Protection Law


FTC Act – 15 U.S.C. § 45


Prohibits:


  • Unfair business practices 
  • Deceptive representations 
  • Misleading privacy disclosures 


This applies directly to digital platforms, apps, and data-driven services.


 

Additional Federal Law: Foreign-Controlled Applications

Protecting Americans From Adversarial Controlled Applications Act


This law is designed to address national security risks posed by applications and platforms that are owned, controlled, or influenced by foreign adversaries.


Key Provisions:


  • Targets Foreign Adversary Control
    Applies to applications owned or controlled by entities in countries designated as foreign adversaries (e.g., China, Russia). 


  • Divestment or Ban Requirement
    Requires companies to divest foreign ownership/control or face removal (ban) from U.S. app stores and infrastructure. 


  • Focus on Data Security & Influence Operations
    Addresses risks related to: 
    • Mass data collection on U.S. citizens 
    • Surveillance and profiling 
    • Algorithmic manipulation and propaganda 


  • Applies to App Stores and Distribution Platforms
    Prohibits companies like Apple and Google from distributing non-compliant applications in the U.S. 


  • National Security Enforcement Mechanism
    Gives the federal government authority to act when foreign-controlled platforms pose a threat to U.S. national security. 


The Core Problem


Consumers are told: “You agreed to it.”
 

But in reality:


  • Refusing terms often means losing access to essential telecommunication and internet services regulated by the FCC
  • Devices are purchased, yet functionality is restricted without consent 
  • Data collection is bundled into non-negotiable agreements 
  • Federal law protects from discrimination meaning no company can bar a person from access to telecommunications and internet infrastructure by way of a contract of any kind, especially through coercion by way of products and services that cost money
  • No contract can superweed federal law regarding unauthorized surveillance and data mining through any essential telecommunications and tech product or service, such as a smartphone, supported by telecom infrastructure and internet infrastructure regulated by the FCC due to existing public trust/obligation laws


This raises a fundamental question:


Is consent valid when participation in modern society requires acceptance? 

Systemic Gaps in Enforcement


At both the state and federal level, a consistent pattern has emerged:


  • Meaningful enforcement is limited 
  • Legislative efforts often stall during escalation 
  • Critical provisions are frequently removed during the legislative process 
  • Industry lobbying results in incomplete or weakened laws 


The result:


Incomplete legislation + weak enforcement = systemic backdoors for exploitation


 

National Security and Public Safety Implications


This issue extends far beyond privacy.


It impacts:


  • Government systems 
  • Critical infrastructure (energy, utilities, telecom) 
  • Enterprise networks 
  • Schools and homes 
  • Devices used by minors 


Key concerns include:


  • Foreign access to sensitive data 
  • Surveillance through consumer devices 
  • Exposure of behavioral, biometric, and location data 
  • Risks to children through always-on digital ecosystems


 

Public-Interest Questions That Must Be Answered


  • How much sensitive data is being collected and where is it stored? 
  • Who has access to that data—domestically and globally? 
  • Are existing confidentiality laws being violated through modern platforms? 
  • What protections exist for minors inside connected environments? 
  • How much surveillance is occurring through devices already paid for by consumers? 


 

The EBOR Position


The Electronic Bill of Rights is based on three fundamental legal principles:


1. Statutory Law Overrides Contracts


Terms of service cannot waive federally protected rights.


2. Consent Must Be Meaningful


Coerced or conditional consent is not valid consent.


3. Telecom Infrastructure Carries Public Obligations


Licensed spectrum and regulated networks are not private data-extraction systems.


 

Why EBOR Is Necessary


The issue is not the absence of law.


The issue is:


  • Lack of enforcement 
  • Incomplete legislation 
  • Business models built on surveillance and data extraction 


EBOR exists to:


  • Clarify rights 
  • Enforce existing law 
  • Close legal loopholes 
  • Protect consumers, families, and national security


 

Call to Action


The Electronic Bill of Rights calls on:


  • Lawmakers 
  • Regulators 
  • Businesses 
  • Critical infrastructure operators 
  • Consumers 


to recognize and act on the following reality:


Privacy, security, and safety are not optional features. They are legal rights.

 

Legal Disclaimer


This material is provided for informational and educational purposes only and does not constitute legal advice. Readers should consult qualified legal counsel for advice regarding specific legal matters.

Learn More by Contacting Rex M. Lee

Documented Harm- Targeted Advertising

Documented Categories of Harm- Targeted Advertising


There are many documented harms—including loss of life—associated with highly addictive and manipulative AI-infused apps, social media platforms, and emerging AI chatbots that can induce the ELIZA effect, the human tendency to attribute human emotion, trust, and authority to AI driven apps, chatbots, and evolving connected products and services.


Outlined below are the mot urgent categories of harm addressed by the Electronic Bill of Rights (EBOR) framework:

 

  • Tech Addiction – Intentional design patterns engineered to maximize engagement can create compulsive use, dependency, and loss of healthy behavioral control resulting in harm.
     
  • Psychological Harm – Prolonged exposure to addictive and manipulative digital environments is linked to online bullying, violence, anxiety, depression, self-harm, loneliness, reduced emotional well-being, suicidal thoughts, and in some cases actions a dynamic explored in documentaries such as The Social Dilemma.
     
  • Cognitive Manipulation – Algorithmic targeting and persuasive design can influence perception, decision-making, and behavior without meaningful awareness or consent leading to threats posed by consumer, political and ideological indoctrination by multinational corporations, including those who collude with governments, including those from adversarial nations.
     
  • Tech-Based Hybrid Warfare –  Governments, political actors, militaries, and intelligence agencies have infiltrated consumer grade social media platforms, while using addictive AI infused apps, platforms, and chatbots to wage phycological and cognitive warfare targeting everyone connected to the internet, including teens and children.
     
  • Government Surveillance by Proxy –  Concerns have been raised that close alignment between government institutions and Alphabet/Google, Microsoft, Apple, Amazon, Meta, and other developers has contributed to the erosion of privacy, security, safety, data and financial agency, biological agency (biometric and physical DNA), while weakening civil liberties and human rights at global scale. 


  • Weaponized Consumer Products and Services- Smartphones, tablets, connected products, and PCs supported by Android, iOS, and Windows—now essential to modern life—have evolved into continuous data-collection and surveillance nodes capable of enabling audio, video, physical surveillance as always on data extraction for profits without compensating the product owner nor the end user.   Never before in the history of consumerism have we seen consumer product manufactures and developers weaponize their products and services against their paying customers and end users, including teens and children directly or weaponized by government by proxy.  Governments around the world are conducting tech-based hybrid warfare—in real time—against their advisories, including their own populations, while bent on eliminating privacy, security, safety, civil liberties, and human rights by way of essential consumer telecom and tech products and services paid for by the product owner and subscriber.
     
  • Forced Participation by Way of Predatory Agreements –  Essential consumer products such as smartphones and connected products supported by AI infused apps, social media platforms, and chatbots are required acceptance of predatory non-negotiable contracts of adhesion (terms of service) as a condition of use resulting in forced participation in data-driven advertising ecosystems in ways that compromise privacy, security, and safety due to the used of addictive technologies embedded in AI infused app and social media ecosystems globally.  Adults, business professionals, corporations, government agencies, and teens ages 13–17 must accept these predatory terms of service, otherwise if rejected they cannot use the products and services they paid for, this is the definition of consumer oppression and cyber-enslavement as product owners and users are turned into uncompensated information producers exploited for profits 24x7/365 days per year.


  • Deceptive Trade Practices- Illegal Consumer Agreements– Ultimately, neither the product owner nor the end user may be able to provide fully informed and legally meaningful consent, particularly when the agreements they are asked to accept are difficult to read, digest, or reasonably understand prior to the purchase or use of a product or service as required by law.  The collective terms and conditions, privacy policies, end-user licensing agreements, and in-device legal disclosures can total more than 3,000 pages of complex legal language at the moment the product owner clicks “I Agree.”  Developers and manufacturers often provide limited transparency regarding their business practices, coding, algorithms, operating-system behaviors, and the functioning of potentially intrusive or addictive AI-infused apps and platforms, while participation is effectively required through non-negotiable terms of service.  By separating the collective terms of service into numerous categories—including online terms and conditions, privacy policies, and end-user license agreements—while disclosing other portions of the agreement within the device itself, such as application permissions, app-specific warnings, and product notices, meaningful transparency is undermined and confusion is created. This structure functions as an intentional legal strategy that makes the agreement torturous, complex, and exhaustive to read by design, constituting a deceptive trade practice.


  • Covert Piggyback Coding and Cross Platform Surveillance and Data Mining-  It is common in the technology industry for developers to gain access to paying customers and end users through preinstalled app agreements, enabling the marketing and distribution of intrusive, AI-infused apps and platforms across the Android, iOS, and Windows ecosystems—an arrangement that is transparent to the product owner.  However, what is not transparent is that multinational corporations, including those based in adversarial nations, can enter into piggybacking and cross-platform surveillance and data-mining agreements by embedding code within each other’s branded applications through a methodology known as “piggybacking.”  Under this structure, two or more companies or entities—including governments—can monitor, track, and data-mine an individual through a single intrusive AI-infused app without meaningful transparency to the product owner, constituting a deceptive trade practice.   This dynamic was highlighted in the 2024 Google antitrust litigation, which exposed agreements between Apple and Alphabet/Google in which Apple’s Safari browser defaulted to a Google-developed search engine that was not overtly branded as Google—effectively misleading Apple device owners who had moved away from Android-based smartphones in an effort to separate from Google’s surveillance- and data-extraction-driven business practices.
     
  • Financial Exploitation & The Rise of Cyber-Enslavement – Behavioral targeting, digital DNA profiling, surveillance-based pricing models, device-level data extraction, and downstream data monetization practices can generate disproportionate economic value from users without meaningful transparency, compensation, or direct benefit. In most cases, neither the product owner nor the end user is compensated for the valuable information produced through their use of digital devices—data that may contribute to substantial revenues within the global data-brokerage and advertising ecosystem.   Compounding these concerns, contractual terms governing many AI-infused apps and digital services place the costs of connectivity, data transmission, and platform access on the user, even as those same interactions enable ongoing data collection and monetization tied to voice, data, and internet usage plans—meaning that product owners and subscribers ultimately pay for the data being extracted from their own products and services that is later exploited for billions in profits.
     
  • Digital Discrimination and Identifiable Digital DNA Profiles – In some data-driven ecosystems, developers and data brokers may assemble highly detailed identifiable Digital DNA profiles (user profiles)—derived from over 5,000 highly confidential data points extracted from the product owner's devices and PCs.  Identifiable digital DNA profiles include personal, business, medical, legal, biometric, health and fitness, employment, location, and sensitive user data. Sensitive user data (personal and business) includes ID, messaging, emails, attachments (PDFs, photos, documents, etc.), calendar events, contacts, account information, and other confidential and protected information as the surveillance and data mining are indiscriminate.


  • Weaponized Identifiable Digital DNA Profiles- Digital DNA profiles can be weaponized against the product owner or end user by the developers, manufactures, multinational corporations, and governments by proxy.  These identifiable profiles can be used in ways that influence social scoring limiting financial or career opportunities,  The profiles can be used to cause financial harm through higher insurance pricing, interests on loans, and other forms of surveillance-based pricing models in which individuals are offered different terms or costs for the same products and services.  Concerns have also been raised that large-scale data analytics could enable population-level segmentation by socioeconomic status or other demographic factors, potentially exposing vulnerable groups to heightened targeting, marginalization, or virtual genocide through economic warfare.   In some cases, real-world violence—including acts described as genocide—has been documented, such as events in Myanmar involving the role of Meta’s Facebook and Instagram platforms and the Myanmar government, as reported by Amnesty International. More broadly, algorithmic profiling has been shown in some cases to reinforce bias or unequal treatment across areas such as employment, finance, housing, and access to essential services.
     
  • Loss of Privacy, Civil Liberties, and Human Rights by Proxy –  Concerns have been raised that close alignment between government institutions and major technology platforms has contributed to a significant erosion of privacy, civil liberties, and human rights through the widespread use of essential consumer technologies. Continuous monitoring, location tracking, behavioral profiling, and large-scale data extraction can blur personal boundaries and confidentiality across both online and offline environments—at home, in vehicles, workplaces, medical settings, legal consultations, social spaces, and everyday public life.   Particular concern has been expressed regarding children, who increasingly enter a fully surveillance and data extraction digital environment from birth and may experience limited practical privacy throughout their lifetime as they are born into digital servitude. For these reasons, some policymakers, advocates, and industry leaders argue that updated legal and ethical safeguards—such as those proposed in an Electronic Bill of Rights—are necessary to protect individual autonomy, dignity, civil liberties and fundamental human rights in a society where daily life is deeply integrated with digital systems.
     
  • Loss of Data and Financial Agency due to Internet Centralization- Alphabet/Google, Microsoft, and Apple control access to global internet trade and commerce by monopolizing the operating system (Android, iOS, Windows) market while controlling the global development and distribution of AI infused apps, chatbots, social media, streaming services, cloud storage platforms, medical platforms, banking platforms, retail purchasing platforms, digital currency platforms, and other essential platforms.  AI-infused apps and platforms—and future AI- and quantum-driven applications—must be developed for Android, iOS, or Windows in order to achieve global distribution through Alphabet/Google, Apple, and Microsoft, either via their app stores or through multibillion-dollar preinstalled app agreements. These distribution controls are used to limit business competition and innovation while eliminating much of the user’s data and financial agency, and they may also violate antitrust and existing  business-competition fairness laws in numerous ways. 

 

Ironically, those who design, protect, and profit from the system may also be affected by its consequences—along with their own families—as future generations are increasingly born into a oppressive, intrusive, exploitive and harmful AI and quantum driven digital environments supported by essential consumer products and services. In this context, individuals may unintentionally fund their own oppression and exploitation at the expense of their privacy, security, and safety by way of essential consumer products and services they pay for.


There is a better way forward for everyone.


Together, these harms underscore the urgent need for ethical, clean-data standards and practices by way of EBOR framework.



Electronic Bill of Rights

Copyright © 2026 Electronic Bill of Rights - All Rights Reserved.

Powered by