I will talk about the Big Tech regulations that are impacting the world in this article. As technology giants expand, legislation has been put into place to safeguard user privacy, promote healthy competition, and enhance safety on the web.
These regulations illustrate how Google, Amazon, and Facebook, among other digital platforms, are being monitored internationally.
Key Point & Big Tech Regulations List
Regulation | Key Point |
---|---|
1. GDPR (General Data Protection Regulation) | Sets strict rules on data privacy and grants EU users more control over data. |
2. Digital Markets Act (DMA) | Aims to curb the power of Big Tech gatekeepers in digital markets. |
3. Digital Services Act (DSA) | Regulates online platforms to remove illegal content and improve transparency. |
4. CCPA (California Consumer Privacy Act) | Gives California residents rights to access, delete, and opt-out of data sales. |
5. COPPA (Children’s Online Privacy Protection Act) | Protects children’s data by restricting collection from users under 13. |
6. Section 230 of the CDA | Shields platforms from liability for user content, with ongoing reform debates. |
7. China’s PIPL | China’s comprehensive privacy law mirroring GDPR for handling personal data. |
8. India’s Digital Personal Data Protection Act | Establishes data rights and compliance for entities processing Indian data. |
9. UK Online Safety Act | Holds platforms accountable for harmful online content and user safety. |
10. Sherman Antitrust Act | Used to challenge monopolistic practices by major tech firms in the U.S. |
1. GDPR (General Data Protection Regulation)
This law was established in 2018 concerning the protection of personal data of EU residents by the European Union.

It places a heavy burden on organizations, including technological ones, to seek permission before data collection through various processes and provide services such as data retrieval, modification, or deletion.
GDPR has international influence concerning data privacy and requires Google and Meta to drastically change their data practices by prioritizing user privacy.
GDPR (General Data Protection Regulation)
User Rights: Empowers individuals to access, amend, delete, or limit the use of their personal data.
Consent Requirements: Collection and processing of personal data must be preceded by explicit informed consent.
Data Protection Officers: Certain entities are required to assign a Data Protection Officer who ensures compliance.
Cross-Border Data Transfers: Transfers of data outside the EU region are regulated and protected by Standard Contractual Clauses.
Fines and Penalties: Non-compliance may result in fines of up to 4% of annual global turnover or €20 million.
Breach of Privacy: Notices of breach must be provided not exceeding 72 hours.
2. Digital Markets Act (DMA)
The Digital Markets Act or DMA defines the primary obligations portal or gatekeeper platforms such as Apple, Amazon, and Microsoft, which became effective in the EU in 2023.
The DMA works towards promoting competition by eliminating self-preferencing of services, requiring setting boundaries to self-hosting, and enforcing OPC (open access; third-party app stores) inter-operability.

Further, the DMA attempts to curb monopolistic activities, foster innovation, and attempt to level the playing field for smaller competitors.
The bill outrages industry leaders who were Once significantly profitable by controlling and dominating markets and business ecosystems, as violations could incur fines of up to 10% of global turnover, increased to 20% for repeated offenses.
Digital Markets Act (DMA)
Gatekeeper Designation: Identifies major players in the industry as “gatekeepers” with disproportioned market influence.
Anti-Competitive Restrictions: Self-preferencing, exclusive agreements, and discriminatory access terms are banned.
Interoperability: Third-party applications such as app stores and messaging platforms must be accessible.
Data Access: Participants classified as gatekeepers must grant relevant business clients access to data produced on their platforms.
Fines: There shall be no less than 10% of the global turnover, and 20% for continual infringement.
Compliance Enforcement: Task force from European Commission is assigned to ensure compliance.
3. Digital Services Act (DSA)
Also effective in the EU in 2023, the Digital Services Act aims to create a safer digital environment through regulation of online platforms for illegal content, misinformation, and other harmful material.

The DSA places transparency requirements on social media and e-commerce platforms regarding content moderation, risk assessments for large platforms, and functions for user complaints. The fines for non-compliance exceed 6% of global annual turnover.
The major consequence under this legislation is for placing responsibility on platforms like X or YouTube which used to exercise unrestricted control over content.
Digital Services Act (DSA) Features
Content Moderation: Platforms must take down illegal content and act against misinformation actively.
Support Adaptation: Requires large platforms like Facebook to assess and mitigate systemic hate speech as part of their new risk assessment policies.
Content Moderation Policies: Require meta to established clear board policies as reporting requirements determine how content moderation procedures and policies are tracked.
Advertising Strategy: Meta must follow strict criteria where minors cannot be issued ads.
TAC Penalties: Meta may get monetary fines constituting 6% of their everage annual global turnover.
Data Protection: Data pertaining to any consumer over 25 million dollars have identified procedures processing the data.
4. CCPA (California Consumer Privacy Act)
California Consumer Privacy Act, which became a law in 2020, gives California citizens some control over their personal information such as data collection, sales opt-out, and deletion requests.

It applies to businesses with considerable revenue and data processing, including technology corporations like Facebook and Salesforce. The CCPA mandates defined privacy policies to the public and carries a fine up to $7,500 for every purposeful breach.
Its strong and robust privacy regulations has served as a catalyst for other states to adopt similar legislature.
Unlike other U.S. privacy laws, CCPA has forced companies to redefine their data-centric policies to become more people friendly.
CCPA (California Consumer Privacy Act)
Consumer Rights: Allows knowing, deleting, and opting of the sale of user’s personal data.
Business Scope: Relevant for companies that have over 25 million dollars in annual revenue or large data processing.
Non-Discrimination: Treating users unfairly for the use of privacy rights is prohibited.
Privacy Notices: Data collection and sharing policies must be disclosed without ambiguity.
Enforcement: Intentionally violating the law incurs a fine of 7500 per violation while data breaches lead to lawsuits.
5. COPPA (Children’s Online Privacy Protection Act)
The Children’s Online Privacy Protection Act was passed in the U.S. in 1998 to safeguard kids below the age of 13 by regulating the access websites and mobile applications have to a child’s personal information.
Companies like TikTok and YouTube must obtain verifiable parental consent prior to data collection, and such data may not be targeted advertising without consent.

COPPA has been instrumental in making companies such as Google pay $170 million dollar settlement in 2019 after the company breached children data privacy laws.
COPPA serves untechie children by ensuring their safety is prioritized during design processes of services and platforms intended for children.
COPPA (Children’s Online Privacy Protection Act) Features
Parental Consent: Consent from a parent or guardian that can be verified is required for collecting data from children below 13.
Data Minimization: Collection of children’s data must pertain to the minimum necessary for the service.
Privacy Notices: Requires accessible privacy policies to be drafted for parents.
Advertising Restrictions: Targeted advertisements aimed at children can only be used with prior consent.
Self-Regulation Compliance: Allows participation via self-governing self-regulatory policies.
Civil Responsibility: Applies civil responsibility with fines such as 170 million dollars in some notable situations.
6. Section 230 of the Communications Decency Act
Section 230, as part of the U.S Communication Decency act of 1996, relieves tech platforms from liability regarding user-generated content and protects them as intermediaries rather than publishers.
This immunity allows companies such as X and Reddit not to be sued for content posted by their users, and enables them to moderate content without losing immunity.

Nevertheless, it has been criticized for enabling harmful content, which puts the law into discussion for needing reform to make platforms more responsible.
232 remains the centerpiece for internet governance regulations which determine user freedom within social media platforms.
Section 230 of the Communications Decency Act
Protection From Legal Action: Protects websites and other platforms from being sued for content posted by users.
Content Moderation: Allows platforms to prohibit illegal content without losing protections.
Good Faith Requirement: Protects those on the internet who act in good faith to delete harmful content from bad consequences.
Limited Scope: Excludes federal criminal law and the law on offenses of intellectual property.
Interactive Services: Covers users of the internet but not publishers and editors of newspapers.
7. China’s Personal Information Protection Law (PIPL)
The Personal Information Protection Law (PIPL) from China is in effect since 2021 and is an all-encompassing privacy legislation regulating how corporations, including information technology giants such as Tencent and Alibaba, deal with personal information.
It makes user consent a prerequisite for data collection, mandates data minimization, and has strict guidelines for the transfer of information across borders.

Failure to comply can lead to sanctions of up to five percent of annual income or the suspension of business activities.
PIPL, partially modeled on GDPR, shows focus on China’s data sovereignty and control, forcing tech companies to relocate data and comply with state surveillance.
China’s Personal Information Protection Law (PIPL)
Consent Framework: Deemed appropriate, requires voluntary and informed approval for the processing of data.
Data Localization: Requires keeping said private data inside of china for some.
Cross-Border Rules: Makes transferring data out of the country subject to stringent approval processes.
Data Security: Companies are required to bear the burden of safeguarding the personal data of other individuals.
Fines: Fines up to 5% of the year’s revenue or a halt in operations in violation of the provisions.
8. India’s Digital Personal Data Protection Act
The 2023 Digital Personal Data Protection Act (DPDP) of India provides regulations to safeguard sensitive information in one of the largest global digital economies.

DPDP includes processes such as obtaining user consent for data processing, gives users the opportunity to access and wipe their data clean, and compels fiduciaries like Reliance Jio and Paytm to ensure data security. Breaches could result in fines of up to ₹250 crore (roughly 30 million USD).
The DPDP seeks to maintain a balance of innovation and privacy which means that tech companies will have to sustain India’s distinct rules for compliance while building confidence in digital services.
India’s Digital Personal Data Protection Act
Consent-Based Processing: Handles personal information of individuals only when consent of the individual is given willingly and with full understanding.
User Rights: Confers the right to the users to access, correct, and erase, and restrict the processing of their data.
Data Fiduciaries: Assigns responsibilities to organisations that determine the purpose for which data will be processed.
Child Protection: Requires that parental consent is obtained before processing the data of children.
Penalties: Imposes penalties of up to ₹250 crore (around 30 million USD) for breaches in data.
Data Localization: Supports the local supply of the sensitive data with a few stated exceptions.
9. UK Online Safety Act
The UK Online Safety Act, passed in 2023, imposes duties on tech platforms to protect people against certain types of illegal and harmful content including but not limited to hate speech and child sexual exploitation.
The Act requires some companies like Snap and Google to take risk assessment, implement safety measures, as well as take part in reporting progress towards fulfilling obligations.

Companies which do not comply might get fined up to 10% of their global annual turnover while company executives can be held liable as well.
The principal aim of the Act is to make the UK the ‘safest place to be online’ while forcing platforms to actively monitor and regulate the content presented to users.
UK Online Safety Act
Duty of Care: Protects users from illegal and harmful the platform content.
Risk Assessments: Requires the preemptive identification of risks such as child grooming.
Content Removal: Requires the prompt removal of illegal content identified on the platforms.
Transparency Reports: Requires report publication on compliance and moderation.
Fines and Liability: Sets a fine of 10% of the global turnover for contraventions and possible executive accountability.
10. Sherman Antitrust Act (as applied to tech companies)
The Sherman Antitrust Act of 1890 is a United States law which prohibits monopolistic practices as well as restraining trade.

It has recently been applied to powerful companies such as Google and Amazon. Recent lawsuits, such as the one issued by the U.S Department of Justice against Google for having market control over search engines, utilizes the Act to resolve issues concerned with anti-competitive practices such as blocking exclusive contracts and market foreclosure.
Recommended penalties include divestitures or fines, both of which would determine how the company operates. The act is being used to control digital monopolies shows the efforts towards controlled regulation are intended.
Sherman Antitrust Act (as applied to tech companies)
Anti-Monopoly Rules: Contracts or agreements that restrain, limit, or control trading and monopolistic tendencies are prohibited.
Market Dominance: Focuses on the companies like Google which subjects users to their domination in searching for information.
Enforcement Actions: Provides avenues to sue for exclusive contracts and predatory pricing.
Remedies: Permits some structural remedies such as divestures and behavioral constraints.
Conclusion
The global attempts to manage the unparalleled control of technology companies is represented in the landscape of GDPR, DMA, DSA, CCPA, COPPA, Section 230, PIPL, India’s DPDP Act, UK Online Safety Act, and The Sherman Antitrust Act.
These regulations combined attempt to fix the issues on data privacy, market competition, content moderation, child protection, and the restrain of trade. Through imposing strict user consent, transparency, interoperability, and accountability, consumers are protected, competition is fair, and the digital environment is safer.
Nonetheless, the regulation of an ageless digital realm presents multifarious ways of approaches and enforcement. While these laws contain the uncontrolled power of big tech, they also restrain the challenges to foster innovative responsibility, balance compliance with user trust, and the scalability applied globally.
In the wake of emerging technology, these regulations need to be synchronized to create a single approach that counteracts harms while promoting advancement.