The EU’s Digital Markets Act and Digital Services Act
The EU adopted in July 2022 two landmark digital regulations, the Digital Markets Act (DMA) and the Digital Services Act (DSA). Both pieces of legislation codify principles and create processes for digital services providers to follow to ensure a wide choice of safe services and derivative products. The laws also aim to allow businesses operating in Europe to freely and fairly compete online as they do offline.
The two acts are a product of the European Commission’s strategy to create a fully integrated Digital Single Market (DSM) out of 27 national markets. When they were passed, European Commissioner for Internal Market and Services Thierry Breton stated, "We are finally building a single digital market, the most important one in the ‘free world'. The same predictable rules will apply, everywhere in the EU, for our 450 million citizens, bringing everyone a safer and fairer digital space.”
The DMA and the DSA aim to protect online users by allowing businesses to challenge “digital gatekeepers” in industries dominated by large companies that enjoy significant network effects. The laws are also meant to foster greater accountability for illegal content. They, with the General Data Protection Regulation (GDPR) and the EU Artificial Intelligence (AI) Act, form the backbone of the EU’s digital regulatory strategy and are part of a broader effort to advance European digital sovereignty and values. The “Brussels effect”—the idea that EU regulations can set global standards—is another reason for the legislation. Scholars such as Anu Bradford argue that the bloc’s digital rules may influence technology business practices and national legislation worldwide. The extent to which the “Brussels effect” will play out in practice, however, remains to be seen, especially in the fast-evolving digital domain.
The DMA and the DSA are increasingly a source of tension in the transatlantic trade and technology relationship. Associations representing the largest US technology companies argue that both preempt innovation, while creating technical challenges for implementers and negative externalities for user experience and security. These associations and many of the companies they represent see the laws as targeting US firms specifically, even unfairly. The second Trump administration has recently embraced elements of this view. President Donald Trump has cited digital regulation and taxes as “designed to harm, or discriminate against, American Technology”. Enforcement and the future of the DMA and the DSA served as an irritant in recent US-EU trade talks, leading the two parties’ framework agreement to include a commitment “to address unjustified digital trade barriers”. But the rhetoric across the Atlantic paints a picture of vastly divergent views from parties that are dug in.
DIGITAL MARKETS ACT
What is the DMA?
The DMA regulates gatekeepers, large online platforms that control access to digital markets. These gatekeepers provide core platform services (CPS) such as online search engines, app stores, and social networks. CPS can benefit from economies of scale and network effects, connecting users with little additional margin costs while also locking in customers via the scale of online ecosystems. The act’s goal is to allow large digital platforms that hold significant or majority market share in a digital product to innovate while ensuring fair practices for customers and other businesses that depend on them.
Prohibited practices include using a gatekeeper’s dominant position to favor its own businesses at the expense of competitors. Violations can result in sizable fines. The DMA does not replace existing competition law but creates a regulatory framework that applies before (ex ante) potential harm can occur.
What entities does the DMA cover?
The DMA covers companies that have a significant impact on the European internal market, provide a CPS, and enjoy an entrenched and durable position. Companies with a market capitalization of at least €75 billion and an annual turnover of €7.5 billion may be designated as gatekeepers. For that to happen, they must also provide a CPS in at least three EU countries, have at least 45 million monthly active EU users, and serve more than 10,000 active EU business users per year. The EU currently designates seven companies as gatekeepers: Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft, and Booking. Each provides core platform services, such as social networks, video sharing, browsers, intermediation, and search.
What are some of the DMA’s key provisions?
- Transparency and data access: Gatekeepers must give advertisers and publishers information to verify ads. They must also provide ad transparency (pricing and performance), and must enable real-time data portability for users, give business users access to their data, and provide “fair reasonable and non-discriminatory” (FRAND) access to search data for rivals.
- Interoperability: Messaging services must allow interoperability with other messaging apps and clearly explain technical and legal terms. Gatekeepers must provide FRAND access to app stores, search engines, and social networks. They must also give access to hardware and software features such as near-field communication.
- User choice and control: Gatekeepers must allow users to uninstall pre-installed apps, change default settings, and install third-party apps or app stores (sideloading). Gatekeepers must make unsubscribing as easy as subscribing.
- Fair competition: Gatekeepers must not use private data from business users or their customers to compete with those users. They must also not rank their own services higher than those of their rivals (no self-preferencing). Gatekeepers must not restrict switching between apps or services.
- User and business freedom (anti‑steering, anti‑tying, and redress): Gatekeepers must not combine or cross-use personal data from different services without user consent. They must not prevent business users from offering better prices or conditions elsewhere, or stop businesses from promoting offers or concluding contracts outside the platform. Gatekeepers must not stop users from accessing content or services acquired outside the platform, or prevent users or businesses from filing complaints with authorities. Additionally, gatekeepers must not require use of their own payment systems, ID services, or browser engines, and they must not force users to subscribe to one service to access another.
Who enforces the DMA?
The European Commission enforces the DMA through a joint team from the Directorate-General for Competition (DG COMP) and the Directorate-General for Communications Networks, Content and Technology (DG CONNECT). The Commission, supported by the High Level Group on the DMA and national competition authorities, conducts market investigations and has the power to adopt noncompliance decisions. When violations occur, the Commission can fine companies up to 10% of their global annual revenue, rising to 20% for repeat offenses.
DIGITAL SERVICES ACT
What is the DSA?
The DSA regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms to prevent illegal and harmful activities online and the spread of disinformation. Legislators aimed to protect users from illegal content, ensure transparency in online advertising, and hold platforms accountable for algorithmic decisions. At the same time, the new rules also require the largest platforms to simplify certain processes, such as online terms and conditions or channels to address user complaints.
What entities does the DSA cover?
Article 3 of the DSA outlines three intermediary services—mere conduit, caching, and hosting—and rules for providers according to their size and role. Micro and small enterprises—those with fewer than 50 employees and annual turnover or balance sheet total not exceeding €10 million—are generally exempt from most DSA obligations.
Mere conduit services facilitate transmission of information over a communication channel and do not store information. Examples include virtual private networks (VPNs) and certain messaging services. Caching services temporarily store information to improve the efficiency of data transmission, such as adjusting images from one size or format to another. Hosting services store third-party data at the user’s request. They include photo and file sharing services and cloud computing.
Hosting services can also be defined as an online platform if they store and disseminate information. The DSA sets strict rules for the biggest services on which users can post and share content. These so-called Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must have at least 45 million monthly active EU users to attain that distinction.
Cloud computing and website hosting, online gaming, adult entertainment, e-commerce and online shopping, and travel services are examples of VLOPs and VLOSEs covered by the DSA.
What are some of the DSA’s key provisions?
- Transparency: All in-scope intermediary services need to explain publicly how content moderation is handled and publicly post reports about content moderation. They must also notify law enforcement of known illegal activity, increase transparency around targeted advertising, and disclose how algorithms recommend content. Users must be given the ability to opt out of profiling-based recommendations. The DSA prohibits targeted ads based on sensitive data, and it bans profiling-based advertising to minors. The law also includes provisions for providing access to data for researchers.
- Flagging mechanisms: Platforms must allow individuals or entities to report illegal content. They must act swiftly, once notified, to remove it and provide detail about the action taken. Users may request information regarding the removal of their content and appeal the action.
- Safety of goods and services: As with online marketplaces, covered platforms must ensure the safety of the goods and services listed by third-party traders and provide consumers with redress mechanisms.
- Risk assessments: VLOPs and VLOSEs must assess and mitigate systemic risks such as disinformation, election manipulation, and negative impacts on children’s well-being. Additionally, major platforms are required to undergo regular independent audits to ensure compliance with the DSA and file annual transparency reports detailing their content moderation processes.
Who enforces the DSA?
The European Commission enforces the DSA with national authorities. The Commission’s primary responsibility is to supervise VLOPs and VLOSEs, while national authorities oversee compliance by smaller platforms and services. The European Board for Digital Services, chaired by the European Commission and composed of members from each national authority, coordinates European-wide policy and enforcement. The board also ensures that national digital and consumer protection laws are working with the DSA and not at cross-purposes. Companies can face fines of up to 6% of their global annual turnover for noncompliance.
NEXT STEPS
Why did the EU pass the DMA and the DSA?
In 2019, Ursula von der Leyen, then a candidate for European Commission president, proposed a "new Digital Services Act". The EU had long relied on antitrust enforcement, but these means were seen as reactive and slow. A new law would allow a proactive regulatory shift, aiming to prevent harm before it occurs and addressing the dominant market power of online platforms. At the same time, online illegal content had been subject to discussion and review within Europe for more than a decade. National authorities passed local laws to restrict unwanted illegal content, creating a fragmented regulatory environment.
The EU had also recently implemented the GDPR, which the Commission widely touted as a global benchmark for privacy regulation. Politically, proposing and passing the DMA and the DSA reflected a broader European ambition to assert digital sovereignty, with von der Leyen stating, “This is [the EU’s] opportunity to make change happen by design—not by disaster or by diktat from others in the world. … [I]t is about Europe’s digital sovereignty on a small and large scale.” The legislation ensured that EU values such as privacy, competition, and democratic accountability are equally upheld in the physical and digital realm.
Have companies been fined?
The European Commission has intensified DMA enforcement against major gatekeepers through formal proceedings. On March 25, 2024, investigations were launched into Alphabet, Apple, and Meta for suspected noncompliance. Alphabet and Apple came under scrutiny for restricting app developers from steering users to alternative offers. Alphabet was also investigated for self-preferencing in Google Search. Apple faced additional probes over iPhone browser defaults and contractual terms limiting third-party app stores. Meta’s “pay or consent” model, which forces users to accept data processing or pay for an ad-free service, was also challenged.
The Commission subsequently issued preliminary findings of potential breaches in three cases: Apple’s App Store steering restrictions (June 2024), Meta’s advertising model (July 2024), and Alphabet’s practices in Google Play and Search (March 2025). On April 23, the Commission issued its first noncompliance decisions, fining Apple €500 million for breaching App Store steering rules (i.e., restricting app developers from informing users about alternative payment options outside the App Store), and Meta €200 million for failing to offer users the choice of a service that requires the collection of less personal data. Investigations into Alphabet and Apple’s contractual terms remain ongoing.
Separately, the Commission has launched a series of investigations under the DSA. TikTok faced scrutiny over its “Rewards” program, which was deemed potentially harmful to minors and subsequently withdrawn. AliExpress remains under investigation for failing to assess and curb the spread of harmful and illegal products, while Temu is being examined for similar shortcomings, alongside concerns about algorithmic design that may encourage compulsive use. Meta’s Facebook and Instagram are also under formal inquiry for their impact on child safety, particularly whether their features and algorithms foster addictive behaviors and “rabbit-hole” effects. National authorities have also undertaken enforcement actions on several companies.
What have technology companies said about this legislation?
The DMA and the DSA have faced considerable criticism from technology companies. They claim that the laws are overly prescriptive, stifle innovation, impose unworkable interoperability mandates, and require data-sharing with competitors. They also argue that the regulations create burdensome obligations and requirements on how technology companies develop, operate, and moderate their products.
Google, for example, asserts that the DMA creates unintended consequences that harm consumers and businesses, favor intermediaries over direct sellers, weaken security, and introduce regulatory uncertainty that undermines competitiveness. Similarly, Apple argues that interoperability and alternative app distribution requirements delay feature launches, increase security and privacy risks, and undermine the user experience. Meta likewise describes the EU’s approach as imposing a “multi-billion-dollar tariff” that undermines its advertising model.
The Information Technology and Innovation Foundation, a US technology association, also states that the laws unfairly target US firms, as five out of seven gatekeepers in the DMA and 14 out of 24 of the VLOPs and VLOSEs are American.
What have consumer protection advocates said about this legislation?
Consumer protection advocates welcome the DMA and the DSA but argue that compliance is patchy and enforcement is slow. The European Consumer Organization (BEUC), for example, reports that gatekeepers are failing to comply fully with the DMA to the detriment of consumers. The Center for Democracy and Technology Europe warns that risk assessments and mitigation measures in the DSA need stronger clarity and enforcement.
Some groups call for a more structured and sustained approach to stakeholder engagement with the DSA, including through a body mirroring the advisory board piloted by the German Digital Services Coordinator.
The Transatlantic Consumer Dialogue, which comprises leading consumer and digital rights groups in the United States and the EU, has called for the bloc to resist Washington’s demands to weaken the laws. This position was echoed by over 50 civil society groups in a September 2 open letter to von der Leyen.
The European Commission recently launched a public consultation on a new Digital Fairness Act to close perceived regulatory gaps on deceptive design practices and child protection. The consultation will close on October 24.
Do the United States or other countries have similar legislation?
At the federal level, the United States has taken small steps toward mitigating negative impacts of digital technologies, but few efforts have made it into national law. None have featured the comprehensiveness, scope, or impact on US technology firms of the DMA and the DSA. The bipartisan “Take It Down” Act, signed into law in May, criminalizes the nonconsensual sharing of intimate images, including deepfakes. There has also been a renewed bipartisan push for the Kids Online Safety Act.
At the subnational level, some states (most notably California) have continued to pass legislation giving citizens more control over their data, while other states (Florida and Texas) have passed laws prohibiting content moderation by platforms.
At the same time, however, digital competition policy has been taken up by the courts through recent antitrust cases such as those against Google’s search and ad businesses, Apple’s smartphone apps, and Meta’s acquisition of WhatsApp and Instagram. Akin to the spirit of the DMA’s gatekeepers, these cases take aim at monopolistic practices in digital markets, with a landmark ruling requiring Google to share some search data with competitors. Critics have argued that the remedies won have not significantly enhanced competition.
In short, US digital regulation is fragmented, reflecting deeply divergent views on the goals of such legislation and the government’s role in overseeing digital commercial activities. As a result, the United States is unlikely to emulate aspects of the EU approach to digital regulation at the national level in the near future.
Outside the United States, some countries have adopted or are considering adopting legislation similar to the DMA and the DSA.
The United Kingdom’s Digital Markets, Competition and Consumers Act gives the Competition and Markets Authority new powers to regulate unfair practices and promote competition in digital markets. The country’s Online Safety Act echoes elements of the DSA by requiring online platforms to take steps to protect users from illegal and harmful content.
Japan enacted its Smartphone Software Competition Promotion Act in June 2024, specifically targeting app store and interoperability rules for large mobile operating system providers.
Kazakhstan, Uzbekistan, and Nigeria have adopted rules emulating aspects of the DMA. Discussions and proposals on digital competition regulation have also emerged in Australia, Brazil, Canada, India, Indonesia, Kenya, Malaysia, Mexico, Morocco, New Zealand, South Africa, South Korea, Thailand, and Türkiye.
Content regulation regimes have been introduced or expanded in Australia, Canada, India, Brazil, South Korea, Singapore, and the United Arab Emirates.
How has the US reacted to the DMA and the DSA?
The US government identified both acts as unfair trade barriers in the US Trade Representative’s 2025 National Trade Estimates Report (NTE).
In early August, the State Department directed its diplomats to condemn “undue” restrictions imposed by the DSA. The memo in particular tasked diplomats with advocating for a narrower definition of “illegal content” in the DSA, revising or removing the Code of Conduct on Disinformation, reducing fines, and not requiring platforms to respond to “trusted flaggers”, entities formally chosen by EU governments to signal illegal content. US officials, including Vice President JD Vance, also stated that European rules on technology reflect a far more restrictive approach to online expression and innovation that amounts to censorship.
On August 25, Trump threatened to penalize “all countries with Digital Taxes, Legislation, Rules, or Regulations”. Digital regulations under the DMA and the DSA are pan-European, but digital taxation is a member-state responsibility and a policy domain distinct from platform regulation. Implementation of measures in this area depends on ongoing multilateral negotiations in forums such as the Organization for Economic Co-operation and Development and the UN.
About a week after Trump’s threat, the House of Representatives hosted a hearing, “Europe’s Threat to American Speech and Innovation”, with speakers discussing relevant British and EU legislation.
On September 5, Trump again warned of a Section 301 (unfair trade) investigation after the European Commission fined Google €2.95 billion for breaching EU antitrust rules by distorting competition in the advertising technology industry.
Is transatlantic convergence or cooperation possible?
Regarding the DMA and the DSA, the United States and the EU are unlikely in the near term to see eye to eye on the role of digital regulation in innovation, governance, and society. This is in part due to the dominant market position of US technology platforms that the DMA and the DSA regulate. Negotiation and a test of resolve—not convergence—are likely to remain the modus operandi in the transatlantic trade and tech relationship.
During summer trade negotiations, the United States floated the idea of a new DMA advisory body that could allow companies to provide input into decisions relevant to enforcing the act. Legislation introduced in Congress has aimed to exempt US firms from foreign digital regulations. Federal Trade Commissioner Andrew Ferguson warned US companies not to weaken data protection or censor Americans at the behest of foreign powers, indicating that compliance with laws such as the DSA could risk violating their obligations under American law.
Key EU actors have made it clear that the DMA and the DSA are not up for negotiation. According to Teresa Ribera, executive vice-president of a Clean, Just, and Competitive Transition, “We are going to defend our sovereignty. We will defend the way we implement our rules, we will defend a well-functioning market and we will not allow anyone to tell us what to do.”
At the same time, European officials have announced that some changes are needed and have called for simplifying EU digital regulations in a drive to update outdated rules and increase competitiveness. The forthcoming EU omnibus package will streamline digital legislation and may be an opportunity for further US-EU discussion and collaboration, including on modifying some procedural elements of the DMA and the DSA.
For this reason, the transatlantic technology relationship should not start and end with DMA and DSA enforcement. Getting concrete on risks such as AI safety, threats to children, copyright law in the age of large language models, chatbots, and mental health may yield a more fruitful agenda for transatlantic digital governance. Shared interests in competitiveness and security in frontier technologies can also provide a basis for future cooperation. While EU digital laws are likely to remain irritants in US-EU technology negotiations, the transatlantic partners can cooperate and identify potential solutions on common risks impacting their societies.