Skip to main content

How the EU is fighting tech giants with Margrethe Vestager

The European Commission’s competition watchdog is thinking bigger than lawsuits

Share this story

Margrethe Vestager
Margrethe Vestager
Photo Illustration by Grayson Blackmon / The Verge

Margrethe Vestager is one of the driving forces behind tech regulation worldwide. Appointed as the European Commission’s commissioner of competition in 2014 and an executive vice president in 2019, she’s pursued antitrust cases against Apple, Google, Meta (formerly Facebook), and Amazon among others. Now, with the EU on the verge of implementing a new antitrust law called the Digital Markets Act, Vestager is planning her next moves.

The DMA designates a category of tech “gatekeepers,” then outlines rules against behaviors like giving preference to their own services or gaining an unfair advantage with analytics data. Vestager hopes it can address systemic problems that individual enforcement actions can’t, and that it will streamline cases the EU does bring against companies. But when I caught up with her at the SXSW conference in Austin, where she was speaking about disinformation and democracy, she said there’s still work left to do before it passes.

There’s also a wide range of emerging technologies that could pose new problems, like cryptocurrency and the “metaverse” — which Vestager has said probably needs regulation once she figures out exactly what it is. And we discussed the questions that antitrust rules can’t answer, like how companies should respond to illegal content on their platforms.

Ok, Margrethe Vestager. Here we go.

This transcript has been lightly edited for clarity.

Executive Vice President Margrethe Vestager, welcome to Decoder.

Well, thank you very much.

Let’s start by laying out the field of antitrust enforcement right now for the EU. Could you go through the big cases that are on the table?

On Friday we just opened a new case with Google and Facebook, now Meta. It’s called Jedi Blue, named after the codename for an agreement that they seem to have entered back in 2018, with the aim, seemingly, to kill off Google competitors in the advertising ecosystem. We also have another Google case exclusively focusing on Google and the ad-tech stack, looking at some of the behaviors that seem to be anti-competitive.

Then we have three Apple cases: one concerning music streaming services and the 30 percent fee, then we have a more general Apple App Store case, and then we have an Apple Pay case about access to the payment infrastructure or technology on your phone. We have two Amazon cases: one concerns access to data — it seems as if Amazon Retail have had access to all the data from the smaller retailers on the Amazon marketplace, enabling Amazon Retail to have a head start on numerous products and prices in the marketplace — and the second is about their fulfillment system. We also have a case concerning advertising in the Facebook environment. So our to-do list is quite full.

It is. You also have new regulations that are going to, in theory, come into force reasonably soon.

“We want the market to be open and contestable.”

Yes. One of the things that I have learned over these seven years [as a European Commissioner] is that some of these behaviors are systemic, and then you need a systemic answer. Also, that we need to gain speed. Because if illegal behavior is allowed to continue even for a short amount of time, the risk that competitors will suffer — and because of that, consumers will suffer — is really big. So with the Digital Markets Act, we want a very simple, fundamental thing: we want the market to be open and contestable. So it depends on your ideas, your work ethics, your ability to attract capital, whether you’ll be successful with your customers or not. And unfortunately, because of the systemic nature of behavior, that’s not necessarily the case today.

Would you be able to go through some of the specifics of the Digital Markets Act?

The idea is to say that you’re more than welcome to be successful in the European market, but if you grow in market power, you should also grow in responsibility. So we have developed a set of objective criteria, and if you fall into those, we will designate you as a gatekeeper. That will give you certain prohibitions and obligations. A prohibition could be that you’re not allowed to self-preference. Or in neighboring markets to the market where you are the gatekeeper, an obligation could be to share data — if you are in the gatekeeper’s marketplace, you would actually get data that your own business is generating, for yourself and for the development of your business.

You’ve talked about enforceability being the last piece of the puzzle for the Digital Markets Act. Could you talk a little bit more about what that means and what it’s going to require?

Sometimes it makes me feel very old, but having been working with legislation for a huge majority of my life, I have realized that legislation is only as good as its enforcement. So it’s really important that we get the enforcement set up in a way that it will work on the ground. Because we have a lot of ambitions, but they should become real in everyday life, for businesses that want to be in an open, contestable market.

These are not trivial things — it’s not trivial to designate a gatekeeper. It’s not trivial to have the regulatory dialogue so that gatekeepers know what they’re supposed to do. So that is the last piece of the puzzle that we are pushing in the late phase of the negotiations of the Digital Markets Act.

You talked a little bit about this, but I want to expand on it more. What are the key elements of making sure this is enforceable?

One of the things we need to maintain is that the commission is responsible for the enforcement. We have very good experience working with national competition authorities, and we need all resources to be mobilized in order to enforce our rules. But we should also maintain that it’s the commission who has the last word — who would take the decisions and enforce this piece of legislation. And in the last phases of a negotiation, sometimes you need to take a step back to make sure that you get the fundamentals right, so the legislation is not challengeable. So it’s really about staying focused in the last part of a negotiation, so that we know that what is in the regulation are things that we actually have a very clear idea of how they will work in real life.

What are some examples of cases we might see with the Digital Markets Act that haven’t been possible so far?

Hopefully a lot of things will be solved before it becomes a real issue. Because the entire idea is that we do not want more markets to tip. We do not want things to happen because of illegal behavior in the marketplace.

And deciding beforehand, based on objective criteria, that [companies] do have these obligations, things they can and cannot do, will speed things up. When we open a competition case today, the first thing we do is to assess whether the business is dominant in the relevant market, and that in itself can take a surprisingly long time. And only if we can prove dominance do we have a case, because a smaller business that’s not dominant can do a huge number of things that the dominant company cannot.

Saying beforehand “You are now designated a gatekeeper” should give us that speed — in making sure that the marketplace is open, that there is no self-preferencing, that businesses get their data, that app stores are open; that a second app store can be on your phone if you would want it to. And that kind of speed, I think, mirrors the nature of digital markets.

I think there’s often a sense of cynicism around tech company enforcement — that the companies will just opt to pay a fine and get a monetary slap on the wrist. How does the Digital Markets Act address that?

It’s not just a theory. The Dutch competition authorities had a case where they asked Apple to change a certain behavior in the App Store, and so far Apple has not implemented those changes and they pay a weekly fine — I think it’s five million euros. And that is really thought-provoking, because the idea, of course, from the Dutch authority is that in implementing those changes, you’d have a more fair market situation.

This is why, in the Digital Markets Act, there is a full toolbox where the sanctions become more and more severe. The fines will increase if you do not implement changes. Eventually, in the toolbox, there’s also the tool that you can actually break up a company if no change is happening, or if you are a repeat offender.

Listen to Decoder, a show hosted by The Verge’s Nilay Patel about big ideas — and other problems. Subscribe here!

What is the interplay right now between EU and US regulation and enforcement? The Jedi Blue case you mentioned began as a state lawsuit in the US.

Yes. We took inspiration from the state attorney general in Texas who filed this suit, ​​and we opened our own with the CMA [Competition and Markets Authority] of the UK. I think it shows that there is a sense of alignment. We do not have a global competition authority. We have many different competition authorities. But there is a new sense of common purpose here for the market to stay open. I keep a map where I can sort of put pins in things, like: now the Australians, they are looking into this behavior; now the Indians are into that behavior; now the South Americans, they are looking into this. And there is a pattern showing that competition authorities all over the planet are zooming in on the digital economy in order to make sure that markets are competitive.

We also have very close cooperation with our US colleagues. In parallel to the [EU-US] Trade and Technology Council, we have a policy dialogue between the US and EU on competition in tech-driven markets; we had the first interaction of that and that was, I think, very successful. It may take some time before we do common cases, depending on the market situation, but I think the alignment as to how we see these markets is increasing more and more.

You mentioned Australia, and there are other countries like Korea that are working on antitrust actions. Where do you think the most interesting things are happening outside the US and Europe?

I can’t really pick. I think there are a lot of interesting things ongoing. In some jurisdictions they have the same idea, that law enforcement in specific cases should be complemented with regulation. We do not act as one competition authority because we have different traditions, differences in legislation, different tools that we can use, but I find it really encouraging that there is a sense of community when it comes to enforcing tech cases.

For your agency specifically — because there is this absolutely vast swath of potential cases — how do you pick where you’re going to focus enforcement actions?

Obviously it’s really important for us when people complain, because that gives us access to how they see the market situation. Very often, they come with data that allows us to ask more qualified questions, to find evidence if evidence is to be found.

We’re also really careful to scope the cases so that we focus on what may be the most harmful illegal behavior, and that allows us to make the most efficient use of resources. We hope that the Digital Markets Act will take care of some of the cases we had before, allowing us to focus on some of the issues that the Digital Markets Act will not deal with — one of the things would be something like the Jedi Blue case.

You’ve talked about how you theoretically want to regulate “the metaverse,” but that it’s still very early days and you’re trying to figure out what it is. What would it take for your agency to develop the expertise that you would need to go after potential violations there?

Well, metaverse or no metaverse, we are in the process of changing the composition [of our group] so we can hire more people who have technical capabilities. We use more technical digital tools than we used to — otherwise our work is simply not possible. If we send a request for information, for example, you may get millions of documents and you can only go through that many with the help of digital tools. So we are already in a period of change and have been for some time. Where that would lead us, I think still remains to be seen, but it’s something that needs to happen before we say “now there is an established thing that we could call the metaverse.”

I’m also curious how you’re looking into the cryptocurrency space.

From a competition perspective, it’s early days. My colleague Mairead McGuinness, who is responsible for financial matters, is of course watching this very carefully. The European Central Bank and the different national central banks are also considering making a digital euro — basically to take back part of the essential monetary policy that allows you to actually issue means of payment. It’s not a cryptocurrency, it’s a real official currency, so to speak.

But it’s something that we follow very closely, because when it comes to things like payment and financing, it’s really complex and very often it’s hidden from people what they actually pay.

Do you see there being potential competition issues in the crypto space? Decentralization is supposed to be a big element of it, but there are currency exchanges and NFT marketplaces that are turning out to be very close to gatekeepers.

Well, we haven’t had complaints about it yet. So it still remains to be seen what specific cases may develop in the future.

I wanted to talk about the Digital Services Act, which is the counterpart to the Digital Markets Act — I’m specifically interested in the liability of platforms for illegal content. To start, could you lay out how that works?

The Digital Services Act addresses both content and products. When it comes to products, platforms would have an obligation to know their business customers — to make sure that your products are safe, that people can come back and complain about them, that they have their consumers’ rights.

When it comes to content, there are two obligations. First would be to take down things that are considered to be illegal if flagged, while at the same time making sure that you can complain if content of yours is taken down. Second, that companies do a horizontal risk assessment as to whether or not their services can be misused to undermine democracy or put people’s mental health at risk. If these risks are being found, you need to mitigate them.

When it comes to liabilities for the platform themselves, that would only kick in if people can make the assumption that it’s actually the platform that was behind this. I think it’s easier to understand when it’s physical products — if you don’t realize that you’re dealing with a business that is on a platform, but think that it’s actually the platform that you’re dealing with, then the platform may assume liability.

The question about removing illegal content is a very hot topic in the US. There’s a fear that adding this requirement is going to lead platforms to over-moderate and take down content indiscriminately — to say, look, we’d rather be safe than sorry — and that this might particularly affect vulnerable groups. And people might not fully understand the process of trying to get that content put back up. How do you see the Digital Services Act addressing that?

“There is a gray zone here where things are not illegal, but they may be hurtful for people.”

This is exactly why people should get notified and have a chance of having their post put back up again. It is a difficult thing to do, because there is a gray zone here where things are not illegal, but they may be hurtful for people. There may be conflicts coming from it, but it’s your right to say whatever you want, as long as it’s not illegal. So this is why we have crafted this balance. We think this will work — because of course over-removal is something we are trying to avoid.

If something like the Digital Markets Act successfully reduces the power of big gatekeepers and big tech platforms, what effect do you see that having on the other tech issues like disinformation?

I think different tools are needed. My colleague Věra Jourová has been responsible for working with platforms based on the EU code of conduct. We’ve just strengthened the code of conduct and all of the big platforms have signed up for it — TikTok, Google, all the big ones. I think the cooperation is quite good, and it has been essential during the pandemic. It’s really important for us not to think that there is one silver bullet. To make the digital world a fully integrated part of our world, many things need to happen at the same time. And I think the code of conduct combined with the regulation creates a balance and that is probably the most effective way to go about it.

I think there’s a broad understanding that people want to lessen the power of gatekeepers and have fewer central choke points for the internet, but at the same time, as we’ve seen with the war in Ukraine, the same people rely on these big central gatekeepers to cut off things they want to see taken offline, like Russian propaganda. It seems theoretically possible that if you lessen the power of these companies, you suddenly don’t have these points where you can exercise control over something like disinformation. Is that something that you’ve thought about?

It’s quite a special situation that we’re in, because we see the actions here as part of the sanctions — because Russia Today and Sputnik, they are completely state-controlled media. So we see them as part of the war machine. And the sanctions are why it works all over the European Union at the same time. But I don’t see that there is a need just to have a few gatekeepers for that to be effective, because even if there were many more outlets for these media, they would also be on board to do this.

What are the broader lessons that we can draw from the digital response to the war in Ukraine?

I think it’s a very old truth that the first victim in a war is truth, because it becomes increasingly difficult to trust what you’re being told. Having social media, of course, amplifies propaganda and misinformation to a completely different degree than what it was just 10 years ago. As horrible as it is on the ground for the people who are being shelled, for people being bombed, it is also a war about how you see the war. This is why it’s so important that we have good cooperation with the platforms to make sure the propaganda is not allowed to remain on them.

For the last few years, we’ve been going through what people are dubbing the “techlash.” I’m curious if you think this is a temporary moment, or if there’s been a permanent shift that’s put public opinion on a different trajectory.

I think it’s a permanent shift, because from the very early days of digitization, maybe we didn’t really notice that it grew to have such an importance in our lives. But the thing was that as our digital world grew, it pushed back on where our democracy has a say — because the physical world became of less and less importance in many people’s lives, and what they do online takes up a lot of hours every day.

What is happening right now is that democracy is coming back in to say, well, no. Democracy counts in the digital world as well as the physical one. It must be that what we agree is illegal actually is illegal and is treated as such. And what is legal is legal and treated as such. And that, I think, is a permanent thing — that democracy is coming back to be able to govern our society when it’s digital.

Thank you, it’s been wonderful talking to you.

Thank you very much for doing this. I appreciate it.

Subscribe now! /

A podcast from The Verge about big ideas and other problems.

Decoder with Nilay Patel