The world’s largest tech companies must comply with a sweeping new European law starting Friday that affects everything from social media moderation to targeted advertising and counterfeit goods in e-commerce — with possible ripple effects for the rest of the world.

The unprecedented EU measures for online platforms will apply to companies including Amazon, Apple, Google, Meta, Microsoft, Snapchat and TikTok, among many others, reflecting one of the most comprehensive and ambitious efforts by policymakers anywhere to regulate tech giants through legislation. It could lead to fines for some companies and to changes in software affecting consumers.

The rules seek to address some of the most serious concerns that critics of large tech platforms have raised in recent years, including the spread of misinformation and disinformation; possible harms to mental health, particularly for young people; rabbit holes of algorithmically recommended content and a lack of transparency; and the spread of illegal or fake products on virtual marketplaces.

Although the European Union’s Digital Services Act (DSA) passed last year, companies have had until now to prepare for its enforcement. Friday marks the arrival of a key compliance deadline — after which tech platforms with more than 45 million EU users will have to meet the obligations laid out in the law.

The EU also says the law intends “to establish a level playing field to foster innovation, growth and competitiveness both in the European Single Market and globally.” The action reinforces Europe’s position as a leader in checking the power of large US tech companies.

‘Dark patterns’

For all platforms, not just the largest ones, the DSA bans data-driven targeted advertising aimed at children, as well as targeted ads to all internet users based on protected characteristics such as political affiliation, sexual orientation and ethnicity. The restrictions apply to all kinds of online ads, including commercial advertising, political advertising and issue advertising. (Some platforms had already in recent years rolled out restrictions on targeted advertising based on protected characteristics.)

The law bans so-called “dark patterns,” or the use of subtle design cues that may be intended to nudge consumers toward giving up their personal data or making other decisions that a company might prefer. An example of a dark pattern commonly cited by consumer groups is when a company tries to persuade a user to opt into tracking by highlighting an acceptance button with bright colors, while simultaneously downplaying the option to opt out by minimizing that choice’s font size or placement.

The law also requires all online platforms to offer ways for users to report illegal content and products and for them to appeal content moderation decisions. And it requires companies to spell out their terms of service in an accessible manner.

For the largest platforms, the law goes further. Companies designated as Very Large Online Platforms or Very Large Online Search Engines will be required to undertake independent risk assessments focused on, for example, how bad actors might try to manipulate their platforms, or use them to interfere with elections or to violate human rights — and companies must act to mitigate those risks. And they will have to set up repositories of the ads they’ve run and allow the public to inspect them.

Just a handful of companies are considered very large platforms under the law. But the list finalized in April includes the most powerful tech companies in the world, and, for those firms, violations can be expensive. The DSA permits EU officials to issue fines worth up to 6% of a very large platform’s global annual revenue. That could mean billions in fines for a company as large as Meta, which last year reported more than $116 billion in revenue.

Companies have spent months preparing for the deadline. As recently as this month, TikTok rolled out a tool for reporting illegal content and said it would give EU users specific explanations when their content is removed. It also said it would stop showing ads to teens in Europe based on the data the company has collected on them, all to comply with the DSA rules.

“We’ve introduced new processes and features to provide greater transparency around our approach to advertising, content moderation and recommendation systems — and ultimately, give users more control over their TikTok experience,” TikTok spokesperson Morgan Evans said in a statement, adding that the company would continue investing in “for our global community too.”

Meta President of Global Affairs Nick Clegg said in a statement Tuesday that the company has “been supportive of the objectives of the DSA and the creation of a regulatory regime in Europe that minimizes harm.” Clegg, also a former deputy prime minister of the UK, said Meta assembled a 1,000-person team to prepare for DSA requirements.

He outlined several efforts by the company including limits on what data advertisers can see on teens ages 13 to 17 who use Facebook and Instagram. He said advertisers can no longer target the teens based on their activity on those platforms. “Age and location is now the only information about teens that advertisers can use to show them ads,” he said.

In a statement, a Microsoft spokesperson told CNN the DSA deadline “is an important milestone in the fight against illegal content online. We are mindful of our heightened responsibilities in the EU as a major technology company and continue to work with the European Commission on meeting the requirements of the DSA.”

In a Friday blog post, Microsoft Chief Digital Safety Officer Courtney Gregoire detailed how the company is strengthening safety on its Bing search engine to comply with the law.

Snapchat parent Snap told CNN that it is working closely with the European Commission to ensure the company is compliant with the new law. Snap has appointed several dedicated compliance employees to monitor whether it is living up to its obligations, the company said, and has already implemented several safeguards.

And Apple said in a statement that the DSA’s goals “align with Apple’s goals to protect consumers from illegal and harmful content. We are working to implement the requirements of the DSA with user privacy and security as our continued North Star.”

Google and Pinterest told CNN they have also been working closely with the European Commission.

“We share the DSA’s goals of making the internet even more safe, transparent and accountable, while making sure that European users, creators and businesses continue to enjoy the benefits of the web,” a Google spokesperson said.

A Pinterest spokesperson said the company would “continue to engage with the European Commission on the implementation of the DSA to ensure a smooth transition into the new legal framework.” The spokesperson added: “The wellbeing, safety and privacy of our users is a priority and we will continue to build on our efforts.”

Many companies should be able to comply with the law, given their existing policies, teams and monitoring tools, according to Robert Grosvenor, a London-based managing director at the consulting firm Alvarez & Marsal. “Europe’s largest online service providers are not starting from ground zero,” Grosvenor said. But, he added: “Whether they are ready to become a highly regulated sector is another matter.”

Load More Related Articles
Load More By RelationsTimes
Load More In Information

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

3 days regional inclusive symposium south asia was held in Islamabad

CJP honorable justice qazi faez esa was chief guest of honor at symposium Islamabad Report…