Skip to main content

Clifford Chance

Clifford Chance

Media & entertainment

Talking Tech

Digital Services Act: considerations for influencer marketing

Will algorithmic transparency give brand owners greater comfort?

Influencers Intellectual Property Artificial Intelligence 29 March 2021

In December 2020, the European Commission (EC) unveiled a proposed package of measures in relation to the governance and operation of digital services in the EU, being the Digital Markets Act (DMA) and Digital Services Act (DSA). The legislative initiatives have two main goals: (i) to create a safe online space in which the fundamental rights of all users are protected; and (ii) to establish a level playing field amongst tech businesses to foster innovation, growth and competitiveness in the EU and beyond. The proposals for each initiative set out how the EC intends to reshape digital business practices within the EU and increase accountability and fairness online.

As covered in our previous article on the Digital Markets Act and Digital Services Actthe proposed initiatives will have significant legal and commercial implications for digital businesses operating in the EU. In this article we will assess what impact the DSA may have on businesses and influencers engaging in influencer marketing due to the proposed obligations on social media platforms.

How will the DSA affect social media platforms?

The DSA aims to improve the mechanisms for the removal of illegal content and to grant greater protection for internet users by creating greater transparency and giving the public greater control online. The proposed DSA would apply to all online intermediary services who offer their services within the single market, including social media platforms.

Social media platforms (e.g. Facebook, Instagram, Twitter, TikTok) would be considered "Online Platforms" under the proposed DSA. Accordingly, they would be subject to certain obligations under the DSA, including transparency reporting, providing a point of contact, creating trusted flaggers, transparency of online advertising and reporting criminal offences. Social media platforms should therefore be taking practical measures now to ensure that they have the appropriate policies and processes in place to comply with such obligations.

With regards to social media platforms, the DSA will address three key issues, being: (i) use of algorithms; (ii) content moderation; and (iii) targeted advertising. All three are interlinked and may affect how businesses and influencers use social media platforms to market goods and services in the future.

How will the DSA affect influencer marketing?
Algorithm transparency

At their inception, social media feeds were simply chronological. However, as the number of users and content increased, social media platforms changed their feeds to be algorithmic in order to: (i) show users content the platform providers think the user will want to see based on their personal data and previous engagements; and (ii) facilitate content moderation. Whilst algorithmic content recognition can be useful for the moderation of harmful content, algorithms can also introduce discrimination, which can lead to a censorship of expression and restriction of access to information. This affects both influencers and brand owners using social media platforms for business purposes.

Specific groups may currently be vulnerable to unconscious (or even conscious) biases embedded in notification systems and algorithms currently employed by social media platforms. For example, in the summer of 2020, at the height of the Black Lives Matter movement, a number of black influencers complained that their content was being wrongly censored in cases where white influencers posting similar content were not.

Influencers have also raised the issue that they have no control over the algorithm and therefore no control over who views their content. For example, there are rumours of "shadow bans" where content is limited for an unexplained reason and influencers have complained that sponsored posts reach fewer people as a result. This can be a concern for brand owners engaging in influencer marketing as there is no guarantee as to how many people will see an advertisement.

The DSA proposes that very large online platforms with more than 45 million active users in the EU, which will include the majority of the top social media platforms, will have to provide transparency on the main parameters of the decision-making algorithms used to offer content on their platforms and provide options for the user to modify those parameters. Clarity as to how each social media platforms' algorithms operate and what factors are used to determine what content is displayed will benefit all social media users: individuals, businesses and influencers.

Greater transparency about the algorithms may force social media platforms to address any issues of discrimination or unfairness, which may in turn lead to a greater of variety of content to be disseminated and viewed. It may also reveal what types of content the algorithms are pre-disposed to hide and social media platforms would be forced to justify their reasoning in doing so. More widely, this could benefit society as a whole. Algorithms prevent users receiving alternative information, which can increase their polarisation and create social media "echo chambers".

For those engaging in influencer marketing, this could lead to more accurate forecasts as to how a particular piece of content may perform. There may be greater certainty as to what posts will appear where and why, which may assist users on the platform in their content creation. Influencers may be able to better tailor their content and target their audiences which could in turn lead to greater engagement on both organic and sponsored content. Influencers may find that their accounts reach more people and they can then reap greater rewards by attracting more brand partnerships.

In relation to the latter, the price a brand owner negotiates with an influencer is often based on the amount of content they will produce and that influencer's average rate of engagement. However, due to the unpredictable nature of the algorithm, it is difficult to predict how many people a certain piece of content will reach. As such, brand owners may end up paying the agreed amount to the influencer regardless of whether certain KPIs or metrics are achieved (e.g. number of likes/upvotes; number of views; number of shares; number of follow-through purchases etc.). Consequently, algorithmic transparency may give brand owners greater comfort and confidence in launching influencer marketing campaigns as content placement becomes less elusive.

Greater content moderation

In addition to algorithmic content moderation, the majority of social media platforms will allow users to flag content that they deem inappropriate for the platform’s review. There already exist a number of sector-specific EU regulations that broadly cover content moderation in relation to (i) child sexual abuse material, (ii) racist and xenophobic hate speech, (iii) terrorist content, and (iv) content infringing intellectual property rights. However, the DSA aims to empower users to report illegal content in an easy and effective manner which should further help counter illegal content available on social media platforms.

Given the large amount of user-generated content available on social media platforms, the DSA proposes mechanisms to alleviate the burden of content moderation by the platforms alone. This includes the creation of so-called "trusted flaggers" who may be vetted by public authorities. To balance this increase in moderation, safeguards will also be introduced to allow users to monitor and challenge a platform's content moderation decisions.

As the content creators, influencers will also have to accept greater responsibility and accountability for the content they post online. As content moderation becomes more stringent and platform providers are obliged to take remedial action (e.g. removing posts; suspending accounts), influencers will need to ensure more than ever that their content is appropriate and not misleading or illegal. Moreover, both brand owners and influencers will need to continue working together to diligence the content uploaded onto social media platforms to ensure that they have the appropriate rights and licences to use such content to mitigate the risks of potential intellectual property infringement claims raised by third parties. Failure to do so could lead to adverse consequences for the influencer's reputation and monetisation model (e.g. rescinded sponsorships).

It is also worth noting that the EU mechanism will remain limited to "illegal" content only, and not extend expressly to "harmful" content. Most social media platforms already have their own content moderation mechanisms to flag "harmful content", however, this is an undefined term and varies between platform providers. In addition, flagging mechanisms can be problematic and erroneously remove legitimate and lawful content.

The DSA seeks to resolve the wrongful removal of content by implementing new mechanisms that seek to mitigate the risks of unjustified blocking of speech (e.g. the ability to contest any moderation decision). Such new mechanisms will be welcomed by influencers who have repeatedly had legitimate content removed unfairly (e.g. malicious followers flagging harmless content; algorithms mistakenly identifying content to be harmful).

Targeted advertising

The use of personal data to tailor advertising to the user is becoming increasingly controversial. It is generally viewed to yield better returns for businesses as they can target customers based on their internet activity. The DSA proposes a number of obligations in relation to transparency that will likely have a significant impact on how digital businesses use targeted advertising moving forwards.

The DSA proposes that it will need to be clear to users when they are viewing an advert and on whose behalf the advert is displayed. Users will also have the right to know why a certain advert was shown to them, including when the logic was based on profiling. Further, users will have the right to object to advertising based on profiling. This will likely have significant consequences for social media platforms, who rely on advertising for revenue, and businesses seeking to target specific audiences online.

If data-rich social media platforms are restricted by the DSA in how they use and share consumer data for the purposes of targeted advertising then businesses may choose to divert their advertising spend to other forms of digital marketing, such as influencer marketing. Influencers are usually asked to share their account metrics with brand owners before partnering with them, which includes providing statistics on their own engagement with followers (e.g. average number of likes and comments; aggregated statistics in relation to age, gender and location of followers). Whilst this would be a less precise method of targeting specific consumers, influencers who know their communities well should be able to provide insights into the demographics of their audience as well as their likes and dislikes based on previous interactions. For example, a childrenswear brand will have some certainty that a parenting influencer’s audience will include some of their target customers.

It is not clear how effective current proposals under the DSA will be in practice and it does not specify whether access to this demographical information will be affected. If not, this could be a way for businesses to still access some statistics about which people they are targeting. However, it's not as effective as targeted advertising as it will only provide an overview of the audience and, due to unpredictable algorithms, there's no guarantee as to how many of those followers will even see the advert.

Implications of Brexit

The Brexit transition period ended on 31 December 2020 meaning that the UK is no longer bound by EU treaties and laws. Therefore, if the DSA comes to pass, the UK will not be obligated to implement it into national law. However, due to the inherently international nature of all things digital and social media, UK-based digital businesses will still need to carefully assess how their activities will be impacted by the DSA if they have customers in the EU.

Further, it may be that the UK Government enacts similar legislation. In December 2020, the UK Government published a report on the Online Harms white paper which provided certain details regarding the UK Government's approach to digital regulation in the future. For instance, certain forms of online advertising (including influencer marketing) will potentially be subject to additional regulatory measures. One key point where this diverges from the DSA is that whilst the DSA will only legislate against illegal content, the Online Safety Bill proposes to legislate against both illegal and harmful content. For now, we will continue to watch this space and report on any further developments in the future.

Conclusion

It remains to be seen whether the DSA will be brought into force and whether all the proposed guidelines will be accepted. Its aims of creating greater transparency online and giving users more control of their data will place a number of obligations on digital businesses and platform providers who will need to take preparatory action now to ensure compliance in the future. This will include developing the necessary policies and processes internally to moderate content and give greater transparency to the algorithms and targeted advertising practises used on their platforms.