Deceptive design patterns and where to find them – a guide provided by the European Data Protection Board
On 14 February 2023, the European Data Protection Board (EDPB) adopted guidelines on deceptive design patterns in social media platform interfaces. The Guidelines are addressed to social media providers, designers and users of social media platforms, and aim to give guidance on how to identify and avoid so-called “deceptive design patterns” in social media interfaces that infringe on the requirements of the European Union's General Data Protection Regulation (GDPR). That being said, such patterns are not unique to social media platforms, so these Guidelines may prove useful for all players in the digital industry.
In this article, we take a closer look at the Guidelines, exploring what deceptive design patterns are, how you can recognise them, and how you can avoid them by following the EDPB's best practices.
What are deceptive design patterns?
In general, "deceptive design patterns” or "dark patterns" are interfaces and user journeys designed to attempt to influence users into making unintended, unwilling and/or potentially harmful decisions. They aim to exploit known cognitive biases by way of, for example, colour choices and content placement, and can be used to achieve multitudes of outcomes, ranging from influencing a user to sign up for a newsletter to making the user buy goods or services they did not intend to buy.
Addressing the use of deceptive design patterns generally belongs to the ambit of consumer protection legislation, however, patterns affecting user decisions concerning the use of their personal data may (also) amount to the infringement of data protection legislation, opening those patterns up to scrutiny by data protection authorities.
Whilst the use of dark patterns is not limited to social media platforms, the EDPB Guidelines focus on those patterns that affect user decisions concerning the processing of their personal data within the context of social media platforms.
The Guidelines include an annex listing design patterns that the EDPB considers deceptive, grouped into six main categories:
- Overloading: when users are being buried under large amounts of requests, information, options or possibilities in order to discourage them to go further and make them keep or accept certain data practice
- Skipping: designing the interface or user journeys so that users forget or do not think about all or some of the data protection aspects
- Stirring: when a platform is influencing the decision-making of users by tapping into their emotions or using visual nudges
- Obstructing: when users are hindered or blocked from accessing information or managing their data by making the action difficult or unattainable
- Fickle: when the interface is inconsistent and erratic, causing difficulty for users to understand the nature of the processing, making informed decisions concerning their data, or to locate the different controls
- Left in the dark: when the interface is designed in such a way that information or controls regarding data protection are hidden or that leaves users not knowing how their data is processed and which controls they may have over it.
It is important to note that this list of patterns is not exhaustive, and data protection supervisory authorities are free to consider other design patterns as deceptive.
Where might you encounter deceptive design patterns on a social media platform?
Deceptive design patterns may be present when users:
- open a social media account
- are provided with information on their rights and events affecting them
- manage their consent and data protection settings
- exercise data subjects’ rights
- leave a social media account.
The deceptive design patterns can be divided into content-based and interface-based patterns. The content-based patterns are applicable to the content, the wording and the context of the sentences and other information displayed. The interface-based patterns relate to how the content is displayed, browsed through, or interacted with.
Interplay between deceptive design patterns and the GDPR
Deceptive design patterns may enter the ambit of data protection legislation if they are incompatible with certain requirements of the GDPR. Key GDPR principles which may be of relevance include the principle of accountability, data protection by design, data minimisation, transparency, easy access to rights, informed consent and purpose limitation.
Further, the fundamental principle of fairness of processing laid down in Article 5(1)(a) GDPR has an umbrella function, ensuring that all deceptive design patterns which are not compatible with the overarching principle of fairness would be caught even if the pattern complies with other, more specific data protection principles. The fairness principle states that personal data must be processed lawfully, and in a fair and transparent manner, and it shall not be processed in a way that is detrimental, discriminatory, unexpected or misleading to the individual. Therefore, if the social media interface has insufficient or misleading information for users and fulfils the characteristics of a deceptive design pattern, it could be classified as unfair in terms of the GDPR.
The principle of accountability requires the social media provider to comply with the principles of the GDPR as well as to be able to demonstrate such compliance. In the context of designing a social media platform, accountability may be demonstrated by certain elements of the social media interface, which in turn may help avoiding the design of the platform from being qualified as deceptive from a GDPR perspective. For example, the user interface and user journey can be used as a documentation tool to demonstrate that users, during their actions on the social media platform, have read and taken into account data privacy information, have freely given their consent, or have easily exercised their rights.
One of the core provisions of the GDPR that may be assessed against the design patterns used in a social media interface is the principle of transparency laid down in Articles 5(1) and 12(1) GDPR. In the context of a social media platform, the principle of transparency requires the provider to inform the users of the processing of their personal data in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”.
The Guidelines on Transparency provide more guidance on the specific elements of transparency. Defective design patterns that, for example, hinder or block users from receiving appropriate information, or deliberately make it difficult for users to understand the nature of the processing, may infringe this particular principle of the GDPR. Furthermore, deceptive design patterns may be particularly impactful for vulnerable persons, such as minors and the elderly. For this reason, the transparency rules of the GDPR require additional safeguards in the case of, for example, children, requiring any information to be provided to a child to be in clear and plain language that children can easily understand.
Data protection by design and default
Article 25 GDPR requires data controllers to implement appropriate technical and organisational measures designed to comply with data protection principles. The Guidelines on Data Protection by Design and by Default provide some key considerations which social media platform operators should consider when designing a social media interface and which could also be highly relevant in respect of deceptive design patterns. These considerations include providing the users with the highest degree of autonomy, and ensuring that the processing corresponds to the users' reasonable expectations. Overall, complying with the principle of data privacy by design may, in itself, prevent the design patterns of a social media platform from being qualified as deceptive.
Examples of dark patterns used during the lifetime of a social media account
Opening of a social media account
- Continuous prompting. For example, when a user is being pushed to provide more personal data than necessary for the purposes of processing by being repeatedly asked to provide additional data or to consent to a particular processing.
- Emotional steering. For example, when wordings or visual elements are used to make users feel good, safe or rewarded, or in the contrary, to make users feel anxious, guilty or punished. An example of Emotional steering is when a social media platform asks users to share their geolocation by stating something like: "We see that you are all alone? Let us share your location and help us connect you with others to make the world a better place." Such motivational language could encourage users to subsequently provide more data because they might feel that what is proposed by the social media platform is what most users will do and thus the “correct way” to proceed.
- Hidden in plain sight. Where a visual style is used that nudges users away from data advantageous options to more invasive options, for example by way of presenting the data advantageous option using small font size or a colour which does not contrast sufficiently to offer enough readability.
Staying informed on social media
- Language discontinuity. When the information on the data processing is not available in the user's language, even though the online services are offered in that language and are addressed to residents in the user's country. This may cause uncertainty amongst users not speaking the particular language on how their data is being processed. Language discontinuity could also emerge when the relevant information webpage switches to a default language that is different than the one the user has previously selected.
Staying protected on social media
- Conflicting information. For example, when conflicting, unclear or unintelligible information is given by the platform, it leaves users unsure of what they should do and of the consequences of their actions. Users will therefore not take any action or keep their default settings.
- Look over there. For example, this pattern presents a data protection related action together with another element to the user, which will likely make them forget about their primary intent concerning the data protection related action. Humour should not be used to distort the hazards and undermine the truth of the information.
- Decontextualizing. For example, when information regarding data protection is located on a page that is out of context. As a result, users will find it highly difficult to find the information on how to choose and save their settings as it is not intuitive to look for it on that specific page.
Exercising data subjects’ rights
- Dead end. For example, this deceptive design pattern occurs when a clear explanation on how to exercise a right is missing or when a link that is supposed to redirect to for example a privacy notice is broken.
- Inconsistent interfaces. For example, an interface is inconsistent when it does offer the user data protection friendly choices, but it is done in an unclear and unusual manner. When this specific interface is different from the others and not what the user expects, the user may end up not knowing how to control their data protection settings. Users may be confused due to the use of different symbols or icons in the interface across different devices. Users are likely to take more time finding the controls they know from another device.
- Longer than necessary. For example, this deceptive design pattern attempts to make the exercise of a right harder due to the number of steps or clicks required. Users should not be discouraged by additional questions such as: "Do you really want to do so? Why do you want to do this?"
Leaving a social media account
When users choose to leave a social media account, they often have the choice to either leave the platform permanently or disable their account temporarily. Users have the right to permanently leave a social media platform pursuant to Article 17(1) GDPR.
Whether the social media platform is permanently or temporarily left, different deceptive design patterns may occur. An example of a content-based pattern could be Ambiguous wording or information. In this situation, users can for example, only delete their account through links named "See you” or "Deactivate”. In this example, the wording used for such a link will make it unclear that users will be redirected to the account deletion process. Instead, users are likely to think of other functionalities such as logging off until the next use, or deactivation of their account.
Best practices suggested by the EDPB to avoid dark patterns
The Guidelines include an annex in which the EDPB provides an overview of best practices that can be used to design an interface which is GDPR compliant and free of dark patterns. Examples include:
- Shortcuts: Links to information, actions or settings that can be of practical help to users to manage their data protection settings should be available
- Bulk options: Putting options that have the same processing purpose together while still leaving users the possibility to make more granular changes.
- Privacy police overview: At the start / top of the privacy police, include a table of contents with headings and sub-headings that shows the different passages the privacy notice contains
- Cross-device consistency: When the social media platform is available through different devices (e.g. computer, smartphones, etc.), settings and information related to data protection should be located in the same spaces across the different versions and should be accessible through the same journey and interface elements.
How can this impact your business?
Using deceptive design patterns in social media platform interfaces, even if unintentionally, may expose businesses to infringement of various areas of law, including data protection legislation. Businesses should therefore be equipped with appropriate tools and resources to be able to recognise and avoid the use of such patterns when designing their social media interface and its content. For this purpose, it is recommended that businesses familiarise themselves with the categories of dark patterns identified by the EDPB, as well as with the various best practices recommended by the EDPB to avoid the use of such patterns.
It is also important to keep in mind that deceptive design patterns are not unique to social media platforms. The EDPB notes that the Guidelines focus on social media platforms was because of the influence of these platforms on daily life of people and nations is constantly growing. However, they also expressly state that strong opinions on this issue were voiced during the public consultation of the Guidelines, mentioning that dark patterns are present in many other instances where users interact with products and services, for example in respect of online shops, video games and mobile applications. Therefore, those businesses who are not providers of social media platforms but are active in industries that are exposed to dark patterns should keep a close eye on the use of these Guidelines as this could provide a general indication of European data protection authorities' approach to dark patterns in the future.