“Dark patterns” and personal data
“Dark patterns” are elements and mechanisms put in place in interfaces to influence user decisions in a way they would not necessarily choose if they were faced with a loyal and transparent interface.
The manipulation of individuals is recurrent on digital products and services, especially when personal data is collected. Often these practices are based on deceptive designs (dark patterns), targeted at influencing the consent of individuals, creating frictions of use or even pushing the individual to share more data than necessary.
The problem of deceptive designs is not exclusive to data protection and privacy. Indeed, these practices initially come from the advertising world, which set them up to convince individuals to consume more. The beginning of the internet has made it possible to adapt these practices to data. In the context of data protection, the use of manipulative designs by digital players first of all raises the question of the loyalty of the service to its users. It also questions the validity of the collection of consent: can we consider a consent as free, specific, informed and unambiguous, when it has been obtained using a potentially abusive or misleading design?
What are the elements that demonstrate the manipulative nature of a design ?
Some answers can be found in a recent Princeton University study on dark patterns.
The observed patterns in question have been categorized as follows:
- Fooling the user (sneaking) : patterns looking to give a false representation of the user’s action, to hide or delay the display of information to which the user would oppose if it were brought to him transparently;
- Establish a sense of (urgency) : patterns imposing a deadline for a sale or promotion in order to create a sense of urgency in the user and accelerate their decision-making and their passage to purchase.
- Distracting / Bad indicate (misdirection) : patterns relying on the use of visuals, language elements or emotional reactions to push users towards one choice over another;
- Exercise social pressure (social proof) : patterns based on the conformist tendency of individuals to accelerate decision-making and the switch to buying ;
- Give an impression of rarety (scarcity) : patterns signaling the limited quantity or high demand for a product in order to increase its perceived value and desirability;
- Obstruct (obstruction) : patterns targeted to make certain the performance of certain actions more difficult than necessary in order to discourage the user from carrying them out;
- Force an action (Forced action) : patterns forcing the user to perform an additional action in order to be able to complete the task initially performed.
From consumer protection to the protection of their personal data
First of all, “dark patterns” based on coercive logic are shared in both cases in order to discourage, or even prevent, the user from carrying out certain actions that would directly go against the economic interests of the service, such as shutting down a subscription or adjust its privacy settings. Techniques based on FOMO (fear of missing out) are also shared, mostly in the categories as, social pressure and scarcity. In the digital field, these practices target in particular to capture the attention of users in order to make them use a service as much as possible, and therefore generate data. Another tactic, that of user “seduction”, is also used by these services to encourage data sharing. They can present to him the advantages he will have in sharing his data, for example to improve his user experience with the personalization of content, or promise him total confidentiality on sharing (“it stays between us”), which is especially used on social networks to complete a profile.