The mobile apps we use every day are surprisingly manipulative. Subtle design tricks known as “dark patterns” nudge us into doing what the app maker wants—be that buying products or continuing to scroll. But now, researchers are fighting back with a new tool that strips these unwanted features out of Android apps.
The term dark pattern was coined by user experience (UX) designer Harry Brignull in 2010 to highlight the way developers often tweak the layout or operation of websites and apps to control users’ behavior. They can include things like automatic opt-ins for services the user hasn’t asked for, deliberately confusing or time-consuming processes for unsubscribing or changing privacy settings, and endless feeds and incessant notifications designed to keep people clicking.
These practices can do significant harm, including encouraging children to spend large sums of money in mobile games and making social media more addictive or duping people into giving up their data, says Konrad Kollnig, a graduate student at the University of Oxford. So, Kollnig and colleagues decided to build a tool that provides a user-friendly way to remove these manipulative design features from popular apps.
“We currently see many examples of apps posing potential risks to individuals, including their autonomy, privacy, and well-being. Worse, user choice over these harms is often limited,” says Kollnig. “Individuals currently struggle to exercise their freedom to change apps in the same way they fix their car or microwave. Why shouldn’t it be similarly simple?”
The tool, dubbed GreaseDroid, is due to be presented at the ACM Conference on Human Factors in Computing Systems in May. It allows users to apply a variety of ready-made modifications called patches to their apps via a simple web portal. These patches are instructions written in common programming languages that tell the GreaseDroid software how to edit an app’s code to remove or alter design features that support dark patterns.
To use the tool, the user selects the app they want to alter and then browses a library of patches each targeted at different dark patterns. Once they’ve selected the patches they want, the GreaseDroid software applies the modifications and provides a link to download a bespoke version of the app.
Some patches will be be app-agnostic, says Kollnig, while others will work only on specific apps. The tool is not yet ready for public release, but to demonstrate its potential, the team showed how it could be used to remove two features of the Twitter app that make it more addictive: notifications about other users and disappearing tweets called “Fleets.”
Colin Gray, an assistant professor at Purdue University who studies dark patterns, says his research has found consumers are often aware they’re being manipulated and coerced by their apps, but rarely know how to respond.
“I am very excited by the work that GreaseDroid brings to the foreground—namely, the use of what might be considered ethical ‘hacking’ to allow consumers to respond to addictive and manipulative threats that are present in apps on their smart devices,” he says.
“This proposed system, even as a rhetorical device, is useful to unpack what kinds of rights consumers should have, and how these rights might intersect or conflict with the rights of app developers or platforms.”
There’s still work to do before the tool can be made available to everyday users though, admits Kollnig. For a start, the legality of these kinds of modifications is not entirely clear. And Kollnig says it may be difficult to extend the approach to iPhone apps because users can install them only through Apple’s tightly regulated App Store.
Creating patches that effectively target dark patterns without disrupting the function of apps also requires significant development experience. This means their approach will rely on building an active community of patch developers, though Kollnig points to the large community of developers who build Web browser extensions as evidence that this should be feasible.
While the research contains some good ideas, Jason Hong, a professor at Carnegie Mellon University, says the community-driven approach the team is relying on presents a lot of security issues. “A user is essentially installing arbitrary code onto their device, which can be highly risky,” he says. “That patch can make the app do anything, and you don’t have the protection of the Google Play store anymore.”
Kollnig agrees this is a concern and his group is currently working on ways to mitigate the risk. One option is to institute a review mechanism, similar to other community-driven projects like Wikipedia, where patches are scrutinized by other developers before being included in the library.