Uber’s business model is incredibly simple: It’s a platform that facilitates exchanges between people. And Uber’s been incredibly successful at it, of in everything from shuttling people around town to delivering food.
Yet its expected may pale in comparison to the wealth of . If you use Uber – or perhaps – it knows a treasure trove of data about you, including your location, gender, spending history, contacts, and even . It may soon know or not.
Uber’s hardly alone. the biggest digital platforms – Airbnb, Facebook, eBay and others – are collecting so much data on how we live, that they already have the capability to manipulate their users on a grand scale. They can predict behavior and influence our decisions on where to click, share and spend.
While most platforms aren’t using all these capabilities yet, manipulation through behavioral psychology techniques can occur quietly and leave little trace. If we don’t establish rules of the road now, it’ll be much harder to detect and stop later.
A platform can be any space that facilitates transactions between buyers and sellers. Traditional examples include flea markets and trading floors.
A digital platform serves the same purpose but gives the owner the ability to “mediate” its users while they’re using it – and often when they’re not. By that we mean it can observe and learn an incredible amount of information about user behavior in order to perfect what behavioral scientists call “,” inconspicuous design elements intended to influence human behavior through how decisions are presented.
For example, Uber to determine the most effective strategies for keeping them on the road as long as possible. These strategies include playing into cognitive biases such as loss aversion and overestimating low probability events, even if a driver is barely earning enough money to make it worth her while. Drivers end up like gamblers at a casino, urged to play just a little longer despite the odds.
Uber didn’t immediately respond to a request for comment.
Airbnb also experiments with its users. to get hosts to lower their rates and accept bookings without screening guests – which creates real risks for hosts, particularly when they are sharing their own apartment.
While these examples seem relatively benign, they demonstrate how digital platforms are able to quietly design systems to direct users’ actions in potentially manipulative ways.
And as platforms grow, they only become better choice architects. With its IPO’s huge influx of investor money to fund more data and , Uber could move into dangerously unethical territory – easy to imagine .
For example, if the app recognizes that you are drunk or in a neighborhood you rarely travel to – and one that its data show is high in crime – it could charge you a higher rate, knowing you’re unlikely to refuse.
And it’s not all speculation.
That’s one reason lawmakers and regulators the difficult, interrelated roles of behavioral science and tech . And some companies, in particular, have been investigated for a host of bad business practices, from to .
But most of the manipulation we’ve identified and worry about is not expressly illegal. And because regulators are often unable to keep pace with the ever-evolving use of technology and choice architecture, that’s likely to remain so.
Given the absence of well-defined and enforceable legal guardrails, platform companies’ propensity to exploit behavioral science at users’ expense will remain largely unchecked.
An ethical code
One solution, in our view, is establishing an ethical code for platform companies to follow. And if they don’t adopt it willingly, investors, employees and users could demand it.
We reviewed hundreds of ethical codes, including ones targeted at tech and computing companies. Based on our research, we urge digital platforms to adopt five ethical guidelines:
All choice architecture employed on a platform should be fully transparent. Platforms should disclose when they are using the tools of behavioral science to influence user behavior
Users should be able to make choices on the platform freely and easily, and choice architects should limit behavioral interventions to reminders or prompts that are the least harmful to user autonomy
Platforms should recognize the power they possess and take care not to exploit the markets they’ve created, including by abusing information asymmetries between themselves and users or opposing reasonable regulations
Platforms should avoid using choice architecture that discourages users from acting in their own best interests. As Nobel Prize-winning behavioral economist Richard Thaler , we should only “nudge for good.”
While the results can significantly enhance our lives, it also makes it easier than ever for companies to manipulate users to enhance their bottom lines.
This article originally published at The Conversation