Consumers—and likely employees too—are getting wise to your nudges. That’s the conclusion from researchers studying nudge design and behavioral science. Simon Shaw, a contributing columnist at Behavioral Scientist, recently undertook research to investigate how nudging is working in the wild. This is important research for AI designers because it hints at how humans respond if their mental model of the intent of the machine changes.
The question the researchers wanted to answer was whether marketers are chasing a moving target. The initial conditions for the nudge—and consequently how effective it initially was—may no longer exist, due to over saturation of behavioral interventions.
Using a representative sample of more than 2000 British people, the researchers tested a variety of real-world scarcity (for example, “only 2 rooms left!”) and social proof (“16 other people viewed this room”) nudges made by a hotel booking site.
Two thirds of the British public (65%) interpreted examples of scarcity and social proof claims as sales pressure. Half said they were likely to distrust the company as a result of seeing them (49%). The researchers were surprised by how many people simply did not believe the claims and how many responded that the nudge reduced trust in the company.
But what really surprised the researchers was that a third (34%) described their reaction as contempt and disgust. The messages were, in their view, designed to induce anxiety. This resulted in negative attitudes to the company, proving that nudging comes with risks and is not a neutral strategy.
This is an important finding for AI product designers because, as the researchers hypothesize, it’s due to a psychological phenomenon called psychological reactance; where people kick back when they feel they are being coerced.
This leads to an important insight: heuristics are dynamic not static. In the field of human-centered AI design, this is particularly relevant because the machine response itself is dynamic, leading to an additional level of flexibility in delivering nudges, but also a higher risk of getting it wrong.
Designing for feedback and co-learning between human and machine is complex, experimental and counter-intuitive. We work with clients to co-create human-machine relationships that thrive in an ecosystem where human behavior is context-dependent and remains fabulously unpredictable.