Let’s play a game—thought experiment. Imagine it’s the near future. You’re walking along a city street crowded with storefronts. As you walk past boutiques, cafes, and the Apple Store, your visage follows you. Thanks to advances in facial recognition and other technologies, behavioral marketers have developed the capacity to take your Facebook profile, transform it into a 3-D image, and insert it into ads. That sweater you’re eyeing? In the display, the mannequin wearing it takes on your face and shape. The screen showing a car commercial depicts you behind the wheel. At a travel agency (let’s pretend they still exist—after all, this is a thought experiment!), you see yourself sunning on a beach, while the real you is bundled up against the cold. The ads might show you with an attractive stranger or a lost love (after all, Facebook knows whom you used to date). Or they could contain scenes of you and your happy family. No longer do you have to picture yourself in the ad—technology has that covered.
Although the technology in our thought experiment doesn’t yet exist, many of the necessary components already do. There is Autodesk 123D Catch, a program that uses computer vision technology to transform simple photographs into 3-D objects. Facebook has its own recognition tools to help users identify and tag photos. Video games generate avatars using sophisticated motion capture techniques.
It seems that commercial actors and advertising models are well on their way to losing their jobs as the consumer becomes the star. Online ads already allow for browsed items, like a pair of shoes, to follow one across the Internet like a “persistent salesman” in a practice called personalized retargeting. H&M uses “completely virtual models” to market its clothes. These practices can sometimes seem creepy at first, but customers generally adapt to them—and such advances are making users more open to the types of ads and content they see and even expect. While the highly developed personalized advertising featured in Minority Report remains science fiction, less sophisticated, but still promising versions have been rolled out.
If the technology described in the thought experiment gets developed, how might it be used?
It is especially hard to predict the future of behavioral advertising, given the headline-grabbing controversies Facebook and other outlets inching toward avatar advertisement have faced. There are serious questions about consent, privacy, and data security. Debate rages between industry advocates who want the field of behavioral advertising to remain self-regulating and those endorsing government regulation. Even the seemingly simple solution of opting out is contentious. Some are satisfied with the National Advertising Initiative’s tool, which allows users to opt-out of behavioral advertising from “member companies” that placed cookies on one’s computer, while others defend the stronger Internet equivalent of the “Do Not Call” registry, i.e., the “Do Not Track” option.
This conflict has serious economic consequences. Forbes notes that in order to prevent government regulation, “Ad technology companies will have to spend as much on privacy issues this year and next as they will on developing their new technologies and figuring out when to sell out to Google, Yahoo, or Facebook.” Lobbying efforts will intensify accordingly.
Given all this industry uncertainty, it would be foolish to attempt to predict what will happen. Nevertheless, we can consider several potential scenarios, each with its own ethical trouble spots.
For instance, consumers themselves could become split on the technology. Perhaps the most desirable consumer—affluent, young, early adopters—will see the new advertising technology as cool and fun. Accordingly, a company—if permitted by law—may launch its avatar campaign by opting users into the advertising. But unless a massive cultural shift takes place, smart money says the privacy bell will ring. Companies tend to announce data ownership and management policy in fine print, buried deep in long user agreements that hardly anyone reads. Consequently, sometimes people develop their own expectations about how their data will be used. “I didn’t imagine my image would appear here!” some users might exclaim when first encountering a targeted avatar ad at their favorite grocery store; perhaps they might expect it to appear only in the clothing store, say, where they first opted in to the technology. This shock could prompt vocal concern about images being transferred to undesirable locations.
Advertisers may also have to make hard decisions about how to present these avatars. Right now, despite all the advances in computational power, it remains impossible to digitally duplicate reality. What’s a company to do when it can’t quite make images walk like us, talk like us, or gesture like us? They certainly won’t alienate customers by making them uglier … unless they want to entice users to feel bad about themselves and invest in beauty, fitness, and sartorial products to change things. Or, perhaps companies will use avatars plucked from the “uncanny valley” to entice consumers to purchase upgrades to adorn their virtual selves. If such ads become ubiquitous, this may feel like extortion.
When Jean-Paul Sartre famously said “hell is other people,” he meant life can suck when others don’t affirm our idealized self-conceptions. Advertisers know this. Translating vanity to the visual, they could go with idealized avatars—slimmer, smoother-skinned versions of their real customers. While flattery isn’t inherently a problem, too much distortion can be dangerous. If idealization sucks the viewer in too deeply, the problem of deceptive practice, already at issue in the controversy over “magic mirrors,” could become a hot issue. Shirts simply look much better on the Brad Pitt and Angelina Jolie avatar versions of us—perhaps too good.