-7 C
Ottawa
Saturday, December 28, 2024

AI ‘Surveillance Pricing’ Practices Under Federal Probe

Date:

In 2006 mathematician Clive Humby called data “the new oil”—a raw commodity that, once refined, would fuel the digital economy. Since then big tech companies have spent vast sums of money honing algorithms that gather their users’ data and scour it for patterns. One result has been a boom in precision-targeted online advertisements. Another is a practice some experts call “algorithmic personalized pricing,” which uses artificial intelligence to tailor prices to individual consumers.

Thank you for reading this post, don't forget to subscribe!

The Federal Trade Commission uses a more Orwellian term for this: “surveillance pricing.”

In July the FTC sent information-seeking orders to eight companies that “have publicly touted their use of AI and machine learning to engage in data-driven targeting,” says the agency’s chief technologist Stephanie Nguyen. The orders are part of an effort to understand the scale of the practice, the kinds of user data that are being gathered, the ways algorithmic price adjustments might affect consumers and the question of whether collusion or other anticompetitive practices could be involved. “The use of surveillance technology and private data to determine prices is a new frontier,” Nguyen says. “We want to bring more information about this practice to light.”


On supporting science journalism

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The orders, which can be enforced similarly to subpoenas, required the eight companies to submit reports outlining their surveillance pricing practices by September 2. One of the companies to receive an order, Revionics, builds AI-powered systems that its website calls “price optimization solutions.” Revionics “does not, in any way, conduct operations related to the surveillance of consumers,” says Kristen Miller, the company’s vice president of global communications. The other companies being scrutinized are Mastercard, JPMorgan Chase, e-commerce platform Bloomreach, consulting firms Accenture and McKinsey & Company, and software companies TASK Software and PROS. None of the eight companies have been accused of anything illegal by the FTC.

The agency’s ongoing investigation was sparked by a growing awareness that companies are using AI and machine learning to track certain categories of user data—such as age, location, credit score or browsing history—which many people probably wouldn’t deliberately share.

“What’s frightening is that a company could know something about me that I had no idea they could find out and I would have never authorized,” says Jean-Pierre Dubé, a professor of marketing at the University of Chicago Booth School of Business. “These are the kinds of things where the FTC might really be onto something.” If companies are deploying AI to gather consumer information that hasn’t been knowingly shared, he says, prices might be getting personalized along “dimensions that aren’t acceptable.”

Nguyen adds that consumer surveillance extends beyond online shopping. “Companies are investing in infrastructure to monitor customers in real time in brick-and-mortar stores,” she says. Some price tags, for example, have become digitized, designed to be updated automatically in response to factors such as expiration dates and customer demand. Retail giant Walmart—which is not being probed by the FTC—says its new digital price tags can be remotely updated within minutes. And e-commerce platform Instacart offers AI-powered “smart carts”: physical carts that can be used to scan items and that are equipped with screens that display personalized ads and coupons.

Movie Tickets and Mortgages

Surveillance pricing is a modern iteration of a much older practice called “personalized pricing”—adjusting prices based on an estimation of a customer’s willingness to pay. A vendor selling fruit in a bazaar in 2000 B.C.E. would probably charge a wealthy landowner more than they would a peasant, just as a modern car salesperson likely wouldn’t offer the same deal to someone who arrives in a Porsche as they would to someone who pulls up on a rusty bike. Flexibly adapting prices to individual customers’ budgets maximizes profits, and it can open doors for lower-income consumers who might otherwise be priced out of the market. Education is an illustrative example: universities sometimes offer more robust aid packages to students from diverse backgrounds or with lower socioeconomic status to maximize fairness and diversity. Further evidence of potential consumer benefits comes from a study Dubé conducted involving two movie theaters, both of which were offering discounts to customers located closer to the competition. (Theater A would offer cheap tickets to moviegoers who lived near theater B, and vice versa.) The result was win-win: both theaters ended up attracting more customers, who in turn saved money by spending less on tickets.

This approach doesn’t always work out so well in other markets, however. When personalized pricing is applied to home mortgages, lower-income people tend to pay more—and algorithms can sometimes make things even worse by hiking up interest rates based on an inadvertently discriminatory automated estimate of a borrower’s risk rating, for example.

Algorithms are now taking personalized pricing from the observable realm to a more shadowy domain. Historically, the practice was based largely on observable traits and information gleaned through face-to-face interactions. Customers could therefore try to game the system: seeking a better deal, a wealthier person buying a new car might leave the Porsche and the Armani suit at home. But in a world of mostly unregulated data collection (at least in the U.S.) and AI processing power, brands may be tweaking prices in more surreptitious ways that are much harder to get around.

Imagine, for example, that you’re shopping online for a new coffee machine on a site that leverages AI to personalize the prices customers see. The algorithm could be factoring in your browsing history (you’ve been searching quite a bit for new coffee makers), your recent purchases (you’re a regular coffee drinker and bought an espresso machine two years ago), the time of day (it’s late in the evening, when your history shows you’re more prone to impulse buys) and your location (there aren’t many brick-and-mortar stores in your area selling coffee makers). Combined, these factors are likely to mean you have more of a willingness to pay at this moment—and as a result, you see a slightly higher price. Meanwhile another person who is shopping for their very first coffee maker and is perhaps a little thriftier with their late-night spending could see a lower price for the same machine.

Your willingness to pay, in other words, could be gauged according to fine-grain demographic information and subtle patterns of behavior, some of which you might not realize are publicly available. “Even though the phenomenon itself is very old, algorithms allow sellers to reach a level of differentiation that they’ve never been able to before,” says Harvard Law School professor Oren Bar-Gill, who has studied the impact of algorithms within consumer markets.

But Is It Dystopian?

Some experts object to the FTC’s use of the word “surveillance,” arguing that it could imply a dystopian disregard for privacy. And even though the agency’s orders were passed with a rare unanimous vote among its five bipartisan commissioners, some had reservations about this phrasing. “This term’s negative connotations may suggest that personalized pricing is necessarily a nefarious practice,” wrote FTC commissioner Melissa Holyoak in an official statement. “In my view, we should be careful to use neutral terminology that does not suggest any prejudgment of difficult issues.”

That note of caution was echoed by some experts interviewed for this story, who agreed that the FTC should keep an open mind in its approach to surveillance pricing. Perhaps the situation calls for more emphasis on transparency rather than a blanket crackdown on all use of algorithms to personalize prices. The practice may even have advantages that the agency doesn’t yet fully understand. Through Miller, Revionics contends that it uses AI to find prices that benefit consumers as well as retailers.

“The fact that data is used in a certain way might mean that we should inform consumers about it so they will know, and they [can] decide whether they want to consent … and engage with that seller,” says Haggai Porat, a teaching fellow at Harvard Law School, who has studied the effects of algorithmic personalized pricing. “But that shouldn’t lead us to a conclusion that the practice itself is necessarily bad for consumers.”

know more

Popular

More like this
Related

This space stock registered its longest winning streak in five months

Please enable JS and disable any ad blockerknow more

Here’s how Trump can avoid a ’MAGA Civil War’ over immigration

Please enable JS and disable any ad blockerknow more

Choking hazard has plush toys recalled across Canada

CalgaryHealth Canada is asking Canadian families to check any...

Summer McIntosh wins CP female athlete of year in overwhelming vote

No explanation required was the consensus among voters for...