Future Tense

Amazon Is Suggesting “Frequently Bought Together” Items That Can Make a Bomb

Amazon suggests you buy a lot of things, but it’s proably best not to follow the advice. 

David Ryder/Getty Images

When you go to Amazon to buy something, Amazon will suggest other products that are “frequently bought together.” This feature can be practical—if you’re buying a Swiffer Wet Jet, it’s probably wise to pick up the cleaning fluid it uses, too—or just strange. But as an investigation from a British television station pointed out Monday, sometimes Amazon’s suggestions can amount to a deadly combination.

A team from Channel 4 News found that Amazon has been prompting customers looking to buy “a common chemical used in food production” to also purchase other ingredients that, together, could be used to produce black powder. (The report did not specify exactly which ingredients, for obvious reasons.) Further down the page, according to the report, Amazon also nudged the customer to buy ball bearings, which can be used as shrapnel in homemade explosives.

Amazon has responded by saying it’s reviewing its website to make sure that products are “presented in an appropriate manner.” Still, the report comes at a time of heightened fear of terrorist attacks in the U.K. On Friday, a homemade bomb left in a plastic bucket detonated on a crowded train car in London, injuring 30 people. It was the fifth terrorist attack in Britain this year.

Though the “frequently bought together” items on Amazon aren’t illegal on their own, the Channel 4 News report noted that there have been successful prosecutions in the U.K. against people who buy chemicals that can be combined to make a bomb.

Amazon’s “frequently bought together” suggestions are generated algorithmically. Which puts Amazon a growing list of major tech companies currently under fire for relying on algorithms that surface troubling results. On Sept. 14, ProPublica released a report detailing how Facebook lets advertisers target ads directly to people who describe themselves as “Jew haters” or are interested in the topic of “how to burn Jews.” Slate took the investigation further, finding Facebook suggested we send ads to users interested in topics like “killing bitches,” “threesome rape,” and “killing Haji.” BuzzFeed also found that Google’s ad targeting tool suggested ad buyers consider targeting users with topics like “black people ruin neighborhoods” and “Jewish control of banks.” Google and Facebook have both said that they are working to change how they let advertisers target users.

The whole point of algorithms is that they work on their own, so that Amazon doesn’t need to hire a person to sit there and think of items that might go well together. But that hands-free design doesn’t excuse Amazon, or any other tech company, from keeping a close eye on how its automated systems work in practice.