ABikerSailor
Diamond Member
Okay, yeah, I get it, computer brains are capable of making algorithms that are capable of working on their own, and quite often can be useful, because they can recommend products that the consumer might buy in addition, but there has to be a limit, or at least some kind of human supervision to check on the programs to make sure crap like this isn't happening.
It seems that if you buy something on Amazon, there is an algorithm on their website that will recommend bomb making materials if you buy stuff as innocent as cooking supplies.
Amazon ‘Reviewing’ Its Website After It Suggested Bomb-Making Items
LONDON — Amazon said on Wednesday that it was reviewing its website after a British television report said the online retail giant’s algorithms were automatically suggesting bomb-making ingredients that were “Frequently bought together.”
The news is particularly timely in Britain, where the authorities are investigating a terrorist attack last week on London’s Underground subway system. The attack involved a crude explosive in a bucket inside a plastic bag, and detonated on a train during the morning rush.
The news report is the latest example of a technology company drawing criticism for an apparently faulty algorithm. Google and Facebook have come under fire for allowing advertisers to direct ads to users who searched for, or expressed interest in, racist sentiments and hate speech. Growing awareness of these automated systems has been accompanied by calls for tech firms to take more responsibility for the contents on their sites.
Amazon customers buying products that were innocent enough on their own, like cooking ingredients, received “Frequently bought together” prompts for other items that would help them produce explosives, according to the Channel 4 News.
Considering the way the world is right now, do we REALLY need companies writing algorithms that help people build home made explosives? Even if they don't buy the stuff from the company, they still get a really good idea how to build them because they get a list of the ingredients.
It seems that if you buy something on Amazon, there is an algorithm on their website that will recommend bomb making materials if you buy stuff as innocent as cooking supplies.
Amazon ‘Reviewing’ Its Website After It Suggested Bomb-Making Items
LONDON — Amazon said on Wednesday that it was reviewing its website after a British television report said the online retail giant’s algorithms were automatically suggesting bomb-making ingredients that were “Frequently bought together.”
The news is particularly timely in Britain, where the authorities are investigating a terrorist attack last week on London’s Underground subway system. The attack involved a crude explosive in a bucket inside a plastic bag, and detonated on a train during the morning rush.
The news report is the latest example of a technology company drawing criticism for an apparently faulty algorithm. Google and Facebook have come under fire for allowing advertisers to direct ads to users who searched for, or expressed interest in, racist sentiments and hate speech. Growing awareness of these automated systems has been accompanied by calls for tech firms to take more responsibility for the contents on their sites.
Amazon customers buying products that were innocent enough on their own, like cooking ingredients, received “Frequently bought together” prompts for other items that would help them produce explosives, according to the Channel 4 News.
Considering the way the world is right now, do we REALLY need companies writing algorithms that help people build home made explosives? Even if they don't buy the stuff from the company, they still get a really good idea how to build them because they get a list of the ingredients.