Whatever you do, don’t say yes when this chatbot asks, ‘Can you hear me?’
Earlier I blogged a section from this LA Times article by David Lazarus, but the whole thing is at the link. From the article:
. . . As the scam plays out, the recorded voice will raise the possibility of a vacation or cruise package, or maybe a product warranty. She’ll ask if you could answer a few questions. Or she’ll make it sound like her headset is still giving her trouble and say, “Can you hear me?”
Don’t say yes.
Police departments nationwide have warned recently that offering an affirmative response can be edited to make it seem you’ve given permission for a purchase or some other transaction. There haven’t been many reports of losses, but a Washington State man reportedly got bilked for about $100.
A recorded “yes” could also could be used to deny refunds to any consumer who complains.
“If someone calls and asks, ‘Can you hear me?’, do not answer yes,” advised the Better Business Bureau. “Just hang up. Scammers change their tactics as the public catches on, so be alert for other questions designed to solicit a simple yes answer.”
Walker, the UC Santa Cruz computer wiz, has been teaching computers how to speak since the 1980s, when she worked as a researcher for the Natural Language Project at Hewlett Packard Laboratories in Palo Alto. She’s also done stints at Mitsubishi Electric Research Laboratories in Cambridge, Mass., and AT&T Labs in New Jersey.
Talking machines have been epitomized for years by the automated switchboards that drive most consumers crazy. But Walker said we’re seeing the next iteration of speech technology in the likes of Apple’s Siri and Amazon’s Alexa — devices that can respond to users’ requests and, to a limited extent, give the impression of conversation.
The next step, she said, will be computers that respond to voice commands to perform multiple tasks across multiple websites or platforms. For example, booking airline seats, a hotel and a rental car without a human having to look at a screen or touch a keyboard. . . .