News

Security Stop-Press : Warning Over Amazon’s Human Voice Mimicking Plans For Alexa

A Global Cybersecurity Advisor at ESET has warned that Amazon’s plans to enable the Alexa voice assistant to mimic human voices (dead or alive) could be used to launch deep fake audio attacks on some voice authentication security systems. The advice from some security experts is that, if Amazon goes ahead with voice mimicking for Alexa, it may be wise to switch from using voice authenticate e.g., for bank accounts to another verification method such as online banking via a smartphone.


Don’t take our word for it, see what are our clients say

“Many thanks again for all of your work getting us up and running in the cloud and getting us organised, another happy customer.”

- Karen Cotton -