News

Security Stop-Press : Warning Over Amazon’s Human Voice Mimicking Plans For Alexa

A Global Cybersecurity Advisor at ESET has warned that Amazon’s plans to enable the Alexa voice assistant to mimic human voices (dead or alive) could be used to launch deep fake audio attacks on some voice authentication security systems. The advice from some security experts is that, if Amazon goes ahead with voice mimicking for Alexa, it may be wise to switch from using voice authenticate e.g., for bank accounts to another verification method such as online banking via a smartphone.


Don’t take our word for it, see what are our clients say

As key project members in many of our IT projects whether, actively assisted or simply advised what they do not know about IT, in our opinion isn’t worth knowing!

- Nadia Mullins-Hills -