News

Security Stop-Press : Warning Over Amazon’s Human Voice Mimicking Plans For Alexa

A Global Cybersecurity Advisor at ESET has warned that Amazon’s plans to enable the Alexa voice assistant to mimic human voices (dead or alive) could be used to launch deep fake audio attacks on some voice authentication security systems. The advice from some security experts is that, if Amazon goes ahead with voice mimicking for Alexa, it may be wise to switch from using voice authenticate e.g., for bank accounts to another verification method such as online banking via a smartphone.


Don’t take our word for it, see what are our clients say

It has been a refreshing experience. We received trusted advice throughout our cloud, security and IT services overhaul. Always on the end of the phone, resolving issues quickly and without fuss.

- Karen Cotton -