In our articles on Deepfakes we have seen these platforms, based on Artificial Intelligence succeed in imitating reality almost to perfection, both in videos and images. On this occasion we will present a new creation of the same, used unfortunately to perform illegal acts. In march, scammers used a software based on artificial intelligence to impersonate an executive director of the company German parent company of an energy company anonymous with headquarters in the United Kingdom, by deceiving his subordinate, the CEO of a spinout company, to conduct a money transfer allegedly urgent, by contacting by phone call. The CEO made the requested transfer to a provider Hungarian and was contacted again to ensure that the transaction will be refunded immediately. That also seemed credible. However, when the rebate funds still had not appeared in the accounts, came a third call of Austria, and the person that called again alleged that the executive director of the parent company requested another transfer is urgent, the CEO began to suspect. Despite recognizing that what seemed to be the voice of his boss, the CEO refused to do the transfer, realizing that something was wrong. Although the CEO recognized the accent family and intonations of the executive director, it turns out that the boss was not making the call. The funds that were transferred to Hungary, moved later to Mexico and other places, and the authorities have still not identified any suspects. By pretending to be another person on the phone, a fraudster voice can have access to private information that otherwise would not be available and can be used for nefarious purposes. The ability to feign the identity of another person with the voice it is easier than ever with the new audio tools and a greater reliance on call centers that offer services (rather than go to the bank and talk to a teller face-to-face, for example). As we improve the tools to create counterfeits, you increase the chances that offenders will utilize technology of voice-based AI to mimic our voices and use them against us. These new techniques can also affect in the world of cryptocurrencies, given that there are platforms that your customer service is by calls, by which, if they are able to falsify the voice of these people, even the owner of the project, may involve assets of users. The following two tabs change content below. I am a student of economics, interested in innovation and technological development, always faithful to that tomorrow will be a better day.
Contents