🗣️ Transcrição automática de voz para texto.
many of us have received dodgy emails asking for money promising things to good to be true now artificial intelligence could be used to try and dup us a deep fake is an AI produced copy of a human being and this technology makes fraud of a new kind very very possible One bank is fighting deep fakes with deep fakes for scammers it is a powerful tool they can use to steal your [Music] money welcome to this edition of scitec from London where we’re delving into the rise of deep fakes video audio and pictures that appear to be real but are not just like that introduction there that wasn’t me in fact it was my avatar created using artificial intelligence would you be able to tell the difference my avatar was made with my consent through a platform run by Synthesia it produces AI generated content for legitimate purposes like training videos marketing and customer service with strict safeguards in place we will never generate an avatar of someone without their explicit consent the other one is control we apply content moderation on the videos that are created and the third is around collaboration so working with industry groups working with uh governments and policy makers to make sure that this technology is developed and Ed responsib [Music] deep fakes that have malicious intent copy the image of real people from photos videos and audio to misrepresent them or an organization Professor Michael waldridge an AI expert at the University of Oxford says the technology is rapidly improving if you can take it to the point where you can be in a call with somebody that you think is your boss looking at somebody’s video and it’s an AI produced video on the other end then it starts to become very difficult to know what is real and what is fake in a survey published in July just over 40% of respondents age 16 plus said they’d seen a deep fake in the last 6 months but less than one in 10 were confident in their ability to spot One One bank has now launched a campaign deploying its own deep fake content to raise a awareness of the dangers as you might have guessed this isn’t me this is a deep fake created to warn you about deep fakes technology companies are also tackling deep fakes this new phone feature pits AI against AI helping users to spot what’s real and what’s not the on device AI carries out a facial scan of a person in an image or a video the algorithm analyzes things like the composition whether or not there are any pixel imp Perfections or whether or not there are any anomalies to the facial features of the person in that image or video and crucially it allows the user to see in real time whether that person is real let’s take a look at an example specifically so if we have a look at this this lady what are your thoughts here real or fake that’s difficult to tell isn’t it I’m not sure I would know whether that was a deep fake or not let’s have a look if I click on the magic capsule here you can see right away that the on device AI is doing its work it’s carrying out a facial scan to determine if this person is real and within seconds we have an answer here we can see that a suspected face swap has been detected and the likelihood of this person not being real is 93% and this alerts the user that firstly the person is not real but secondly and crucially they could be at risk of being caught up in some sort of scam or fraud so basically now you’re a technology company fighting technology well ultimately for us this is about uring the privacy and security of our users I think we all know that technology has been developing at quite a pace for some years now thanks in part to Ai and inevitably there will be people who are determined to exploit that through the creation of so-called deep fakes and on an industry level I think there’s an onus on all of the key players to play their part in stemming the rise of deep fix moving forward fraud is not the only concern deep fakes are also fueling intimate image abuse online one victim whose real identity is protected by this Avatar was deep faked by a former friend I felt confused and taking back a documentary about this kind of deep fake abuse spawned a campaign group now demanding tougher regulations to block publishing sites I have spoken to over 30 survivors of this and in every single case their lives have shrunk it has had an impact on their job on their career on their sense of their body on their sense of their identity the stakes could not be higher [Music]