Deepfakes That Look Exactly Like Real Are On Their Way, Says Hao Li

Deep fake technology pioneer Hao Li has told to CNBC this Friday that perfectly real manipulated videos and images will become accessible to all the people within half year to one year.

Li is an associate professor at the University of Southern California under the department of computer science. He said that currently most of the Deepfakes can be recognized with a naked eye. However he also added that there were also ones which were very convincing and said that it required great effort. Deepfake is the process wherein computers along with machine-learning software are used for manipulation of videos or any digital representations such that they look exactly like real.

Many concerns have been raised about this technology regarding the confusion caused by these creations as well as the spread of misinformation especially with regard to global politics. There have been cases of spreading of misinformation through social media apps. When asked by CNBC, about the recent developments Li said that he has recalibrated his timeline due to the emergence of the Chinese Zao app and also the growing need for research focus. He added that what was remaining was the training with more of data and finally its implementation.

Li said that this technology is going to take things to that point where in detection of Deepfakes will no longer be possible and that other sort of solutions should be used. He explained that to detect Deepfakes one has to know about its limitations first. He further added that if AI framework to detect perfectly real Deepfakes has to be built then this type of technology should be used for their training. So it is almost impossible to detect them if one does not know about its working.

Li has said that the entertainment and fashion industries will benefit from Deepfake technology. Video conferencing efficacy can also be increased using this. The main problem lies in detecting those videos which are made with bad intention either for deceiving people or for any harmful consequence.

Add a Comment

Your email address will not be published. Required fields are marked *