Only one image from Samsung's new dangerous technology can create fake porn video

Only one image from Samsung's new dangerous technology can create fake porn video

Samsung's new dangerous technology can create fake porn video
Samsung's new dangerous technology can create fake porn video

Samsung's new dangerous technology can create fake porn video

The duplicate video can be created with the help of many images of any person with artificial intelligence in DeepFack. Such fake video can be shown to do something that can not be imagined in the dream of "Sahi Photo" or "Sahibiya Photo". But Sam has made this technology extremely dangerous. The DeepFake Video can now be created with just one image.
This photo can be taken from profile photos on social media. This technology is set up in Russia by Samaji Ai Lab. Samangan Ai Lab has taken the picture of Mona Lisa into real video. In which Mana Lisa is shaking her head, she is blinking and moving the neck. Apart from this, Sam has just made a video of many celebrities from a fantasy.
Samsung has named this technology realistic neural talking heads. Previously, this type of Deep-Fake Video is made of celebrities. Due to the availability of infinite images of celebrities, it was easy to make their fake video, but now new technology can take fake video by taking only one picture of any person. With this technology, blackmailers and criminal minds can make fake videos from anyone with social media as well as profile photos.

There are currently many tricks in this technology. After watching the video made from this technology, it becomes known that they are counterfeit, but soon these defects will be removed. This system will gradually remove all of these errors from learning. After which there may be a flood of fake videos on the internet.

Post a Comment

0 Comments