TV & Film

Sunny Leone launches AI avatar, warns against falling prey to deepfakes

Photos: Collected

Popular Bollywood actress Sunny Leone expressed her concern over the misuse of artificial intelligence (AI) technology, stating she had fallen prey to deepfakes for years. The actress has recently become the first Bollywood star to have her very own AI character, and she launched this avatar at a grand event in Mumbai on Wednesday. 

Sunny called her decision to have her AI avatar a 'progressive risk'. "Today, celebrities already have their AI avatars and clones made on the internet by unauthorised individuals. The cloning is anyway going to happen, so I thought, why don't I make my avatar before anyone else does it so that I can have control over it," she shared.

The actress has also issued her concern over the growing trend of utilising AI technology for making questionable videos of celebrities and women and asked them to report such incidents to the proper authority.

The deepfake issue gained attention in India when actor Rashmika Mandanna expressed her concern about a deepfake video featuring her in November 2023. She called for action, stating that the technology had been misused, as reported by various Indian media channels.

The viral video showed a British-Indian woman dressed in black inside an elevator with her face edited using AI to resemble Rashmika.

In an interview with India Today, commenting on actor Anushka Sen's morphed photos being shared online with derogatory comments, Sunny said the trend has been going on for years, stating, "These things have happened to me, but honestly, I don't think about it much. I don't let it affect me psychologically or mentally."

"But there are young girls who sometimes have to face the stigma, but they should understand it's not their fault. They did nothing wrong. If something like this happens, they can always go to the cyber cell and brief the officers about the case," the actor said.

"Tell them your identity and likeness has been misused. The police will take action. And even on social media, technical help is available to report these issues. The system is with you, you just need to do it," she added.

Commenting on the recent growing trend in celebrity deepfakes Sunny said that celebrities cannot take precautions as it all depends on the mindset of the person creating these malicious contents. The actor emphasised that everyone is trying to figure out how to work with AI.

"It is the shiny new penny at the moment, and everyone is trying new things out. More than the fear of being replaced, celebrities are worried about their likeness being misused," the actor shared during the interview.

"We saw that happening a lot last year where fake photos, videos and even voices were put out, leading to a menace... It's a menace that's been going on for a long time. It's not a recent issue as many believe it to be," Sunny pointed out.

Comments

Sunny Leone launches AI avatar, warns against falling prey to deepfakes

Photos: Collected

Popular Bollywood actress Sunny Leone expressed her concern over the misuse of artificial intelligence (AI) technology, stating she had fallen prey to deepfakes for years. The actress has recently become the first Bollywood star to have her very own AI character, and she launched this avatar at a grand event in Mumbai on Wednesday. 

Sunny called her decision to have her AI avatar a 'progressive risk'. "Today, celebrities already have their AI avatars and clones made on the internet by unauthorised individuals. The cloning is anyway going to happen, so I thought, why don't I make my avatar before anyone else does it so that I can have control over it," she shared.

The actress has also issued her concern over the growing trend of utilising AI technology for making questionable videos of celebrities and women and asked them to report such incidents to the proper authority.

The deepfake issue gained attention in India when actor Rashmika Mandanna expressed her concern about a deepfake video featuring her in November 2023. She called for action, stating that the technology had been misused, as reported by various Indian media channels.

The viral video showed a British-Indian woman dressed in black inside an elevator with her face edited using AI to resemble Rashmika.

In an interview with India Today, commenting on actor Anushka Sen's morphed photos being shared online with derogatory comments, Sunny said the trend has been going on for years, stating, "These things have happened to me, but honestly, I don't think about it much. I don't let it affect me psychologically or mentally."

"But there are young girls who sometimes have to face the stigma, but they should understand it's not their fault. They did nothing wrong. If something like this happens, they can always go to the cyber cell and brief the officers about the case," the actor said.

"Tell them your identity and likeness has been misused. The police will take action. And even on social media, technical help is available to report these issues. The system is with you, you just need to do it," she added.

Commenting on the recent growing trend in celebrity deepfakes Sunny said that celebrities cannot take precautions as it all depends on the mindset of the person creating these malicious contents. The actor emphasised that everyone is trying to figure out how to work with AI.

"It is the shiny new penny at the moment, and everyone is trying new things out. More than the fear of being replaced, celebrities are worried about their likeness being misused," the actor shared during the interview.

"We saw that happening a lot last year where fake photos, videos and even voices were put out, leading to a menace... It's a menace that's been going on for a long time. It's not a recent issue as many believe it to be," Sunny pointed out.

Comments

ভারত হাসিনাকে ফেরত না পাঠালে আইসিসির সহযোগিতা নিতে পারবে বাংলাদেশ: টবি ক্যাডম্যান

‘আন্তর্জাতিক আইন অনুযায়ী ফরম্যাল চার্জ হওয়ার পরে শেখ হাসিনাকে ফেরত পাঠানোর জন্য বাংলাদেশ সরকার যদি ভারতের কাছে অনুরোধ জানায় এবং ভারত যদি ফেরত না পাঠায়, তাহলে তার অনুপস্থিতিতেই ট্রায়ালের বিবেচনা...

২৪ মিনিট আগে