
Social media influencer and TikToker Alina Amir recently found herself at the centre of an online storm after an alleged MMS video, falsely attributed to her, began circulating across social media platforms. The four-minute clip was widely shared, accompanied by misleading claims, prompting speculation, gossip, and a surge of suspicious ‘watch full video’ links. Alina has since made it clear that the video is a deepfake, created using artificial intelligence to impersonate her.
View this post on Instagram
Alina is a popular online content creator with a huge following. The trouble began when rumours started circulating about a leaked video of Alina. Within days, the video went viral on the internet, with messages and links promising access to the ‘original MMS.’ Clicking on many of these links could lead to dangerous websites, increasing the chances of online fraud.
After a week of silence, Alina finally broke her silence with a video message on Instagram. She condemned the fake videos and asked people to stop posting unverified videos. In a moving appeal, she asked Punjab Chief Minister Maryam Nawaz to take strong action against people posting deepfake videos.
View this post on Instagram
Alina also appreciated the Punjab Cyber Crime Department for intervening in the matter and pointed out that these attacks are not only happening to influencers but to everyone. She said that the fake videos are being sent to the families of the victims, and it causes brutal emotional trauma to them.
Deepfake technology has heightened the risks of online harassment. Scammers only need a few images to create very convincing videos that are intended to humiliate, blackmail, or silence women. Amir may have a voice, but countless women are subjected to the same kind of harassment.
Cybersecurity professionals have always warned against clicking on suspicious links, especially those that contain leaked or viral videos. These links are often used to deliver malware or steal users’ personal information, as well as grant hackers access to banking apps and social media accounts. In most cases, users end up losing money in pursuit of something that never existed.

Links with blurry thumbnails, exact dates, clickbait headlines, or pressure to act quickly are usually a trap. Typos in URLs, weird domains, and requests to download files or log in should raise red flags immediately.
The alleged MMS video linked to Alina Amir has been confirmed as fake by the influencer herself. She stated that the clip circulating online was created using AI deepfake technology and does not feature her. There is no authentic or original MMS video, and all links claiming to show one are misleading.
Deepfake technology allows scammers to create highly realistic fake videos using only a few images. These videos are often used to harass, blackmail, or humiliate women, causing severe emotional trauma.
No, there is no genuine MMS video of Alina Amir. All circulating videos are confirmed deepfakes and misleading, as stated by the influencer herself.
Alina Amir publicly denied the video and urged authorities to take strict action against those creating and circulating deepfake content. She also raised awareness about cyber harassment and encouraged victims to report such incidents immediately.
Yes. Sharing, forwarding, or promoting deepfake or morphed videos can attract legal action under cybercrime, defamation, and IT laws. Even unknowingly sharing such content can result in serious consequences.
ALSO READ- 19-Minute Viral Video Update: Sofik SK And Sonali Dostu Full Controversy Explained
Keep reading Herzindagi for more such stories.
Image Courtesy: Instagram
Also watch this video
Herzindagi video
Our aim is to provide accurate, safe and expert verified information through our articles and social media handles. The remedies, advice and tips mentioned here are for general information only. Please consult your expert before trying any kind of health, beauty, life hacks or astrology related tips. For any feedback or complaint, contact us at [email protected].