When digital space becomes a paradise for harassers
In the heart of Digital Bangladesh, where technology promises progress and connectivity, a darker reality lurks in the shadows of social media platforms and messaging apps. Women across the country often face digital violence with devastating consequences that ripple through their lives, families, and society at large.
Farzana Sithi, who became a familiar face during the July-August uprising for raising her voice against the authoritarian Awami League regime, faced severe cyberbullying in what appears to be a targeted attempt to discredit her advocacy. "They posted trimmed clips of me and turned them into weapons against me. While the officer present at the scene received a medal for 'handling the situation well,' I became the target of relentless online attacks, trolling and bullying," she recounted. The scale of the assault was staggering. "I discovered 117 fake accounts using my name and photos on Facebook. They used AI to create deepfake videos of me that gained millions of views," she shared. The most disturbing part of it was that the majority of those attacking her—questioning her character, clothes, and her very existence—were other women.
Her experience demonstrates a disturbing pattern where online violence is deliberately used to intimidate women activists and distort public narratives around their contributions to social movements.
Recent statistics paint a disturbing picture: Since 2016, gender-based case consideration has shown that 70 percent of women have faced cyberbullying, while the number for men has been 30 percent so far. A 2019 study published in the Asian Journal of Psychiatry found that cyberbully victimisation affects up to 55 percent of users globally, with Bangladesh showing a concerning 32 percent prevalence rate among youth aged 14-17 years. More troubling still, 27.3 percent of these victims developed psychiatric disorders, with major depressive disorder being significantly higher among victims compared to non-victims.
Moreover, fear of victim-blaming often prevents women from seeking justice in cybercrime cases. Dhaka University student Promiti Sankar Atri recently discovered that an unknown person had accessed her private social media photos and created a fake account to distribute these personal images without her consent. When encouraged to pursue legal action, Promiti's response revealed a deeper systemic issue, "I am going to get blamed for it and harassed with inappropriate questions instead. It would be of no help." Her hesitation reflects a common predicament faced by female victims of digital harassment, who must weigh the trauma of their privacy violation against the potential for further victimisation within the legal system. Recent data from the Police Cyber Support for Women, which was formed in 2020, reveals the scope of institutional failure. Between 2020 and April 2023, the unit received 34,605 complaints, with 26,592 from women. Alarmingly, 8,947 victims explicitly refused to pursue legal action, indicating a deep distrust in the system.
Fighting the fear and considering the hassle, when victims do pursue legal help, it often leads to dead ends. Farjana Akter, a Dhaka-based writer, shared her experience from 2022, "When I received explicit content from someone using their real account, I decided to take legal action. But the system seemed designed to discourage victims." Police initially refused to file her complaint due to jurisdictional issues, and even after she could register the case, it was an endless loop of hollow assurances. After her three-month pursuit of justice, the system's sluggishness essentially helped the perpetrator escape accountability.
Digital violence against women is increasingly amplified by coordinated campaigns of misinformation and disinformation. Initial harassment often escalates with the creation and spread of false narratives, manipulation of content, mass distribution through fake accounts, and the use of deepfake technology to create compromising content. Deepfakes pose significant security risks across international, national, and personal domains, potentially destabilising political environments, manipulating elections, and disproportionately harming women through non-consensual explicit content.
According to ActionAid's 2022 report, the consequences are severe: 65.07 percent of victims suffered psychological trauma, including depression and anxiety; 42.79 percent lost confidence in online expression; and nearly a quarter experienced a devastating loss of self-dignity.
The existing legal framework reveals critical gaps in addressing digital violence against women. There are no clear definitions for various forms of online harassment, making it difficult for victims to prove their cases. Additionally, the requirement for victims to file complaints in person, coupled with the absence of anonymous reporting mechanisms, creates significant barriers for women who fear social stigma. The absence of specific timelines for investigation and prosecution means cases can drag on indefinitely, leading many victims to abandon their pursuit of justice. Perhaps most critically, there are no provisions for emergency protection orders or immediate content removal, leaving victims vulnerable to ongoing harassment while their cases slowly move through the system.
Addressing digital violence and creating safer digital spaces require a comprehensive, innovative approach that goes beyond traditional legal frameworks. Legal reform must begin by reclassifying cyberbullying as a cognisable offence and establishing specialised cybercrime units with gender-sensitive training. Fast-track courts for digital violence cases and stricter penalties for creating and spreading deepfake content are essential steps forward.
A mandatory AI ethics training programme should be established, requiring social media and tech platform employees to undergo rigorous certification in digital consent and gender-based harassment prevention. Simultaneously, the government should develop a unified, encrypted national reporting platform powered by advanced AI tools capable of detecting and flagging potential harassment patterns, with blockchain technology ensuring evidence preservation.
Education must be a cornerstone of this strategy, integrating comprehensive digital safety and consent modules into the curricula from primary to tertiary levels. These programmes should teach responsible digital citizenship, focusing on understanding consent, recognising harassment, and promoting respectful online interactions. Complementing this, a corporate accountability framework would impose legal penalties on social media platforms that fail to respond to harassment reports promptly, with mandatory quarterly transparency reports and the establishment of a national digital ombudsman office.
To support victims, the government should create a tech-enabled support network featuring counselling resources, anonymous support groups, and real-time legal consultation channels. Free digital security audits and protection services would provide immediate assistance to those experiencing online harassment. Additionally, international collaboration will be crucial, with cross-border mechanisms for tracking and reporting digital violence, enabling the sharing of technological solutions and best practices across different jurisdictions.
The promise of Digital Bangladesh cannot be fully realised until digital spaces are safe for all citizens, regardless of gender. As we advance technologically, we must ensure that progress doesn't come at the cost of women's safety and dignity in the digital landscape. This requires not just institutional change, but a fundamental shift in how society views and responds to digital violence.
Mahiya Tabassum is a writer and journalist, and is a sub-editor at The Daily Star. She can be reached at mahiya.t16@gmail.com.
Views expressed in this article are the author's own.
Follow The Daily Star Opinion on Facebook for the latest opinions, commentaries and analyses by experts and professionals. To contribute your article or letter to The Daily Star Opinion, see our guidelines for submission.
Comments