Tag: WhatsApp groups

Liabilities of WhatsApp group admins: A critical legal analysis from Indian legal perspectives by Dr.Debarati Halder

picture courtesy : Internet

Over the years we have witnessed gradual development of internet and digital communication technology and rapid over flow of users of the same who may or may not know the digital socio-legal culture. This internet and digital communication technology that I have mentioned here, primary includes WhatsApp. When this platform started becoming popular in India since 2014-15 onwards, it also became popular platform to form opinions, disseminate news including fake news, harassing remarks for group members and other individuals who may not be group members but may be known to one or other group members. Soon Indian users could get connected with users from other jurisdictions through WhatsApp and the groups formed on the basis of WhatsApp became better connected than networks of people connected on Offline. Consider groups like law teachers’ groups, or groups formed on the basis of common interest like terrace gardeners, animal lovers, theological groups, chartered public vehicle commuters’ groups, health service providers groups etc.: members did not necessarily stay in the same locality, did not work in the same organization or may not speak the same vernacular language. But what bonded them was their common interest. This was some thing more popular than Facebook which was ruling internet during 2012-18 era. Slowly WhatsApp became more popular with specific service people like the IPS or IAS association (non official groups) and judicial officers’ groups. The popularity grew because individuals could actually control who would view their opinion and images that may have been ‘consensually’ shared by the members. It must not be however forgotten that WhatsApp has also notoriously become a platform for several online crimes including crimes against State, against individuals, cybercrimes against women[1] and children,[2] economic crimes,[3] cyber terrorism[4] etc.

Understanding stronger confidentiality setup of WhatsApp, soon workplaces and schools also started their own WhatsApp groups. Presently almost all organizations, schools and educational institutes have their respective division /unit/team-based WhatsApp groups. Some of these groups are moderated and monitored by senior members of the organization or the HR department member or the creator of the group or teachers (in case they are the creators/members of the said groups). The bright side of the story is, people can get the necessary information in their hand phones (which may include WhatsApp services) and they may not necessarily look into their mails unless it is for immediate verification necessity.  Mails now have become more official and WhatsApp groups are more personal. The negative aspect is quick circulation of offensive, harassing and unwanted contents.

Here comes the question of liabilities of three groups especially regarding creation, publication and circulation of offensive and unwanted contents. These liabilities may vary according to the age of the creators/publishers/circulators and position of the creators/publishers/circulators. ‘Position’ here necessarily means the website who is hosting the communication, the admin who is moderating or who may have created the group and general members who may be the creators/publishers/circulators of the content.  This three groups are as follows:

Let me first start with the website. WhatsApp as the web platform of the communications or Facebook as the parent company facilitating WhatsApp, may seek their excuse from any legal tangle in case of creation, circulation, publication of any offensive contents by virtue of Due Diligence clause which they exercise in almost cases of creation/circulation/publication of contents which are offensive. For this purpose, we need to understand the Indian version of Due Diligence law which can be found in S.79 of the Information Technology Act, 2000(amended in2008); the first two subclauses address the points which may be used by the websites. To summaries:

Websites or intermediaries who provide services including web hosting services, search engines etc (as per S.(w) of the Information technology Act, 2000(amended in 2008), may not be liable for any third-party activities carried out on their web platforms if such activity (which includes creation/publication/circulation etc. of any offensive, harassing etc. contents) is not initiated by the website, the website dies not select the receiver of the transmission and the website does not select or modify the information contained in the transmission. The website or the intermediary will also be excused from the third party liability in case the same has practiced due diligence as per the laws, rules and guidance as has been mandated by the Indian government. These Rules are mentioned in Information Technology (Intermediaries guidelines) Rules, 2011, which have further been suggested for amendment. [5] As such, these Rules include the following responsibilities of the intermediary or the web platform:

  • Publishing of Rules, regulations, privacy policies and user agreements which will clearly make the user understand that posting/transmission of/uploading/modification  etc of contents which may be grossly harmful, harassing, blasphemous, defamatory, obscene, pornographic, paedophilic, libellous, invasive of another’s privacy, hateful, or racially, ethnically objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatever;  harm minors in any way;  infringes any patent, trademark, copyright or other proprietary rights;  violates any law for the time being in force;  deceives or misleads the addressee about the origin of such messages or communicates any information which is grossly offensive or menacing in nature; impersonates another person;  contains software viruses or any other computer code, files or programs designed to interrupt, destroy or limit the functionality of any computer resource; threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or public order, or causes incitement to the commission of any cognisable offence or prevents investigation of any offence or is insulting any other nation. threatens public health or safety; promotion of cigarettes or any other tobacco products or consumption of intoxicant including alcohol and Electronic Nicotine Delivery System (ENDS) & like products that enable nicotine delivery except for the purpose & in the manner and to the extent, as may be approved under the Drugs and Cosmetics Act, 1940 and Rules made thereunder; (k) threatens critical information infrastructure.
  • Provide all support to the criminal justice machinery to disclose incidences of cyber security, the identity and all other relevant details of the harasser/originator of the offensive content.
  • Not to host/transmit/publish etc any information which the website management known to be illegal and offensive.
  • Provide periodic update on the policy of the web company related to users liabilities, rights and duties etc.
  • Take down reported content within considerable time of maximum 24 hours (as the draft Intermediary Guidelines (Amendment) Rules, 2018 indicates). [6]

In short, the web companies, intermediaries may not be directly liable for WhatsApp mess-ups that may be done by the individual users.

The second party which may attract the liability for publication/creation/circulation of any offensive content on the platform is the group admin. Now, let us first understand who are called as ‘group admins’:  WhatsApp provides certain features especially for group chats and this includes monitoring of the group by designated persons who are known as admins. Admins may not necessarily be the creators of the group. However, the latter may always remain as admin in spite of creation of multiple admins by him/her. Admins may have the power and authority to include and exclude members, block members, restrict the publication of comments[7] and create group policies which may be used to restrict a particular member/s in case of violation of the same.  Indian courts have in numbers of occasion, held that group admins may not be held liable for the activities of the members of the group in case the said admin had shown due diligence to restrict publication/circulation/creation of offensive comments.[8] This due diligence is however derived from the understanding of criminal law sanctions mixed with tortuous liabilities. For example, consider the followings:

  • If the group admin has not been made group admin consensually and he does not know the subject of discussion of the group, he may have a very narrow defence of being misled  by other admins/creator who forced him to join them in criminal activities like creation/publication/circulation of offensive contents which violate the existing laws of the Land.
  • If the group admin himself had not created/circulated any offensive content and had warned any user for not sharing/posting etc any content which is offensive, he may not be made liable for creating/sharing contents which may be offensive under any law if any member had even for some time (when the admin was not expected to watch/monitor the group) had posted/circulated some offensive content. But in such case, if the content falls in the category of child sexual abuse material which may be categorised under S.67B of the Information Technology Act, 2000(amended in 2008) or POCSO Act, the admin may not avail any excuse.
  • In case the group admin is a child, the question becomes tricky. If the group is specifically made by minors, the police, the prosecution and the court have to see who may have provided the basic assistance in accessing the web platform and the contents (including the offensive contents). Necessarily in such cases, courts may have to use the principles of vicarious liability because a child may not be eligible to own a SIM card unless an adult provides him the same. Here, the basic understandings of contract laws and age of maturity may be applied.[9] Now, let us see the case of the WhatsApp group of students of an elite school in Mumbai where minor students were discussing about child sexual abuse of their own female classmates:[10] parents may be made vicariously liable in such case, which actually did not take place, may be because here the parents of the accused children themselves alerted the school and restricted further violation of rights of those children who were targeted for the sexual fantasy of the adolescent boys. But here one needs to check whether personal information including images of the ‘victim children’ were disseminated unauthorizedly or not, or whether it was restricted only to the use of names. In both cases POCSO Act may be applied (in the latter case , Ss.11 (sexual harassment), 13(use of children for pornographic purposes) and S.14 (punishment for using children for pornographic purposes) of the POCSO Act may be narrowly applied.

However, if the group admin/s knowingly allow creation/circulation /publication of posts which may be offensive in nature, they may not get any excuse from the clutches law specifically made to punish the commitment of such acts like creation/circulation/dissemination of obscene images (S.67), sexually explicit contents (S.67A), voyeurism and sharing non-consensual images (S.66E of the Information Technology Act, 2000(amended in 2008) and S.354C of the Indian Penal Code, defamation (S.499, 500 Indian Penal Code), sharing information which has been restricted as seditious material under S.124A IPC or any other law which may restrict freedom of speech in the line of Article 19(2) of the Constitution of India, all of which may be read together with Ss.107 and 108 of the Indian Penal Code and S.84B of the Information Technology Act, 2000(amended din 2008)( laws related to abetment of offence ).

            Coming to the liability of the third group of users of WhatsApp, it may be seen that if a user/user create/publish/circulate any content which is offensive in nature, they may be liable as per the respective legal sanctions. However, the act of forwarding any content has also been considered as within the scope of defamation laws (under S.499/500 IPC ) or in case of online harassment of women and children, within the meaning of different kinds of offences recognised by law including voyeurism, stalking, non-consensual image sharing, indecent representation of women, child sexual abuse, grooming etc.

But the question larks on the issue of machine and artificial intelligence, which may make the admins responsible in case they may not be aware about the usage. For example, if the admin is a new user or not accustomed with the privacy and security features of WhatsApp, he may not be able to restrict certain ‘posts’ which may be published because of the machine intelligence: this may include certain words which the phone may suggest presuming the first few alphabets. He may neither be able to restrict a member which may have been suggested by the computer system of the platform and the device. Further, he might also not be able to remove certain posts which may have surfaced in the group due to resharing or forwarding by other members. Here, the group admin’s liability must be seen exclusively. Websites or intermediaries however would not be liable by virtue of the proviso clause of Rule 3 of the  Intermediary Guidelines Rules, 2011 (and also Amended draft version of 2018), which says  “……………….the following actions by an intermediary shall not amount to hosting, publishing, editing or storing of any such information as specified in subrule(2): (a) temporary or transient or intermediate storage of information automatically within the computer resource as an intrinsic feature of such computer resource, involving no exercise of any human editorial control, for onward transmission or communication to another computer resource; (b) removal of access to any information, data or communication link by an intermediary after such information, data or communication link comes to the actual knowledge of a person authorised by the intermediary pursuant to any order or direction as per the provisions of the Act.”           

As may be understood from the above, WhatsApp group admins therefore may not always claim to be immuned especially when they were aware of the group activities, they had not practiced due diligence from their side and they had published or forwarded offensive contents themselves for the wider circulation of the same.    


*Prof(Dr)Debarati Halder, LL.B.,  M.L., Ph.D(Law)(NLSIU) is the Managing Director (Hon) of Centre for Cyber Victim Counselling (www.cybervictims.org) .  She can be reached @debaratihalder@gmail.com

[1] See Halder D., & Jaishankar, K (2016.) Cyber crimes against women in India.

New Delhi: SAGE Publications. ISBN: 9789385985775 for understanding types of cyber crimes against women and laws.

[2] See Halder, D. (2018). Child Sexual Abuse and Protection Laws in India. New

Delhi: SAGE Publications. ISBN: 9789352806843, Halder D., & Jaishankar K. (2014). Patterns of Sexual Victimization of Children and Women in the Multipurpose Social Networking Sites. In C. Marcum and G. Higgins (Eds.), Social Networking as a Criminal Enterprise (pp. 129-143). Boca Raton, FL, USA: CRC Press, Taylor and Francis Group. ISBN 978-1-466-589797 for more understanding on types of cyber crimes against children.

[3] See for example, Kurowski, S., (2014). Using a whatsapp vulnerability for profiling individuals. In: Hühnlein, D. & Roßnagel, H. (Hrsg.), Open Identity Summit 2014. Bonn: Gesellschaft für Informatik e.V.. (S. 140-146). Available @ https://dl.gi.de/handle/20.500.12116/2633 Accesed on 21.01.2020

[4] See for example, Broadhurst, Roderic and Woodford-Smith, Hannah and Maxim, Donald and Sabol, Bianca and Orlando, Stephanie and Chapman-Schmidt, Ben and Alazab, Mamoun, Cyber Terrorism: Research Review: Research Report of the Australian National University Cybercrime Observatory for the Korean Institute of Criminology (June 30, 2017). Available at SSRN: https://ssrn.com/abstract=2984101 or http://dx.doi.org/10.2139/ssrn.2984101 Accessed on 20.01.2020

[5] By way of Intermediary Guidelines (Amendment) Rules, 2018

[6] The Intermediary Guidelines (Amendment) Rules, 2018 also mentions that if the intermediary has more than 50 fifty lakh users in India or is in the list of intermediaries specifically notified by the government of India, it shall:

(i) be a company incorporated under the Companies Act, 1956 or the Companies Act,2013;

(ii) have a permanent registered office in India with physical address; and

(iii) Appoint in India, a nodal person of contact and alternate senior designated

functionary, for 24×7 coordination with law enforcement agencies and officers to

ensure compliance to their orders/requisitions made in accordance with provisions

[7] For more understanding, see https://faq.whatsapp.com/en/android/26000118/?category=5245251 Accessed on 12.01.2020

[8] For example, see Ashish Bhalla vs Suresh Chawdhary & others, 2016. Accessed from http://delhihighcourt.nic.in/dhcqrydisp_o.asp?pn=242183&yr=2016 on 21.01.2020

[9] For understanding this, we need to see S.11 of the Indian Contract Act, which says minors, persons of unsound mind and persons disqualified by law may not be able to enter into any agreement.

[10] See India Today Webdesk. Schoolboys at posh Mumbai school talk about raping classmates, ‘gang bang’ in horrific WhatsApp chats. Available @https://www.indiatoday.in/india/story/mumbai-ib-school-students-whatsapp-chat-horror-1629343-2019-12-18 . Accessed on 21.01.2020

WhatsApp reporting of women and child abuse videos: The common understanding versus the reality

By Dr.Debarati Halder

Image Source: WhatsApp

Couple of days ago my friend shared an alarming news with me on Facebook about WhatsApp. It says that several cyber security think tanks including Cyber Peace Foundation are now finding out how WhatsApp groups are circulating child sexual abuse videos and how these contents are growing viral.[1] This is not an uncommon incident now. In 2015 from Centre for Cyber Victim Counselling, we had done an empirical research titled “Harassment via WhatsApp in Urban and Rural India: A Baseline Survey Report (2015).[2] This research was conducted in three cities namely Tirunelveli, Kolkata and Delhi with responders from the age group of 19-40. Even though this research did not include survey on WhatsApp groups , but it did emphasize on personal harassment or receiving of the sexually explicit images, harassing videos of others etc. Some of the findings of this report are as follows:

  • 32.8% stated that they are aware of the safety tools in WhatsApp and 42.7% said they feel it is safer than other internet communication services. 41.2% stated that they were not aware of the safety tools and 13.7% stated that they don’t feel that WhatsApp is safer than other internet communication services. 1.5% did not want to tell about their knowledge of awareness regarding safety tools in WhatsApp and 11.5% did not want to tell about their feelings whether WhatsApp is safer than other internet communication services. 24.4% stated that they have heard about the safety tools in WhatsApp but have no direct knowledge about it. 32.1% stated that they have heard about other internet communication services, but they do not have direct knowledge, whether WhatsApp is safer because they do not use other services.
  • In answer to the question whether they had received any sexually explicit or obscene images including videos/images of rape, sexual abuse of women or children or men or LGBT people etc, among the 131 respondents, 11.5% stated that they had received sexually explicit or obscene images, 51.9% stated they did not receive such images and 2.3% did not want to answer. 34.4% stated that they are not aware of being targeted with such images because they do not use WhatsApp or have stopped using the services.[3]

This suggests that WhatsApp had been a “chosen platform” by predators since long.

 But why WhatsApp has become more dearer to predators than other social media websites like Facebook or Instagram especially for those including pedophiliacs or persons who  create and circulate abusive videos including sexual abuse videos of women  ?  Let’s have a reality check about WhatsApp here:

  • What is WhatsApp and how it works: As we had mentioned in the research report, WhatsApp messenger was started approximately in 2009 in the US by Jan Koumand Brian Action as a “better SMS alternative” (WhatsApp, 2014) and it is available for iPhone, Blackberry, Android, Windows phone, Nokia etc. This app uses the user’s phone number as the basic verification mode and it does not support calls via VoIP (Schrittwieser,Fr¨uhwirt, Kieseberg, Leithner, Mulazzani, Huber, & Weippl, 2014). Some of the basic features of WhatsApp include status update, profile picture update, uploading of address book (Schrittwieser, et. al., 2014), options to create/join groups (Terpstra, 2013), updates about location, uploading and circulating photos and videos and voice recordings. Typically WhatsApp verification may include a three stage procedure which involves (i) logging on to the download page of WhatsApp @ https://www.whatsapp.com/download/ and clicking on the chosen device icon and start downloading; (ii) the server then sends a 4-digit PIN number by SMS to the prospective user’s phone by SMS for verification and authentication (Schrittwieser, et. al., 2014), (iii) the user copies the code to the WhatsApp’s application graphical user interface (GUI) and after cross checking by the WhatsApp server the app gets activated on the phone of the user (Schrittwieser, et. al., 2014). Once connected with WhatsApp, the user can get the information about other WhatsApp users by simply checking his/her phone address book or call log history or Gmail address book. This is because WhatsApp may access the user’s contact list or address book to keep track of other mobile phone numbers who use the WhatsApp services and may store this information on the server (WhatsApp, 2014, see sub- para B in Para 3) to get people connected instantly, profile pictures of other users and one WhatsApp user may get instantly connected to others through the server.[4]
  • How do users create network on WhatsApp and how the groups may be busted?

After downloading the app and activating the same, the user may get connected to his friends or likeminded people by doing a simple search in his phone address book. Other numbers with WhatsApp applications may show up. Users may choose to circulate their messages in several ways through WhatsApp :

  • By using broadcasting feature whereby a single text/audio visual  message may be conveyed to a batch of people : The Boradcasting list may be created as below:
Image Source: WhatsApp

  • By forwarding the message to maximum five recipients at one time. Now, this “forwarding” may reach a wider recipient list if it is done in a group.  WhatsApp group can be created  by any individual by going to the chat tab and creating a new group. The image below may explain how groups can be created on WhatsApp:
Image source : WhatsApp

Interestingly, WhatsApp groups can be private or be public as well. Most of the groups who circulate images /contents of sexual abuse including  for self-gratification or group gratification, may keep their group private so that the group may not be disturbed by any 3rd party monitoring authority including the police. These group members generally may have a mutual understanding and trust whereby the contents shared by them would not be reported outside.  The members may necessarily download /save the sexual abuse/harassment videos/contents in their own devices  for individual gratification or for unethical gaining by further circulation as well. The end to end encryption by WhatsApp may make it more favorable for such group members to widely discuss and circulate such contents.

Public groups on the other hand are more open groups where people may join for discussions and it may not necessarily private for those whom the admin/s have invited or made them join. Unlike the private groups, public groups may be monitored if any  third party monitoring authority joins the discussion in disguise or any other group member decides to bring in the police or other monitoring stakeholders. In both these cases, admin’s responsibilities have been scrutinized by courts in India. The recent report suggests that the courts have held  responsible for allowing to spread seditious, inciting messages.[1]  WhatsApp group members and admins have also been booked for creating /circulating child sexual abuse materials for sexual gratification.[2]

  • What if the group admin is an underage user?

It is important to know the age barrier about WhatsApp users. There are infact not two, but three options given by WhatsApp. Lets check it:

  1. The minimum age criterion for European region including European Union countries is 16.
  2. For other countries the, the minimum age criteria is 13 unless the domestic laws of the said countries have fixed a higher age for using of WhatsApp.[3]
  3. Overlooking both, a child can use the WhatsApp services of the parents if the parent allows the child to use the services under his/her monitoring.

This in fact shows that a child may use WhatsApp, may create his/her own profile and may create contents him/herself for private or public sharing on WhatsApp with whoever he/she wants. 

  • What happens to the producer/distributor of the offensive contents?

In broader understanding, the child is legally permitted to create content  which he/she thinks can be circulated. Now, this has been a question for several courts : when a child is creating a sexting content and circulating the same with fellow children (including his/her boy/girl friend ),  how the courts (and the laws )would treat him/her ? Is he the perpetrator? Is he the victim? Or is he a ‘child’ with no liabilities?[4] S.67B of the Information technology Act, 2000(amended in 2008), Ss. 13 and 14 of the Protection of children from sexual offences Act, 2012 clearly mention that “whoever’ creates, circulates, produces etc  contents depicting children in sexually contents may be penalized. These cane be considered non-baliable, which would suggest that the punishment can be heavier.  Similarly, Ss. 67 and 67A of the Information Technology Act, 2000(amended in 2008) also penalizes ‘anyone’ who creates, distributes etc  sexually explicit and obscene materials. S. 354C of the Indian Penal Code also touches upon penalizing men who  private images of woman who would not consent for sharing such contents with third parties . S.375 and 376 of the Indian penal Code also touches upon capturing rape videos and storing or circulating the same. These offences can also be non-bailable and can have heavier punishments.

The contents that the children would have created also carries significance: if a child creates a sexting video or sexual abuse video or a non consensual porn image/content or  even a revenge porn content and sends it to his friend/s, the recipient may decide not to receive the content if from the look at the content or the text attached with it, the recipient feels that it should not be opened or should not be further circulated because it contains ‘bad stuff’.  WhatsApp is smart enough to have created limited policy guideline and security feature whereby one can report his/her child who may be using WhatsApp without parental guidance  and the parents feel that the child may be doing /victimized due to illegal /risky contents and connections.  It says

“If your underage child created a WhatsApp account, you can show them how to delete their account. You can learn how to delete an account in our Help Center.If you’d like to report an account belonging to someone underage, please send us an email. In your email, please provide the following documentation and redact or hide any unrelated personal information:

Proof of ownership of the WhatsApp number (e.g., copy of government-issued identification card and phone bill with the same name)

Proof of parental authority (e.g., copy of birth or adoption certificate for the underage child)

Proof of child’s date of birth (e.g., copy of birth or adoption certificate for the underage child)

We’ll promptly disable the WhatsApp account if it’s reasonably verifiable that the account belongs to your underage child. You won’t receive confirmation of this action. Our ability to review and take appropriate action on a report significantly improves with the completeness of the information requested above.[5]

Removal /deactivating of the said account is however at the discretion of WhatsApp especially when they would not be reasonably convinced .

But in case the reporting individual is not the parent of the child who may be doing illegal stuff  or who may be a potential victim, WhatsApp suggests to contact the parents of the child.

For adult wrong doers, WhatsApp has a typical formula which is followed by almost all social media companies : they would suggest to block the number so that the user of that particular number would not be able to contact the blocker  unless the earlier is being unblocked . Here is what WhatsApp suggests regarding how to block a number:

Image source : WhatsApp

  • The producer/distributor of the offensive content has been arrested. What about the offensive image?

The above information would not serve much purpose for blocking /reporting of the content unless the same is considered as an offending  subject through a police report. In such case, the said content may be made disabled from their own server, but they would rather work like email or SMS and would not access individual devices to dig out the offensive content to block and disable it. In such case, even if the persons (owning the WhatsApp numbers and profiles) may be blocked, the contents may keep on circulating unless these have been ‘ordered ‘ to be disabled from the server.  This is how the objectionable contents float from one device to another and reach out to millions after the original sender may have deleted from his device to save himself or he may have been arrested by the police.

Nothing but a police report about the said content therefore could be the best answer for blocking the content from being further circulated. But a few things can not be ignored when this is suggested: the police must act accordingly to make WhatsApp delete the content from its server and block the circulation whenever it appears on WhatsApp from which ever device. But this may become a herculean task especially when the police and the courts  may feel  challenged due to lack of infrastructure and proper laws. As long as this does not take place, WhatsApp users have to be responsible enough to not to circulate such contents even if they receive it from known or unknown numbers. Not to be forgotten, the police may arrest individuals who may store child sexual harassment videos /images unknowingly as well. But the unfortunate fact is this may not be the same for adult sexual abuse cases. But if the users use WhatsApp responsibly, the problem may definitely be address.

Please note : Do not violate copyright of this blog. If you would like to use information provided in this blog for your own assignment/writeup/project/blog/article, please cite it as “Halder D. (2019), ” WhatsApp reporting of women and child abuse videos:  The common understanding vs the reality”  29th April, 2019 , published in http://debaraticyberspace.blogspot.com


[1] See WhatsApp ‘admin’ spends five months in an Indian jail. Published in https://www.bbc.com/news/technology-44925166 Accessed on 22.04.2019

[2] See Sandhya Nair (2018) WhatsApp group sharing child porn busted, 5 held

Published in http://timesofindia.indiatimes.com/articleshow/65263327.cms?utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst Accessed on 22.04.2019

[3] For more information see https://faq.whatsapp.com/en/general/26000151/?category=5245250

[4] Halder, D., & Jaishankar. (2013). Revenge Porn by Teens in the United

States and India: A Socio-legal Analysis. International Annals of

Criminology, 51(1-2), 85-111. ISSN: 00034452 (UGC Listed Journal)

[5] See https://faq.whatsapp.com/en/general/26000151/?category=5245250


[1] Cuthbertson Anthony (2019). WHATSAPP IS HOTBED FOR CHILD SEX ABUSE VIDEOS IN INDIA, STUDY FINDS. Published in https://www.independent.co.uk/life-style/gadgets-and-tech/news/whatsapp-child-sex-abuse-videos-groups-india-a8885811.html?fbclid=IwAR251ajPe20Y7zcXtD2o1s0w–86-Pr5UrKHVgv7IF_7swAH_dvEGQTzcZQ on 26th April, 2019. Retrieved on 26th April, 2019

[2] Halder, D., & Jaishankar, K. (2015). Harassment via WhatsApp in Urban

and Rural India: A Baseline Survey Report (2015). Tirunelveli, India:

Centre for Cyber Victim Counselling. Available @ https://www.cybervictims.org/CCVCresearchreport2015.pdf Retrieved on 27.04.2019

[3] Ibid

[4] See pp 2 in ibid