Meta Searching For Human Exploitation Counsel

Meta Searching For Human Exploitation Counsel

Meta Seeking Counsel to Combat Human Exploitation Risks

Meta’s stock soared today in the market making record gains of +17.59%, its highest single-day rebound in the past nine years. Although Meta’s quarterly earnings beat expectations, the company missed revenue goals, citing international relations and Apple’s updated IOS privacy policy as reasons for not meeting estimates. Facebook’s daily active users increased after a long slump and now totals 1.96 billion. Although Meta’s metaverse division lost $3 billion in quarter one, the company is committed to moving forward in the virtual reality space.

Within the past month, Meta posted a job listing seeking an Associate General Counsel for Human Exploitation and Emerging Risks Compliance. This individual will assist the legal team in advising Facebook’s safety compliance program as pertaining to human exploitation risks (trafficking) and child safety. Platforms must find ways to prevent vulnerable individuals from being groomed or exploited by predators and traffickers in the virtual world. The need to educate users about the dangers of human trafficking is greater than ever and companies can no longer feign ignorance to the idea that their products can be playgrounds for heinous crimes.

Digital spaces give easy access to predators, especially those seeking to groom children as reported in instances across platforms like Instagram and Discord. While users of the metaverse must be 18 years and older, all a child needs is their parent’s VR headset and Facebook account to participate. The video-game like atmosphere of the VR world is attractive to children who are often unable to realize that a virtual avatar represents an actual human stranger. According to the FBI, 50% of victims of online sexual exploitation are between the ages of 12 and 15. A record 89% of sexual advances towards children occur in chat rooms and instant messages.

There are a number of ways children can be targeted in the metaverse. A predator may disguise their voice and pretend to be the same age as the child. They may buy the child gifts in the metaverse to flatter them and gain their interest. They can pressure them to send explicit photos or videos of themselves and insist that they meet in person. Upon refusal, the predator can use blackmail or threaten the child’s family in order to compel them to do as they say.

Child grooming is not the only concern. Grooming can occur towards any vulnerable person at any age, leading to coercive and exploitative situations in the real world. And now with metaverse’s virtual avatars, users can also run the risk of virtual sexual harassment, as experienced by Nina Patel who in December 2021 detailed her experience of being gang raped within 60 seconds of entering the metaverse.

Companies have an obligation to users to keep them informed and safe from human rights violations in the virtual world, especially when online scenarios intended to increase human connection run the risk of opening the virtual door to real life harm.

Back to blog