
By India McCarty
Instagram is beginning to test the use of AI to crack down on teens posing as adults on the social media app.
“Providing age-appropriate experiences for the billions of people who use our services around the world is an important element of what we do,” a statement from Meta, Instagram’s parent company, reads. “Understanding how old someone is underpins these efforts, but it’s not an easy task. Finding new and better ways to understand people’s ages online is an industry wide challenge.”
The statement added that AI is “one of the best tools” they have to take on these challenges, which includes young people who have lied about their ages to access the adults-only Instagram format.
Meta stated its AI program is trained to look for the type of content an account interacts with, profile information and when the account was made. Details like these can help determine if an account was made by someone underage.
If a user has been found to be misrepresenting their age, Instagram will automatically make their account a teen account, which is private by default and restrict private messages.
“We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences,” Instagram said in a press release about the teen account feature. “This new experience is designed to better support parents, and give them peace of mind that their teens are safe with the right protections in place.”
In its statement about the new crackdown on underage accounts, Meta continued, “We know that the more we do to solve these challenges, the more we’ll be able to help protect the people using our services. We hope that sharing our efforts to better understand age encourages others to help build on these solutions, and we can all improve together.”
While Meta has spoken publicly about its desire to help young people stay safe on its platforms, others have criticized the company for trying to push responsibility off on Apple’s app store.
“The simplest way to protect teens online is to put parents in charge,” Meta spokesperson Jamie Radice said in a statement. “That’s why legislation should require app stores to obtain parental consent before allowing children to download apps.”
However, Google argued that putting the onus of responsibility on them is “concerning,” adding that this is just a way for Meta to “avoid that responsibility despite the fact that apps are just one of many ways that kids can access these platforms.”
While public opinion is mixed over whether or not Meta is truly working to ensure children’s online safety, this new move to keep kids from posing as adults on their platforms is a step in the right direction.
Read Next: Meta Supports Legislation to Require Parental Approval for Teen App Downloads