SAN FRANCISCO – Social media was not designed for children. Facebook was originally created for Ivy League students, Instagram grew out of its founder’s love for bourbon, and YouTube started out as a video dating site.
But teens are active on social media, and many children under 13 already socialize online. They build worlds together in Minecraft, FaceTime with friends, and send texts and emoji through tools like Facebook Messenger Kids. But they also use apps and browse the internet that was not designed for them.
Before letting kids fall in love with TikTok, fall for YouTube’s holes, or start their own Instagram accounts, parents need to figure out which social media is right for their family. This is a complicated question, especially when other forms of socialization for children are still pending in many parts of the country.
Adding to the conundrum, companies are increasingly making tools specifically for younger internet users, who are old enough to type words on a smartphone or computer, but too young for social media applications. existing. There are already YouTube Kids and Facebook Messenger Kids. Now Facebook is working on a version of Instagram specifically for kids under 13.
In the United States, federal laws limit the tracking and targeting of people under the age of 13, which companies have often bypassed by using low age verification. To access popular sites and apps, kids can borrow an adult’s account, have their parents create one for them, or lie about their age and create their own. Or in the case of YouTube, just open it in a browser, maybe even on a school-provided Chromebook.
With an imminent threat of privacy regulation, growing competition for younger users, and a desire to hook kids up to an online ecosystem before they enter college, social media companies are diversifying. Here are some questions parents should ask themselves before registering their children:
What are the biggest worries about leaving kids on social media?
According to Titania Jordan, parenting manager at online monitoring company Bark, the main concern of parents about allowing young children on social media is exposure to sexual content and predators.
She worries that giving children an on-screen alternative to in-person interaction is a bad idea, regardless of the precautions included. Screen time issues, however, have been put on the back burner by many during the pandemic as parents and children have more urgent things to worry about and fewer options for in-person socialization.
Not all online interactions are the same. While some parents may be fine with text communication, something like Instagram would raise different issues. A social experience based on photos might affect self-esteem and mental health more than just one-on-one texts.
Experts we spoke to are especially worried about the companies behind these apps and, in the case of Instagram’s plans, are bristling with Facebook’s track record. “Facebook’s priority is not to protect children; it’s a for-profit business that seeks to monetize time spent, ”Jordan said.
Common Sense Media CEO Jim Steyer agrees: “It’s basically Facebook coming back in its old bag of tricks to make young children addicted when they are most vulnerable.
Why are tech companies creating apps for young children?
Kids are one of the next big untapped online markets, and big tech companies may be interested in attracting people before the age of 13. It requires making a product that parents approve of, so they don’t worry about issues like predators or radicalization.
“At Disney, we called it Cradle to Cane. If you have a kid who’s excited about the Disney brand – excited about princesses at 3, 4 and 5 – and you can keep that commitment… you’ve created a lifelong attachment, ”said KC Estenson, former Disney executive and current CEO of GoNoodle, an app that creates videos, music and games for young children.
Lawmakers are also under increasing pressure to regulate how big tech companies track and manage young users. By creating apps that claim to be more secure on their own, companies like Facebook might try to push back against any additional laws that require them to be even stricter on things like data collection.
What features should I look for before letting my kids sign up for a social media app?
If you are considering a social network for your child, there are some features and policies that you should check out first. Jordan recommends looking for any ephemeral features that make it harder to monitor communications like a disappearance mode, or in the case of Instagram, its Stories feature, which removes posts after 24 hours.
Review direct messaging features and make sure only approved contacts can communicate with your child. Look for options that allow a parent to approve contacts, like in Facebook’s Messenger Kids. Check out the parental monitoring features and see how much control you would really have – and if your child can turn them off without notifying you.
“Ask, is the app specially designed for children?” If not, you should be totally on alert, ”said Steyer, whose nonprofit advocacy group Common Sense Media reviews children’s content.
He recommends looking at an app’s business model to avoid anything based on targeted ads, and be wary of companies that make their money through in-app purchases. See if there is an associated adult app, like with Messenger Kids, and ask yourself if the kids version is just a way to sign up users and get them to the main site when they are enough. old, said Steyer.
Look beyond the safety promises to see how much data an app collects on your child. Does it track or share the geolocation of a device? If so, take a look at how many of these settings you can turn off and avoid anything that won’t allow you to opt out.
What if I decide to let my kids use adult or kid versions of social media, then what?
Have an honest conversation with your kids about what to watch out for online, including bullying, predatory behavior, and inappropriate content. Also, closely monitor their mental health. Bark, which is used to track the online activity of 5.4 million children, says an annual survey showed Instagram was frequently reported for suicidal ideation, depression and body image issues.
Don’t make social media education a one-off conversation, either. “Get to know the platforms they are on, the games they play, the people they follow, and change those things regularly. Kids can have a structured adult conversation about what’s right and wrong, ”Estenson said.
Why not just keep them offline?
Eliminating phones and computers, banning screen time and video games, and banning social media are also options. But the pandemic has shown us that children, with the right advice and a little space, can find their fulfillment and friendships online. If you are okay with them being on the internet in one way or another, the next step is to prepare it for them.
“We need to build an online world for them; it takes a lot of people to do it, ”said Estenson. “We have to try to do it with noble intention, not just to make money. We need it because the children are already here.
What is Instagram for Kids and does it even exist?
Last week, BuzzFeed News reported that Facebook was working on a children’s version of Instagram, the popular photo-sharing app purchased in 2012. The new app, announced internally by the company, would be targeted specifically at users of Instagram. 12 years old and under. Officially, Instagram is only for people 13 and over, but there is no strict age verification and many younger children have their own accounts, often with permission from their parents.
There is no release date for the Instagram app for kids. In a statement, the company said children are asking to “follow” their friends, which is why it is working on additional social media apps to be “kid friendly, run by parents.” Instagram recently hired Pavni Diwanji, a Google executive who oversaw the development of YouTube Kids.
How would that be different from classic Instagram?
There aren’t many details on what a kids’ Instagram would look like or what would make it different or safer than Instagram for adults. But some clues can be found in a recent blog post from the company.
Instagram last week described some ways it was trying to make its main app more secure for teenage users, including using artificial intelligence to make age checks more accurate and harder to tamper with. He added a restriction that prevents adults from sending messages to users who have said they are under 18, unless the youngest is already following them. The company is also adding teen safety advisories when it detects that an adult is acting suspiciously, such as sending mass messages to younger ones.
Instagram has also experimented with minimizing likes on photos. Limiting this type of comments could be the key to an Instagram offer for under 13s, who want to avoid the type of FOMO and the pressure to look good that is common on the main app.
We can also look at Facebook’s existing product for kids, Facebook Messenger Kids. Released in 2017 following a flood of reviews, the app encountered a few early issues that the company addressed and is now widely used without much attention. It does not require children to register on Facebook and is controlled through a parent’s account.
Do kids even want to use apps designed for their age group?
Well-designed children’s apps that prioritize privacy and have strict safeguards to protect young users from stalkers and predators may seem good to parents, but not always to target users. Many children might prefer the less restricted adult versions and find ways to access them. YouTube Kids, for example, was this company’s attempt to create a safer space away from the problematic and wild world of regular YouTube, but kids of all ages still flock to the main site. And TikTok, which doesn’t have an option for under-13s in the US, is hugely popular with young users and creators.