Handy tools to protect your kids online during the festive season
Due to strong opposition from the US government, lawmakers, and parents, Meta (formerly Facebook), Instagram’s parent company, was forced to halt development of Instagram for Kids for children aged 13 and under in September 2021.
The argument was based on leaked information indicating that Instagram is contributing to the rising rates of anxiety and depression among adolescent girls.
However, the global technology company resisted, claiming that an exclusive photo-sharing platform for children would allow parents to better supervise and control their children’s experiences rather than relying on Instagram.
According to an official Instagram comment, parental permission will be required to sign up for the app. It would have no advertisements and only age-appropriate content.
Because YouTube, Facebook Messenger, and TikTok already have similar children-only platforms, the public reaction could be interpreted as suspicious. Safety/parental control tools are available in apps that do not create separate versions.
This is not a fluke. Social media companies that were founded without children’s plans have realized that it is nearly impossible to build fail-safe underage controls on their platforms in order to give parents some control. Furthermore, the age at which children must be before using internet-enabled phones is becoming younger and younger.
On a larger scale
The approach to addressing the global concern of online safety for children differs across platforms. Almost all social media platforms require users to be at least 13 years old in order to use their platform. These restrictions are primarily supported by national data protection laws.
All of these restrictions are necessary to prevent inappropriate and age-appropriate suggestions from appearing on children’s timelines. It would also protect them from Internet predators. However, this does not necessarily protect young users from other issues that may arise during their initial exposure to social media, such as addiction, which can lead to anxiety, depression, and other mental health issues.
When we discussed the highly addictive nature of social media during one of Techpoint Africa’s team bonding sessions, the majority of the team agreed that, just as children under a certain age should be kept away from addictive substances, they should also be kept away from social media.
With many people divided on the move, China’s regulatory authority has implemented regulations requiring social media and gaming companies to disincentivize youth from spending excessive time on social media.
Douyin, a video-sharing app similar to TikTok, does not allow users under the age of 18 to use the platform for more than 40 minutes at a time if they open it in Youth Mode. In addition, the app becomes inaccessible between 10 p.m. and 6 a.m., and it displays content.
Meanwhile, a new set of rules for gaming platforms was recently released, limiting young users to playing games between 8 p.m. and 9 p.m. on Fridays, Saturdays, Sundays, and official holidays.
While other countries may not go this route, they frequently find legal ways to monitor the activities of these platforms. In the case of Instagram, US lawmakers stood firm.
A cursory look at a report that surveyed nearly 3,000 parents of teenagers in the United States reveals that parents are aware of the dangers of having their children online, but they either don’t know how to deal with it or accept it as a necessary evil.
“How do you want to keep track of your child’s online activities without exhausting yourself or getting them riled up to the point of looking for loopholes to get back at you?” one perplexed parent asks.
Platforms and companies can impose strict rules, but parents and guardians must also play a role, and ignorance of what to do will not suffice.
Don’t be a bumbling parent
Before delving into the safeguards put in place by various social media apps to keep children safe, it’s important to note that it’s best if parents open social media accounts for their children.
Because these platforms only activate specific security restrictions when age is specified, they can effectively define settings that suit their children’s ages.
Any app that their children want to use should be understood by their parents. They should, if necessary, download the app and become acquainted with its contents.
Facebook and Facebook Messenger
Facebook is one of the world’s oldest surviving social networking platforms. With over 2.9 billion users worldwide as of October 2021, the platform ranked first on the list of the most popular social networks. Attempting to protect a teenager with a mobile phone from the allure of such a platform is almost certainly a futile exercise.
For starters, no one under the age of 13 can open a Facebook account. Parents must enable the following settings during account creation for teens aged 14 and up:
- Under the ‘Audience and visibility” option in the settings menu, change the default audience from “Public” to “Friends.” Only their friends will be able to see their activities as a result of this.
- Under the “How people find and contact you” option, change who can send friend requests from “Everyone” to “Friends of friends.” You can also remove them from public search here, which will prevent their profiles from appearing in a Google search or as a suggestion to anyone with their phone number or email address.
- Change the settings to “Friends” or “Friends of friends” to see who can follow, tag, comment on, or share their posts.
Meanwhile, Messenger Kids, a child-friendly version of Facebook’s instant messaging platform, allows parents to control who their children chat with. This was introduced in 2017.
Because activation is only possible through an adult’s account, the child does not need a Facebook account or a phone number to use the Messenger Kids app.
When a parent installs the Messenger Kids app on their child’s phone, they must authenticate it with their Facebook account before creating a mini-profile for their child.
The parent decides who becomes friends and, as a result, with whom the child communicates. Even though parents have control over who sends and accepts friend requests, they cannot monitor their children’s chats.
Instagram, a photo-sharing app owned by Meta, is another social network that has captured the attention of young people. Given the numerous reported cases of online abuse on the platform, online safety precautions, particularly for young people, are required.
The platform has several safety features, including ‘Tools to combat bullying,’ to keep minors and all users safe.
Set profile to ‘Private account’ in the ‘Privacy’ settings menu to manage those who interact with your children. Only followers will be able to interact as a result of this. People under the age of 16 have their profiles automatically set to ‘private.’
You can also manage comments, tags, mentions, and unwanted interactions as well as direct messages.
YouTube
YouTube, like other platforms, does not allow people under the age of 13 to create accounts, unless the age is not reported.
Parents can supervise teen and tween usage, change video privacy settings, and disable autoplay. They can also walk their children through these safeguards.
Google launched YouTube Kids in 2015 to provide child-friendly content to younger users. The app is directed to uploaded videos that are appropriate for children by YouTube’s algorithm.
Because some offensive videos have been reported to have slipped through YouTube’s filters, parents must continue to monitor app use and report offensive content.
Snapchat
Snapchat has been around since 2011, and it is popular for sharing short-lived videos and images.
Unlike the other platforms mentioned, Snapchat posts, known as Snaps, are deleted after 24 hours. Direct messages also vanish after the recipient has viewed them. The ephemeral nature of its content, however, does not guarantee safety. If anything, extra caution should be exercised.
One of Snapchat’s ground rules is that anyone under the age of 13 is not permitted to use the app.
Discussing staying away from explicit content and unfollowing any account that makes them uncomfortable is an excellent way to help teens use the app safely.
Parents can also change the location sharing settings to ‘ghost mode’ and adjust the safety settings. Only followers will be able to see their Snaps or send direct messages as a result of this.
Here’s how to activate Snapchat’s parental controls:
- Change the option in ‘Contact me’ to ‘My friends’ on the profile page’s settings icon. Under the ‘Who can’ section, select the same option (to manage who can see their snaps).
- Uncheck all of the boxes in the “See Me in Quick Add” section to make their account private so they don’t get suggestions to subscribe to channels or appear in other users’ suggestions.
- To disable location sharing, select ‘Ghost Mode’ from the ‘My location’ settings.
Tiktok
TikTok is one of the most popular social media platforms in the world, with over 2.6 billion downloads. It, like any other social media platform, has security concerns.
The video-sharing app also caters to young users, though, as with other platforms, age must be reported in the account settings in order for the accounts to be activated.
For users under the age of 16, the default account setting is ‘Private account,’ which means that videos are only visible to followers. Direct messaging is also turned off. This option can be enabled manually in the settings menu.
TikTok redirects children under the age of 13 to TikTok Kids, where they are exposed to curated content; sharing, commenting, and messaging are disabled. Parents in some countries can link their child’s TikTok account to their own. Parents can manage screen time, restrict content, and supervise messaging, following, commenting, and other activities once the ‘Family Safety Mode’ is enabled.