Snapchat is introducing today new tools that parents can use to monitor the content that minors see when they access the social media network including filters for Stories and restrictions to the content that is displayed to them as part of Snap’s algorithm-led recommendations.
These features are now accessible via the Snapchat Family Center, a tool launched in August last year that seeks to make the platform a safer place for teenagers and other sensitive audiences by allowing parents to have more control over what their kids see.
“Our new Content Controls in Family Center will allow parents to filter out Stories from publishers or creators that may have been identified as sensitive or suggestive. To enable Content Controls, parents will need to have an existing Family Center set up with their teen”, the company commented in a blog post published yesterday.
Even though all of these publishers have to abide by the platform’s content and community guidelines, this gives parents more freedom to choose what they let their kids be exposed to depending on the individual values and principles that they embrace and foster.
This is What Snapchat Considers “Sensitive” Content
In addition, Snapchat is making public for the first time its Content Guidelines for creators. This lengthy document explains what kind of content the platform deems appropriate to be broadcasted on its Stores feature – the one that showcases content created by media partners and top Snapchat influencers.
It will now be clear to creators what kind of content will be eligible to be displayed on Snap’s Stories and which is considered sensitive and may be automatically restricted by the algorithm to be displayed only to adult audiences.
Some of the content that is considered “sensitive” by Snapchat (SNAP) includes non-nude body imagery, suggestive sexual language, sexual health content, and content created by well-known figures within the adult entertainment industry.
Meanwhile, when it comes to violence, news that showcase or inform about violent acts committed in a location, discussions about topics related to violence, suicide, and eating disorders, body modifications such as tattooing and piercing, and content that extensively used profane or swear words will be flagged as sensitive as well.
As for the use of substances, sensitive content will include the consumption of alcohol in a moderate manner, content about weight loss, gambling and betting, and fictional content that discusses or portrays the consumption of illegal substances.
TikTok is Also Helping Parents Manage their Kids’ Exposure to Social Media
Earlier this month, the global success in the social media space, TikTok, introduced new features that allow parents to limit the time their kids spend on the platform.
In the case of the Chinese app, parents can link their children’s TikTok account to theirs and unilaterally make modifications to the maximum time they can spend on the social media network and filter the type of content they are exposed to.
In addition, TikTok added a dashboard that will show parents some useful metrics about the kid’s usage patterns such as the number of times the app has been opened and when it was used during the day.
The platform has also enforced a maximum daily screen time of 60 minutes for teenagers aged 18 or less. Once that limit is reached, the user will have to add a specific code they have received from TikTok. This is just a warning as the code will automatically remove the limit. However, TikTok believes that this prompts teenagers to make conscious decisions in regards to how they spend their time.
According to a recent survey from Pew Research, 54% of teenagers believe that it would be hard to give up social media altogether. Meanwhile, an outstanding 55% indicated that they think they spend “the right amount of time” on these apps.
Other Related Articles: