A group of creators and TikTok officials gathered for a roundtable discussion in Phoenix on Thursday.
They met to talk about the app’s safety features, especially for minors.
The event comes in the same week that the U.S. surgeon general called on Congress to require warning labels on social media platforms regarding their effects on young people's lives, similar to those now mandatory on cigarettes.
Suzy Loftus, TikTok's USDS Head of Trust and Safety, wouldn’t comment on that specifically, but said the app has gotten ahead of any government requirements by creating safety features like Family Pairing, which allows parents to limit screen time and control the type of content that reaches their children.
“The goal behind our entire safety approach to teens is recognizing 1. That teens have a different brain. They’re in the process of developing,” Loftus said,” and 2. Recognizing that our experience is very different than the vast majority of social media experiences.”
Loftus says her teens use TikTok and she likes that there are built-in features based on age, like how people under 18 are defaulted into a STEM feed that gives them educational content.
Arizona resident and content creator Sarah Babiarz said she filtered comments on her account for her own mental health. The feature keeps people from using certain words.
“Some of the comment filters that I have set up are ‘obese, ugly, fat’ and I was getting some of those comments before," Babiarz said.
She said her relationship with the app has become much more positive since creating those filters.