close
close
Thu. Oct 24th, 2024

TikTok visits Utah to demonstrate security features. Lawmakers say it’s not enough

TikTok visits Utah to demonstrate security features. Lawmakers say it’s not enough

TikTok head of trust and safety Suzy Loftus made a stop in Salt Lake City on Tuesday for a private press event where she spoke about the app’s safety features. But Utah lawmakers said the available features don’t go far enough to protect children.

Salt Lake City is one of 17 cities that Loftus and her team have visited or will visit before the end of the year. When asked how Utah came to be on the list, Loftus said, “I think it’s about great communities of creators who have stories to tell, and this is really driven by the creators in the community.”

The stop comes as Utah has filed two lawsuits against the social media giant. The state also passed first-of-its-kind legislation aimed at protecting children from addictive features in apps and giving parents tools to monitor their children’s accounts. The law also allows children and their parents to bring a private right of action against social media companies under certain conditions.

In addition to speaking with Loftus and two TikTok creators Briel Adams-Wheatley and JT Laybourne at Tuesday’s event, the Deseret News spoke with Rep. Jordan Teuscher, Senator Mike McKell and Margaret Busse, executive director of Utah’s Department of Commerce on the app’s security.

Is TikTok safe for children?

During the press event, Loftus said 1.1 million Utahns are on TikTok. The core of TikTok’s safety policy starts with its community guidelines, she said. “We are a community and we have rules such as bullying, harassment and dangerous behavior.”

Some technology can identify content that violates community guidelines before it is posted, she said. In other cases, such as disinformation or bullying, Loftus said if the technology doesn’t know, the content goes to a human moderator (and for disinformation, to fact-checkers).

For younger users, she said, TikTok has more restrictions.

“A lot of times people will think that their experience on TikTok is the same as what a 13-year-old or a 15-year-old has experience on TikTok,” Loftus said. “And it’s a much more restrictive experience for teens.”

Loftus said that for account users under the age of 16, the accounts are private by default, their TikToks cannot be downloaded and there is no direct messaging. For those under 18, daily screen time is set at 60 minutes, and Loftus said there will be no push notifications at night.

These features are available for use by all accounts – Adams-Wheatley said it was “game-changing for me. I didn’t have to see the comments I didn’t want to see.” Laybourne said when his 13-year-old signed up for the app, the features made it a positive experience.

As for how the TikTok app determines how old users are, Loftus said that when people sign up for the app, they must include their date of birth. Then she said TikTok checks and looks for clues to see how old users are based on what they’ve put in their bios or in their videos.

Busse said there’s a difference between age checking (asking for an age) and age verification (verifying a user’s age) — and pointed to Utah’s law requiring accurate age verification.

“There are many different types of age verification technologies that these platforms could use today if they wanted to, but they don’t want to,” she said.

To ensure the app is safe, Busse says meaningful age verification is needed and certain settings must also be put in place to protect children’s data and also “get rid of their addictive qualities – period, period. And those include push notifications, infinite scrolling, autoplay, things like that.”

“Just being on the platform, having these addictive features, just being on the app a lot, you know that can lead to mental health issues,” Busse later said.

When asked if Loftus could foresee a future where parents might be able to disable the algorithm, she responded by referring to the default screen limit. She said it is an ongoing discussion and that the screen limit can be used by anyone. She said the public policy team is working with lawmakers.

Teuscher, R-South Jordan, and McKell, R-Spanish Fork, are two of the lawmakers who have worked the most on social media legislation in Utah.

Sharing a story, Teuscher said students one day found out their teacher had cancer and wanted to show support. These students came together on social media to do this – and he said this is an example of how social media can work for good.

When looking at social media, Teuscher said the biggest problems he and others found were related to the design elements intended to keep children on the platform — which is what’s causing the harm, he said.

“I would like to see social media companies like TikTok actually recognize that this is a problem, especially for minors, and put up guardrails to stop these things — and that’s what we’re not seeing,” Teuscher said.

Removing the addictive design elements would make it easier for teens to stay off the platform, he said, because they could choose to do so. When speaking to his constituents, Teuscher said he hears from parents that the algorithms are too powerful and that children are finding ways around the restrictions.

“They needed the government to step in and put up guardrails to protect children,” Teuscher said. These restrictions, in turn, can help parents be proactive with their children’s use of social media.

When the legislation was passed with these restrictions, McKell said the state showed its compelling interest.

“The biggest concern is that kids are on these platforms for far too long and because they stay on the platform for far too long, it affects performance at school and sleep at night,” he said.

Tackling this problem requires all hands on deck, McKell says. Although it’s called social media, companies are mining the data of child users. He said the government regulates harmful products such as tobacco and alcohol, and social media should be no different.

“When we look at the data, the mental health of a child who is online for more than three hours starts to decline,” he said.

As for whether or not these social media companies are doing enough, McKell said, “Let me be clear on this. The social media companies are not negotiating in good faith. They haven’t done that from day one, so I don’t expect them to do that in the future.”

While McKell is optimistic about seeing social media companies bring out tools that make their product less addictive, he said, “At the same time, we need to hold them accountable.” Referring specifically to TikTok, McKell said that unless he and others “see a really good solution, we will continue to move forward.”

Utah won’t sit back and just watch what the courts do, Teuscher said. The state will work side by side with other states to take the lead in protecting children. Both Teuscher and McKell pointed to Utah’s website socialharms.utah.gov for more information.

What is the status of Utah’s lawsuits against TikTok?

The Utah Department of Commerce houses the Utah Division of Consumer Protection that sued TikTok – the Utah Attorney General’s Office filed on behalf of the division.

Busse said they are waiting for two legal decisions: In the first suit, the state is waiting for the judge to determine whether or not the lawsuit will be dismissed. She said she is confident the suit will survive.

“In fact, we’re very confident, in part because Meta survived the motion to dismiss,” Busse said, adding that other states have seen their lawsuits survive. In the second lawsuit, which focuses on the TikTok Live feature, a judge is currently weighing which parts of the lawsuit should be overturned. Parts of the suit were blocked off so the public could not read them.

By Sheisoe

Related Post