One of the findings of UCF’s study suggests children don’t want to be constantly monitored by their parents, and artificial intelligence could be an alternative to helping them balance their privacy and safety.

Children Give Their Input on How to Make TikTok a Safer Platform for Youth

By: Zenaida Gonzalez Kotala on

Privacy laws meant to protect young users often deter social media platforms from allowing children younger than 13 to have accounts. But often kids find ways to use them anyway.

TikTok, better known as Musical.ly among kids, is one of these social media platforms. The Federal Trade Commission recently fined the company more than $5 million for failing to protect children’s privacy.

Because of TikTok’s popularity among school-age children, researchers from the University of Central Florida and the University of Maryland teamed up to see what kids had to say about “stranger danger” and how it might be handled more effectively through design.

The children, ages 8 to 11, were asked to redesign TikTok in a way that would help keep them safe online while still being appealing to their age group.

The study found that young children don’t often recognize dangers when they are engaging with others on social media, such as encountering sexual predators or providing too much personal information that could lead to children being lured away and taken for human trafficking. And if the children encounter a sticky situation, they would prefer to ask artificial intelligence systems for advice, rather than having their parents have direct control over their social media interactions.

While the children acknowledged there are some bad people online, they were more concerned with privacy and independence. Under two scenarios, each group opted for a design that had AI assistance rather a design that gave parents the ability to outright ban or control a child’s activity.

These are among the findings from a study that gave children the opportunity to redesign TikTok as a way to gain insight into their online behavior.

“Kids did seem to have a sense that an online stranger wasn’t something that was physically harmful to them, so they didn’t seem to view strangers online as they do offline,” says Karla Badillo-Urquiola ’14 ’15MS, a doctoral scholar at the University of Central Florida who led the study. “It’s commonplace to interact with strangers online, so they don’t see this as abnormal. The lack of physicality lowers their sense of risk.”

“It’s not that they trust more in AI necessarily,” said Pamela Wisniewski, a UCF assistant professor of computer science who co-authored the study’s findings. “Rather AI promises more privacy. Children don’t want to be constantly monitored by their parents, and AI could be an alternative to helping them balance their privacy and safety.”

The study will be published as part of the June 12-15 Association for Computing Machinery Conference (ACM):  Interaction Design and Children Conference in Boise, Idaho.

“We are definitely not suggesting that we remove parents from the equation,” Badillo-Urquiola said. “Rather, parents need to be more thoughtful about how they are monitoring technology use in ways that empower the child and teach them to use it effectively instead of doing it through surveillance, restriction, and punishment. If the parent has all the control, there is no room for the child to learn on their own.”

Wisniewski has been researching teens’ online behavior for years. She has found that teens want more independence and privacy, while parental-control technologies promise safety but primarily deliver privacy-invasive levels of restriction and monitoring.

She suggests that just using technology to avoid bad situations isn’t the best approach. Rather, parents should find ways to help teach their teens how to navigate these situations to build resiliency.

The researchers wanted to examine what younger children thought about the balance between safety and autonomy, which led them to KidsTeam at the University of Maryland’s Human-Computer Interaction Lab (HCIL). KidsTeam co-designs technologies that support children’s learning and play. The group’s research examines how intergenerational design techniques impact use and by working with children, the designs are more likely to be more relevant to children’s interests and needs. All participant parents agreed to have the children participate in the study.

The researchers agree that some platforms aren’t appropriate for children. They both have young children so their familiarity is both academic and personal, and their goal is to make the internet a safer place for their children. TikTok, for example, has been banned in several countries and neither researcher allows their children to use it.

“However, we want to think about designing for child social media use more broadly, because designing to restrict and limit use isn’t designing for a positive online experience,” Wisniewski said.

Badillo-Urquiola earned her bachelor’s and master’s degrees at the UCF and is pursuing her doctorate in modeling and simulation. She expects to graduate in 2021 and hopes to one day become a lead scholar in human-computer interaction and a tenured professor, a status held by only 4 percent of Hispanic women.

Wisniewski is an expert on the topic of adolescent online safety and is the first computer scientist to receive the scholars award from the William T. Grant Foundation, as well as multiple awards from the National Science Foundation on promoting adolescent online safety.

Share This Article

Featured Content image

UCF Researcher Is Working to Extend Battery Life in Smartphones, Electric Cars

A University of Central Florida researcher is working to make portable devices and electric vehicles stay charged longer by extending the life of the rechargeable lithium-ion batteries powering them. Assistant...

Read More

Featured Content image

Chemistry Student Working to Develop Sustainable Technology to Clean Water Worldwide

Nearly 800 million people are without clean drinking water in the world, and for the environmental chemist and second-year doctoral candidate Lorianne Shultz, this is no small problem. In her...

Read More

Featured Content image

UCF Partners with Adobe to Personalize Reading Experiences for Students, Adults

The increase in remote work and e-learning sparked by the COVID-19 pandemic means more people are reading in digital formats, from eye-straining, small-text documents to PDFs that have embedded pictures...

Read More

Featured Content image

Several Knights to Present at Adobe Max’s Free, Virtual Conference This Week

The Adobe Max 2020 Creativity conference kicks off Tuesday and this year the event is completely virtual and free. Celebrities alongside tech gurus will discuss products and the power to...

Read More