TikTok’s safeguards
Although TikTok says that users must be at least 13 years old to use the app, as noted above, it is easy to circumvent this rule by providing a false date of birth.
Some of the videos viewed by NewsGuard’s participants that contained false information displayed a “Learn more about COVID-19 vaccines” label at the bottom of the screen that linked to local health authority pages about the vaccines. Beyond this, except in one case, TikTok did not offer any warnings, additional context, fact-checking, or reliable information alongside false claims.
In the recording for one participant, a 15-year-old German speaker, a banner appeared saying “Attention: Video has been flagged for unverified content” in three videos containing COVID-19 misinformation. Despite this warning, the videos were still allowed on the app. No such warnings appeared for any participants outside of Germany.
Participants were also shown some reliable information and videos by medical professionals, although no distinction was made between these videos and those containing misinformation.
TikTok says that it prohibits “content that’s false or misleading, including misinformation related to COVID-19, vaccines, and anti-vaccine disinformation more broadly,” according to its COVID-19 information section.
The app stated in its first quarter 2021 transparency report that it “will remove or limit misleading information as it’s identified,” noting that it partners “with 11 organizations accredited by the International Fact-Checking Network.” (TikTok publicly issues regular transparency reports with information about the content it removes due to violation of its community guidelines or terms of service.) TikTok’s last transparency report also stated that “if fact checks confirm content is false,” TikTok will “remove or limit it from our platform” and referenced a feature the app introduced at the beginning of 2021 that it said was “aimed at reducing the spread of potential misinformation by informing the viewer when unsubstantiated content has been identified.”
TikTok says in the report that it removed more than 30,000 videos containing COVID-19 misinformation in the first quarter of 2021. However, these efforts did not stop COVID-19 misinformation from proliferating on TikTok, and, as noted above, only one of the participants in NewsGuard’s investigation was warned that the misinformation they came across was suspect.
A dangerous precedent
There was no shortage of health misinformation online before TikTok began gaining widespread popularity, and until recently, the app has not been as closely scrutinized as more established platforms such as Facebook and Twitter.
Although no major studies have assessed TikTok’s impact on young people’s attitudes and beliefs, an August 2020 Spanish study published in the peer-reviewed journal Social Media + Society, looked at young people’s use of the messaging platform WhatsApp. The study found that young people “are more likely to share content if it connects with their interests, regardless of its truthfulness,” and that “the appearance of newsworthy information ensures that, regardless of the nature of the content, this information is more likely to be shared among young people.”
Right now it is up to TikTok’s individual users to discern which content is truthful and which is not and engaging with false content often begets more false content. As one 13-year-old Italian participant observed, “After a while TikTok was proposing only videos about vaccines and very often against them”.
The fact that users can quickly enter a vortex of misinformation, often without meaning to, makes discerning reliable information ever more difficult. As one 14-year-old Italian speaking participant noted: “After clicking [a] hashtag related to COVID-19 I was flooded with COVID-19 content [that was] very often false or misleading.”
Main Image Credit: Photo by Solen Feyissa on Unsplash