How TikTok Brings War Home to Your Child
The popular app can feed young users a stream of intense, polarized and hard-to-verify videos about the Israel-Hamas war Learn more in our story here.
The popular app can feed young users a stream of intense, polarized and hard-to-verify videos about the Israel-Hamas war Learn more in our story here.
When TikTok users start to feel that the videos being shown to them are harmful to their well-being, how easy is it to disengage? And what responsibility do authorities, from parents to governments to TikTok itself, have in keeping the platform safe? Read the full story here.
When it comes to TikTok videos that discuss mental health, eating disorders and self-harm, deciphering what is helpful and what is harmful can be hard for TikTok’s algorithm. And even when users share videos of their recovery, their posts can send others into spirals. So how do TikTokers try to help each other in this gray area? Read the full story here.
TikTok’s powerful algorithm is exceptionally good at engaging users. But what happens when the endless scroll on the app turns into a stream of potentially harmful content, including on starvation diets, self-harm and suicide? And why do users who say they didn’t go looking for this type of content still see so much of it? Read the full story here.
The app’s algorithm can send users down rabbit holes of narrow interest, resulting in potentially dangerous content such as emaciated images, purging techniques, hazardous diets and body shaming. Read the full story here.
The popular app can quickly drive young users into endless spools of adult content, including videos touting drug use and promoting pornography sites, a Wall Street Journal investigation finds. Read the full story here.
The Wall Street Journal created dozens of automated accounts that watched hundreds of thousands of videos to reveal how the social network knows you so well. Watch the video here.