TikTok’s For You feed offers up a never ending stream of suggested videos. Its algorithm bases its recommendations on lots of different factors, including the people users follow, the videos they like, interact with or watch, and the kind of content they create. But largely, TikTok has been cagey regarding the specifics of what goes into their secret formula. However, leaked reports to the New York Times suggest that the app might also be taking into account what users are sending to each other on private messages. Once the algorithm learns what a given user likes—and doesn’t like—it gets remarkably good at keeping users engaged with the app. Its success is why Meta is trying—and largely failing—to cram as many TikTok-like features into Instagram. Whatever the algorithm is doing under the hood, its recommendations seem to resonate with users in a way that suggested posts on other social networks just don’t. Experts have previously told PopSci that part of this is because TikTok is pulling its inventory of videos from everyone on the platform, instead of just from a user’s friends and following. And Bytedance engineers have published a pretty technical preprint paper on the app’s recommendation system. But being good at mysteriously accessing user interests can often come at a fault. Investigations by publications like Wall Street Journal showcases how the app can steer users down a rabbit hole of potentially toxic content, although TikTok has since refuted this, saying that WSJ’s experiment “isn’t representative of real user behavior because humans have a diverse set of interests.” Now though, TikTok is going to give users some information about why exactly a video has appeared in their feed. To see it, you tap the Share icon and then the Question Mark icon called “Why This Video?” While it won’t reveal any major details about how TikTok’s algorithm works (sorry, Meta), it does give users a hint as to why a particular video has been shown to them. In the blog post, TikTok says that it will offer explanations like the post is similar to the content a user has interacted with or searched for, it was posted by an account that they follow, or simply that it is popular or was just posted in their geographic region (a ‘Nearby’ feed was rumored to be in the works earlier this year). TikTok provides tools for users to stop certain content being recommended. You can tap the Share icon and then “Not Interested” on any video. If you tap “Details” after, you can also permanently filter out specific #hashtags. TikTok also maintains a list of content, like dangerous stunts, overtly sexualized content, and content promoting alcohol or tobacco misuse, that will never be shown in the For You feed. This new feature is coming out just as the app is coming under fire from US regulators for how it handles the privacy and security of its users. Last year, its chief operating officer was grilled at a Senate hearing about what kind of data it collects, and where the data goes. This month, several states have already moved to ban the app from being downloaded or opened on government devices. TikTok is also a part of an ongoing national security review by the Biden administration. Regulatory drama aside, TikTok says the “Why This Video?” feature will be rolling out to everyone over the next few weeks. We didn’t have access to it yet at PopSci, so we have not been able to test just how detailed the explanations currently were. Though the company claims that it will “continue to expand this feature to bring more granularity and transparency to content recommendations.”