YouTube is looking to keep conspiracy theories and misinformation out of recommended video lists. The platform is changing its recommendation algorithm to keep content that might mislead viewers from popping up. 

YouTube said that the changes will affect less than 1 percent of videos and that all of the videos will still be available on the platform, they just won't be recommended. 

"We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat or making blatantly false claims about historic events like 9/11," the company wrote in a blog post announcing the move.

Even with the limited scope of the changes, YouTube thinks that users can expect a much better experience. 

"While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community," they said. 

The move is similar to ones made by YouTube in the past to eradicate negative trends like clickbait-y reaction videos. The platform said that the removal of videos from recommendations "relies on a combination of machine learning and real people" to keep the videos from spreading. 

While the videos don't violate YouTube's community guidelines and therefore won't be removed, the restructuring of the algorithms will keep them from being distributed alongside legitimate news. 

The decision comes down just days after BuzzFeed posted an investigation in YouTube's recommendations, finding that news videos were frequently followed with hateful, hyper-partisan or conspiracy-filled clips.