Twitter CEO Jack Dorsey has big plans for remaking Twitter into a place that’s more conducive to forming communities and less a source of harassment and outrage. In a TED Talk on April 16, Dorsey explained that he’s looking to make changes to Twitter to downplay likes and retweets as he’s found it leads to a lot of negative behavior, including spreading disinformation and incentivizing outrage.  

"In the past it’s incented a lot of outrage," he said, per the BBC. "It’s incented a lot of mob behavior. It’s incented a lot of group harassment."

Dorsey said that Twitter in its current incarnation needs a bit of tweaking to feel like it’s providing a service that is a net good. He said that the current system can emphasize the wrong thing, leading to people gaining very little out of logging on. "You don’t necessarily walk away feeling you have learned something," he said. "It takes a lot of time and a lot of work to build up to that."

One way that he thinks Twitter can be fixed is by deemphasizing individual accounts and focusing on connecting like-minded individuals around common interests, like the Usenet groups and chatrooms of the early internet. "It may be best if it becomes an interest-based network," he said, noting that Twitter might soon encourage following conversations and hashtags over other users. 

Dorsey also addressed the issue of abuse, which has plagued the platform nearly from the beginning. He noted that the company began using automated processes in the last year to flag potentially abusive tweets before users see them and report them. Dorsey said that 38 percent of abusive tweets on Twitter are flagged and reviewed without the need for other users to report them. 

"We've seen harassment, manipulation, misinformation, which are dynamics we did not expect 13 years ago when we founded the company," he said. "What worries me is how we address them in a systematic way."

Twitter shared some of the ways they are trying to make a “healthier Twitter” in a recent blog post, also pointing out the success of their abuse-detecting algorithms. 

Also Watch