While Twitter (TWTR) has become an important part of the national conversation thanks to its extensive use by President Donald Trump, the platform has struggled to police its audience.

On the positive side, the social-media site gives every user a way to communicate freely. On the negative side, that allows virtually unchecked hate speech, cyber-bullying, and other forms of inappropriate interaction.

Twitter has struggled with how to police its audience, and has acknowledged its struggles. In fact, the company's @TwitterSafety account even admitted, in a recent blog post, that Twitter has not been able to deliver on promises to become a safer place for its audience. "Far too often in the past, we've said we'd do better and promised transparency but have fallen short in our efforts," it wrote.

Now, however, Twitter has promised to change, and it has laid out the beginnings of a plan to do so.

Two Twitter bird icons in a speech bubble, with a rolled-up sticky note forming a megaphone for one

Twitter has struggled to police its own platform. Image source: Getty Images.

What is Twitter doing?

Twitter has to balance being a forum where people can communicate freely with being a lawless place where trolls and loudmouths control the conversation. That's a particular challenge when its most visible user, President Trump, uses the social-media site to sling insults.

It's a conundrum that forces the company to create rules that set a clearer line as to what will be tolerated on the platform. That's not a simple problem to solve: There are a lot of gray areas when it comes to intent and shades of meaning. Still, the company has laid out its plans, while also acknowledging that it does not have all the answers:

Starting today, you can expect regular, real-time updates about our progress. Sometimes, this may be insight into the difficult questions we're asking ourselves, even before we have the answers. This is the first time we've shared this level of visibility into our work, and we hope it helps build trust along the way.

Along with that admission and promise, Twitter has published a calendar of the changes it plans to make to its rules. The upcoming changes begin on Oct. 27 when the company tightens up rules on pictures involving nonconsensual nudity, and continues through the rest of the year, addressing everything from hate groups to abusive Twitter handles.

The company has also promised to offer a better experience for people who appeal having their accounts suspended. In addition, the social-media site plans to keep communicating with its audience via the @TwitterSafety account as its new rules get implemented.

Why is Twitter doing this?

While you might like to hope that human decency has fueled some of these changes, that's (at best) only part of the reason. Twitter has struggled to monetize its audience, partly because advertisers don't want to pay to be part of an unregulated community. By having clear rules and policies, the company may be able to show it's a viable forum for companies to promote their brands.

In many ways, Twitter has the same problem as World Wrestling Entertainment (WWE). The wrestling company delivers a large audience by cable standards, but it has never been able to charge ad rates comparable to programming that draws similar numbers. That's because advertisers don't want to be associated with content that may reflect poorly on their brands.

WWE has tried to deal with that by cleaning up its programming, banning intentional bleeding, and no longer presenting its female characters purely as eye candy. Those changes have worked to some degree, attracting higher-profile advertisers to the company.

Twitter has a bigger challenge. It doesn't just have to police itself -- it has to police an audience of more than 327 million monthly active users. These new policies at least acknowledge and begin the process of solving the problem. This isn't going to be an easy fix, but the company does seem to be putting itself on a more transparent path to addressing the safety of its users.