Jack is reflecting on how things turned out with the social media platform he was in charge of.

The internal company documents being reported on by Musk's writers Matt Taibbi and Bari Weiss were addressed in a thread and newsletter post. In what has already been released, the name and email of Dorsey have come up a few times.

The internal communications between employees at the company, in which they debate about specific pieces of content, whether that content violated the company's rules, and what action to take on those users have been shown so far.

He sounds regretful in his post about the active direction in which it was carried out. It seems as though he wishes he could just let it be.

This gave the company too much power and opened us to pressure from outside. The suspension of Trump's account made me realize that companies have become too powerful.

Tweet may have been deleted

(opens in a new tab)

The proposed solution is based on these three principles.

  1. Corporate control of social media is important.

  2. The original author can only remove their work.

  3. Moderation is best implemented using a computer program.

Some of the principles seem reasonable, but they're not easy to carry out in practice because you're dealing with humans. How would he deal with death threats, publishing of a user's private data, or child sex abuse material if only the original poster could take it down? Everyone on the internet is not acting in good faith according to his beliefs.

These concerns were somewhat addressed by the fact that takedowns and suspensions were said toicate important context, learning and enforcement of illegal activity. This causes a lot of issues to be conflated. If there is a broader context or lesson, moderation policies should take that into account.

The for-profit entity made choices so that advertisers wouldn't stop spending money on the platform Many of those decisions were made by users of the platform who did not want to interact with racism.

One such instance of harassment is the recent targeting of the former head of trust and safety, YoelRoth, by Musk.

The current attacks on my former colleagues aren't going to solve anything. Direct the blame at me or my actions.

He had to flee his home after Musk implied that he was a pedophile because of his college thesis, but he didn't mention it.

How would his principles help someone likeRoth? "Algorithmic choice" is an ideal solution that would allowRoth to stick his head in the sand and not see the threats and harassment on his feed. He wouldn't be stopped from ending his life by other social media users.

The biggest mistake I made was investing in tools for us to manage the public conversation, instead of building tools for people to manage it for themselves.

Both should have been done by the social networking site. Users should be given more control over how they use social media. Platforms have a responsibility as well. The filters that were put on certain accounts allowed users to share posts to their followers but not to promote them in the trends feed. Users should have been told if their accounts had been hit with filters, as well as what they could do to fix the problem.

It looks like he would like to see a system in place that would shift responsibility from the corporation to the users. Taking responsibility is not the same as that.