A person walks down the sidewalk near the U.S. Supreme Court building in Washington, D.C., February 16, 2022.A person walks down the sidewalk near the U.S. Supreme Court building in Washington, D.C., February 16, 2022.

The Texas social media law was put on hold by the Supreme Court after the tech industry and other opponents warned it could allow for hate speech.

The decision doesn't rule on the merits of the law, but does impose an injunction blocking it from taking effect while federal courts decide whether it can be enforced. In the future, the Supreme Court is likely to be asked to look at the constitutionality of the law.

Five justices voted to block the law. Justice Samuel Alito was one of three justices who dissented from the decision. The law will remain in effect while a challenge to it is pending.

The law prohibits online platforms from removing content based on viewpoint. It stems from a common charge on the right that major California-based social media platforms like Facebook and Twitter are biased in their moderation strategies. Right-leaning users rank among the highest in engagement on the platforms.

Two industry groups that represent tech companies, including Amazon and Facebook, filed an emergency application with the court to force platforms to distribute all sorts of objectionable viewpoints, such as Russia's propaganda.

The law does not prohibit the platforms from removing entire categories of content, according to the attorney general of Texas.

The platforms can decide to eliminate pornography without violating the law. The platforms are not required to host Russia's propaganda about Ukraine because they can ban foreign government speech.

Alito acknowledged the significance of the case for social media companies and for states that would regulate how those companies can control the content on their platforms.

Issues of great importance that will plainly merit this Court's review are the subject of this application. A ground-breaking Texas law addresses the power of dominant social media corporations to shape public discussion of the important issues of the day.

As the case proceeds through federal courts, Alito said he would allow the law to remain in effect. He doesn't have a definitive view on the novel legal questions that arise from Texas's decision to address changing social and economic conditions.

He wrote that he was not comfortable intervening at this point in the proceedings.

A lower court granted a preliminary injunction preventing the legislation from going into effect. The law could be enacted while the court deliberated on the broader case after the Fifth Circuit stayed the injunction.

NetChoice and the Computer and Communications Industry Association (CCIA) filed an emergency petition with Alito, who is assigned to cases from that district.

NetChoice and CCIA asked the court to keep the law from going into effect, arguing that the appeals court would get rid of the discretion that social media companies have in deciding what content to distribute and display. As the appeals court reviews the important First Amendment issues central to the case, the court should leave the stay in place.

The district court said that Texas's House of Representatives is a trainwreck and that it is an example of burning the house to roast the pig.

The CCIA President said that no online platform, website, or newspaper should be directed by government officials to carry certain speech.

The Supreme Court's decision has implications for other states. Florida's legislature passed a similar social media law, but it has been blocked by the courts.

The Eleventh Circuit upheld an injunction against a similar law in Florida, concluding that content moderation is protected by the Constitution. Florida's attorney general filed an amicus brief on behalf of her state and several others, urging the court to continue to allow the Texas law to be in effect, arguing that the industry had misinterpreted the law and that states are within their rights to regulate businesses in this way.

Tech platforms have relied on the legal liability shield for years to moderate their services, and Congress is considering changing that. Section 230 of the Communications Decency Act keeps online platforms from being held responsible for the content users post to their services and gives them the ability to moderate or remove posts in good faith.

The law has been criticized by both Democrats and Republicans. Democrats want to reform the law to give tech platforms more responsibility to moderate what they see as dangerous content. Some Republicans want to make it harder for platforms to engage in other forms of moderation that they view as ideological censorship, because they agree that certain types of content should be removed.

The industry groups are in favor of the Supreme Court reversing the stay. Cox argues that Section 230 should be used to protect the state law from the irreconcilable conflict.

At least one Justice on the Supreme Court is interested in reviewing Section 230.

In 2020, Thomas wrote that we should consider whether the text of the statute is compatible with the current state of immunity enjoyed by internet platforms.

He suggested in a concurrence last year that online platforms could be regulated like common carriers or places of accommodations.

Dan Mangan contributed to the report.

You can subscribe to CNBC on the internet.

There is a messy business of moderation on social media.