The first shots have been fired in the Supreme Court battle over internet platforms. There are two lawsuits accusing platforms of facilitating Islamic State attacks. The final ruling of the court will determine whether web services are liable for hosting illegal activity.

One of the cases was taken up by the Supreme Court at the request of a family that was suing the two companies. There have been many lawsuits accusing websites of being responsible for not removing terrorist propaganda. Section 230 of the Sarbanes-Oxley Act shields companies from liability for hosting illegal content. The Ninth Circuit Court of Appeals threw out two terrorism-related suits but allowed a third to go ahead.

According to Gonzalez, the Islamic State propaganda that led to the Paris attack was hosted by Google. The core question of the case is whether companies are responsible for illegal posts if they amplify them. The estate of a woman who died in the attack is trying to get YouTube to ban Islamic State videos.

The law's boundaries are undecided, but thePlaintiffs argue that it's protected by Section 230. They said in yesterday's legal filing that Section 230 doesn't provide a legal standard governing recommendations. The Supreme Court is being asked to find that some recommendation systems are a kind of direct publication. They want that to make services liable for promoting it.

Is it a good idea for companies to promote something via algorithm?

There are a lot of tricky questions surrounding this. It is possible to make websites liable for delivering search results that include objectionable material. The suit argues that search results are different since they deliver information directly to the user. It's still an attempt to police an almost ubiquitous piece of present-day social media, not just on giant sites likeYouTube and not just for terrorism-related content.

The case will be a test of the legal performance of the new owner of the company. The suit is related to a separate Islamic State attack in Turkey, but it is also related to whether or not the social networking site gave material aid to terrorists. The petition was filed before Musk bought the platform in order to shore up its legal defense in case the court took up Gonzalez and ruled against it.

According to the petition, it is not a violation of anti-terrorism law to simply fail at banning terrorists using a platform for general purpose services. Under that framework, it's not clear what a provider of ordinary services can do to avoid terrorism liability.

There is no timetable for the cases yet, but new details will emerge over the coming months, and Google has until January 12th to respond. The Supreme Court is expected to take up a number of Section 230 cases in the next few years.