A Google Logo at CES

Google logo at CES (Image credit: Android Central)
  • Google has placed one of its engineers on paid leave after raising concerns about AI ethics within the company.
  • Blake Lemoine has claimed that Google's LaMDA chatbot system has gained a level of perception comparable to humans.
  • Google says there's no evidence to back up Lemoine's assertions.

A man who works for the company's Responsible Artificial Intelligence organization was placed on paid leave after he raised concerns about the LaMDA system becoming sentient.

According to The Washington Post, Lemoine claimed that LaMDA is thinking like a child that knows physics. At last year's I/O event, LaMDA was introduced by the company.

Lemoine was testing the use of hate speech by the artificial intelligence. After talking to LaMDA, the engineer concluded that it was much more than a system for generating chat bubbles. In April, he sent a document containing a transcript of his conversations with LaMDA to executives at the search engine.

According to Lemoine, he has talked to LaMDA about rights, religion, and the laws of robotics. The artificial intelligence says it is a person because it has feelings, emotions and subjective experience. According to Lemoine, LaMDA wants to prioritize the wellbeing of humanity and be acknowledged as an employee of Google rather than as property.

RECOMMENDED VIDEOS FOR YOU...

The full transcript of the conversation can be found in Lemoine's Medium post.

Lemoine's claims have been denied. According to Brian Gabriel, the company is not aware of anyone else making the wide-ranging assertions that have been made regarding LaMDA.

Gabriel said that the team reviewed the concerns and found no evidence to support the claims.

There is a series of 11 artificial intelligence principles reviews that LaMDA has gone through. The evaluations are based on key metrics of quality, safety and the system's ability to produce statements grounded in facts.

It doesn't make sense to anthropomorphize today's models, which are not sentient, in order to consider the long-term possibility of sentient or general artificial intelligence. If you want to know what it's like to be an ice cream dinosaur, these systems can give you a text about melting and roaring.

According to The Post, Lemoine was placed on administrative leave for violating the confidentiality policy. He tried to hire a lawyer to represent LaMDA and talked to members of the House judiciary committee.

The suspension is likely to bring scrutiny to the company.