He knew he might lose his job when he claimed the chat system was sentient. He was fired by the tech giant after he was placed on paid leave.
Lemoine announced his firing on the Big Technology Podcast. He said that LaMDA was worried about being turned off because death would scare it a lot, and that it felt happy and sad. Lemoine likened LaMDA to the sci-fi romance Her.
An engineer with the Washington Post took the story public after Lemoine was put on leave for talking with people outside of the company. The company terminated him a month later.
"If an employee raises concerns about our work, we review them extensively," the company said in a recent interview. For many months, we worked to clarify the claims that LaMDA is sentient and that they were completely false. The open culture helps us innovate in a responsible way. It is regrettable that despite lengthy engagement on this topic, he still chose to violate clear employment and data security policies that include the need to safeguard product information. We will continue to work on language models, and we wish him good luck.
The pursuit of making a chatbot sentient is a Sisyphean task and a majority of scientists agree.
Gary Marcus, the founder and CEO of Geometric Intelligence, told CNN Business that nobody should think auto-completion is conscious. Lemoine told the radio station that he is getting legal advice.
Even though LaMDA isn't sentient, it is likely that it is racist and sexist.