Sony shows off a robot grabber, 4K OLED panels for VR, and more

The image is from the Sony film "Screen Shot"

This year, we got some great visuals of tech the company has been working on at its R&D labs, as Sony is holding its Technology Day event to show off what it has been working on. We got to see a demo reel of Sony's awesome displays for making virtual movie sets, but we also got to see a robot hand that Sony said could figure out grip strength based on what it was picking up.

The most interesting thing Sony showed off was a headset that had 4K displays. The Sony presentation used a headset that was clearly intended for lab and prototype use, but the specifications Sony laid out for the panels were reminiscent of the rumors surrounding the PSVR 2.

:noupscale is a file on thechorusasset.com

The next PS3 headset will probably not look like this.

Sony said the headset it showed off was 8K, and the PS VR 2 will only be 4K overall with 2000 x 2040 pixels per eye. It is exciting that Sony is working on panels that are focused on virtual reality. In a Q&A session for journalists, Sony wouldn't answer questions about when the displays would show up in an actual product but said that various divisions were already looking into how they could integrate them into products.

Sony showed off a robotic grabber that could be used to pick up objects. Sony said that it had the ability to control grip strength depending on what it was holding, so it could hold things tightly without causing them to fall.

I would love to have a robot hand me a rose and hold my hand without crushing it.

Gif of Sony.

Sony says the grabber could be used to cook or line up items in a shop window, though it would have to be coupled with a way to move and an artificial intelligence that could determine what objects it needed to pick up. It is hard to imagine that sort of thing in public, but it does make for an impressive demo. I have had my fill of eerily human-looking robot parts for this month, and it is nice that Sony made it look robotic rather than having it look like a hand.

The presentation had some neat visuals to go with other projects, as well as another look at some devices we have seen before. It showed off some machine-learning supersampling tech, which it said it could use to improve resolution and performance for ray-traced rendering. You can see the Sony comparison shots in the video below, but they would ruin the quality.

Sony showed off its depth sensor, which it says can be used in Lidar systems in cars.

The image is of Sony.

Sony talked about its "Mimamori" system, which it said is designed to watch over the planet. When I saw this slide, I was a little worried.

:noupscale is a file on thechorusasset.com

What do you think about Sony?

Sony explained that it wanted to use satellites to gather data from sensors placed all over the earth, to collect information about soil temperature, and more. Its pitch is that it could help scientists gather information on how the climate is changing and help farmers adapt to those changes. Mimamori was closer to a prototype than a pitch, but it shows that Sony is at least looking into how it can leverage some of its tech to help deal with climate change.

While some of the ideas Sony showed off seem like moonshots, it is interesting to get a peek into what Sony is doing in its labs. Even if we don't get consumer devices out of it, at least we got some cool futuristic visuals.