Hyundai sends Boston Dynamics’ Spot robot into the metaverse – TechCrunch

One thing is for certain:Hyundai has a lot of ambitions for its robotic development. The acquisition of Boston Dynamics, which was valued at north of $1 billion, was the most recent example of the automaker putting its money where its mouth is.

The company is presenting at the Consumer Electronics Show this week. Last month, a sneak preview of the Mobile eccentric was offered byHyundai. The company outlined its plans for the future with a new concept.

We will be speaking to some executives to get a better idea of what it might look like in practice, as we will be revealing more information about its strategy. The broad idea was presented under the banner of Expanding Human Reach, which aims to find a role for mobility and robotic in a virtual reality metaverse. It is difficult to separate the meaning of the words from the practical implications, but the use of hardware seems to be the main component.

The image is from theHyundai

There are a lot of big promises surrounding a lack of tangibility that has long been a root problem for virtual reality applications. Chang Song is the President of theHyundai Motor Group.

The idea behind Metamobility is that space, time and distance will no longer be relevant. We will be able to move freely between the real world and virtual reality with the help of robots. The metaverse provides a proxy experience that allows us to be there, but robots will be an extension of our own physical senses.

A near-term use for such technology is using remote teleoperations to control a manufacturing robot. That is something that Toyota has been exploring with its T-HR3 system. It is not hard to imagine a system like Microsoft Cloud for Manufacturing serving some practical functions.

The image is from theHyundai

Other applications are a bit further out. When a user access a digital twin of their home in the metaverse while away from their physical home, they will be able to feed and hug a pet in Korea. Users will be able to enjoy real world experiences through this.

At the moment, the concepts are mostly conceptual, though Hyundai is offering demos of what they might look like in person. It is easy to imagine how remote operations could be useful down the road, given the number of people currently attending the show virtually.

Bringing robotic equipment to objects.

The company didn't spend a lot of time in the metaverse. The New Mobility of Things concept will use robotics to move objects autonomously.

Plug & Drive is a product under the New Mobility of Things concept. This single wheel unit has a lot of hardware that can help it navigate around and detect objects.

The PnD modules are designed to attach to tables in an office. The user could tell the table to move closer to them or to move at a certain time when more space is needed.

The PnD Module can be adapted to match human needs. In the world to come, you won't move your things, they will move around you, said the vice president and head of the robotics lab at the company. Normally, objects are mobile. It is this ability that makes it possible to change any space. It is a way of configuring spaces.

A personal transport system that could ferry an individual to a waiting bus was one of the applications showcased byHyundai. The mother shuttle would be attached to this Pod with four 5.5-inch PnD modules.

The image is from theHyundai

The bus would stop and the human sitting inside would continue on to their destination.

That idea, which was shown in a video that depicted an elderly woman with a cane getting her cane delivered to her via a single PnD before she climbed into the Pod and boarded the bus, is targeted at an aging population. If it ever becomes a reality, it could be used to provide first- and last-mile public transit without adding a lot of large cars to the roads.

The DnL module is a module designed to lift objects. The DnL was combined with theMobED robot. The DnL is mounted on each wheel of the ModED, allowing it to lift up and down, keeping it level as the robot traverses low barriers such as steps or speed bumps.