The market for artificial intelligence inferencing chips is supercharging due to the growing demand for artificial intelligence. Inferencing chips speed up the process of learning and generating outputs from a set of data. Stable Diffusion is an example of a system that uses artificial intelligence to yield faster generations from it, as well as Openai's GPT 3, which extends a few lines of prose into full-length poems, essays and more.

There are a number of vendors that are developing and selling access to artificial intelligence. There are a number of upstarts like Flex Logix and Hailo. On the incumbent side, Amazon is competing for dominance with its Inferentia. NeuReality, which occupies the artificial intelligence chip inferencing market but also offers a suite of software and services to support their hardware, has not been scared away by the fierce competition.

NeuReality today announced that it has raised $35 million in a Series A funding round with participation from a number of companies. NeuReality's flagship artificial intelligence inferencing chip will be shipped to customers in early 2023, according to the company's co- founder and CEO.

According to Tanach, NeuReality was founded with the vision to build a new generation of artificial intelligence solutions that are unleashed from traditional architectures and deliver high performance and low latency. Most companies that can leverage artificial intelligence don't have the funds or the R&D that Amazon, Meta and other large companies investing in artificial intelligence have. Anyone who wants to deploy easily and cheaply will be able to.

The co-founding of NeuReality was done by Tzvika Shmueli, Yossi Kasus, and Tanach, who had previously worked at Intel and Marvell. Shmueli was the VP of engineering at Habana Labs. Kasus was a senior director of engineering at Mellanox and the head of integrations at EZ chip.

NeuReality focused on bringing to market artificial intelligence hardware for cloud data centers andedge computers that do most of their data processing offline. The startup's current-generation product lineup, the Network Attached Processing Unit (NAPU), is designed for applications like computer vision and natural language processing.

NeuReality's NAPU is a mixture of different types of processors. Functions like inferencing load balancing, job scheduling and queue management have traditionally been done in software, but it can be done with it's own set of skills.

NeuReality

The image is called NeuReality.

NeuReality's NR1 is a network attached "server on a chip" with an embedded artificial intelligence inferencing accelerator. NeuReality also offers a PCIe card with an NR1 and a network-attached inference server, as well as a separate module that pairs several NR1Ms with the NR1.

NeuReality has a software development kit for cloud and local workloads, a deployment manager, and a monitoring dashboard.

The magic that supports our innovative hardware approach is the software for artificial intelligence. The first beneficiaries of the NAPU technology are enterprises and cloud solution providers that need infrastructure to support their chatbot, voice bot, automatic transcriptions and sentiment analysis as well as computer vision use cases for document scans, defect detection, etc.

NeuReality has yet to back up some of its claims. NeuReality said in a recent article that it estimates its hardware will deliver a 15x improvement in performance per dollar compared to the available GPUs andASICs. The startup has claimed in the past that its proprietary networking protocol is more performant than other solutions.

It isn't easy to deliver hardware at large scale. According to Tanach, NeuReality has laid the groundwork by inking a partnership with IBM and partnering with a chip manufacturer. IBM is evaluating the startup's products for use in the IBM cloud. NeuReality has been shipping prototypes to partners.

According to Tanach, NeuReality is working with a number of different companies on deployment. Tanach wouldn't reveal how many customers the startup has or how much revenue it expects.

The swine flu is slowing companies down and pushing for consolidation between vendors. Tanach said that the deployment of inference is expected to explode in the next year or two and that our technology is the driver of that growth. The NAPU will bring artificial intelligence to more companies. Large-scale users such as hyperscalers and next-wave data center customers will be able to support their growing scale of artificial intelligence usage.

"We see substantial and immediate need for higher efficiency and easy to deploy inference solutions for data centers, and this is why we are investing in NeuReality." The company's innovative disaggregation, data movement and processing technologies improve computation flows, compute-storage flows, and in-storage compute.

NeuReality plans to hire 20 more people over the next two quarters. It has raised $48 million in venture capital so far.