NeuReality, an Israeli startup specializing in hardware and software solutions to accelerate artificial intelligence (AI) in data centers, raised $20 million in new funds from the European Innovation Council (EIC) Fund, Varana Capital, Cleveland Avenue, XT Hi-Tech and OurCrowd. NeuReality has now raised $70 million to date.
The company focuses on the inference stage of AI, which involves applying trained AI models to new data for tasks like image recognition or language translation.
NeuReality argues that traditional data center servers aren’t optimized for AI workloads. They’ve developed a custom architecture with specialized hardware (including their own Network Addressable Processing Units or NAPUs) and software to handle AI tasks more efficiently. NeuReality claims their solutions can deliver 10x faster inference compared to traditional systems, leading to quicker processing of AI tasks. This can be crucial for real-time applications.
Will you offer us a hand? Every gift, regardless of size, fuels our future.
Your critical contribution enables us to maintain our independence from shareholders or wealthy owners, allowing us to keep up reporting without bias. It means we can continue to make Jewish Business News available to everyone.
You can support us for as little as $1 via PayPal at [email protected].
Thank you.
NeuReality was established in 2019 by CEO Moshe Tanach, VP of Operations Tzvika Shmueli, and VP VLSI Yossi Kasus. The company says that its NR1 AI Inference Solution is a holistic systems architecture, hardware technologies, and software platform makes AI “easy to install, use, and manage.”
NeuReality’s boasts that its “novel” approach exponentially enhances ultra-scale inference for AI applications by doing away with the current CPU-centric model.
The company says enterprises are facing the limits of today’s solutions and experiencing spiraling costs with their growing deep learning demands on-premises and in the cloud.
“Our disruptive AI Inference technology is unbound by conventional CPUs, GPUs, and NICs. We didn’t try to just improve an already flawed system. Instead, we unpacked and redefined the ideal AI Inference system from top to bottom and end to end, to deliver breakthrough performance, cost savings, and energy efficiency,” said NeuReality’s CEO Moshe Tanach.
NeuReality says that its NR1 chip represents the world’s first NAPU (or Network Addressable Processing Unit) and will be seen as an antidote to an outdated CPU-centric approach for inference AI.
“In order for Inference-specific deep learning accelerators (DLA) to perform at full capacity, free of existing system bottlenecks and high overheads, our solution stack, coupled with any DLA technology out there, enables AI service requests to be processed faster and more efficiently, said Tanach.