December 7, 2022  — NeuReality, a semiconductor firm, has announced a Series A fundraising round of USD 35 million. The round was co-led by Varana Cash, Samsung, SKHynix, and other important strategic and financial investors, bringing the total capital raised to USD 48 million. The money will assist NeuReality in finalizing the design and marketing of its flagship AI inferencing chip, which is scheduled to be released in early 2023.

NeuReality accelerates AI’s capabilities by providing a revolutionary solution that reduces overall complexity, cost, and power consumption. While other companies create Deep Learning Accelerators (DLAs) for deployment, none connect the dots with a software platform designed to help manage specific hardware infrastructure. This system-level, AI-centric approach makes it easier to run AI inference at scale.

“NeuReality was founded with the vision of building a new generation of AI inferencing solutions unleashed from traditional CPU-centric architectures and delivering high performance and low latency, with the best possible efficiency in cost and power consumption,” CEO Moshe Tanach.

According to Ezra Gardner, co-founder of Varana Capital and NeuReality Board Member, “Varana has supported NeuReality since the initial seed round, and it has been a joy to collaborate with Moshe and his world-class team. Our faith in NeuReality’s purpose is unwavering. The need for AI inference is increasing exponentially, and there is a global demand for the NeuReality solution, which reduces the cost and power consumption of scaling these applications. We can’t wait to start distributing these chips next year.”

About the investors:

Gardne, Ezra Varana Capital, LLC, founded in 2012, invests in and collaborates with small-cap/early-stage public and private firms primarily in the United States and Israel. VCLLC, which has offices in Colorado and Israel, provides a range of funds for authorized, long-term investors.

Neutrality has always focused on bringing AI hardware to market for cloud data centers and “edge” computers or machines that run on-premises and execute most of their data processing offline. The startup’s current-generation product lineup, the Network Attached Processing Unit (NAPU), according to Tanach, is optimized for AI inference applications such as computer vision (think algorithms that recognize objects in photos), natural language processing (text-generating and classifying systems), and recommendation engines (like the type that suggest products on e-commerce sites).

NeuReality’s NAPU is a mix of many CPU kinds. It can do services such as AI inferencing load balancing, job scheduling, and queue management that were previously performed in software but not always effectively.

We try our  best to fact-check and bring the best, well-researched, and non-plagiarized content to you. Please let us know

-if there are any discrepancies in any of our published stories,

-how we can improve,

-what stories you would like us to cover and what information you are looking for, in the comments section below or through our contact form! We look forward to your feedback, and thank you for stopping by! 

Next Article

Previous articleAutomotive software company Sonatus raises USD 75 million in funding
Next articleIreland-based data encryption startup Vaultree raises USD 12.8 million in Series A funding
Kshitij does business research and content writing for VCBay. Pursuing BBA from Symbiosis Center Of Management Studies (SCMS) Pune, he is skilled in Financial Modeling, Stock valuation and Microsoft Excel. He is passionate about Entrepreneurship and Finance.

LEAVE A REPLY

Please enter your comment!
Please enter your name here