32G Multi Rate Very Short Reach SerDes PHY - GlobalFoundries 12LP and 12LPP
NeuReality Boosts AI Acelerator Utilization With NAPU
By Sally Ward-Foxton, EETimes (April 4, 2024)
Startup NeuReality wants to replace the host CPU in data center AI inference systems with dedicated silicon that can cut total cost of ownership and power consumption. The Israeli startup developed a class of chip it calls the network addressable processing unit (NAPU), which includes hardware implementations for typical CPU functions like the hypervisor. NeuReality’s aim is to increase AI accelerator utilization by removing bottlenecks caused by today’s host CPUs.
NeuReality CEO Moshe Tanach told EE Times its NAPU enables 100% utilization of AI accelerators.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
Related News
- Flex Logix Boosts AI Accelerator Performance and Long-Term Efficiency
- Europe Leaps Ahead in Global AI Arms Race, Joining $20 Million Investment in NeuReality to Advance Affordable, Carbon-Neutral AI Data Centers
- BrainChip Boosts Space Heritage with Launch of Akida into Low Earth Orbit
- Arm shares jump 50% on AI, China boosts to results
- Rambus Boosts AI Performance with 9.6 Gbps HBM3 Memory Controller IP
Breaking News
- RISC-V International Promotes Andrea Gallo to CEO
- See the 2025 Best Edge AI Processor IP at the Embedded Vision Summit
- Andes Technology Showcases RISC-V AI Leadership at RISC-V Summit Europe 2025
- RISC-V Royalty-Driven Revenue to Exceed License Revenue by 2027
- Keysom Unveils Keysom Core Explorer V1.0
Most Popular
- RISC-V Royalty-Driven Revenue to Exceed License Revenue by 2027
- SiFive and Kinara Partner to Offer Bare Metal Access to RISC-V Vector Processors
- Imagination Announces E-Series: A New Era of On-Device AI and Graphics
- Siemens to accelerate customer time to market with advanced silicon IP through new Alphawave Semi partnership
- Cadence Unveils Millennium M2000 Supercomputer with NVIDIA Blackwell Systems to Transform AI-Driven Silicon, Systems and Drug Design