Bangalore, October 19, 2022: IESA member d-Matrix announced at the IESA Vision Summit the opening of the R&D Center in the presence of IESA President Vivek Tyagi. Founded by semiconductor veterans from Silicon Valley, Sid Sheth and Sudeep Bhoja, d-Matrix builds a unique data center, artificial intelligence inference platform using a software-first approach paired with leading hardware innovations in the areas of in-memory computing (IMC) and interfaces. at the slide level. d-Matrix has tackled the physics of memory arithmetic integration using innovative ML tools, software, algorithms, and circuit techniques, solving the ultimate frontier in AI computing efficiency.
If you’ve been following the development of AI powered by deep learning, the renaissance of generative AI, and the next upheaval in computer vision, you probably know that it’s all about transformer-based models. They power neural networks with billions to trillions of parameters and current silicon architectures (including a large number of AI accelerators) struggle to varying degrees to keep up with explosive model sizes and performance requirements.
“d-Matrix is addressing the growing need for more AI computing head-on by developing a highly efficient all-digital in-memory computing accelerator for AI inference and optimizer for computational patterns in switches,” Said Sheth, Co-Founder, President and CEO of d-Matrix“
At d-Matrix, the team is actively working to build the world’s first inference-focused computing platform for the Transformer AI era. Adapter-based model architectures create a whole new class of models called generative models that power services like language generation and code generation, and d-Matrix has already completed development for the Nighthawk and Jayhawk platforms that demonstrate the benefits of digital information, memory computing, and chiplets for computation inference. The company is now developing the Corsair platform that combines the elements with an open, mature, and comprehensive software package that can be deployed frictionlessly across the cloud to customer computing.
“The efficient Corsair chiplet platform provides high-bandwidth computing to balance fabric, memory, and networking to serve generative Transformer AI models such as GPT3 and large-scale stable propagation models,” Sudeep Bhoja, co-founder and CTO, d-Matrix . said
d-Matrix recently raised $44 million in Series A funding from two distinguished investors: Microsoft M12, Playground Global, SK Hynix and Marvell.
d-Matrix already has research and development centers in Santa Clara and Sydney. When asked about India Center, Mr. Sheth said“We are very excited about our position in India. We already have a world class core team now in design and validation here. The talent in India is amazing. Team India will play an important role in our growth”
D-Matrix Tighten Lately Dr.. Pradeep Thacker To chair the team India Vice President and Head of State. “We are delighted to have Pradeep join the India team. He brings decades of experience in system-on-chip development,” Syed said. Prior to d-Matrix, Pradip was Vice President of Engineering and Country Head of Team Marvell India, where he was responsible for a force of over 1,500 employees. When asked why he chose D-Matrix, Pradeep said, “The opportunity to work on groundbreaking AI technology with the world-class engineering and leadership team at d-Matrix was too exciting to miss,” Dr. Thacker said. The team has a proven track record in developing and commercializing silicon systems at scale and has attracted top-tier talent across the industry. We will see artificial intelligence transform lives in the next decade and the d-Matrix will play a major role in the revolutionary technology adoption.