Big Data News Hubb
Advertisement
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
Big Data News Hubb
No Result
View All Result
Home Big Data

d-Matrix Unlocks New Potential with Reinforcement Learning based Compiler for at Scale Digital In-Memory Compute Platforms

admin by admin
November 25, 2022
in Big Data


d-Matrix, a leader in high-efficiency AI-compute and inference, announced a collaboration with Microsoft using its low-code reinforcement learning (RL) platform, Project Bonsai, to enable an AI-trained compiler for d-Matrix’s unique digital in memory compute (DIMC) products. The user-friendly Project Bonsai platform accelerates time to value, with a product-ready solution that cuts down on development efforts using an AI-based compiler that leverages ultra-efficient DIMC technology from d-Matrix.

With large transformer models driving expanding demand for AI inference, while memory and energy requirements hit threshold limits, d-Matrix is bringing one of the first DIMC-based inference compute platforms to market. d-Matrix transforms the economics of complex transformers and Generative AI with a scalable platform built to handle the immense data and power requirements of inference AI, making energy-hungry data centers more efficient. This novel AI compute platform from d-Matrix uses an ingenious combination of intelligent ML tools and integrated software architectures utilizing chiplets in a Lego block grid formation, which enables the integration of multiple programming engines in a common package.

Combining d-Matrix technology with Project Bonsai enables the efficient creation of a compiler for the DIMC platform. Project Bonsai accelerates rapid prototyping, testing and deploying of trained RL agents in the compiler stack to take full advantage of low power, AI inference technology from d-Matrix that can deliver up to ten times the power efficiency of older architectures. 

“d-Matrix has built the world’s most efficient computing platform for AI inference at scale,” said Sudeep Bhoja, Co-Founder, CTO at d-Matrix. “What made us gravitate towards Project Bonsai is its product-first features and ease of use. Microsoft’s unique offering is built around machine teaching and the Inkling language, which makes RL constructs fully explainable.”

The RL based compiler is expected to become a key differentiator of d-Matrix’s first generation DIMC product offering, CORSAIR, on track to ship in late 2023.

“We have been working together developing the RL based compiler,” said Kingsuk Maitra, Principal Applied AI Engineer at Microsoft, with the Project Bonsai team. “We made it a point to have a product mindset from the get-go. Embodiments including the instruction set architecture have been vetted and validated on two d-Matrix test chips, NightHawk and JayHawk, and embedded into the RL training environment. Project Bonsai’s low code attributes made early development work easy, and the ability to integrate statistical control parameters and make integration of other real life chip design constraints simpler, with very promising results so far.” 

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW





Source link

Previous Post

Delivering Novel Application Experiences using Machine Learning & AI

Next Post

AI Democratization a Work in Progress, H2O’s Ambati Says

Next Post

AI Democratization a Work in Progress, H2O’s Ambati Says

Recommended

2023 Trends in Data Strategy

December 13, 2022

Accelerate orchestration of an ELT process using AWS Step Functions and Amazon Redshift Data API

January 10, 2023

How to Effectively Leverage LLMs (Large Language Models) for B2B NLP (Natural Language Processing) Use Cases

October 26, 2022

Don't miss it

News

Introducing the AWS ProServe Hadoop Migration Delivery Kit TCO tool

February 7, 2023
News

Are We Nearing the End of ML Modeling?

February 7, 2023
Big Data

How to Use Apache Iceberg in CDP’s Open Lakehouse

February 6, 2023
Big Data

Implementing AI into Enterprise Search to Make It Smarter

February 6, 2023
Big Data

Performing Slowly Changing Dimensions (SCD type 2) in Databricks

February 6, 2023
News

Deep dive into the AWS ProServe Hadoop Migration Delivery Kit TCO tool

February 6, 2023

big-data-footer-white

© 2022 Big Data News Hubb All rights reserved.

Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Privacy Policy and Terms & Conditions.

Navigate Site

  • Home
  • Big Data
  • News
  • Contact us

Newsletter Sign Up

No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us

© 2022 Big Data News Hubb All rights reserved.