Big Data News Hubb
Advertisement
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
Big Data News Hubb
No Result
View All Result
Home Big Data

Databricks Launches Simplified Real-Time Machine Learning for the Lakehouse

admin by admin
March 7, 2023
in Big Data


Databricks, the lakehouse company, announced the launch of Databricks Model Serving to provide simplified production machine learning (ML) natively within the Databricks Lakehouse Platform. Model Serving removes the complexity of building and maintaining complicated infrastructure for intelligent applications. Now, organizations can leverage the Databricks Lakehouse Platform to integrate real-time machine learning systems across their business, from personalized recommendations to customer service chatbots, without the need to configure and manage the underlying infrastructure. Deep integration within the Lakehouse Platform offers data and model lineage, governance and monitoring throughout the ML lifecycle, from experimentation to training to production. Databricks Model Serving is now generally available on AWS and Azure. 

With the opportunities surrounding generative artificial intelligence (AI) taking center stage, businesses feel the urgency to prioritize AI investments across the board. Leveraging AI/ML enables organizations to uncover insights from their data, make accurate, instant predictions that deliver business value, and drive new AI-led experiences for their customers. For example, AI can enable a bank to quickly identify and combat fraudulent charges on a customer’s account or give a retailer the ability to instantly suggest complementary accessories based on a customer’s clothing purchases. Most of these experiences are integrated in real-time applications. However, implementing these real-time ML systems has remained a challenge for many organizations because of the burden placed on ML experts to design and maintain infrastructure that can dynamically scale to meet demand. 

“Databricks Model Serving accelerates data science teams’ path to production by simplifying deployments, reducing overhead and delivering a fully integrated experience directly within the Databricks Lakehouse,” said Patrick Wendell, Co-Founder and VP of Engineering at Databricks. “This offering will let customers deploy far more models, with lower time to production, while also lowering the total cost of ownership and the burden of managing complex infrastructure.” 

Databricks Model Serving removes the complexity of building and operating these systems and offers native integrations across the lakehouse, including Databricks’ Unity Catalog, Feature Store and MLflow. It delivers a highly available, low latency service for model serving, giving businesses the ability to easily integrate ML predictions into their production workloads. Fully managed by Databricks, Model Serving quickly scales up from zero and back down as demand changes, reducing operational costs and ensuring customers pay only for the compute they use. 

“As a leading global appliance company, Electrolux is committed to delivering the best experiences for our consumers at scale — we sell approximately 60 million household products in around 120 markets every year. Moving to Databricks Model Serving has supported our ambitions and enabled us to move quickly: we reduced our inference latency by 10x, helping us deliver relevant, accurate predictions even faster,” said Daniel Edsgärd, Head of Data Science at Electrolux. “By doing model serving on the same platform where our data lives and where we train models, we have been able to accelerate deployments and reduce maintenance, ultimately helping us deliver for our customers and drive more enjoyable and sustainable living around the world.”

Databricks’ unified, data-centric approach to machine learning from the lakehouse enables businesses to embed AI at scale and allows models to be served by the data and ML training platform. Lakehouse provides a consistent view of data throughout the entire ML lifecycle, which accelerates deployments and reduces errors, without having to stitch together disparate services. With Databricks, organizations can manage the entire ML process – from data preparation and experimentation to model training, deployment and monitoring – all in one place. Databricks Model Serving integrates with Lakehouse Platform capabilities, including:  

  • Feature Store: Provides automated online lookups to prevent online/offline skew. Define features once during model training, and Databricks will automatically retrieve and join the relevant features in the future.
  • MLflow Integration: Natively connects to MLflow Model Registry, enabling fast and easy deployment of models. After providing the underlying model, Databricks will automatically prepare a production-ready container for model deployment. 
  • Unified Data Governance: Manage and govern all data and ML assets with Unity Catalog, including those consumed and produced by model serving.

Databricks is committed to driving innovation with its Lakehouse Platform and delivering more capabilities that make powerful, real-time machine learning accessible to any organization. This includes new quality and diagnostic features coming soon for Databricks Model Serving which will automatically capture requests and responses in a Delta table to monitor and debug models and generate training data sets. Databricks is also enabling GPU-based inference support, which is available in preview. 

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW





Source link

Previous Post

Announcing General Availability of Databricks Model Serving

Next Post

When should you use a database instead of a spreadsheet?

Next Post

When should you use a database instead of a spreadsheet?

Recommended

How to Protect Company’s Sensitive Information From Outsiders

January 19, 2023

How BookMyShow saved 80% in costs by migrating to an AWS modern data architecture

January 11, 2023

Data Concerns with Netflix and Disney+’ Ad-Supported Tiers

October 14, 2022

Don't miss it

News

Bill Gates Says the Age of AI Has Begun, Bringing Opportunity and Responsibility

March 25, 2023
Big Data

Techniques for training large neural networks

March 25, 2023
Big Data

O’Reilly 2023 Tech Trends Report Reveals Growing Interest in Artificial Intelligence Topics, Driven by Generative AI Advancement

March 24, 2023
Big Data

Democratizing the magic of ChatGPT with open models

March 24, 2023
News

Introducing native support for Apache Hudi, Delta Lake, and Apache Iceberg on AWS Glue for Apache Spark, Part 2: AWS Glue Studio Visual Editor

March 24, 2023
News

ChatGPT Puts AI At Inflection Point, Nvidia CEO Huang Says

March 24, 2023

big-data-footer-white

© 2022 Big Data News Hubb All rights reserved.

Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Privacy Policy and Terms & Conditions.

Navigate Site

  • Home
  • Big Data
  • News
  • Contact us

Newsletter Sign Up

No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us

© 2022 Big Data News Hubb All rights reserved.