Databricks makes it easier to build real-time ML applications with the new service

Databricks makes it easier to build real-time ML applications with the new service

San Francisco-based Databricks, which provides a Data Lakehouse platform for storing and mobilizing disparate data, today debuted serverless real-time inference capabilities. The company says the move will make it easier for struggling enterprises to deploy and run real-time machine learning (ML) applications.

Today, real-time ML is the key to product success. Companies are deploying it for a variety of app use cases—from recommendations to chat personalization—to take immediate action based on streaming data and drive revenue. However, when it comes to end-to-end support for AI application systems, things can get complicated.

Teams must store their ML models in the cloud or on-premises, then expose their functionality via API to work within the application system. The process is often referred to as ‘sample serving’. It requires the creation of a fast and scalable infrastructure that supports not only the main service demand, but also feature discovery, monitoring, automated deployment, and model retraining. This results in teams integrating disparate devices, which increases operational complexity and maintenance overhead.

In fact, most data scientists who handle this task spend most of their time and resources just putting together and maintaining data in the data, ML lifecycle, and servicing the infrastructure.

Model serving with serverless real-time inference

To address this gap, Databricks launched serverless real-time inference in GA. This is an important step for a company that has led the development of cloud-based Spark data processing methods.

According to Databricks, the new service is a fully managed, production-level service that exposes MLflow machine learning models as scalable REST API endpoints. Performs all the heavy lifting involved in the process, from configuring infrastructure to managing instances, maintaining version compatibility, and patching versions. The service dynamically grows and shrinks resources, ensuring cost-effectiveness and maximum scalability – all with high availability and low latency.

See also  Top 10 Mobile App Development Companies in Texas 2023

Databricks notes that with this offering, companies can reduce infrastructure overhead and accelerate their teams’ time to production. In addition, its deep integration with various data lake system services, such as the service container, provides automatic alignment, control, and monitoring of data, services, and model lifecycles. This means teams can manage the entire ML process, from data ingestion and training to deployment and monitoring, on a single platform, creating a unified view of the ML lifecycle that minimizes errors and speeds up troubleshooting.

Model serving with serverless real-time inference. Image source: Databricks.

The time and resources saved by the model service can instead be used to produce better quality models faster, Databricks noted. Chris Sawtelle, engineering manager at Barracuda Networks, and Gyuhyeon Sim, CEO of Letsur AI, also noted similar benefits.

Sawtelle said Databricks’ model service allows the Barracuda team to manage, deploy and monitor ML models in the same process. This allows ML engineers to focus solely on producing the most efficient models without spending time on the availability and functionality of the models.

Meanwhile, Sim noted that the service’s rapid auto-scaling keeps costs low while allowing it to scale as traffic demand increases. “Our team now spends more time building models to solve customer problems, rather than debugging infrastructure issues,” he added.

The general availability of the service is the latest move by Databricks to provide businesses with everything they need to quickly and easily build models using the data stored in their lakehouse. The company has also launched industry-specific versions of its platform to better serve customers in sectors such as healthcare and compete strongly with players such as Snowflake and MongoDB.

See also  tet applications "rise" like hot cakes

Databricks has raised $3.5 billion in nine funding rounds. Its clients include such giants as AT&T, Columbia, Nasdaq, Grammarly, Rivian and Adobe.

VentureBeat’s mission will be a digital town square for tech decision makers to learn about transformative enterprise technology and transactions. Discover our brochure.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *