Connect with us

Technology

Databricks Apps a toolkit that simplifies AI development

Published

on

Databricks Apps a toolkit that simplifies AI development

Databricks on Tuesday unveiled Databricks Apps, a suite of features that aims to make it easier for users to build customized data and AI applications.

Databricks already provides Mosaic AI, an environment that enables customers to integrate systems such as large language models (LLMs) with their enterprise’s proprietary data. Missing, however, were the capabilities to develop the actual interactive applications such as generative AI chatbots that are powered by the combination of AI systems mixed with proprietary data.

Databricks Apps adds the ability to develop applications on top of the tools previously available in Mosaic AI, enabling developers to execute the entire development and deployment process within the secure Databricks environment.

Because Databricks Apps extends what users could do with Mosaic AI and lets them develop applications without requiring third-party platforms, the new set of tools is significant, according to Donald Farmer, founder and principal of TreeHive Strategy.

Advertisement

“Very interesting news from Databricks,” he said. “The new Databricks Apps features take away some bothersome obstacles such as the need to spin up separate infrastructure for development and deployment. Because they can now deploy and manage apps directly in Databricks, this should be considerably easier.”

Kevin Petrie, an analyst at BARC U.S, likewise noted that Databricks Apps is an important addition for the vendor’s customers given that it adds to what they were previously able to develop with Mosaic AI.

“Companies cannot differentiate themselves competitively by simply implementing AI/ML models,” he said. “Rather, they must create differentiated applications that capitalize on their unique datasets. Databricks Apps help AI adopters take that critical step.”

Based in San Francisco, Databricks is a data platform vendor that when it was founded in 2013 was one of the pioneers of the data lakehouse storage format, combining the structured data storage capabilities of data warehouses with the unstructured data storage capabilities of data lakes.

Advertisement

Over the past two years, the vendor has made AI a focal point, expanding its platform to include an environment for deploying and managing traditional AI, generative AI and machine learning applications.

Databricks’ June 2023 acquisition of MosaicML for $1.3 billion was a key part of creating that environment with MosaicML’s technology now serving as the foundation for Databricks’ AI and machine learning capabilities. Subsequent acquisitions and product development initiatives, including the launch of DBRX, Databricks’ own large language model, have added functionality.

Now, Databricks Apps — available in public preview on AWS and Azure — further advances the vendor’s AI development capabilities.

New capabilities

Fueled by the potential of generative AI to aid data management and analytics, enterprise interest in AI is surging.

Advertisement

One of its promises is that it can enable true natural language processing (NLP), which lets non-technical workers use analytics to inform decisions. Another part of its potential is that it can be used to generate code and automate processes, which can make data experts more efficient.

However, developing generative AI applications, including chatbots that let users query and analyze data and tools that use machine learning to take on tasks such as monitoring for data quality, is not easy.

Databricks Apps is designed to simplify application development by enabling developers to do all their work in the secure Databricks environment while still providing them with choices as they build data and AI applications.

Before Databricks Apps, Databricks customers had to use platforms from third-party vendors to complete the development of generative AI chatbots, AI-powered analytics applications and other intelligent capabilities.

Advertisement

However, mixing proprietary data, AI systems such as LLMs and third-party development platforms risked accidental data beaches. In addition, it was expensive.

Part of what makes developing data and AI applications difficult, risky and expensive is all the movement they require. Relevant data needs to be discovered and moved out of a data storage platform to train an application. The application needs to be developed in an integrated development environment or other data science platform. And then the application needs to be moved to its host environment for deployment and management.

Databricks Apps eliminates the need for that labor intensive, expensive and risky movement.

Instead, it enables developers to build applications natively in Databricks using development frameworks including Dash, Flask, Gradio, Shiny and Streamlit. In addition, it comes with prebuilt Python templates designed to speed up the development process.

Advertisement

However, if developers prefer to work in an integrated development environment such as Visual Studio Code or PyCharm, Databricks Apps supports that as well.

Following development, Databricks Apps eliminates the need to build infrastructures for deploying and running applications by running the applications on automatically allotted serverless compute storage within Databricks, according to the vendor. Management, meanwhile, includes security measures and governance capabilities such as access control and data lineage accessed through the Unity Catalog.

“There are some features here which are potentially very impactful,” Farmer said. “For example, the support for popular developer frameworks, which enables application developers to work with familiar tools of choice, expands the Databricks ecosystem to a new market of application developers.” 

In addition, eliminating the need to develop infrastructures for managing applications is noteworthy, he continued.

Advertisement

“The automatic provisioning of serverless compute will be significant because it enables developers to focus on building applications rather than the complex process of deploying a data architecture, which was a barrier to those who were not data specialists,” he said.

From a competitive standpoint, Databricks’ aggressive development of an environment for building, deploying and managing AI and machine learning tools over the past couple of years has differentiated it from other data platform vendors, Farmer said.

While AWS, Google Cloud, Microsoft and Snowflake have all similarly made AI a focal point of their product development, their tools for developing and managing AI models and applications are not as integrated as what Databricks has built, he continued. Databricks Apps furthers the separation between Databricks and its peers.

“Snowflake has been catching up or at least playing catch-up, but this continuous development from Databricks is very compelling,” Farmer said. “Microsoft Fabric, of course, is aiming to be a unified platform similar to Databricks, but it’s still an inferior product. Google Cloud Platform and AWS have a wide range of AI and ML services, but they’re not so deeply integrated across the platform.”

Advertisement

Despite the additive capabilities of Databricks Apps, Petrie cautioned that the applications — in particular, generative AI applications — customers will be able to develop will not suddenly enable anyone within an organization to freely work with data.

While Databricks aims to help enterprises broaden the reach of data and AI beyond a small audience of users, training and expertise are still required to use data and AI to inform decisions and take actions based on those decisions.

“Like many vendors, Databricks aims to ‘democratize’ the consumption of data, analytics and AI,” Petrie said. “But I think users of these applications will still require significant expertise in data, AI and the business domain, depending on the use cases involved.”

While Databricks Apps extends what customers could do with Mosaic AI and demonstrates that Databricks is continuing to focus on improving its AI and machine learning development environment, the impetus for the new features came from customer feedback, according to Shanku Niyogi, the vendor’s vice president of product management.

Advertisement

Developing and deploying internal applications has always been complex, he noted. But with enterprise interest in AI rapidly increasing, there is greater need for vendors such as Databricks to simplify developing and deploying AI applications.

“Customers … have shared that building and deploying internal data apps has historically been a complex and time-consuming process,” Niyogi said. “They specifically asked for easier ways to test new features while maintaining a secure environment. With the explosion of AI, this need has only grown.”

Looking ahead

Databricks Apps does not end Databricks’ focus on enabling application development and deployment, according to Niyogi.

The vendor’s goal is to make data and AI available to a broad audience of users, he noted. Toward that end, Databricks plans to invest in developing new Mosaic AI features as well as adding other capabilities through partnerships.

Advertisement

“Databricks will continue to make AI more accessible for organizations,” Niyogi said. “This includes further ways to simplify the app development process, new Mosaic AI capabilities that help teams build, deploy and measure compound AI systems, and a continued investment in a collaborative AI partner ecosystem.”

Farmer, meanwhile, said that Databricks’ focus on improving AI and machine learning workflows is appropriate. In particular, he suggested that the vendor enhance its support for developing applications for non-technical users as well as emerging AI technologies such as multimodal models.

“Multimodal will become critical over the next couple of years,” Farmer said. “I think we should also see more development for non-technical users. This release includes a first attempt at that, and no doubt this is the start of a new direction for Databricks and a very welcome one at that.”

Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.

Advertisement

Source link

Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

Fei-Fei Li picks Google Cloud, where she led AI, as World Labs’ main compute provider

Published

on

Fei-Fei Li picks Google Cloud, where she led AI, as World Labs' main compute provider

Cloud providers are chasing after AI unicorns, and the latest is Fei-Fei Li’s World Labs. The startup just tapped Google Cloud as its primary compute provider to train AI models, a deal that could be worth hundreds of millions of dollars. But Li’s tenure as chief scientist of AI at Google Cloud wasn’t a factor, the company says.

During the Google Cloud Startup Summit on Tuesday, the companies announced World Labs will use a large chunk of its funding to license GPU servers on the Google Cloud Platform, and ultimately train “spatially intelligent” AI models.

A handful of well-funded startups building AI foundation models are highly sought after in the cloud services world. Some of the largest deals include OpenAI, which exclusively trains and runs AI models on Microsoft Azure, and Anthropic, which uses AWS and Google Cloud. These companies regularly pay millions of dollars for computing services, and could one day need even more as their AI models scale. That makes them valuable customers for Google, Microsoft, and AWS to build relationships with early on.

World Labs is certainly building unique, multimodal AI models with significant compute needs. The startup just raised $230 million at more than a billion-dollar valuation, a deal led by A16Z, in order to build AI world models. General manager of startups and AI at Google Cloud, James Lee, tells TechCrunch that World Labs’ AI models will one day be able to process, generate, and interact with video and geospatial data. World Labs calls these AI models “spatial intelligence.”

Advertisement

Li has deep ties with Google Cloud, having led the company’s AI efforts in 2018. However, Google denies that this deal is a product of that relationship, and rejects the idea that cloud services are just commodities. Instead, Lee said services, such as its High Performance Toolkit to scale AI workloads, and its deep supply of AI chips were a larger factor.

“Fei-Fei is obviously a friend of GCP,” said Lee in an interview. “GCP wasn’t the only option they looked at. But for all the reasons we talked about – our AI optimized infrastructure and the ability to meet their scalability needs – ultimately they came to us.”

Google Cloud offers AI startups a choice between its proprietary AI chips, tensor processing units or TPUs, and Nvidia’s GPUs, which Google purchases and has a more limited supply of. Google Cloud is trying to get more startups to train AI models on TPUs, largely as a means to reduce its dependency on Nvidia. All cloud providers are limited today by the scarcity of Nvidia GPUs, so many are building their own AI chips to meet demand. Google Cloud says some startups are training and inferencing solely on TPUs, however, GPUs still remain the industry’s favorite AI training chip.

World Labs chose to train its AI models on GPUs in this deal. However, Google Cloud wouldn’t say what went into that decision.

Advertisement

“We worked with Fei-Fei and her product team, and at this stage of their product roadmap, it made more sense for them to work with us on the GPU platform,” said Lee in an interview. “But it doesn’t necessarily mean it’s a permanent decision… Sometimes [startups] move onto different platforms, such as TPUs.”

Lee would not disclose how large World Labs’ GPU cluster is, but cloud providers often dedicate massive supercomputers for startups training AI models. Google Cloud promised another startup training AI foundation models, Magic, a cluster with “tens of thousands of Blackwell GPUs,” each of which has more power than a high-end gaming PC.

These clusters are easier to promise than they are to fulfill. Google’s cloud services competitor Microsoft is reportedly struggling to meet the insane compute demands of OpenAI, forcing the startup to tap other options for computing power.

World Labs’ deal with Google Cloud is not exclusive, meaning the startup may still strike deals with other cloud providers. However, Google Cloud says it will get a majority of its business moving forward.

Advertisement

Source link

Continue Reading

Servers computers

Tripp Lite 8U/12U/22U Expandable Wall-Mount 2-Post Open Frame Rack, Adjustable Network Equ

Published

on

Tripp Lite 8U/12U/22U Expandable Wall-Mount 2-Post Open Frame Rack, Adjustable Network Equ



Tripp Lite 8U/12U/22U Expandable Wall-Mount 2-Post Open Frame Rack, Adjustable Network Equ

Configures to 8U, 12U or 22U of rack space. Stores 19 in. rack equipment up to 18 in. deep. Maximum load capacity of 150 lb. Allows easy access to equipment and cabling. Simple to assemble and mount. 5-Year Limited Warranty

Get it here: https://amzn.to/2MFGKFU .

source

Continue Reading

Technology

The best October Prime Day deals you can get for under $100

Published

on

The best October Prime Day deals you can get for under $100

If you’re looking to maximize your budget and score a bigger haul for your money during October’s Amazon Prime Day sale, you’ll be happy to know that there are many great deals you can find on tech and other gear for well under $100. Some of the inexpensive gadgets we like under that threshold are discounted even further, while others normally in the triple digits have snuck into double-digit territory, some for the first time.

Source link

Continue Reading

Servers computers

SERVER RACK SETUP

Published

on

SERVER RACK SETUP

source

Continue Reading

Technology

Here come the ads in your Google AI Overviews

Published

on

Google AI Overviews Ads

Google promised advertisers that its AI plans would not bar them from reaching potential customers, and the tech giant has delivered with the addition of advertising to the AI Overviews feature in Google Search. Whether that makes the Gemini AI-written summaries to your search queries more appealing is debatable. AI Overviews means you may not need to click on a link to get an answer to a question and was hyped at Google I/O this year. Companies that rely on Google’s search engine to promote their websites were leery of a tool that seemed to make sponsoring results worthless. 

Google anticipated that reaction and promised ads would be an element of AI Overviews. After months of testing, the ads are rolling out. Essentially, you’ll see products mentioned and linked to in the text written by the AI. For now, it’s just going to be U.S. mobile users who see the ads, but they will likely expand quickly, assuming the tests have worked out most of the bugs. 

Source link

Continue Reading

Technology

AIs can work together in much larger groups than humans ever could

Published

on

AIs can work together in much larger groups than humans ever could

Copies of the same artificial intelligence model can work together

Eugene Mymrin/Getty Images

We can struggle to maintain working relationships when our social group grows too large, but it seems that artificial intelligence models may not face the same limitation, suggesting thousands of AIs could work together to solve problems that humans can’t.

The idea that there is a fundamental limit on how many people we can interact with dates back to the 1990s, when anthropologist Robin Dunbar noticed a link between the size of a primate’s brain and the typical size of its social group. Extrapolating to humans,…

Advertisement

Source link

Continue Reading

Trending

Copyright © 2024 WordupNews.com