Share
Home Technology AI startup Hugging Face attracted funding valuing it at $4.5 bn

AI startup Hugging Face attracted funding valuing it at $4.5 bn

Open-source AI is key
AI startup Hugging Face attracted funding valuing it at $4.5 bn
AI Startup Hugging Face

According to a HubSpot poll, 43% of business leaders say they plan to increase their investment in AI and automation tools over the course of 2023.

We all know that artificial intelligence (AI) startups are emerging everywhere, as hopeful competitors to Microsoft-backed OpenAI.

But one in particular has been a magnet to huge investments: Hugging Face.

Salesforce Ventures has led a $235 million financing round in Hugging Face, a NYC and Paris, France-based open-source AI platform that helps businesses use AI. This brought the startup’s valuation to $4.5 billion. OpenAI is valued at $11.3 billion. The round more than doubles the company’s actual share price, thus valuing the company at over 100 times its annualized revenue.

Total funding for Hugging Face reached $395 million to date.

Read: Abu Dhabi University defines artificial intelligence (AI) as universal skill 

Who is Hugging Face?

Hugging Face runs a service that simplifies how companies store and use AI software, similar to the way GitHub lets developers store software code.

AWS, Google, Nvidia, IBM, Intel, AMD, Qualcomm, and Sound Ventures participated in the funding round and signed partnerships with the company.

Hugging Face provides MLOps, a set of tools essential for taking AI models from development to production. It is an open-source and open-science artificial intelligence platform for developers to share models, data, and code. The platform hosts 500,000 models, 250,000 datasets and over 1 million code repositories.

The market estimates MLOps to reach $16.6 bn by 2030.

With 170 employees, the company boasts more than 10,000 customers and deals with 50,000+ organizations in the AI domain.

Beyond MLOps

In 2021, Hugging Face unveiled Bloom, an LLM, aiming to rival OpenAI’s GPT-3.

Hugging Face has now collaborated with ServiceNow, the enterprise software company, to release a free code-generating AI model called StarCoder. It later released a follow-up model called  SafeCoder which debuted recently.

Hugging Face recently worked with $1 trillion dollar company Nvidia to expand access to cloud computing via Nvidia’s DGX computing platform.

The company partnered with Amazon to extend its products to AWS customers and leverage the next generation of Bloom.

Finally, Hugging Face worked with Microsoft on a way to turn the company-developed AI models into scalable production solutions hosted through Azure.

Paid functionalities

The company’s paid functionalities include AutoTrain, which helps to automate the task of training AI models. Inference API allows developers to host models without managing the underlying infrastructure. Finally, the design behind Infinity increases the speed with which an in-production model processes data.

Open-source AI

Open-source AI is an alternative development paradigm to the closed-source commercial AI that markets call a black box.

While many companies often keep their proprietary AI code under wraps to maximize profitability, backers of open-source AI are keen to promote transparency. They encourage community-driven enhancements, as anyone can inspect, modify, and distribute frameworks.

Meta is releasing open-source versions of AI models like Llama-2, its 70 billion-parameter LLM trained on two trillion tokens of raw text. This is hosted by who else, Hugging Face. Stability AI’s Stable Diffusion model is another example of an open-source model hosted on the platform.

IBM has also contributed over 200 open models and datasets on Hugging Face. This includes its recent Geospatial Foundation Model built with NASA.

For more tech stories, click here.

The stories on our website are intended for informational purposes only. Those with finance, investment, tax or legal content are not to be taken as financial advice or recommendation. Refer to our full disclaimer policy here.