FedML raises $11.5 to combine MLOps tools with a decentralized AI compute network | TechCrunch

News Summary
- Collaborators can add devices to use for AI model training, like servers or even mobile devices, and track the training progress in real time.Recently, FedML rolled out FedLLM, a training pipeline for building “domain-specific” large language models (LLMs) à la OpenAI’s GPT-4 on proprietary data.
- “And thanks to its foundation of federated learning technology, its MLOps platform and collaborative AI tools that help developers train, serve and observe the custom models, building custom alternatives is an accessible best practice.”
- Avestimehr claims that the platform is being used by more than 3,000 users globally and performing over 8,500 training jobs across more than 10,000 devices.“For the data or technical decision-maker, FedML makes custom, affordable AI and large language models a reality,” Avestimehr said, with confidence.
- Nevertheless, Avestimehr believes FedML can achieve greater reach and success by combining this compute paradigm with an MLOps suite.“FedML enables custom AI models by empowering developers and enterprises to build large-scale, proprietary and private LLMs at less cost,” Avestimehr said.
- “Unfortunately, custom AI models are prohibitively expensive to build and maintain due to high data, cloud infrastructure and engineering costs.
- ).But FedML has ambitions beyond developing AI and machine learning model tooling.The way Avestimehr tells it, the goal is to build a “community” of CPU and GPU resources to host and serve models once they’re ready for deployment.
Interest in AI among the enterprise continues to rise, with one recent survey finding that nearly twothirds of companies plan to increase or maintain their spending on AI and machine learning into t [+4415 chars]