Sun. Aug 3rd, 2025

Bud Ecosystem’s Single GenAI Stack For CPUs, GPUs, And Edge Devices

Jithin VG Founder Bud Ecosystem


Starter

GenAI models, without the need for costly GPUs or vendor lock-in? A new platform says it can deliver just that.

Bud Ecosystem was founded in early 2023 by Jithin VG and Ditto PS as a research lab focused on artificial intelligence and open source model development.

While there is no direct hardware component to their work, Jithin noted that it remains closely linked to electronics. One of the key challenges, he explains, is enabling large language models (LLMs) and general AI (GenAI) models to run on small electronic devices or standard commodity CPUs—whether on servers, clients, or edge systems. Bud Ecosystem’s software is designed to do exactly that, ensuring these models operate efficiently while maintaining high performance across diverse hardware.

Jithin VG Founder Bud Ecosystem
Jithin V.G., Founder & CEO, Bud Ecosystem

When discussing the key challenges of deploying GenAI across diverse hardware, Jithin says, “A key challenge is that there was no common software layer to support different hardware platforms like Nvidia’s CUDA, AMD’s ROCm, and Intel’s systems. OpenAI’s Triton worked primarily with Nvidia, so we built much of the required infrastructure ourselves—particularly for CPUs—by creating computing, communication, and networking libraries from scratch. Even with this foundation, we needed to ensure AI models run efficiently in real-world situations, so we developed a simulation system to help optimise performance for internal development and end-user deployments.”

The startup is building a platform to minimise hardware dependency for GenAI infrastructure—a challenge that remains difficult to overcome with existing systems. Systems built on Nvidia CUDA are locked into Nvidia hardware, and even deploying across different Nvidia chips can be challenging due to varying CUDA versions. Mixing hardware from other vendors like Nvidia, AMD, and Intel—or combining GPUs, HPUs, and CPUs—is currently unfeasible.

The team claims to be the only company addressing this challenge through heterogeneous parallelism and clustering, along with SLO-based management that abstracts hardware details. This approach means users no longer need to worry about the underlying hardware. It is similar to how operating systems once addressed mainframe limitations by offering hardware abstraction and portability. Today, GenAI faces a comparable issue: models developed for Nvidia platforms often cannot be deployed on alternative systems. The startup aims to address this by enabling AI applications to operate seamlessly across diverse hardware environments.

Open-sourcing GenAI presents significant challenges. Although the startup required numerous contributors, it faced difficulty sourcing them. In India, while many use open source tools, few engage in building large-scale systems. For Bud Ecosystem, the challenge was both technical and cultural—building complex systems while nurturing a stronger open source community in India.

Bud Ecosystem focuses on core AI research and model architectures tailored for India and similar markets. Their work targets models that consume less energy, require less computational power, and can run on edge devices. Rather than following the large data centre approach, they believe India can leverage billions of devices as a decentralised computing network. They believe innovation should precede growth, which will follow naturally when meaningful problems are addressed.


By uttu

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *