Gibran Raises $2.6M to Build Scale-Free AI That Evolves With Humans Rather Than Replacing Them

July 14, 2025

Gibran Raises $2.6M to Build Scale-Free AI That Evolves With Humans Rather Than Replacing Them

Gibran, the London-registered AI research company, raised $2.6 million in seed funding in July 2025, from Together Fund — the early-stage venture fund co-founded by Freshworks’ Girish Mathrubootham and Eka Software’s Manav Garg — and Mercuri. The capital will fund expansion of the research team, development of the core platform, and early applications in scientific domains, with initial R&D outputs targeted for the second half of 2025. Gibran was founded in 2025 by Govind Balakrishnan, Srikant Chakravarti, Suzanne Sadedin, and Edgar Duéñez-Guzmán.

The dominant paradigm in large language model development has been scale: bigger models, trained on more data, with more compute, producing progressively more capable systems. This approach has delivered remarkable results across a wide range of tasks. But it has inherent limitations in domains where large labelled datasets simply do not exist — drug discovery, theoretical science, creative generation, personalised education — and it produces systems that are fundamentally static: once trained, they do not continue to learn and adapt from the ongoing interactions they have with users. The result is AI that requires periodic retraining to update, is unable to incorporate the contextual learning that emerges from real-world use, and is designed to automate tasks rather than to genuinely collaborate with the humans using it.

Gibran’s founding thesis is that AI development needs a different model for these domains: one that draws on principles from evolutionary biology and complex adaptive systems rather than on the brute-force statistical learning that has defined the LLM era. The company is building what it calls “scale-free” AI — models that evolve and adapt through dialogue and contextual interaction with people, rather than through training on large static datasets. These systems are designed to generate genuinely novel combinations of existing knowledge — analogous to the way biological evolution combines and recombines genetic material to produce new adaptations — rather than simply interpolating between patterns observed in training data. This makes them potentially more useful in exactly the domains where conventional LLMs struggle: domains that require genuine scientific creativity, hypothesis generation from first principles, or personalised adaptation to an individual’s specific knowledge state and learning needs.

The founding team combines unusual breadth. Balakrishnan and Chakravarti previously co-founded Curio, an AI-powered audio journalism platform that operated for eight years before closing in January 2025. Suzanne Sadedin is an evolutionary biologist who uses AI and simulation to study complex cognition. Edgar Duéñez-Guzmán is a generative systems researcher. The combination of practical product-building experience, deep evolutionary theory, and computational research is unusual in the AI startup landscape and reflects the genuinely interdisciplinary foundations of Gibran’s approach.

Together Fund, led by founders who built multi-billion dollar enterprises in Freshworks and Eka Software, brings not just capital but strategic experience in building enterprise software platforms from zero to global scale. The initial focus on drug discovery and scientific research reflects both the magnitude of the opportunity — AI-accelerated drug discovery is one of the most significant commercial and social opportunities in the technology sector — and the structural advantage of Gibran’s approach in data-sparse research domains.

Sources