Concept
Defined the vision for an AI-native cloud stack: compute, storage, knowledge, and agent infrastructure.
Defined the vision for an AI-native cloud stack: compute, storage, knowledge, and agent infrastructure.
Brought up core platform foundations on our controller node in London: compute + storage, multi-tenant networking, security baseline, and deployment automation.
Designed a serverless vector database layer with tenant isolation, fast provisioning, and scalable indexing/search paths.
Added centralized IAM after VBase to unify authentication/authorization across the stack and simplify tenant access control.
Built ultra-fast, isolated function-as-a-service for events and jobs. Cold starts in seconds with automatic scaling.
Add an orchestration layer to connect data sources, run workflows, and coordinate agents/tools across the platform.
Introduced the first developer experience: including Cli, MCP and SDKs
Opened early access with docs, SDK examples, and onboarding for selected teams to validate product-market fit.
Augmented S3 — every bucket is a knowledge instance. Hybrid semantic search built in, with vector, analytical, and graph pillars.
Your DODIL agent specialist working with other agents.
Orchestrate agents with planning, context awareness, and multi-step reasoning capabilities.
Model Context Protocol — secure API layer for agents to access knowledge and tools.
Managed tool execution with budgets, rate limits, and blast radius constraints for safe agent operations.
Planned the intelligence layer: anomaly analysis, drift detection, and clustering on top of vector data.
Scale capacity and reliability: larger pools, better isolation, more regions, and production-grade observability.
Planned transformation pipelines to normalize, extract, and enrich data before embedding and indexing.