Concept
Defined the vision for an AI-native cloud stack: compute, storage, knowledge, and agent infrastructure.
Defined the vision for an AI-native cloud stack: compute, storage, knowledge, and agent infrastructure.
Brought up core platform foundations on our controller node in London: compute + storage, multi-tenant networking, security baseline, and deployment automation.
Designed a serverless vector database layer with tenant isolation, fast provisioning, and scalable indexing/search paths.
Added centralized IAM after VBase to unify authentication/authorization across the stack and simplify tenant access control.
Built multimodal ingestion pipelines: parse → chunk → enrich → embed, streaming-friendly with progress and artifacts.
Built ultra-fast, isolated function-as-a-service for events and jobs. Cold starts in milliseconds with automatic scaling.
Kubernetes clusters as a service. Shared or dedicated, with autoscaling and managed control planes.
Add an orchestration layer to connect data sources, run workflows, and coordinate agents/tools across the platform.
Introduced the first developer experience: including Cli, MCP and SDKs
Opened early access with docs, SDK examples, and onboarding for selected teams to validate product-market fit.
S3-compatible object storage for artifacts, datasets, and runtime caches. Durable and globally accessible.
Your DODIL agent specialist working with other agents.
Orchestrate agents with planning, context awareness, and multi-step reasoning capabilities.
Model Context Protocol — secure API layer for agents to access knowledge and tools.
Managed tool execution with budgets, rate limits, and blast radius constraints for safe agent operations.
Planned the intelligence layer: anomaly analysis, drift detection, and clustering on top of vector data.
Scale capacity and reliability: larger pools, better isolation, more regions, and production-grade observability.
Planned transformation pipelines to normalize, extract, and enrich data before embedding and indexing.