Tag Archives: Cloud Computing

Customer Centricity Shapes Your Platform Architecture

This week’s blog might be a little controversial, but hang in with me and it will get clearer. When we discuss customer centricity, it often feels like the domain of marketing, sales, or support. But in reality, customer centricity directly impacts software architecture, especially in a world where the cloud is the primary delivery model for software.

Too often, companies think of customer acquisition as a funnel: wide at the top, narrowing down to a sale. That’s a mistake. A better metaphor is an hourglass: acquiring a customer is just the midpoint. Retention, expansion, and deepening of customer value are just as critical.

Whether your customers are individuals or organizations, their needs always revolve around three key factors:

  1. Keep me safe (minimize risk)
  2. Save me money (minimize cost)
  3. Make me thrive (increase profits, stature, or viability)

You cannot separate the architecture of your platform from customer obsession in order to deliver on these goals. Below, I’ll outline key architectural principles every product leader should consider, each anchored in customer value.

1. Serviceful, Loosely Coupled Platforms

Favor serviceful platforms over brittle monoliths. This does not imply pursuing microservices without a clear purpose. Instead, ensure domain boundaries are respected, APIs expose logic and data, and refactoring happens in manageable chunks. This improves gross margins while reducing future drag.

2. Feedback Early, Iteration Always

Big upfront designs often fail under real-world complexity. Instead, build the thinnest viable platform, simple and evolving in response to usage. Internal developer platforms reduce cognitive load and accelerate iteration, creating consistent, curated developer experiences.

3. Asynchronous > Synchronous

Humans expect instant feedback, but platforms need scalability. Asynchronous integrations allow systems to react to events at scale, often uncovering new proactive patterns along the way.

4. Eliminate, Don’t Just Reengineer

As Elon Musk says, the first principle of design is elimination. Too many teams polish legacy components long past their expiration. Customer obsession means removing friction, even entire features, when they no longer serve the purpose.

5. Reengineer, Don’t Multiply

I know I mentioned to eliminate and not reengineer; too often we add things just for the sake of it, which creates unnecessary noise. Look at Apple’s careful approach to AI: slow beginnings, but better user experiences. Complete what you begin; don’t add new services until you’ve streamlined the old ones.

6. Duplicity > Premature Abstractions

Patterns emerge with real usage. Please avoid over-abstracting too early; it’s advisable to allow duplications until clear paths emerge. Like city planners waiting to see where grass is worn before paving sidewalks.

7. Reachability via APIs

Your business logic and data must be accessible through proper APIs. Proprietary protocols only create friction. APIs are the handshake of customer-centric platforms.

8. Everything as Code

Infrastructure, policies, security, and other elements should all be maintained in code. This ensures consistency and traceability, which accelerates evolution.

9. Secure by Default

Customer trust is non-negotiable. Zero trust and auditability for all human and non-human actors is a must. “Trust but verify” is outdated; today it’s “Zero Trust and verify.”

10. Build on Open Standards

Differentiate where customers care. Elsewhere, leverage open standards to reduce costs and innovate at the experience layer.

11. Explainability is Survival

A platform customers can’t understand is a platform they won’t trust. When failure occurs (and it will), systems must be explainable and observable to minimize downtime.

Closing Thought

Customer centricity isn’t just about GTM strategies or NPS scores, it’s about architecture. The way we build platforms directly reflects the way we value customers. Each principle above is both a technical choice and a customer promise: safety, savings, and growth.

As product leaders, our job is to make sure the platform hourglass doesn’t run out in the middle but continuously fills on both ends.

The future of AI looks a lot like the Cloud… And that is not a bad thing

When you look at where AI is headed, it is hard not to notice a familiar pattern. It looks a lot like cloud computing in its early and mid-stages. A few players dominate the market, racing to abstract complexity, while enterprises struggle to comprehend it all. The similarities are not superficial. The architecture, ecosystem dynamics, and even the blind spots we are beginning to see mirror the path we walked with cloud.

Just like cloud computing eventually became a utility, general-purpose AI will too.

From First-mover Advantage to Oligopoly

OpenAI had a distinct advantage, not only in terms of model performance but also in terms of brand affinity; even my non-technical mother was familiar with ChatGPT. That advantage, though, is shrinking, as we witnessed during the ChatGPT 5 launch. We now see the rise of other foundation model providers: Anthropic, Google Gemini, Meta’s Llama, Mistral, Midjourney, Cohere, Grok, and the fine-tuning layer from players like Perplexity.This is the same trajectory that cloud followed: a few hyperscalers emerged (AWS, Azure, and GCP), and while niche providers still exist, compute became a utility over time.

Enter Domain-Specific, Hyper-Specialized Models

This abstraction will not be the end. It will be the beginning of a new class of value creation: domain-specific models. These models will be smaller, faster, and easier to interpret. Think of LLMs trained on manufacturing data, healthcare diagnostics, supply chain heuristics, or even risk-scoring for cybersecurity.

These models won’t need 175B parameters or $100 million training budgets: they will be laser-focused and context-aware and deployable with privacy and compliance in mind. Most importantly, they will produce tailored outcomes that align tightly with organizational goals.

The outcome is similar to containerized microservices: small, purpose-built components operating near the edge, orchestrated intelligently, and monitored comprehensively. It is a back-to-the-future moment.

All the lessons from Distributed Computing …. Again

Remember the CAP theorem? Service meshes? Sidecars? The elegance of Kubernetes versus the chaos of homegrown container orchestration? Those learnings are not just relevant; they are essential again.

In our race to AI products, we forgot a key principle: AI systems are distributed systems.

Orchestration, communication, and coordination: these core tenets of distributed computing will define the next wave of AI infrastructure. Agent-to-agent communication, memory systems, vector stores, and real-time feedback loops need the same rigor we once applied to pub/sub models, API gateways, and distributed consensus.

Even non-functional requirements like security, latency, availability, and throughput have not disappeared. They’ve just been rebranded. Latency in LLMs is much a performance metric as disk IOPS in a storage array. Prompt injection is the new SQL injection. Trust boundaries, zero-trust networks, and data provenance are the new compliance battlegrounds.

Why This Matters

Many of us, in our excitement to create generative experiences, often overlook the fact that AI didn’t emerge overnight. It was enabled by cloud computing: GPUs, abundant storage, and scalable compute. Cloud computing itself is built on decades of distributed systems theory. AI will need to relearn those lessons fast.

The next generation of AI-native products won’t just be prompt-driven interfaces. They will be multi-agent architectures , orchestrated workflows, self-healing pipelines, and secure data provenance.

To build them, we will need to remember everything we learned from the cloud and not treat AI as magic but as the next logical abstraction layer.

Final thought

AI isn’t breaking computing rules; it’s reminding us why we made them. If you were there when cloud transformed the enterprise, welcome back. We’re just getting started.