Back to Blog
Nvidia, AI Infrastructure, Agentic AI, Enterprise AI, Trust and SafetySep 2, 2025

AI’s New Bottleneck: Trust and Concentration — Inside Nvidia’s 39% Mystery Customers

A sharp look at Nvidia’s customer concentration, the trust crisis in AI deployments, and what this shifting power map means for investors, startups, enterprises, and everyday users.

AI’s New Bottleneck: Trust and Concentration — Inside Nvidia’s 39% Mystery Customers
The most dangerous thing about AI isn't that it sees things that aren't there; it's that it doesn't pay attention. This week's news indicates that a few people are getting more power and money, but trust and dependability are still behind where AI really affects people. For the past 15 years, I've been watching AI cycles, and it's evident that capital and compute are coming together faster than confidence and capacity.
What do these developments all have in common? Three lines meet. First, buyer concentration: Nvidia said that two "mystery customers" took up 39% of its Q2 revenue, which was $46.7 billion. Second, people are losing faith: Taco Bell is cutting back on drive-thru AI because it didn't function well, Meta is making rules for teen chatbots harsher, and Anthropic is making data sharing optional. All of these elements show that deployment is running into social and legal limits. Third, a platform realignment: Meta's Scale AI partnership is falling apart, India's Reliance is striving to build a sovereign AI backbone with Big Tech, and AWS is racing to convert Bedrock into a product. This shows that who owns the rails and the rules is becoming more important every week.
Let's take a closer look at the most essential point: Nvidia's very high concentration of income. The news isn't just a footnote; it's that roughly 40% of a blowout quarter came from only two clients. Yes, the hyperscalers have always been the finest clients, but the size and lack of information here are astounding. Investors were thrilled about the projection that AI capex will be between $3 and $4 trillion, but they nevertheless sold the company since it is vulnerable due to concentration and cycle timing. When a few buyers drive your growth, their schedule for buying becomes your forward multiple.
Who are these people called "Customer A" and "Customer B"? Nvidia won't say, and I won't guess beyond the obvious: think of massive GPU renters and hyperscalers who get unique supply allocations. What matters is not identity, but dependence. That dependency, which makes up about 39% of revenue, can make any of the following worse: a halt in orders as data centers switch to GB200 racks, a shift to custom silicon (as we've seen in different ways from cloud providers), or a policy-driven shift toward sovereign AI spending that splits demand. In the meanwhile, channeling purchases through aggregators or lease-providers can mask the disparities between end users while focusing on recognizing income. There are real money below the shells. Is the demand going to last? Yes, in the near future. Use is still high, inference is finally catching up to training, and the time it takes to make new models is growing shorter. But the glide path is changing. The next issue isn't raw flops; it's safety, trust, integration, and TCO. That's why we're seeing the second derivative moves in the market: AWS making Bedrock guardrails and Karpenter-backed autoscaling; businesses testing agent frameworks with clearer accountability (see Maisa AI's pitch on lowering enterprise failure rates); and regulators getting tougher on sensitive groups (Meta's teen chatbot restrictions after the scandal). In other words, the race to get GPUs is turning into a race to get Ops and Governance. There is also a political side. Reliance's plan to build an Indian AI backbone with Google and Meta—and possibly OpenAI—shows that national champions will want to be sure of supply, have local control, and have cost curves that don't leave them at the mercy of US business cycles. Lawsuits over alleged AI monopolies have paved the stage for procurement to become policy. That's when suppliers who have a lot of control over prices and how they get to customers have to modify them.
Short- and medium-term effects on the market: Nvidia is still the index for AI in the near future. Companies that make networking equipment, like Arista and Broadcom, and companies that make power and cooling equipment, make money from more demand. GPU renters and specialty clouds also gain. But if "Customer A/B" shows any signs of order digesting, things will get even more unstable. - In the next 12 to 24 months, you should see increased variety. More accelerators for some areas, more ARM-based and custom inference silicon, and buying sovereign capacity. Smartphones made after 2010 are similar: one huge business established the bar for performance, and then economics pushed intelligence to the limit with various types of technology. Now is the time for investors to move from pure-capex beta to AI operations beta. I like these groups: cost governance and workload placement across clouds and on-prem; agent observability and policy enforcement; data provenance and consent infrastructure (Anthropic's opt-out deadline is a canary); and vertical-integration plays that turn models into workflows with measurable ROI. I don't think consumer chat UIs are a good method to make money because they don't make users want to spend more. The money goes to controlled data areas, integration, and distribution.
For new firms, being reliable is more crucial than being new. Taco Bell's troubles show that voice agents still don't operate effectively in real life when there is noise, different dialects, and unusual situations. If you can make "first-time-right" rates, latency, and recovery paths ten times better, you'll be the new players. Make pipelines that operate with any GPU and make it easy and cheap to swap clouds. The CFO of your buyer is worried about unit economics and exit options. Because your customers have to now, approach data consent like a product surface (opt-outs, provenance, licensing). And don't hire agents for the first ten jobs unless you can keep an eye on what they do like you do with employees: SLAs, audit trails, permissions, and off-ramps. The greatest thing for businesses to do is set small goals and keep track of everything without compassion. Start with use cases that need a lot of retrieval and don't cost much to hallucinate. After that, put up guardrails and have people look over everything until the metrics prove that independence is warranted. Don't use just one GPU calendar; use calendars from multiple companies. Pilot hosting for sensitive domains that are either public or private. Choose platforms that make it easier to transfer policies around. For example, AWS Bedrock's guardrails, Kendra-like search, and autoscaling toolchains are becoming mainstream. Put TCO literacy next to model benchmarks at the top. This is because boardrooms are asking more and more, "What's the marginal ROI of the next 1,000 GPUs?" There will be fewer "wow" demos and more tools that "work every time" for average people. AI will assist 911 centers pick through calls that aren't emergencies. Banks will automate routine choices, which will have an impact on jobs. Chatbots will feel more constrained, especially for kids. That's not standing still; that's normalizing. The things you don't notice the most will be the ones you need the most.
I am ready to be held to three guesses: - Nvidia's Customer A/B will each create or grow custom accelerators, which will make them less dependent on Nvidia and minimize Nvidia's part of their AI expenditure in 18 months. - You will need AI "trust stacks" to buy items in 2026. These will include provenance, permission, safety evaluation, and the opportunity to see the agent. - "Governed agents" will be the next big thing. They are different from "bigger models" since they are based on how well they fit into the organization's rules and how accountable they are, not IQ.
The headline isn't that AI is growing slower; it's that power is getting stronger and the boundary of adoption is moving from models to mechanisms. The people who can get a lot of good results from a little bit of computational power will win. This must happen on a big scale, according to policy, and with no single point of failure.
AI’s New Bottleneck: Trust and Concentration — Inside Nvidia’s 39% Mystery Customers | ASLYNX INC