Upstart Aria Networks Unveils AI‑Native Networking Platform For The AI Factory Era
Networking is no longer a background utility in the AI factory. It’s a critical differentiator that can make or break AI infrastructure performance at scale, CEO and Co‑Founder Mansour Karam tells CRN.
Aria Networks, a Palo Alto, Calif.-based startup that’s at the intersection of networking, distributed systems and AI, has announced the general availability of its AI-native networking offering that the company said has been designed for the AI factory era.
In the AI factory, networking is no longer “plumbing” in the background. It is a performance differentiator, and an important component of delivering competitive AI infrastructure at scale, CEO and Co-Founder Mansour Karam told CRN.
Karam was the founder and CEO of intent-based networking provider Apstra before it was acquired by Juniper Networks in 2021. During his time at Juniper Networks, he got back into owning the entire stack — both hardware and software. That’s when he saw the “explosion of new demand” for data center from customers because of AI.
“We really needed to start the company from scratch, and that’s where I left and started Aria,” he said.
[Related: 10 CEOs On How AI Is Changing The Networking Game]
The one-year-old company founded by technology veterans from the likes of Arista Networks, Cisco, Google, Meta and Pure Storage, to name a few, got its start in 2025 delivering on its promise of creating “networks that think,” Karam said.
Aria’s platform has been built to optimize token efficiency, what the company called a core metric for AI factories that ties network performance to revenue, cost per token, and model FLOPs utilization (MFU).
At the center of the platform is deep networking, which Karam called “a fundamentally different approach” to how AI networks operate. Aria’s platform continuously collects fine‑grained, end‑to‑end telemetry—10 to 10,000 times finer resolution than traditional tools—across switches, transceivers, and hosts, the company said. Karam said that prior AIOps models have failed because they dumped telemetry into data lakes and left customers to interpret and act on it manually, which he called an “unworkable” approach at AI scale. Instead, Aria’s platform has been designed to close that loop by automatically extracting what matters from telemetry and taking real‑time, intelligent action to optimize AI cluster performance.
Aria Networks is working with AI‑focused system integrators, neocloud builders, and managed AI infrastructure providers to deliver full AI factories, with networking treated as a first‑class design element rather than a bolt‑on, Karam said.
“We’re working with the right type of partners for this market. In some cases, it could be someone that is delivering the entire factory — they’re delivering the compute, the storage, and our job is to make sure we deliver the network. Then, there’s the service integrators [whose] purpose is really to deliver on this AI opportunity,” he said.
White glove deployment will be a core part of the partner model, with Aria embedding field deployment engineers directly into customer environments. It’s an opportunity the company says partners can eventually take on themselves as a value‑added service.
Karam said that partners don’t need decades of networking expertise to win AI factory deals, but they do need to recognize that network performance underpins AI performance. As enterprises move toward private AI data centers, the expectation is that partners who can architect, integrate, and optimize AI‑native networks will have a competitive advantage, he said.
Aria Networks said that it has customer orders in hand and is actively deploying its offering today.