The 10 Hottest Semiconductor Startups Of 2023

CRN rounds up the 10 hottest semiconductor startups of 2023, which range from startups challenging Nvidia in the AI computing space to those seeking to help the industry overcome memory and data transfer bottlenecks that are holding back demanding workloads.

While the semiconductor industry is estimated to end this year with a 10.9 percent decline in revenue, it’s expected to come roaring back in 2024 with a projected 16.8 percent increase for sales worldwide.

That’s according to research firm Gartner, which forecast earlier this month that while 2023 will end with $534 billion in global semiconductor revenue, it will more than make up the difference in the next year with $624 billion.

[Related: AMD Launches Instinct MI300 AI Chips To Challenge Nvidia With Backing From Microsoft, Dell And HPE]

While that’s mostly good news for the world’s largest semiconductor companies, such as Intel, TSMC and Nvidia, it’s a sign that some of the industry’s startups may have a better chance at winning customer orders to outfit state-of-the-art systems.

In the past year, the industry’s attention has become increasingly fixated on enabling and accelerating generative AI workloads thanks to the late 2022 launch of ChatGPT and other applications that require powerful and expensive chips to operate.

Nvidia may dominate the AI computing space, having tripled its revenue in the past quarter alone due to high AI chip demand. But that isn’t stopping a slew of semiconductor startup companies, such as Cerebras Systems, d-Matrix, Lightmatter and Tenstorrent, from fighting for their own slice of market share with novel chip architectures, processing techniques and business models.

The AI gold rush has also created a need for different kinds of semiconductor startups, like Astera Labs, Ayar Labs and Celestial AI, to help companies in the industry overcome memory and data transfer bottlenecks that are holding back applications from running faster than what current systems allow.

Semiconductor startups see opportunities in other areas too. These include Ampere Computing, whose Arm-based, cloud-native processors are winning over OEMs and cloud service providers, as well as EdgeQ, which is gaining momentum in the telecom space with its “Base-Station-on-a-chip.” There’s also Sim.ai, whose edge AI chip is entering mass production and going into systems from Supermicro.

What follows are CRN’s 10 hottest semiconductor startups of 2023, which consist of companies that have made a big splash this year for multiple reasons, including big funding rounds, product launches and momentum with customers and partners, among other milestones.

Ampere Computing

Top Executive: Renee James, Founder, CEO

While Microsoft Azure and Amazon Web Services build their own Arm-based CPUs for cloud computing, Ampere Computing is making sure the rest of the industry can seize the high density and energy efficiency benefits of the x86 chip architecture’s main alternative.

The Santa Clara, Calif.-based chip design startup—which confidentially filed for an initial public offering a year ago—is on to the third generation of its “cloud-native” server chips, called Ampere One, which provide up to 192 cores, eight channels of DDR5 memory and 128 lanes of PCIe Gen5 connectivity in a custom Arm-compatible design. That core count puts Ampere above the maximum number of cores for any Arm- or x86-based server chip available in the market today.

Ampere’s top supporters include Google Cloud, which introduced AmpereOne-based instances a few months ago; Supermicro, which sells a variety of Ampere-based servers; and Oracle, which is a major shareholder and plans to spend a significant amount of money on Ampere chips. The company also works with Hewlett Packard Enterprise, Microsoft Azure and several other OEMs and cloud service providers such as Gigabyte and Alibaba Cloud.

Astera Labs

Top Executive: Jitendra Mohan, Co-Founder, CEO

Astera Labs is arming server vendors and cloud service providers with chip-based products that take advantage of Compute Express Link, a new connectivity standard that enables faster and more efficient servers with greater flexibility and lower costs.

After raising a $150 million Series D round from investors more than a year ago, the Santa Clara, Calif.-based startup reportedly hired Morgan Stanley and JPMorgan Chase & Co. as the lead underwriters for an initial public offering that could happen in 2024. This came around the same time Astera expanded its leadership team, including former Meta hardware veteran Chris Peterson as fellow of technology and ecosystems, to “drive corporate growth, strategic development, and product innovation.”

Astera’s connectivity portfolio includes the Leo Memory Connectivity Platform, which it said is the “industry’s first” memory controller to support memory expansion, memory pooling and memory sharing features that can help increase performance and lower costs for servers.

Ayar Labs

Top Executive: Mark Wade, Co-Founder, CEO

Ayar Labs believes it can usher in a new era of high-performance chips with silicon photonics technology that moves data through chips using light instead of electricity.

The Santa, Clara, Calif.-based startup said in December that it had appointed co-founder and CTO Mark Wade as its new CEO as the company gears up to “meet the high-volume opportunity we see with in-package optical I/O,” according to investor Will Graves.

Wade’s appointment was made after Ayar in May said it had raised an additional $25 million funding for its Series C round, which brought the round’s total to $155 million. The startup’s investors include Nvidia, Intel Capital, Hewlett Packard Enterprise’s venture arm and GlobalFoundries.

The extra investment came after Ayar achieved multiple milestones. These included a demonstration of the industry’s first optical solution for delivering a connectivity speed of 4 Tbps, a partnership with the U.S. Department of Defense to prototype military applications using its optical I/O chiplets and lasers, and a collaboration with Nvidia to develop AI chips with optical I/O.

Celestial AI

Top Executive: Dave Lazovsky, Founder, CEO

Celestial AI thinks it’s better equipped than rivals to deliver on the promise of silicon photonics for AI and high-performance computing with its Photonic Fabric optical interconnect technology.

The Santa Clara, Calif.-based startup in June said that it had raised a $100 million Series B round led by IAG Capital Partners, Koch Disruptive Technologies and Temasek’s Xora Innovation fund. Other investors include Samsung Catalyst, Smart Global Holdings and Porsche Automobil Holding.

Like other silicon photonics designers, Celestial AI sees optical interconnects as the future of accelerated computing, giving systems the ability to send data far faster and in greater quantities with less power than electrical interconnects that are currently the norm.

But the company said its Photonic Fabric, which can deliver data to any location on the compute die of a processor, provides 25 times greater bandwidth at more than 10 times lower latency and power consumption than any optical interconnect alternative.

Cerebras Systems

Top Executive: Andrew Feldman, Co-Founder, CEO

Cerebras Systems says it’s finding major traction with customers for its giant wafer-scale AI chips, which are better equipped for massive AI models than traditional GPUs from companies like Nvidia.

The Sunnyvale, Calif.-based startup told MarketWatch in October that it has doubled revenue this year. One significant driver is a large deal it scored with investment firm G42 in the United Arab Emirates, which involves the build-out of nine interconnected supercomputers the startup will manage as a cloud service.

This marks an evolution in the company’s original business model, which involves selling systems equipped with Cerebras’ 2.6 trillion-transistor Wafer-Scale Engine processor.

The startup’s other customers include GlaxoSmithKline, AstraZeneca, Jasper and TotalEnergies.

D-Matrix

Top Executive: Sid Sheth, Co-Founder, CEO

D-Matrix seeks to make generative AI applications commercially viable with what it calls a “first-of-its-kind” compute platform that relies on chiplet-based processors with digital in-memory computing.

The Santa Clara, Calif.-based startup said in September that it had raised a $110 million Series B funding round led by Singapore-based investment firm Temasek, with participation from other investors, including Microsoft’s M12 venture fund, SK Hynix and Marvell Technology.

D-Matrix seeks to challenge Nvidia in the fast-growing AI inference market with a chip-based solution it said will enable a lower total cost of ownership than GPU-based alternatives. The startup’s chips make use of digital in-memory computing circuit techniques that embed compute cores alongside memory to significantly reduce memory bottlenecks common in traditional processors.

EdgeQ

Top Executive: Vinay Ravuri, Founder, CEO

EdgeQ is taking on semiconductor giants like Intel and the chip-making capabilities of telecom giants like Ericsson with its “Base Station-on-a-chip” offering for a range of cellular network applications.

The Santa Clara, Calif.-based startup in April revealed that it had raised a $75 million Series B funding round to invest in the development of its next-generation chip technology, which can be used as the foundation for public and private 5G and 4G cellular networks.

This year, network and telecom vendors began unveiling new products that will use EdgeQ’s chip. These include Taiwan’s Wistron, which said it has developed the “first software-defined, multi-mode 4G/5G small cell” using EdgeQ’s chip. The startup has also unveiled a 4G/5G small cell reference design with MaxLinear as well as a 4G/5G small cell architecture platform with Actiontec.

Other vendors that have backed EdgeQ include Vodafone, which plans to use its chip for next-generation, software-programmable 5G ORAN platforms. Network software vendor Mavenir also plans to use EdgeQ’s chip to develop a 4G/5G small cell offering.

Lightmatter

Top Executive: Nicholas Harris, Founder, CEO

Lightmatter says it plans to “bring a new level of performance and energy savings to the most advanced AI and HPC workloads” with its photonic chip-based technologies.

The Boston-based startup in December said that it had raised $155 million as an extension from the separate $154 million it had received for its Series C round, which was announced back in May. The latest round brings the company’s total funding to more than $420 million and gives it a private valuation of more than $1.2 billion, according to Lightmatter.

The Series C-2 round was led by Alphabet’s venture arm, GV, alongside global investment firm Viking Global Investors. The company’s other investors and the venture arm of Hewlett Packard Enterprise among several others.

Lightmatter’s portfolio includes Envise 4S, a 4U server that includes 16 of the company’s Envise chips and only uses 3 kilowatts of power. The company said a rack-scale Envise inference system can run the BERT-Base SQuAD large language model at three times greater inferences per second and eight times greater inferences per second per watt than Nvidia’s DGX A100 system.

Sima.ai

Top Executive: Krishna Rangasayee, Founder, CEO

SiMa.ai said its edge AI chip technology combines “jaw-dropping performance” and “incredible power efficiency” with a “beautifully simple, push-button software experience.”

In June, the San Jose, Calif.-based startup said that its Machine Learning System on a Chip silicon for the embedded edge market entered mass production with the support of two PCIe-based production boards and Palette software for low-code ML development. The company also raised an additional $13 million in funding, bringing the total investment to $200 million.

In November, the startup said it entered into a strategic partnership with Supermicro to provide an edge machine learning server that uses its MLSoC chip for a variety of use cases, including smart vision applications like employee safety, theft management and access control.

SiMa.ai said its MLSoC beat Nvidia’s Jetson AGX Orin system-on-chip on power and latency with higher frames per second per watt for inference, based on results submitted to MLPerf, a suite of machine learning benchmark tests for training and inference.

Tenstorrent

Top Executive: Jim Keller, CEO

Tenstorrent is shaking up the chip design world for AI computing with a business model that involves selling specialized processors and licensing chip technologies for other companies to use.

The Toronto-based startup asaid in August that it had raised a $100 million funding round led by Hyundai Motor Group and Samsung Catalyst Fund, with participation from several other vendors.

The investment news came roughly three months after Tenstorrent unveiled a partnership with Korean electronics giant LG, which will potentially use the startup’s AI and RISC-V chiplet designs to power future premium TV and automotive products in the future. The two will also collaborate to integrate LG’s video codec technology into Tenstorrent’s future data center products.

Earlier in the year, Tenstorrent made appointments for key roles in the company’s leadership team. These included former Google and SiFive executive Keith Witek as COO, former Meta infrastructure hardware lead Olof Johansson as vice president of operating systems and infrastructure software, and former Intel GPU head Raja Koduri as a board member.