Companies and governments are looking for tools to run AI locally in a a bid slash cloud infrastructure costs and build sovereign capability. Quadric, a chip-IP startup founded by veterans of early bitcoin mining firm 21E6, is trying to power that shift, scaling beyond automotive into laptops and industrial devices, with its on-device inference technology.
That expansion is already paying off.
Quadric posted $15 million to $20 million in licensing revenue in 2025, up from around $4 million in 2024, CEO Veerbhan Kheterpal (pictured above, center) told TechCrunch in an interview. The company, which is based in San Francisco and has an office in Pune, India, is targeting up to $35 million this year as it builds a royalty-driven on-device AI business. That growth has buoyed the company, which now has post-money valuation of between $270 million and $300 million, up from around $100 million in its 2022 Series B, Kheterpal said.
It has also helped attract investors to company. Quadric announced last week a $30 million Series C round led by ACCELERATE Fund, managed by BEENEXT Capital Management, bringing its total funding to $72 million. The raise comes as investors and chipmakers look for ways to push more AI workloads from centralized cloud infrastructure onto devices and local servers, Kheterpal told TechCrunch.
From automotive to everything
Quadric began in automotive, where on-device AI can power real-time functions like driver assistance. Kheterpal said the spread of transformer-based models in 2023 pushed inference into “everything,” creating a sharp business inflection over the past 18 months as more companies try to run AI locally rather than rely on the cloud.
“Nvidia is a strong platform for data-center AI,” Kheterpal said. “We were looking to build a similar CUDA-like or programmable infrastructure for on-device AI.”
Unlike Nvidia, Quadric does not make chips itself. Instead, it licenses programmable AI processor IP, which Kheterpal described as a “blueprint” that customers can embed into their own silicon, along with a software stack and toolchain to run models, including vision and voice, on-device.
Techcrunch event
San Francisco
|
October 13-15, 2026

The startup’s customers span printers, cars, and AI laptops, including Kyocera and Japan’s auto supplier Denso, which builds chips for Toyota vehicles. The first products based on Quadric’s technology are expected to ship this year, beginning with laptops, Kheterpal told TechCrunch.
Nonetheless, Quadric is now looking beyond traditional commercial deployments and into markets exploring “sovereign AI” strategies to reduce reliance on U.S.-based infrastructure, Kheterpal said. The startup is exploring customers in India and Malaysia, he added, and counts Moglix CEO Rahul Garg as a strategic investor helping shape its India “sovereign” approach. Quadric employs nearly 70 people worldwide, including about 40 in the U.S. and around 10 in India.
The push is being driven by the rising cost of centralized AI infrastructure and the difficulty many countries face in building hyperscale data centers, Kheterpal said, prompting more interest in “distributed AI” setups where inference runs on laptops or small on-premise servers inside offices rather than relying on cloud-based services for every query.
The World Economic Forum pointed to this shift in a recent article, as AI inference moves closer to users and away from purely centralized architectures. Similarly, EY said in a November report that the sovereign AI approach has gained traction as policymakers and industry groups push for domestic AI capabilities spanning compute, models, and data, rather than relying entirely on foreign infrastructure.
For chipmakers, the challenge is that AI models are evolving faster than hardware design cycles, Kheterpal said. He argued that customers need programmable processor IP that can keep pace through software updates rather than requiring costly redesigns every time architectures shift from earlier vision-focused models to today’s transformer-based systems.
Quadric is pitching itself as an alternative to chip vendors such as Qualcomm, which typically uses its AI technology inside its own processors, as well as IP suppliers like Synopsys and Cadence, which sell neural processing engine blocks. Kheterpal said Qualcomm’s approach can lock customers into its own silicon, while traditional IP suppliers offer engine blocks that many customers find difficult to program.
The programmable approach by Quadric allows customers to support new AI models through software updates rather than redesigning hardware, giving an advantage in an industry where chip development can take years, while model architectures shift in a matter of months nowadays.
Still, Quadric remains early in its buildout, with a handful of signed customers so far and much of its longer-term upside dependent on turning today’s licensing deals into high-volume shipments and recurring royalties.
Startups,AI,Exclusive,AI chip,QuadricExclusive,AI chip,Quadric#Quadric #rides #shift #cloud #ondevice #inference #paying1769086184

