Although Qualcomm (QCOM) shares have mostly missed out on the AI-fueled semiconductor surge in recent years, circumstances seem to have shifted for the company, with the stock increasing by nearly 70% in the last month as Qualcomm prepares for the next stage of AI, which extends beyond centralized computing to encompass billions of interconnected devices. In our earlier assessment, we highlighted how Qualcomm’s stock could potentially double, spurred by edge AI. Here, we detail the specifics of this transition and how the composition of Qualcomm’s customer base is evolving.
What Makes Qualcomm Uniquely Suitable for Inference?
The current AI landscape relies heavily on centralized computing, where workloads are concentrated in cloud-based data centers and processed en masse using chips from firms like Nvidia (NVDA). This model has limitations. Routing every inference through the cloud incurs significant costs, introduces latency, and consumes a lot of power, making it impractical for scaling to billions of devices. The next evolution involves local inference: AI that operates directly on devices, offering faster processing, greater privacy, and independence from connectivity. Techniques for model compression such as quantization, pruning, and distillation are facilitating this transition by reducing model size without significantly impacting performance. Ironically, each advancement in efficiency achieved through AI research simultaneously enhances Qualcomm’s edge hardware capabilities.
This aligns perfectly with Qualcomm’s strengths.
For many years, the company has focused on the two primary constraints defining edge AI: power efficiency and connectivity. Qualcomm’s Snapdragon and Dragonwing platforms distribute AI workloads among three dedicated engines. The NPU handles intensive matrix computations for AI at an impressive 80 TOPS (a benchmark for the number of AI calculations the chip can execute per second) while consuming far less power compared to a laptop CPU. The GPU is responsible for visual processing related to generative AI, whereas the CPU takes care of application logic. Each engine performs its specific function.
Moreover, Qualcomm integrates the modem within the same silicon as the computation engines, enabling the device to manage connectivity and inference under a common power budget instead of requiring two distinct chips. This integration reflects years of collaborative design that a new competitor would be unable to replicate. Additionally, edge devices do not utilize HBM, the high-bandwidth memory that data center chips depend on for rapid model processing. Qualcomm’s platforms are designed with this limitation in mind from the beginning, whereas many rivals are attempting to adjust data center approaches for edge applications.
As Qualcomm’s hardware becomes more capable with every AI efficiency breakthrough, rule-based investing offers a systematic way to participate in this structural shift from cloud to edge while maintaining the discipline required to navigate execution risks.
Who Are Qualcomm’s Customers?
The “customer” landscape for Qualcomm is evolving from a limited number of smartphone original equipment manufacturers (OEMs) to a comprehensive ecosystem that includes large industrial and automotive corporations.
Mobile device manufacturers continue to be Qualcomm’s primary clients, representing 66.4% of chip revenue in the most recent quarter, although this revenue share has been declining. Conversely, the automotive sector accounted for roughly 14.6%, and the Internet of Things — including PCs and various other chips — contributed about 19%, with both sectors experiencing growth. Refer to Qualcomm’s key metrics.
PC manufacturers such as Dell, Lenovo, and HP are opting for Qualcomm’s Snapdragon X Elite for their top-tier AI PCs because it is the only NPU that meets Microsoft’s (MSFT) exacting Copilot+ performance criteria while providing over 20 hours of battery life. Microsoft has essentially established the qualification standard, and Qualcomm is the chip that complies. This scenario is not just a feature achievement; it is a platform lock-in dictated by an external standard.
In the automotive industry, Volkswagen, BMW, and GM are basing their future fleets on Qualcomm’s digital chassis. The reason lies in the architecture. A vehicle identifying a pedestrian cannot afford the latency of a round trip to the cloud. Inference must occur in the vehicle, instantaneously, at all times. Qualcomm’s automotive design-win pipeline currently amounts to $45 billion, built on contracts with lead times spanning years. Automotive design cycles last five to seven years. Every victory today results in revenue that competitors will not be able to access until the subsequent vehicle generation. Qualcomm’s automotive operations alone constitute a $45 billion design-win pipeline.
On the manufacturing floor, Qualcomm’s Dragonwing platform is powering robots that can navigate and interpret their surroundings without a live server connection. The industrial edge represents the tangible application of physical AI, and Qualcomm is positioning itself as the go-to silicon for that transformation, transitioning from Arduino prototypes to fully autonomous manufacturing systems.
Read the full article here




