Joining the AI craze, Qualcomm launches mobile phones and PC chips to challenge Apple and Intel
On Tuesday, October 25th, Qualcomm released two new chips aimed at running artificial intelligence software on smartphones and personal computers (PCs), including the introduction of the technology industry's Big Language Model (LLM) without the need to connect to the internet.Since the release of StableDiffusion's image generator and OpenAI's chat robot ChatGPT at the end of 2022, there has been a surge in interest in artificial intelligence applications
On Tuesday, October 25th, Qualcomm released two new chips aimed at running artificial intelligence software on smartphones and personal computers (PCs), including the introduction of the technology industry's Big Language Model (LLM) without the need to connect to the internet.
Since the release of StableDiffusion's image generator and OpenAI's chat robot ChatGPT at the end of 2022, there has been a surge in interest in artificial intelligence applications. These two so-called "generative artificial intelligence" applications require a lot of processing power, and so far, they mainly run on powerful and power consuming Nvidia graphics processors.
The new chips released by Qualcomm this time include the XElite chip for personal computers and laptops, as well as the Snapdragon series 8Gen3 for high-end Android phones.
The speed at which smartphone chips process artificial intelligence models may represent a new feature war between high-end Android phones produced by companies such as Asus and Sony and Apple's iPhone, which also releases new artificial intelligence features every year.
A Qualcomm executive stated in an interview that the latest Snapdragon chip runs artificial intelligence tasks much faster than last year's processors, reducing the time to generate images from 15 seconds last year to less than 1 second.
If someone goes to buy a phone today, they will ask: How fast is the CPU? How big is the memory? Or what does a camera look like? But in the next two or three years, people will ask, what artificial intelligence features will they have
The artificial intelligence boom has boosted NVIDIA's stock price, but largely bypassed Qualcomm. Although Qualcomm's smartphone chip shipments are large, and since 2018, Qualcomm has included an artificial intelligence component called NPU.
Qualcomm's NPU is used to improve photos and other features. Now, Qualcomm has stated that its smartphone chips can handle larger AI models used in generative AI, with some even reaching up to 10 billion parameters. However, this is still lower than some of the largest artificial intelligence models, such as OpenAI's GPT3, which has approximately 175 billion parameters.
Qualcomm executives have stated that if the chips are fast enough and equipped with sufficient memory, these types of artificial intelligence models can run on devices. They say that running a large language model locally is more meaningful than running it in the cloud because it is faster and more private. Qualcomm stated that its chips can run a certain version of Meta's Llama2 model and hopes that its customers (smartphone manufacturers) can also develop their own models. Qualcomm is also developing its own artificial intelligence model.
Qualcomm has provided a device that runs a free StableDiffusion artificial intelligence model, which can generate images based on a string of words. It also demonstrates the ability to extend or fill in partial photos using artificial intelligence.
Last year, Qualcomm's second-generation chip successfully ran the same model, but it took 15 seconds to process all the numbers to create an image of a cat sitting on the beach. This year, Qualcomm's new chip can complete this task in half a second. Katuzan stated that this can greatly improve the response speed of artificial intelligence applications such as personal assistants.
Qualcomm stated that future applications, such as personal voice assistants, can use the device's own artificial intelligence model for simple queries, run on the device's chip, and send more difficult problems to more powerful computers in the cloud. Qualcomm explained that this is why it works closely with Microsoft to ensure that its chips are optimized for artificial intelligence software.
Katuzan said, "The more these devices are used to run artificial intelligence functions, the less they will spend on Microsoft Azure, which typically runs super expensive inference functions. Now, all of these things can be uninstalled. In a mixed scenario, loading the cloud onto edge client devices will give them a huge advantage
Qualcomm also stated that its top smartphone chip this year, the Snapdragon series 8Gen3, will begin appearing on "high-end" Android devices priced at over $500 by brands such as Asus and Sony starting early next year. The functionality of high-end chips will eventually penetrate into other devices.
Qualcomm XElite chip
Qualcomm's new personal computer chip XElite is based on the Arm architecture and will compete with Intel's x86 chip for use in laptops and desktops.
XElite used the technology used by Apple when it acquired Nuvia, which was created by a former Apple engineer and is currently at the core of a legal dispute with Arm. Laptops based on this chip are expected to be launched in the middle of next year, using Qualcomm's so-called Oryon core. Qualcomm stated that it has outperformed Apple's M2Max chip in terms of performance while also consuming less power. (Small)
Tag: and Joining the AI craze Qualcomm launches mobile phones
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.