Revolutionize Your Experience: Intel Core Systems Now Support Advanced AI Algorithms Like Stable Diffusion for Superior Efficiency

Revolutionize Your Experience: Intel Core Systems Now Support Advanced AI Algorithms Like Stable Diffusion for Superior Efficiency

George Lv11

Revolutionize Your Experience: Intel Core Systems Now Support Advanced AI Algorithms Like Stable Diffusion for Superior Efficiency

The generative AI revolution has mostly been focused on running large and complex AI models in server datacenters. Some AI models are optimized enough to run on typical computers, though, and Intel is making some progress there.

Intel announced today that there are now over 500 AI models optimized for its new Intel Core Ultra processors, which were revealed in December and have started to appear in new PC laptops. That list likely includes many experimental and testing models that don’t serve a practical purpose for most applications, but there are a few big ones: Phi-2, Meta’s Lllama model , Mistral, Bert, Whisper, and Stable Diffusion 1.5 .

Intel said in a press release, “Models form the backbone of AI-enhanced software features like object removal, image super resolution or text summarization. There is a direct link between the number of enabled/optimized models and the breadth of user-facing AI features that can be brought to market. Without a model, the feature cannot be designed. Without runtime optimization, the feature cannot reach its best performance.”

Most (if not all) of those AI models can run on non-Intel hardware, but adding support for the newer hardware features specific to Intel’s latest chips makes them more practical for real-world use. For example, Intel said the OpenVINO AI model’s optimization process included “load-balancing across all the compute units, compressing the models to run efficiently in an AI PC, and optimizing the runtime to take advantage of memory bandwidth and core architecture within Intel Core Ultra.”

Machine learning and AI models that run locally on computers is nothing new, but running newer generative AI models locally on PCs has a few interesting use cases. You could have something like ChatGPT and Microsoft Copilot running entirely on your own PC, potentially eliminating the privacy concerns and network connectivity requirements that come with sending prompt data to external servers. NVIDIA’s ChatRTX local chatbot is a step in that direction, but it’s still experimental and requires a PC with a powerful RTX 30 or 40-series graphics card.

Intel is hoping that software using these optimized models might push people to buy newer computers with Core Ultra processors. For now, though, cloud-based AI tools like ChatGPT and Copilot aren’t going anywhere.

Source: Intel

Also read:

  • Title: Revolutionize Your Experience: Intel Core Systems Now Support Advanced AI Algorithms Like Stable Diffusion for Superior Efficiency
  • Author: George
  • Created at : 2024-09-11 17:20:12
  • Updated at : 2024-09-16 16:59:11
  • Link: https://hardware-tips.techidaily.com/revolutionize-your-experience-intel-core-systems-now-support-advanced-ai-algorithms-like-stable-diffusion-for-superior-efficiency/
  • License: This work is licensed under CC BY-NC-SA 4.0.
On this page
Revolutionize Your Experience: Intel Core Systems Now Support Advanced AI Algorithms Like Stable Diffusion for Superior Efficiency