Canonical is making it easier for developers to deploy AI models on Ubuntu, with the release of new ‘optimised inference snaps’ for both Intel and ARM Ampere device.

Beta builds of DeepSeek R1 and Qwen 2.5 VL are the first large language models (LLMs) on the Snap Store. Each offering “automatic engines, quantizations and architectures based on the specific silicon of the device” they’re installed on.

What are these models?

DeepSeek R1 is an open-source reasoning model from Chinese AI firm DeepSeek, better at maths, coding and complex tasks.

Qwen 2.5 VL is Alibaba Cloud’s open-source vision model, designed for processing text, images, and video.

Both models install and run locally, free of cloud API calls or subscription costs.

“We are making silicon-optimized AI models available for everyone. When enabled by the user, they will be deeply integrated down to the silicon level,” Jon Seager, VP Engineering at Canonical says of the launch.

Jeff Wittich, Chief Product Officer at Ampere says “this brings Ampere’s high performance and efficiency to end users right out of the box […] enabling enterprises to rapidly deploy and scale their preferred AI models on Ampere systems with Ubuntu’s AI-ready ecosystem.”

Giving developers easier access to pre-tuned LLMs on Ubuntu to use in applications and server workloads means they no longer need to go and research, locate and manually deploy quantised (reduces model size while maintaining accuracy) model variants.

They can just snap install and go.

Architecture-optimised models will run faster and more efficiently on hardware, while quantised models are designed to use fewer resources. Users won’t need to worry about which to pick since ‘automatic’ selection they get the one best suited to their device.

Do these snaps mean Canonical is going all-in on AI in Ubuntu?

Not to any worrying extent, no. Intel’s NPU driver is already available a snap, as are OpenVINO AI plugins. Here, Canonical is simply putting popular open-source LLMs on the snap store that are pre-optimised for certain chips.

Beyond that, the snaps are of large language models and users will need something which can interface or task them to do something (the same as any random library you’d install). They aren’t preinstalled or integrated in the Ubuntu desktop UI1.

Secondly, desktop users aren’t the main target here. Canonical and its silicon partners want to make optimised versions available more readily, so developers, enterprises can install them with less friction to get the most from their hardware.

I’m sure that is obvious, but I’m saying it lest anyone fear Ubuntu 26.04 LTS will be released with annoying ‘AI sparkle’ buttons peppered in its UI, hassling you to let it write your homework for you or acting like a dystopian precog like Windows Recall.

Ubuntu is, as Ubuntu ever was: a platform where the user is able to choose what they use.

More details (and a large dose of marketing puffery) in the Canonical blog post.

  1. The models being available on the Snap Store will make it easier for third-party developers to create desktop integrations that are powered by the LLM snaps. ↩︎