Alibaba’s Qwen3.5 brings native multimodal AI to Ollama with 256K context windows, 201 language support, and models from 0.8B to 397B parameters. Run locally or in the cloud with built-in vision, thinking mode, and tool calling.
Category: Data Science & AI
Machine Learning & AI Trends
Big Data & Analytics
SQL & NoSQL Databases
Business Intelligence
Running Large Language Models Locally with Podman and Ollama
Learn how to set up and run powerful LLMs like Qwen2.5-14B on your local machine using Podman and Ollama. This comprehensive guide covers installation, model deployment, and troubleshooting for split GGUF files.
Run Claude Code with a Local LLM Using Ollama
Want to use Claude Code without sending data to the cloud or paying API fees? You can point it at a locally running LLM via Ollama in just a few minutes. Here’s exactly how.