{"id":7548,"date":"2026-01-18T15:16:20","date_gmt":"2026-01-18T14:16:20","guid":{"rendered":"https:\/\/javiercachon.com\/?p=7548"},"modified":"2026-01-30T15:46:23","modified_gmt":"2026-01-30T14:46:23","slug":"guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13","status":"publish","type":"post","link":"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/","title":{"rendered":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13"},"content":{"rendered":"\n<figure class=\"wp-block-image aligncenter size-large is-resized\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama-1024x576.jpg\" alt=\"\" class=\"wp-image-7549\" style=\"width:1200px;height:auto\" srcset=\"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama-1024x576.jpg 1024w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama-300x169.jpg 300w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama-768x432.jpg 768w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama-1536x864.jpg 1536w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg 1920w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><strong>Hardware de referencia:<\/strong> Laptop HP (16GB RAM + 16GB SWAP)<\/p>\n\n\n\n<p><strong>Repositorio oficial:<\/strong> <a href=\"https:\/\/github.com\/ollama\/ollama\">https:\/\/github.com\/ollama\/ollama<\/a><\/p>\n\n\n\n<p><strong>Web Ollama:<\/strong> <a href=\"https:\/\/ollama.com\/download\" target=\"_blank\" rel=\"noreferrer noopener sponsored nofollow\">https:\/\/ollama.com\/download<\/a><\/p>\n\n\n\n<p>Este art\u00edculo documenta el proceso completo para desplegar un entorno de Inteligencia Artificial local, optimizando el almacenamiento y superando las restricciones de seguridad de Debian.<\/p>\n\n\n\n<p>Durante d\u00e9cadas, la inform\u00e1tica personal ha girado en torno a una idea central: el control del usuario sobre su m\u00e1quina. Sin embargo, en los \u00faltimos a\u00f1os, ese principio ha sido erosionado por un modelo basado en la nube, la externalizaci\u00f3n del c\u00f3mputo y la concentraci\u00f3n del conocimiento en manos de grandes corporaciones.<\/p>\n\n\n\n<p>La inteligencia artificial moderna nace, parad\u00f3jicamente, como tecnolog\u00eda acad\u00e9mica abierta, pero r\u00e1pidamente es absorbida por modelos de negocio cerrados. Plataformas como ChatGPT, Gemini o Copilot ofrecen potencia a cambio de dependencia, y comodidad a cambio de soberan\u00eda.<\/p>\n\n\n\n<p>La ejecuci\u00f3n local de modelos de lenguaje supone, por tanto, <strong>un retorno a los principios cl\u00e1sicos de la inform\u00e1tica<\/strong>: control, previsibilidad, propiedad y responsabilidad.<\/p>\n\n\n\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_82_2 ez-toc-wrap-center counter-hierarchy ez-toc-counter ez-toc-light-blue ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title\" style=\"cursor:inherit\">Tabla de contenidos del articulo.<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Alternar tabla de contenidos\"><span class=\"ez-toc-js-icon-con\"><span class=\"\"><span class=\"eztoc-hide\" style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #999;color:#999\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #999;color:#999\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 eztoc-toggle-hide-by-default' ><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#1_Introduccion\" >1. Introducci\u00f3n<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#2_%C2%BFQue_es_Ollama\" >2. \u00bfQu\u00e9 es Ollama?<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#3_Caracteristicas_principales\" >3. Caracter\u00edsticas principales<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#31_Simplicidad_operativa\" >3.1 Simplicidad operativa<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#32_Amplio_catalogo_de_modelos\" >3.2 Amplio cat\u00e1logo de modelos<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#33_Integracion_via_API\" >3.3 Integraci\u00f3n v\u00eda API<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-7\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#34_Optimizacion_de_recursos\" >3.4 Optimizaci\u00f3n de recursos<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-8\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#35_Software_libre\" >3.5 Software libre<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-9\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#36_Requisitos_previos\" >3.6 Requisitos previos<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-10\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#4_Instalacion_de_Ollama_en_Debian_13\" >4. Instalaci\u00f3n de Ollama en Debian 13<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-11\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#41_Requisitos_previos\" >4.1 Requisitos previos<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-12\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#42_Instalacion_oficial\" >4.2 Instalaci\u00f3n oficial<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-13\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#44_Verificacion\" >4.4 Verificaci\u00f3n<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-14\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#5_Problema_critico_Ollama_y_la_particion_raiz\" >5. Problema cr\u00edtico: Ollama y la partici\u00f3n ra\u00edz \/<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-15\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#51_Configuracion_avanzada_de_systemd_para_usar_home\" >5.1 Configuraci\u00f3n avanzada de systemd para usar \/home<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-16\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#6_Override_del_servicio_systemd\" >6  Override del servicio systemd<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-17\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#6_Descargar_y_ejecutar_un_modelo_LVM\" >6. Descargar y ejecutar un modelo LVM<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-18\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#7_Instalacion_de_Open_WebUI_con_Docker_interfaz_tipo_ChatGPT\" >7. Instalaci\u00f3n de Open WebUI con Docker (interfaz tipo ChatGPT)<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-19\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#Resumen\" >Resumen<\/a><\/li><\/ul><\/nav><\/div>\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"1_Introduccion\"><\/span>1. Introducci\u00f3n<span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<p>La inteligencia artificial generativa ha dejado de ser un experimento de laboratorio para convertirse en una herramienta estrat\u00e9gica en entornos profesionales, educativos y empresariales. Sin embargo, el uso de servicios cerrados en la nube plantea interrogantes leg\u00edtimos sobre <strong>privacidad, soberan\u00eda del dato, dependencia de terceros y costes recurrentes<\/strong>.<\/p>\n\n\n\n<p>En este contexto, el ecosistema de <strong>IA privada, local y de c\u00f3digo abierto<\/strong> cobra especial relevancia. Ejecutar modelos LLM (Large Language Models) en infraestructura propia \u2014un simple PC o workstation GNU\/Linux\u2014 es hoy una realidad t\u00e9cnica madura.<\/p>\n\n\n\n<p>Esta gu\u00eda constituye un <strong>manual t\u00e9cnico integral<\/strong>, orientado a administradores de sistemas, ingenieros, formadores y usuarios avanzados de GNU\/Linux, con especial foco en <strong>Debian 13<\/strong>, donde se describe paso a paso c\u00f3mo desplegar una soluci\u00f3n completa de IA privada basada en:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Ollama<\/strong>: motor de ejecuci\u00f3n de modelos LLM en local.<\/li>\n\n\n\n<li><strong>Open WebUI<\/strong>: interfaz web moderna, estilo ChatGPT, autoalojada mediante Docker.<\/li>\n<\/ul>\n\n\n\n<p>El objetivo es proporcionar una referencia <strong>rigurosa, reproducible y mantenible<\/strong>, alineada con los principios del software libre, la documentaci\u00f3n cl\u00e1sica y las buenas pr\u00e1cticas de administraci\u00f3n de sistemas.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"2_%C2%BFQue_es_Ollama\"><\/span>2. \u00bfQu\u00e9 es Ollama?<span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<p><strong>Ollama<\/strong> es una plataforma de c\u00f3digo abierto dise\u00f1ada para <strong>descargar, gestionar y ejecutar modelos de lenguaje (LLM) en local<\/strong>, de forma sencilla y eficiente.<\/p>\n\n\n\n<p>Desde el punto de vista t\u00e9cnico, Ollama act\u00faa como:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Un <strong>runtime<\/strong> optimizado para modelos LLM.<\/li>\n\n\n\n<li>Un <strong>gestor de modelos<\/strong> (descarga, versionado y eliminaci\u00f3n).<\/li>\n\n\n\n<li>Un <strong>servicio local<\/strong> accesible por CLI y API HTTP.<\/li>\n<\/ul>\n\n\n\n<p>Su filosof\u00eda recuerda a herramientas cl\u00e1sicas de Unix:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><em>una sola cosa, bien hecha<\/em>.<\/p>\n<\/blockquote>\n\n\n\n<p>Ollama abstrae la complejidad del despliegue de modelos modernos (quantizaci\u00f3n, formatos, uso de CPU\/GPU, memoria, etc.) y ofrece una experiencia coherente tanto para uso personal como profesional.<\/p>\n\n\n\n<p>Es importante destacar que Ollama:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>No es un modelo<\/strong>: es el motor.<\/li>\n\n\n\n<li><strong>No impone interfaz gr\u00e1fica<\/strong>: se integra con terceros como Open WebUI.<\/li>\n\n\n\n<li><strong>Respeta la privacidad<\/strong>: todo se ejecuta en local.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"3_Caracteristicas_principales\"><\/span>3. Caracter\u00edsticas principales<span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<p>Las caracter\u00edsticas que han consolidado Ollama como est\u00e1ndar de facto en IA local son:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"31_Simplicidad_operativa\"><\/span>3.1 Simplicidad operativa<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Instalaci\u00f3n mediante script oficial.<\/li>\n\n\n\n<li>CLI coherente y minimalista.<\/li>\n\n\n\n<li>Gesti\u00f3n autom\u00e1tica de dependencias.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"32_Amplio_catalogo_de_modelos\"><\/span>3.2 Amplio cat\u00e1logo de modelos<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Incluye soporte directo para:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Llama 3\/4<\/li>\n\n\n\n<li>Mistral \/ Mixtral<\/li>\n\n\n\n<li>Qwen<\/li>\n\n\n\n<li>Gemma<\/li>\n\n\n\n<li>Phi<\/li>\n\n\n\n<li>DeepSeek<\/li>\n\n\n\n<li>Modelos fine-tuned de la comunidad<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"33_Integracion_via_API\"><\/span>3.3 Integraci\u00f3n v\u00eda API<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Ollama expone una API REST local (<code class=\"\" data-line=\"\">localhost:11434<\/code>) que permite:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Integraci\u00f3n con aplicaciones propias.<\/li>\n\n\n\n<li>Uso desde IDEs.<\/li>\n\n\n\n<li>Conexi\u00f3n con interfaces web.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"34_Optimizacion_de_recursos\"><\/span>3.4 Optimizaci\u00f3n de recursos<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Quantizaci\u00f3n autom\u00e1tica.<\/li>\n\n\n\n<li>Uso eficiente de CPU.<\/li>\n\n\n\n<li>Soporte opcional de GPU.<\/li>\n\n\n\n<li>Posibilidad de swap para escenarios controlados.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"35_Software_libre\"><\/span>3.5 Software libre<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>C\u00f3digo abierto.<\/li>\n\n\n\n<li>Comunidad activa.<\/li>\n\n\n\n<li>Transparencia t\u00e9cnica.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"36_Requisitos_previos\"><\/span>3.6 Requisitos previos<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Antes de instalar, comprueba que tu sistema cumple con lo siguiente:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Sistema operativo: <strong>Debian 12 o superior<\/strong> (amd64 o ARM64).<\/li>\n\n\n\n<li>CPU moderna con soporte AVX2 (recomendado para LLMs complejos).<\/li>\n\n\n\n<li>8 GB de RAM m\u00ednimo (<em>16+ recomendado para modelos grandes<\/em>).<\/li>\n\n\n\n<li>Espacio de disco significativo (modelos pueden ocupar decenas de GB).<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"4_Instalacion_de_Ollama_en_Debian_13\"><\/span>4. Instalaci\u00f3n de Ollama en Debian 13<span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"41_Requisitos_previos\"><\/span>4.1 Requisitos previos<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Sistema base recomendado:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Debian 13 (estable)<\/li>\n\n\n\n<li>Arquitectura x86_64<\/li>\n\n\n\n<li>16 GB de RAM (m\u00ednimo funcional)<\/li>\n\n\n\n<li>Swap 16 GB<\/li>\n\n\n\n<li>Usuario sin privilegios + sudo<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"42_Instalacion_oficial\"><\/span>4.2 Instalaci\u00f3n oficial<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Ollama proporciona un m\u00e9todo de instalaci\u00f3n directo y mantenible:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo apt install -y curl ca-certificates\n$ curl -fsSL https:\/\/ollama.com\/install.sh | sh<\/pre>\n\n\n\n<p>Este script:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Instala el binario en <code class=\"\" data-line=\"\">\/usr\/bin\/ollama<\/code>.<\/li>\n\n\n\n<li>Crea el usuario y servicio <code class=\"\" data-line=\"\">ollama<\/code>.<\/li>\n\n\n\n<li>Configura systemd autom\u00e1ticamente.<\/li>\n<\/ul>\n\n\n\n<p>Inicia, refresca y reinicia el servicio systemd en ollama:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo systemctl daemon-reload\n$ systemctl enable ollama\n$ sudo systemctl restart ollama<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"44_Verificacion\"><\/span>4.4 Verificaci\u00f3n<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ ollama --version\nollama version is 0.14.2<\/pre>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ systemctl status ollama\n\u25cf ollama.service - Ollama Service\n     Loaded: loaded (\/etc\/systemd\/system\/ollama.service; enabled; preset: enabled)\n    Drop-In: \/etc\/systemd\/system\/ollama.service.d\n             \u2514\u2500override.conf\n     Active: active (running) since Sun 2026-01-18 12:56:15 CET; 33min ago\n Invocation: 311ee159fbef4d0dbcc1cd13015e7d62\n   Main PID: 2664 (ollama)\n      Tasks: 13 (limit: 18132)\n     Memory: 210.9M (peak: 504.1M)\n        CPU: 1.784s\n     CGroup: \/system.slice\/ollama.service\n             \u2514\u25002664 \/usr\/local\/bin\/ollama serve<\/pre>\n\n\n\n<p>Si el servicio est\u00e1 activo, Ollama est\u00e1 correctamente instalado.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"5_Problema_critico_Ollama_y_la_particion_raiz\"><\/span>5. Problema cr\u00edtico: Ollama y la partici\u00f3n ra\u00edz <code class=\"\" data-line=\"\">\/<\/code><span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<p>Por defecto, Ollama guarda los modelos en la ra\u00edz del sistema, lo que en Debian es <strong>peligroso<\/strong>, ya que:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Los modelos pesan entre 5 GB y 250 GB<\/li>\n\n\n\n<li>La ra\u00edz suele ser peque\u00f1a<\/li>\n\n\n\n<li>Puede provocar bloqueos del sistema<\/li>\n<\/ul>\n\n\n\n<p>Soluci\u00f3n profesional: mover los modelos a <code class=\"\" data-line=\"\">\/home<\/code><\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"51_Configuracion_avanzada_de_systemd_para_usar_home\"><\/span>5.1 Configuraci\u00f3n avanzada de systemd para usar <code class=\"\" data-line=\"\">\/home<\/code><span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">5.1.1 Crear el directorio de modelos<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ mkdir -p \/home\/ojosdegato\/ollama_models<\/pre>\n\n\n\n<p><strong>Nota:<\/strong> Sustituye ojosdegato por tu usuario del sistema.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">5.1.2 Permisos correctos<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">sudo chown -R ollama:ollama \/home\/ojosdegato\/ollama_models\nsudo chmod +x \/home\/ojosdegato\n<\/pre>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Sin este paso, Ollama <strong>no arrancar\u00e1<\/strong> en Debian.<\/p>\n<\/blockquote>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"6_Override_del_servicio_systemd\"><\/span>6  Override del servicio systemd<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Crear el archivo manualmente y edita con nano<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo mkdir -p \/etc\/systemd\/system\/ollama.service.d<\/pre>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo nano \/etc\/systemd\/system\/ollama.service.d\/override.conf\n<\/pre>\n\n\n\n<p>Contenido <strong>exacto<\/strong>:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">[Service]\nEnvironment=\"OLLAMA_MODELS=\/home\/ojosdegato\/ollama_models\"\nEnvironment=\"OLLAMA_HOST=0.0.0.0\"\nEnvironment=\"OLLAMA_ORIGINS=*\"\nReadWritePaths=\/home\/ojosdegato\/ollama_models\n<\/pre>\n\n\n\n<p><strong>Nota:<\/strong> Sustituye ojosdegato por tu usuario del sistema y Aplicar cambios:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo systemctl daemon-reload\n$ sudo systemctl restart ollama<\/pre>\n\n\n\n<p>Verificaci\u00f3n:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ systemctl show ollama --property=Environment\nEnvironment=PATH=\/usr\/local\/bin:\/usr\/bin:\/bin:\/usr\/local\/games:\/usr\/games OLLAMA_MODELS=\/home\/ojosdegato\/ollama_models OLLAMA_HOST=0.0.0.0 \"OLLAMA_ORIGINS=*\"<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"6_Descargar_y_ejecutar_un_modelo_LVM\"><\/span>6. Descargar y ejecutar un modelo LVM<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<p>Esta es la tabla oficial sobre los modelos a fecha de 01\/2026: <\/p>\n\n\n<table style=\"font-size: 1rem; border-collapse: collapse; font-family: Arial, sans-serif; height: 635px; width: 100%;\">\n<thead>\n<tr style=\"background-color: #1e73be; color: #ffffff;\">\n<th style=\"border-style: solid; border-color: #1e73be; padding: 8px; height: 37px;\">Modelo<\/th>\n<th style=\"border-style: solid; border-color: #1e73be; padding: 8px; height: 37px;\">Par\u00e1metros<\/th>\n<th style=\"border-style: solid; border-color: #1e73be; padding: 8px; height: 37px;\">Tama\u00f1o<\/th>\n<th style=\"border-style: solid; border-color: #1e73be; padding: 8px; height: 37px;\">Comando de descarga<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Gemma 3<\/td>\n<td style=\"height: 23px;\">1B<\/td>\n<td style=\"height: 23px;\">815 MB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run gemma3:1b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Gemma 3<\/td>\n<td style=\"height: 23px;\">4B<\/td>\n<td style=\"height: 23px;\">3.3 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run gemma3<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Gemma 3<\/td>\n<td style=\"height: 23px;\">12B<\/td>\n<td style=\"height: 23px;\">8.1 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run gemma3:12b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Gemma 3<\/td>\n<td style=\"height: 23px;\">27B<\/td>\n<td style=\"height: 23px;\">17 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run gemma3:27b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">QwQ<\/td>\n<td style=\"height: 23px;\">32B<\/td>\n<td style=\"height: 23px;\">20 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run qwq<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">DeepSeek-R1<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">4.7 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run deepseek-r1<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">DeepSeek-R1<\/td>\n<td style=\"height: 23px;\">671B<\/td>\n<td style=\"height: 23px;\">404 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run deepseek-r1:671b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 4<\/td>\n<td style=\"height: 23px;\">109B<\/td>\n<td style=\"height: 23px;\">67 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama4:scout<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 4<\/td>\n<td style=\"height: 23px;\">400B<\/td>\n<td style=\"height: 23px;\">245 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama4:maverick<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.3<\/td>\n<td style=\"height: 23px;\">70B<\/td>\n<td style=\"height: 23px;\">43 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.3<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.2<\/td>\n<td style=\"height: 23px;\">3B<\/td>\n<td style=\"height: 23px;\">2.0 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.2<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.2<\/td>\n<td style=\"height: 23px;\">1B<\/td>\n<td style=\"height: 23px;\">1.3 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.2:1b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.2 Vision<\/td>\n<td style=\"height: 23px;\">11B<\/td>\n<td style=\"height: 23px;\">7.9 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.2-vision<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.2 Vision<\/td>\n<td style=\"height: 23px;\">90B<\/td>\n<td style=\"height: 23px;\">55 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.2-vision:90b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.1<\/td>\n<td style=\"height: 23px;\">8B<\/td>\n<td style=\"height: 23px;\">4.7 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.1<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 3.1<\/td>\n<td style=\"height: 23px;\">405B<\/td>\n<td style=\"height: 23px;\">231 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama3.1:405b<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Phi 4<\/td>\n<td style=\"height: 23px;\">14B<\/td>\n<td style=\"height: 23px;\">9.1 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run phi4<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Phi 4 Mini<\/td>\n<td style=\"height: 23px;\">3.8B<\/td>\n<td style=\"height: 23px;\">2.5 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run phi4-mini<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Mistral<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">4.1 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run mistral<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Moondream 2<\/td>\n<td style=\"height: 23px;\">1.4B<\/td>\n<td style=\"height: 23px;\">829 MB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run moondream<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Neural Chat<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">4.1 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run neural-chat<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Starling<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">4.1 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run starling-lm<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Code Llama<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">3.8 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run codellama<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Llama 2 Uncensored<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">3.8 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llama2-uncensored<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">LLaVA<\/td>\n<td style=\"height: 23px;\">7B<\/td>\n<td style=\"height: 23px;\">4.5 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run llava<\/code><\/td>\n<\/tr>\n<tr style=\"height: 23px;\">\n<td style=\"height: 23px;\">Granite-3.3<\/td>\n<td style=\"height: 23px;\">8B<\/td>\n<td style=\"height: 23px;\">4.9 GB<\/td>\n<td style=\"height: 23px;\"><code class=\"\" data-line=\"\">ollama run granite3.3<\/code><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n<p><\/p>\n\n\n\n<p>Para nuestro ordenador laptor, 16 GB de RAM + 16 GB SSD de Swap = 32 GB, los modelos LLM que funcionan bien son los modelos que no superen las 16 GB de capacidad que es igual a la memoria del ordenador y se aconseja no tocar la swap, por ejemplo:  yo he optado por <strong>Llama 3.1 de 8B y 4.7 GB<\/strong>, Una opci\u00f3n demasiado justa y no recomendable es Gemma 27B y 17GB, tocaria la Swap e iria muy lento.<\/p>\n\n\n\n<p><strong>Llama 3.1<\/strong> es una&nbsp;colecci\u00f3n de modelos de lenguaje de gran tama\u00f1o (LLM) desarrollada por Meta, lanzada originalmente a mediados de 2024 y consolidada como un est\u00e1ndar de c\u00f3digo abierto en 2025 y 2026.&nbsp;<\/p>\n\n\n\n<p>Sus caracter\u00edsticas principales incluyen:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>El modelo 405B:<\/strong>&nbsp;Es el primer modelo de c\u00f3digo abierto que compite directamente en capacidades con los modelos propietarios m\u00e1s avanzados como GPT-4o, siendo ideal para destilaci\u00f3n de modelos, generaci\u00f3n de datos sint\u00e9ticos y tareas complejas de razonamiento.<\/li>\n\n\n\n<li><strong>Capacidad Multiling\u00fce:<\/strong>&nbsp;Soporta de forma nativa ocho idiomas, incluyendo espa\u00f1ol, ingl\u00e9s, alem\u00e1n, franc\u00e9s, italiano, portugu\u00e9s, hindi y tailand\u00e9s.<\/li>\n\n\n\n<li><strong>Ventana de Contexto Ampliada:<\/strong>&nbsp;Ofrece una ventana de contexto de&nbsp;<strong>128k tokens<\/strong>, lo que permite procesar documentos extensos o libros completos en una sola consulta.<\/li>\n\n\n\n<li><strong>Variantes:<\/strong>&nbsp;Adem\u00e1s del modelo 405B, incluye las versiones&nbsp;<strong>8B<\/strong>&nbsp;(para dispositivos locales) y&nbsp;<strong>70B<\/strong>&nbsp;(para aplicaciones empresariales de alto rendimiento).&nbsp;<\/li>\n<\/ul>\n\n\n\n<p>Para descargar e instalar modelo LLM Llama 3.1 8B y 4,7 GB compatible con mi hardware:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ ollama run llama3.1<\/pre>\n\n\n\n<p>Para ejecutar la IA en la terminal de debian ejecutamos el comando anterior: <code class=\"\" data-line=\"\">ollama run llama3.1<\/code><\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ ollama run llama3.1\n>>> \u00bfQuien es el ideologo de Linux?\nEl creador de Linux es Linus Benedict Torvalds. Sin embargo, no es un ide\u00f3logo en el sentido tradicional del t\u00e9rmino, sino m\u00e1s bien un desarrollador y \nprogramador que cre\u00f3 la base del sistema operativo.\n\nLinux se basa en el modelo de desarrollo colaborativo (open-source) y tiene una comunidad activa y amplia que contribuye al c\u00f3digo fuente y lo mejora \ncontinuamente. El proyecto Linux es un ejemplo cl\u00e1sico de software libre y abierto, y su filosof\u00eda est\u00e1 enfocada en la colaboraci\u00f3n, la transparencia y \nla disponibilidad del c\u00f3digo para cualquier persona.\n\nLinus Torvalds ha destacado por su enfoque hacia la comunidad y la colaboraci\u00f3n, creando un entorno donde los desarrolladores pueden contribuir de \nmanera activa al proyecto. Su visi\u00f3n ha permitido que Linux se convierta en uno de los sistemas operativos m\u00e1s utilizados en el mundo, especialmente en \nservidores y dispositivos de red.\n\nEn cuanto a ideolog\u00edas pol\u00edticas o filos\u00f3ficas relacionadas con Linux, hay varias corrientes dentro de la comunidad de desarrollo de software libre. \nAlgunas de las principales incluyen:\n\n*   El movimiento del software libre: se enfoca en el concepto de que los programas inform\u00e1ticos deben ser libres para que cualquier persona pueda \nestudiarlos, modificarlos y distribuirlos.\n*   La filosof\u00eda de Richard Stallman: fundador de la Free Software Foundation (FSF), quien defiende la idea de que el software debe ser libre y est\u00e1 \nsujeto a las leyes de derechos de autor.\n\n>>> Send a message (\/? for help)\n<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Gesti\u00f3n y mantenimiento de modelos<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ ollama list                 # Listar modelos\n$ ollama rm modelo            # Eliminar modelo\n$ df -h \/home                 # Espacio disponible<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"7_Instalacion_de_Open_WebUI_con_Docker_interfaz_tipo_ChatGPT\"><\/span>7. Instalaci\u00f3n de Open WebUI con Docker (interfaz tipo ChatGPT)<span class=\"ez-toc-section-end\"><\/span><\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">7.1 Requisitos<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo apt install docker.io docker-compose -y\n$ sudo systemctl enable --now docker\n$ sudo usermod -aG docker $USER\n$ newgrp docker<\/pre>\n\n\n\n<p><strong>Nota:<\/strong> Sustituye <strong>$USER<\/strong> por tu usuario del sistema.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">7.2 Despliegue de Open WebUI (puerto 3000)<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">docker run -d -p 3000:8080 \\\n  --add-host=host.docker.internal:host-gateway \\\n  -v open-webui:\/app\/backend\/data \\\n  -e OLLAMA_BASE_URL=http:\/\/host.docker.internal:11434 \\\n  --name open-webui \\\n  --restart always \\\n  ghcr.io\/open-webui\/open-webui:main\n<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">7.3 Aseg\u00farate de que el firewall permita el tr\u00e1fico:<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">$ sudo ufw allow 11434\/tcp.<\/pre>\n\n\n\n<p>Acceso web:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">http:\/\/localhost:3000<\/pre>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large is-resized\"><img decoding=\"async\" width=\"1024\" height=\"539\" src=\"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58-1024x539.png\" alt=\"\" class=\"wp-image-7552\" style=\"width:1200px;height:auto\" srcset=\"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58-1024x539.png 1024w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58-300x158.png 300w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58-768x404.png 768w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58-1536x809.png 1536w, https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/Captura-de-pantalla_2026-01-18_15-07-58.png 1920w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Crea tu usuario y contrase\u00f1a.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span class=\"ez-toc-section\" id=\"Resumen\"><\/span>Resumen<span class=\"ez-toc-section-end\"><\/span><\/h2>\n\n\n\n<p>Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de <strong>Ollama<\/strong> como motor de inteligencia artificial local en <strong>Debian 13<\/strong>, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante <strong>Open WebUI<\/strong> y Docker. El objetivo principal es demostrar que la <strong>IA local, privada y soberana<\/strong> es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.<\/p>\n<div style='text-align:center' class='yasr-auto-insert-overall'><\/div><div style='text-align:center' class='yasr-auto-insert-visitor'><\/div>","protected":false},"excerpt":{"rendered":"<p>Ollama es una herramienta de c\u00f3digo abierto que ejecuta grandes modelos ling\u00fc\u00edsticos (LLM) directamente en una m\u00e1quina local. Esto la hace especialmente atractiva para desarrolladores de IA, investigadores y empresas preocupadas por el control y la privacidad de los datos.<\/p>\n<p>Al ejecutar los modelos localmente, mantienes la propiedad plena de los datos y evitas posibles riesgos de seguridad asociados al almacenamiento en la nube.<\/p>\n","protected":false},"author":1,"featured_media":7549,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"jetpack_post_was_ever_published":false,"yasr_overall_rating":0,"yasr_post_is_review":"","yasr_auto_insert_disabled":"","yasr_review_type":"","_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[5,2,3],"tags":[7,299,15],"class_list":["post-7548","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-desarrollo","category-informatica","category-sistemas","tag-debian-linux","tag-ia","tag-linux"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com<\/title>\n<meta name=\"description\" content=\"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/\" \/>\n<meta property=\"og:locale\" content=\"es_ES\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com\" \/>\n<meta property=\"og:description\" content=\"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/\" \/>\n<meta property=\"og:site_name\" content=\"javiercachon.com\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/javiercachon82\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/javiercachon82\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-18T14:16:20+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-01-30T14:46:23+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Javier Cach\u00f3n Garrido\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/x.com\/udegeek\" \/>\n<meta name=\"twitter:site\" content=\"@udegeek\" \/>\n<meta name=\"twitter:label1\" content=\"Escrito por\" \/>\n\t<meta name=\"twitter:data1\" content=\"Javier Cach\u00f3n Garrido\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tiempo de lectura\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutos\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/\"},\"author\":{\"name\":\"Javier Cach\u00f3n Garrido\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#\\\/schema\\\/person\\\/56ea56e1350676921cc43a3bcfd6c997\"},\"headline\":\"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13\",\"datePublished\":\"2026-01-18T14:16:20+00:00\",\"dateModified\":\"2026-01-30T14:46:23+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/\"},\"wordCount\":1369,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#\\\/schema\\\/person\\\/56ea56e1350676921cc43a3bcfd6c997\"},\"image\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/ollama.jpg\",\"keywords\":[\"Debian Linux\",\"IA\",\"Linux\"],\"articleSection\":[\"Desarrollo\",\"Inform\u00e1tica\",\"Sistemas\"],\"inLanguage\":\"es\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/\",\"url\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/\",\"name\":\"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/ollama.jpg\",\"datePublished\":\"2026-01-18T14:16:20+00:00\",\"dateModified\":\"2026-01-30T14:46:23+00:00\",\"description\":\"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#breadcrumb\"},\"inLanguage\":\"es\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#primaryimage\",\"url\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/ollama.jpg\",\"contentUrl\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/ollama.jpg\",\"width\":1920,\"height\":1080},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"inicio\",\"item\":\"https:\\\/\\\/javiercachon.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#website\",\"url\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/\",\"name\":\"javiercachon.com\",\"description\":\"es una plataforma educativa revolucionaria que busca democratizar el acceso al conocimiento en inform\u00e1tica.\",\"publisher\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#\\\/schema\\\/person\\\/56ea56e1350676921cc43a3bcfd6c997\"},\"alternateName\":\"javiercachon.com\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"es\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/#\\\/schema\\\/person\\\/56ea56e1350676921cc43a3bcfd6c997\",\"name\":\"Javier Cach\u00f3n Garrido\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/Montanas-JC-logo-azul.jpg\",\"url\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/Montanas-JC-logo-azul.jpg\",\"contentUrl\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/Montanas-JC-logo-azul.jpg\",\"width\":250,\"height\":34,\"caption\":\"Javier Cach\u00f3n Garrido\"},\"logo\":{\"@id\":\"https:\\\/\\\/javiercachon.com\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/Montanas-JC-logo-azul.jpg\"},\"description\":\"Javier Cach\u00f3n Garrido es un experto en inform\u00e1tica, apasionado por el c\u00f3digo abierto y la educaci\u00f3n tecnol\u00f3gica. Como fundador de JavierCachon.com, impulsa la formaci\u00f3n en GNU\\\/Linux y software libre, promoviendo el aprendizaje accesible y colaborativo. Su misi\u00f3n es democratizar el conocimiento en TI, ofreciendo recursos gratuitos y de calidad para profesionales y entusiastas del sector. Es un profeta y predicador que evangeliza la tecnolog\u00eda, la inform\u00e1tica y telecomunicaciones, acomp\u00e1\u00f1ale en su aventura y pasi\u00f3n por los sistemas operativos abiertos GNU\\\/Linux.\",\"sameAs\":[\"https:\\\/\\\/javiercachon.com\\\/\",\"https:\\\/\\\/www.facebook.com\\\/javiercachon82\",\"https:\\\/\\\/www.instagram.com\\\/edugeek\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/javiercachongarrido\\\/\",\"https:\\\/\\\/x.com\\\/https:\\\/\\\/x.com\\\/udegeek\",\"https:\\\/\\\/www.youtube.com\\\/@edugeek\",\"https:\\\/\\\/wiki.javiercachon.com\\\/wiki\\\/index.php\\\/Usuario:Ojosdegato\"],\"url\":\"https:\\\/\\\/javiercachon.com\\\/es\\\/author\\\/ojosdegato\\\/\"},false]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com","description":"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/","og_locale":"es_ES","og_type":"article","og_title":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com","og_description":"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.","og_url":"https:\/\/javiercachon.com\/es\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/","og_site_name":"javiercachon.com","article_publisher":"https:\/\/www.facebook.com\/javiercachon82","article_author":"https:\/\/www.facebook.com\/javiercachon82","article_published_time":"2026-01-18T14:16:20+00:00","article_modified_time":"2026-01-30T14:46:23+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","type":"image\/jpeg"}],"author":"Javier Cach\u00f3n Garrido","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/x.com\/udegeek","twitter_site":"@udegeek","twitter_misc":{"Escrito por":"Javier Cach\u00f3n Garrido","Tiempo de lectura":"7 minutos"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#article","isPartOf":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/"},"author":{"name":"Javier Cach\u00f3n Garrido","@id":"https:\/\/javiercachon.com\/es\/#\/schema\/person\/56ea56e1350676921cc43a3bcfd6c997"},"headline":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13","datePublished":"2026-01-18T14:16:20+00:00","dateModified":"2026-01-30T14:46:23+00:00","mainEntityOfPage":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/"},"wordCount":1369,"commentCount":0,"publisher":{"@id":"https:\/\/javiercachon.com\/es\/#\/schema\/person\/56ea56e1350676921cc43a3bcfd6c997"},"image":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#primaryimage"},"thumbnailUrl":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","keywords":["Debian Linux","IA","Linux"],"articleSection":["Desarrollo","Inform\u00e1tica","Sistemas"],"inLanguage":"es","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/","url":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/","name":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13 | javiercachon.com","isPartOf":{"@id":"https:\/\/javiercachon.com\/es\/#website"},"primaryImageOfPage":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#primaryimage"},"image":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#primaryimage"},"thumbnailUrl":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","datePublished":"2026-01-18T14:16:20+00:00","dateModified":"2026-01-30T14:46:23+00:00","description":"Este art\u00edculo documenta de forma exhaustiva y basada en experiencia real la instalaci\u00f3n, configuraci\u00f3n y explotaci\u00f3n de Ollama como motor de inteligencia artificial local en Debian 13, integr\u00e1ndolo con una interfaz gr\u00e1fica tipo ChatGPT mediante Open WebUI y Docker. El objetivo principal es demostrar que la IA local, privada y soberana es viable en equipos dom\u00e9sticos si se aplican criterios t\u00e9cnicos correctos.","breadcrumb":{"@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#breadcrumb"},"inLanguage":"es","potentialAction":[{"@type":"ReadAction","target":["https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/"]}]},{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#primaryimage","url":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","contentUrl":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","width":1920,"height":1080},{"@type":"BreadcrumbList","@id":"https:\/\/javiercachon.com\/guia-maestra-ia-privada-de-codigo-abierto-con-ollama-y-open-webui-en-debian-13\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"inicio","item":"https:\/\/javiercachon.com\/"},{"@type":"ListItem","position":2,"name":"Gu\u00eda Maestra: IA Privada de c\u00f3digo abierto con Ollama y Open WebUI en Debian 13"}]},{"@type":"WebSite","@id":"https:\/\/javiercachon.com\/es\/#website","url":"https:\/\/javiercachon.com\/es\/","name":"javiercachon.com","description":"es una plataforma educativa revolucionaria que busca democratizar el acceso al conocimiento en inform\u00e1tica.","publisher":{"@id":"https:\/\/javiercachon.com\/es\/#\/schema\/person\/56ea56e1350676921cc43a3bcfd6c997"},"alternateName":"javiercachon.com","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/javiercachon.com\/es\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"es"},{"@type":["Person","Organization"],"@id":"https:\/\/javiercachon.com\/es\/#\/schema\/person\/56ea56e1350676921cc43a3bcfd6c997","name":"Javier Cach\u00f3n Garrido","image":{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/javiercachon.com\/wp-content\/uploads\/2024\/11\/Montanas-JC-logo-azul.jpg","url":"https:\/\/javiercachon.com\/wp-content\/uploads\/2024\/11\/Montanas-JC-logo-azul.jpg","contentUrl":"https:\/\/javiercachon.com\/wp-content\/uploads\/2024\/11\/Montanas-JC-logo-azul.jpg","width":250,"height":34,"caption":"Javier Cach\u00f3n Garrido"},"logo":{"@id":"https:\/\/javiercachon.com\/wp-content\/uploads\/2024\/11\/Montanas-JC-logo-azul.jpg"},"description":"Javier Cach\u00f3n Garrido es un experto en inform\u00e1tica, apasionado por el c\u00f3digo abierto y la educaci\u00f3n tecnol\u00f3gica. Como fundador de JavierCachon.com, impulsa la formaci\u00f3n en GNU\/Linux y software libre, promoviendo el aprendizaje accesible y colaborativo. Su misi\u00f3n es democratizar el conocimiento en TI, ofreciendo recursos gratuitos y de calidad para profesionales y entusiastas del sector. Es un profeta y predicador que evangeliza la tecnolog\u00eda, la inform\u00e1tica y telecomunicaciones, acomp\u00e1\u00f1ale en su aventura y pasi\u00f3n por los sistemas operativos abiertos GNU\/Linux.","sameAs":["https:\/\/javiercachon.com\/","https:\/\/www.facebook.com\/javiercachon82","https:\/\/www.instagram.com\/edugeek\/","https:\/\/www.linkedin.com\/in\/javiercachongarrido\/","https:\/\/x.com\/https:\/\/x.com\/udegeek","https:\/\/www.youtube.com\/@edugeek","https:\/\/wiki.javiercachon.com\/wiki\/index.php\/Usuario:Ojosdegato"],"url":"https:\/\/javiercachon.com\/es\/author\/ojosdegato\/"},false]}},"yasr_visitor_votes":{"stars_attributes":{"read_only":false,"span_bottom":false},"number_of_votes":0,"sum_votes":0},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/javiercachon.com\/wp-content\/uploads\/2026\/01\/ollama.jpg","jetpack_likes_enabled":true,"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/posts\/7548","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/comments?post=7548"}],"version-history":[{"count":10,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/posts\/7548\/revisions"}],"predecessor-version":[{"id":7566,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/posts\/7548\/revisions\/7566"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/media\/7549"}],"wp:attachment":[{"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/media?parent=7548"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/categories?post=7548"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/javiercachon.com\/es\/wp-json\/wp\/v2\/tags?post=7548"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}