<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Privacy-First AI on Rachid Youven Zeghlache</title><link>https://youvenz.github.io/tags/privacy-first-ai/</link><description>Recent content in Privacy-First AI on Rachid Youven Zeghlache</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 05 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://youvenz.github.io/tags/privacy-first-ai/index.xml" rel="self" type="application/rss+xml"/><item><title>Supercharge TeXstudio: Local AI Chat Without APIs</title><link>https://youvenz.github.io/blog/2026-03-05-supercharge-texstudio-local-ai-chat-without-apis/</link><pubDate>Thu, 05 Mar 2026 00:00:00 +0000</pubDate><guid>https://youvenz.github.io/blog/2026-03-05-supercharge-texstudio-local-ai-chat-without-apis/</guid><description>&lt;h2 id="set-up-a-local-llm-inside-texstudio-without-cloud-apis--for-latex-writers-who-want-privacy"&gt;Set Up a Local LLM Inside TeXstudio Without Cloud APIs — For LaTeX Writers Who Want Privacy&lt;/h2&gt;
&lt;p&gt;You&amp;rsquo;re writing a LaTeX paper, and you want AI assistance—but you don&amp;rsquo;t want to pay per API call, send drafts to external servers, or depend on internet connectivity.&lt;/p&gt;
&lt;p&gt;Right now, TeXstudio&amp;rsquo;s &lt;strong&gt;AI Chat Assistant&lt;/strong&gt; only connects to OpenAI or Mistral. There&amp;rsquo;s a third way: run an LLM locally and connect it directly to TeXstudio in 15 minutes.&lt;/p&gt;</description></item></channel></rss>