<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Ai on Dhiru's Notebook</title><link>https://rfcorner.in/tags/ai/</link><description>Recent content in Ai on Dhiru's Notebook</description><generator>Hugo</generator><language>en</language><lastBuildDate>Sat, 16 May 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://rfcorner.in/tags/ai/index.xml" rel="self" type="application/rss+xml"/><item><title>Local AI Coding with Aider on Ubuntu 26.04</title><link>https://rfcorner.in/posts/local-ai-coding-tool/</link><pubDate>Sat, 16 May 2026 00:00:00 +0000</pubDate><guid>https://rfcorner.in/posts/local-ai-coding-tool/</guid><description>&lt;p&gt;&lt;em&gt;Running your own AI coding assistant locally using ROCm, LLaMA C++, and aider.&lt;/em&gt;&lt;/p&gt;
&lt;h2 id="why-local-ai-coding"&gt;Why Local AI Coding?&lt;/h2&gt;
&lt;p&gt;Cloud AI coding assistants are convenient, but local models offer:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Better privacy&lt;/li&gt;
&lt;li&gt;Lower long-term cost&lt;/li&gt;
&lt;li&gt;Offline development&lt;/li&gt;
&lt;li&gt;Faster iteration for small/medium models&lt;/li&gt;
&lt;li&gt;Full control over models and tooling&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;With modern AMD GPUs and ROCm support improving rapidly, Ubuntu 26.04 makes it surprisingly easy to run local coding models.&lt;/p&gt;
&lt;p&gt;In this guide, we'll set up:&lt;/p&gt;</description></item></channel></rss>