<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Data Privacy on Vishal Vijayraghavan</title><link>https://vvr.netlify.app/tags/data-privacy/</link><description>Recent content in Data Privacy on Vishal Vijayraghavan</description><generator>Hugo</generator><language>en-us</language><copyright>© Copyright notice</copyright><lastBuildDate>Wed, 22 Jan 2025 00:00:00 +0000</lastBuildDate><atom:link href="https://vvr.netlify.app/tags/data-privacy/index.xml" rel="self" type="application/rss+xml"/><item><title>Local AI Coding Assistant</title><link>https://vvr.netlify.app/post/local_coding_assistant/</link><pubDate>Wed, 22 Jan 2025 00:00:00 +0000</pubDate><guid>https://vvr.netlify.app/post/local_coding_assistant/</guid><description>I came across GitHub Copilot and Cursor, and they&amp;rsquo;re truly amazing and feature-rich. However, I was concerned about running something locally due to data privacy. No worries, open-source always has an alternative! This time, it&amp;rsquo;s Ollama and Continue (VS Code plugin). Local AI coding assistants might seem intimidating, but with Ollama and the Continue VS Code extension, it&amp;rsquo;s surprisingly straightforward. In this blog, I will walk you through setting up a powerful, private coding companion right on your local machine.</description></item></channel></rss>