Y
The following is a condensed version of a chat I had with ChatGPT Sora, which I asked it to turn into a Q&A form. The answers are paraphrased but core answers are mine.
Q: Let’s start from the beginning — what sparked your tech journey?
It all began with a single Ubiquiti UniFi access point. I wanted to host the UniFi Controller myself rather than buy the standalone product, so I turned to a Raspberry Pi as a low-cost, low-power server. That tiny computer ended up teaching me far more than I expected — not just networking, but the entire Linux ecosystem. —
Q: Was it easy getting the UniFi Controller to run on a Raspberry Pi?
Not at all. This was before generative AI tools like ChatGPT existed, so I had to search the web for every command and configuration tweak. I started out using the GUI desktop on the Pi but soon realized that command-line control was essential. The first major hurdle was dependency and version conflicts in Docker — a recurring theme that taught me patience and precision. —
Q: How did those early experiences evolve into broader self-hosting projects?
Once I got the controller working, I was hooked. I began experimenting with other Docker containers — hosting a personal blog, a music server, and even a photo archive using open-source tools like Immich (an alternative to Google Photos).
As my ambitions grew, so did my understanding of reverse proxies. I spent countless hours configuring Caddy, eventually compiling my own image to fine-tune logging for Fail2Ban. That project deepened my grasp of web security and remote access, especially when setting up HTTPS certificates and troubleshooting browser-level networking issues. —
Q: Were there hardware limitations along the way?
Definitely. My Raspberry Pi 3B+ could only handle so much. Running multiple containers strained performance, so I learned the value of lightweight deployments and efficient resource management. Those lessons became the foundation for how I now approach running AI stacks — minimalism first, performance second. —
Q: You mentioned you weren’t initially interested in generative AI. What changed?
When AI tools first became popular, I was skeptical. I wanted to see tangible, transformative use cases rather than hype. But once I started using ChatGPT, that perspective shifted. It was immediately useful — enough that I hit usage quotas quickly — and that frustration sparked a new goal: to self-host my own AI models. That’s when I discovered Ollama and started experimenting with local LLMs and containerized AI workflows. Suddenly, the Raspberry Pi experience felt like preparation for this exact moment. —
Q: How did your self-hosting and Docker knowledge translate into AI experimentation?
Perfectly. All the foundational concepts — containerization, reverse proxies, networking, resource optimization — applied directly. I started running small models locally, using Caddy again to solve CORS (Cross-Origin Resource Sharing) issues between frontend ports and API calls. That same troubleshooting muscle I’d built years earlier made it possible to piece together my current AI stack with relatively little frustration. —
Q: What’s the next step in your journey now?
Right now, I’m developing a Word extension for an open-source Microsoft Word platform called SuperDoc. The goal is to replicate enterprise-level AI-assisted contract editing, but on a self-hosted stack. I’m integrating it with Ollama and local LLM APIs, using Docker to deploy the entire setup. Of course, that comes with its own set of challenges — particularly CORS and networking issues — but again, Caddy comes to the rescue. —
Q: How do you handle coding for these projects if you don’t have a programming background?
I actually studied C++ in high school, but I’d forgotten almost everything. These days, I rely on VS Code’s AI assistants to “white-code” — basically, to co-write functions with me in real time. I describe what I want, and the AI scaffolds the logic. It’s an incredible way to build complex systems without being a professional developer. Every project I do teaches me something new about how AI can act as a multiplier for technical creativity. —
Q: Let’s pivot to your professional focus — how does this intersect with legal tech?
My current work sits at the intersection of law and AI. I’m exploring how small law firms can use open-source, self-hosted AI tools to match the efficiency of enterprise legal platforms. I’m particularly interested in Retrieval-Augmented Generation (RAG) systems — designing ways for AI to pull and summarize relevant case data, rather than relying on costly fine-tuned models. For a small firm, building a customized RAG pipeline with hierarchical document summarization and a vector database can be far more practical than paying for proprietary AI services. —
Q: Many top law firms now use AI to reduce manpower costs. How do you see that trend?
Big firms like Allen & Overy and Orrick are already leveraging tools such as Harvey AI to automate drafting, filings, and research. These solutions free senior lawyers to focus on strategy. But for smaller firms, I believe the real transformation will come from open, self-hosted systems — giving lawyers independence, transparency, and full control over their data. It’s about democratizing legal AI, not just scaling it. —
Q: Looking forward, what do you want to achieve with your projects?
My goal is to create transformative, self-hosted AI tools that rival the capabilities of proprietary cloud platforms — not just for myself, but as open-source solutions others can build upon. I see my Raspberry Pi journey as the blueprint for that vision: small, hands-on, resourceful, and open to constant iteration. From networking scripts to AI document analysis, it’s all part of the same continuum — empowering individuals to build their own intelligent systems from the ground up. —
Q: Final thoughts?
If I’ve learned anything, it’s that you don’t need massive infrastructure or a team of engineers to create something meaningful. Curiosity, persistence, and a willingness to troubleshoot are enough to go from a blinking Pi light to running your own AI-assisted workflow. Self-hosting isn’t just a technical decision — it’s a mindset of ownership and exploration.