// Nine Years of Silence
I owe you an honest explanation. IT Digest started in 2013 as a simple place to collect the kind of knowledge that every developer accumulates over years — the terminal commands you always forget, the MySQL configuration tricks, the Mac OS X quirks that waste your afternoon. Quick tips, code snippets, problems and solutions. Nothing revolutionary, but genuinely useful.
Then life happened. The day job got busier. The articles got fewer. And at some point in 2017, I just stopped. The site stayed online — the articles kept getting traffic from search engines — but I wasn't writing anymore. IT Digest became an archive.
I didn't plan to come back. But then something happened that made it impossible to stay quiet.
// The Shift
If you're reading this, you already know. The AI revolution didn't creep in — it kicked the door down. In the span of roughly three years, we went from GPT-3 being a curiosity to a world where I can sit at my desk in Geneva, run a 70-billion-parameter model locally on my Mac Studio, and have an AI assistant write, debug, and architect production code alongside me.
This isn't hype. This is my daily workflow. Right now, as I type this, Claude Code is open in my terminal. Ollama is serving Llama and Qwen models on localhost. LM Studio is running benchmarks in the background. My oCMS — the very platform you're reading this on — has an AI chatbot built into it. And every single one of these tools either didn't exist or was science fiction when I published that last Yahoo security post in 2017.
I've been a software engineer for over thirty years. I've seen the Web arrive. I've seen mobile transform everything. I've watched cloud computing reshape infrastructure. But I have never — not once — experienced a shift this fast, this deep, and this personal to my craft. AI isn't just a new technology. It's a new way of building software. It changes how I think about architecture, how I write code, how I solve problems, and how I interact with machines.
That's why I'm back. Not because blogging is trendy. Because I genuinely can't stop building with this stuff, and I need a place to write about it.
// What I've Been Building
While IT Digest was silent, I wasn't idle. The biggest project is the platform you're looking at right now.
oCMS — A CMS Built in Go
oCMS is a lightweight content management system I built from scratch in Go. Not WordPress. Not a headless CMS-as-a-service. A real, self-hosted CMS written in a language I love, with the architecture choices I believe in:
- Go 1.26 on the backend — fast, safe, single-binary deployment
- SQLite as the database — zero-config, embedded, surprisingly powerful with FTS5 full-text search
- HTMX + Alpine.js for interactivity — no React, no Vue, no 200MB node_modules
- templ for type-safe HTML templates — compile-time guarantees instead of runtime surprises
- A full REST API with bearer token auth and rate limiting
- A webhook system with HMAC-SHA256 signatures and exponential backoff retries
- A module architecture that lets you toggle features at runtime without restarting
- Multi-language support, media library with image processing, form builder, import/export
- And yes — an AI chatbot integration powered by Dify, because why not practice what you preach
oCMS is open source on GitHub. I built it partly because I wanted a CMS that reflects modern Go development practices, and partly because building a CMS is the best way to deeply understand web architecture. It's been a masterclass in template engines, database design, caching strategies, and security hardening — all things that matter even more in the age of AI.
My Local AI Lab
On the AI side, I've assembled what I think of as a local AI lab. My Mac Studio M3 Ultra with 96GB unified memory lets me run large language models that would have required a data center just three years ago. My current setup:
- Ollama — my primary local inference engine, running Llama 3.3, Qwen 2.5, DeepSeek, and others
- LM Studio — for model exploration, benchmarking, and GGUF experimentation
- Claude Code — my daily AI coding companion in the terminal, hands down the most useful developer tool I've adopted in years
- Claude API + OpenAI API — for building AI-powered features into oCMS and other projects
- Dify — for RAG pipelines and the AI chatbot you can see on this site
- LibreChat — a self-hosted multi-model chat interface for when I need to compare outputs
The key insight? Running models locally changes everything. Latency drops to milliseconds. Privacy is absolute. Costs are zero per query. And you develop an intuition for what these models can and can't do that you simply don't get from API calls alone. When you watch a 70B model struggle with a coding task in real time on your own hardware, you understand AI's limitations viscerally — and that understanding makes you a better AI engineer.
// What's Changing at IT Digest
IT Digest is evolving. The old MySQL tips and Mac OS X tricks aren't going anywhere — they still help people, and that matters. But the focus going forward is clear:
- AI Engineering — practical guides on LLM integration, RAG architecture, MCP servers, prompt engineering, and AI-assisted development. No theory papers. Real code, real benchmarks, real production patterns.
- Go Development — building web services, CMS architecture, performance optimization, idiomatic Go. With a heavy AI flavor.
- Developer AI Tools — honest reviews and deep-dives on Claude Code, Ollama, LM Studio, Dify, and everything else I use daily. What works, what doesn't, and why.
- Self-Hosting & Infrastructure — running local LLMs, Docker setups, Debian/Ubuntu server management, and the ops side of AI deployment.
I'm also changing how I write. The old IT Digest posts were mostly short tips — a problem, a command, done. Those are still useful, but AI demands deeper thinking. Expect longer articles. Architecture diagrams. Code walkthroughs. Comparisons with actual benchmarks. Opinion pieces on where AI is heading and what it means for developers.
// Why This Matters
We're living through the early days of the most significant technological shift of our careers. The developers who thrive won't be the ones who fear AI or ignore it — they'll be the ones who understand it deeply enough to build with it effectively. Who know when to use a local model vs. an API call. Who can architect a RAG pipeline that actually works in production. Who understand the security implications of putting an LLM in a customer-facing product.
That's the knowledge I want to share. Not because I have all the answers — I'm learning every day, same as everyone — but because the process of writing forces clarity. Every article I publish makes me a better engineer, and hopefully makes you one too.
// What's Next
In the coming weeks, expect articles on:
- Building oCMS: architecture decisions, why Go, and lessons learned
- Running 70B+ parameter models locally on Apple Silicon — a practical guide
- Claude Code vs. GitHub Copilot: a senior developer's honest comparison
- RAG architecture patterns that actually work in production
- MCP servers: the protocol that's quietly changing how AI tools work
IT Digest is back. The silence is over. And honestly? I'm more excited about technology than I've been in twenty years.
Let's build.
— Oleg Ivanchenko, Geneva, March 2026