- Published on
A month of Building
One Month Back to Coding: What I've Been Learning and Building
It's been a month now since I started coding again regularly. A lot has happened in these 30 days—new learning paths, new tools, and new projects. Here's what I've been up to.
Re Learning Deep Learning Fundamentals
I went through the first few lessons of Andrej Karpathy's Zero to Hero series. Most of it covered things I already knew and had worked on before, but there was something refreshing about building Micrograd from scratch. I had previously created a deep learning course in which we recreated a Tensorflow like neural network library, so this was a good refresher.
What struck me most was remembering how much I enjoy this process. Training neural networks is genuinely fun—playing with hyperparameters, tweaking architectures, and watching the results unfold. The videos also reminded me of how I learn best: through a hands-on approach where I type everything myself and make mistakes that strengthen my understanding. I value mental models that help me grasp concepts deeply, and visual explanations help too. I revisited some 3Blue1Brown videos during this time, which never disappoint.
Discovering Solveit
I attended 7 out of 10 live sessions from the Solveit cohort, and it's been transformative. Solveit has become my standard notebook coding environment for exploration. I tried Jupyter AI, but it's still not at the level of the Solveit platform.
What makes Solveit special is how the model has access to my entire workspace—my notes, code, and prompt messages. This means I can go deep down rabbit holes while learning a concept without losing context. In the course, we learned how to build agents the easiest way, among many other things. With Solveit, I feel like I can learn anything.
Vibe Coding Adventures
I also experimented with vibe coding some apps this month, and it was incredibly fun. I started with VS Code using the Codex extension. This very blog was built through vibe coding with Codex—I started with a template and made tweaks by prompting an LLM.
These days, I'm working on a more ambitious project using Antigravity, which offers a better experience. I'll dedicate a special blog post to Antigravity later because there's a lot to share about it.
Building jaimemalangue.bj
One of the projects I've been working on this month is jaimemalangue.bj, an initiative to collect voice data for building an ASR (Automatic Speech Recognition) model for Fongbe, the most spoken national language in Benin Republic. We launched the campaign with a goal of collecting 500 hours in the first phase. This data will later be open source to anyone who wants to use it to train some ASR or multilingual models.
On Writing More Often
It took me a month to write this second post, while my initial goal was to write at least three times a week. Working on multiple engaging projects while going through a cohort and building new things made it harder to write as often as I'd like.
I'm inspired by Simon Willison's approach—he writes prolifically on his blog, with some posts being just one or two paragraphs. Maybe I should try that. Short updates, frequent posts. Let's see if that could work for me.