Around 2019 I had a minor breakdown about learning. Frontend frameworks were changing every six months. Everyone was talking about Kubernetes. GraphQL was the future, except maybe gRPC, or maybe something else entirely. I had a list of "things I should learn" with about 40 items on it.

I was spending evenings and weekends watching tutorials, barely making progress on any single topic, feeling increasingly behind. Every Hacker News thread about some new technology felt like an accusation: why don't you know this yet?

Eventually something clicked: I was trying to prepare for a future that might never come. Learning "just in case" rather than "just in time." And it was making me miserable without making me better at my job.

The Trap of Keeping Up

Here's a liberating truth: you can't keep up. Nobody can. The field is too broad, moving too fast. Even people who seem to know everything actually know a small slice of everything - they're just good at talking about their slice.

I know senior architects at big tech companies who don't know React. I know frontend specialists who couldn't write a SQL join to save their lives. I know infrastructure engineers who have never built a web application. They're all successful. They all have impostor syndrome about the things they don't know. They all think everyone else knows more.

The anxiety about "falling behind" is based on a false premise - that there's some standard knowledge set all developers have. There isn't. There are just different specializations and trade-offs.

Just-in-Time vs Just-in-Case

I spent three weekends learning Kubernetes basics in 2018 because everyone said it was important. Then I didn't use it. The knowledge faded. When I finally needed it in 2022, I had to relearn most of it anyway.

Those weekends were wasted. Not because Kubernetes isn't valuable, but because I learned it at the wrong time. Without a real problem to solve, without actual production systems to work with, nothing stuck.

Now I learn things when I need them. This feels risky - what if I need something and don't know it? But in practice, most technologies can be learned as you go. The first week is rough, then you're productive, then you're proficient. The learning is more efficient because it's tied to real problems.

I still read about new technologies casually, but I've stopped trying to deeply learn things I'm not actively using. Awareness that something exists is different from knowing how to use it. Awareness is cheap. Deep learning is expensive. Spend the expensive resource wisely.

Going Deep Instead of Wide

There's a concept of the "T-shaped" developer: broad shallow knowledge across many areas, deep expertise in one or two. I think this is actually a useful model.

Broad knowledge means knowing what tools exist and roughly what they're good for. "Oh, you need to process CSV files at scale? That sounds like a Spark/pandas kind of problem." You don't need to know Spark - you just need to know that's the right direction.

Deep knowledge means being genuinely good at something. The person teams call when there's a hard problem in that area. For me it's backend web development. I'm not the best at DevOps or mobile or data engineering, but I'm solid at designing APIs and building web services.

Trying to be deep in everything leads to burnout. You can't be an expert in everything. Pick your battles.

Tutorial Hell Is Real

Watching a 10-hour Udemy course gives you a false sense of competence. You nod along, understand the explanations, feel like you're learning. Then you open a blank file and can't write anything.

Tutorials are passive learning. You're following someone else's thought process, not developing your own. It's like watching someone ride a bike and thinking you can now ride a bike.

The fix is to build something different while you learn. If the tutorial builds a todo app, you build a bookmark manager. Same concepts, different application. This forces you to adapt what you're learning instead of just copying. You'll hit problems the tutorial didn't cover. You'll have to think. That's where real learning happens.

I also like learning projects that are actually useful to me. Learning Python? Automate something annoying at work. Learning a frontend framework? Build the side project you've been putting off. Motivation is easier when there's a real outcome.

One Hard Thing at a Time

I've seen developers try to learn Docker, a new language, and a new framework simultaneously for a side project. This almost always fails. Too many unknowns. Every error could be any of the three things. Debugging becomes impossible.

Dan McKinley has this concept of "innovation tokens" - on any project, you only get a few chances to use new/risky technology. Use them all and the project collapses under uncertainty.

Apply this to learning: if you're learning Rust, use a database you already know. If you're learning a new cloud platform, use a language you're comfortable with. Keep everything else boring and familiar so you can focus on the new thing.

Accepting "Good Enough"

When I started learning Go, I kept stopping to understand how the runtime worked, how the garbage collector was implemented, what was happening at the syscall level. Interesting, sure. But I couldn't write a simple HTTP server because I kept falling into rabbit holes.

It's okay to not understand everything. It's okay to copy code you don't fully grasp, get it working, then dig deeper later. The goal is building things, not achieving complete understanding of every abstraction.

I know developers who can explain React's reconciliation algorithm in detail but can't ship a working feature. I also know developers who couldn't tell you how virtual DOM works but have built dozens of production applications. The second type gets paid more.

Understanding comes with usage. Get something working first. Understanding will follow.

Protecting Your Curiosity

This is maybe the most important thing: don't let learning feel like homework.

The developers who stay engaged for decades are the ones who still find programming interesting. Not because they should, but because they do. The moment learning becomes a chore - something you do because you're afraid of falling behind - you're on a path to burnout.

Take breaks. It's okay to not learn anything technical for a few weeks. Read novels. Touch grass. Your career won't collapse because you didn't keep up with the latest JavaScript framework for a month.

When I feel overwhelmed by the firehose of new technologies, I remind myself: I don't have to learn any of this. I choose to learn the parts that interest me, when I need them. The rest will wait, or won't matter.

The tech industry has enough anxiety built in. Don't add to it by treating learning like a never-ending race you're losing. Learn at your own pace, for your own reasons. It's a marathon, and you're only running against yourself.

Back to Developer Productivity

Back to Home