Given the ever-expanding role of software in our lives, it shouldn't be too difficult to ascertain why so many people have been exuberantly advocating learning computer programming recently. This idea that everyone should learn to code — practiced at new websites like Codecademy and preached by media cheerleaders like Douglas Rushkoff and Tim O'Reilly — has become practically meme-like. At its best, it has sparked a long-overdue conversation about the importance of understanding and participating in the complex systems being built around us. But the philosophy recently found an interesting opponent in noted programmer and blogger Jeff Atwood, who earlier this week argued the contrary: that average folks shouldn't bother learning to code, unless they're planning on making a career out of it.

Atwood, best known as the father of StackOverflow, was quickly called out by several of his peers. His argument rests on the assertion that for average people, coding is a specialized, non-essential skill. Citing an odd and probably inconsequential tweet from New York Mayor Michael Bloomberg, he irreverently compares the call to program with a call to learn plumbing, saying that among other things it "falsely equates coding with essential life skills like reading, writing, and math." But in doing so he fails to account for an important point that perhaps, admittedly, these coding initiatives haven't been great at communicating: that "learning to code" and "becoming a programmer" are not the same thing, and that doing the former in a time when software encapsulates nearly everything we do is personally empowering.

And yes, it's okay — you can do one and not the other. In fact, you should.

"Learning to code simply means having a basic grasp of how computers work instead of blindly following whatever a talking paperclip tells you."

"I can't think of many other skills that enable you to create something from scratch and reach as many people as knowing how to set up a simple website," writes CodeYear designer Sacha Greif in response to Atwood's post. "'Learning to code' doesn’t always mean becoming the next Linus Torvalds, just like 'learning to cook' doesn’t mean opening a 3-stars restaurant. It simply means having a basic grasp of how computers work instead of blindly following whatever a talking paperclip tells you."

Atwood raises points about the unfortunate tendency to write code for the sake of writing code, rather than to actually solve problems. But coding isn't necessarily a solution in search of a problem — it's a kind of literacy, a mindset which, when properly utilized, can be applied anywhere there's software (or anywhere there's not). In the end, Atwood's "stay out of the kitchen" attitude in response to this new wave of free coding resources and training doesn't fully corroborate with most of his points. Instead, it seems to boil down to a discouraging diatribe from a curmudgeonly (though very often brilliant) code veteran raised on the "old way" of computer science classes and textbook learning.

"If you don't know how to program, you filter out all parts of the world that involve programming," reads another response on Github. "Yes, I can "see the code" behind my phone and know that poor memory management, not the mercurial favor of the Gods caused that app to crash hard. But this is only because I understand programming. I am only able to function in this digital wasteland of convenience -- and not be dragged along sipping a slurpy -- because my search space for problem solving and perception has been expanded by my learning to code."

Atwood's rant has generated a lot of interesting and impassioned discussion which you should read, including one rebuttal that sits smack-dab in the middle. Be sure to also check out some of the responses on Hacker News.