There is some interesting discussion going on this morning on the topic of coding (i.e., programming computers) and whether the general populace should learn to do it.
The discussion stems from this blog post by Jeff Atwood titled “Please Don’t Learn to Code.”
I won’t rehash the statements brought up by each of these, however, I think on some level, they are all implying the same thing.
- Computers are and will continue to be a central part of humanity’s present and future.
- Everyone needs to have some level of understanding of how computers work in order to effectively interact with them.
Point #1 is moot. No one will argue it. It is on point #2 above that the debate begins.
Part of the difficulty is that almost everyone making an argument for or against learning to code has tried to introduce an analogy to something we already understand. Proponents for learning to code compare it to basic human skills like reading or writing. Those against compare it more to specialized skills such as plumbing. The problem is that programming computers is different than any of these, so ultimately, arguments break down when using them.
The difficulty in even asking the question is that there are so many levels of abstraction contained within the word “code” that we can’t fully answer it unless we have the same vantage point.
Let me break it down:
- To a computer, everyone is a hack because no one programs using machine code
- Assembly programmers think all high level programming is a hack.
- C programmers think anyone coding for a virtual machine is a hack.
- Java and C# programmers think scripting is a hack.
And these are only a few examples. What about those who write macros in Excel, or apply styles in Word? The point is that at every level, some things are hidden away, and the level of control given to the coder is reduced. Does this make a coder who works in any of these given levels of abstraction less of a coder?
Coming back to everyone learning to code, let me pose this scenario:
My son has a Lego Mindstorms kit and has spent some time using the (albeit awkward) IDE in order to program his robot to do stuff like walk or respond when they see a certain color. To me, this classifies as coding, even though he didn’t type a word of code. He programmed the robot visually. Does he now understand a decision branch and looping structure? Absolutely.
I think this method of visual coding will become more and more popular in the years to come, particularly at a consumer level. We will always try to find ways to make things as easy as possible, which usually means hiding things in the background. 20 years ago, C was considered a high level language, but today, most people would consider it more low level.
So are you only a coder if you understand all of the things in the background? If yes, then at what level? An interpreted language? A compiled language? Assembly? Machine code? I don’t think it’s a correct distinction to say that coding only involves typing words in order to make a computer do something.
So trying to frame the discussion around whether everyone should learn to “code” is difficult because there are many different levels of coding, and each of them has a role to play. I think the better discussion is how can we use the concepts used in coding (at every one of these levels) in order to organize our thoughts, communicate better and solve real world problems.
If you enjoyed this article, follow Phil on Twitter for related stuff.