We started an interesting conversation on LinkedIn recently. Our CTO polled members of the LinkedIn group International Society for Technology in Education (ISTE): “How early should computer programming skills be introduced to kids?”
This is a screenshot of the poll results.
The general consensus — almost 75% of people polled — was that students should learn to code starting in elementary school. This makes sense as learning a programming language is very similar to learning an actual language: the earlier, the better.
As with all good conversations however, there were some who fervently disagreed. A few educators said computer programming need not be introduced to everyone in K-12 education. They felt it should be optional.
To some extent, they’re right. Just as not everyone needs to be multi-lingual, not everyone needs to have advanced programming skills, or build a career as a programmer. That would defeat the purpose of the modern division of labor.
What we are seeing here is a basic difference in the interpretation of ‘computer programming’. How should this be defined? Does it mean writing code in a core programming language? Does it mean the ability to make a cool web application?
At Tynker, we don’t think of programming skills as having the ability to write Ruby or Python with proper syntax. Rather, we believe that knowing how to program means having a basic understanding of a language’s underlying computational logic.
Seen from this perspective, programming is one of the most valued skills in the 21st century. In which case, no age is too early to learn programming.