Justin James believes that teaching programming to students at an early age provides them with skills that will serve them well in the job market.
One of the often repeated ideas in IT is that domain experts, not programmers, will create software in the "near future." Those of us in the trenches always laugh at this one, because we know that it isn't so and won't be anytime soon. There are two major reasons why.
The first is that we are still barely at a Neanderthal-level of tools when it comes to our technology; we are still essentially painting on cave walls by burning sticks and scraping them on rock. The second issue -- and this is the killer -- is that so few people have the proper mindset or training to construct software. I think that we can and eventually will have the toolsets needed to make software creation much simpler. But until then (and my guess for this is measured in decades if not centuries), I believe that we need to treat programming as a skill at the same level as mathematics or basic science in our education system.
I am not suggesting putting third graders in front of Visual Studio or Eclipse or teaching them C# or Java; but I do believe that every student should be exposed to enough software development to be able to properly leverage the skills they will be using as adults. More and more jobs involve working with a computer in one way or another, and many applications have a scripting system, an API, or some other way for a programmer to access them. These hooks allow the user to use an application the way they need to, instead of being frustrated that it doesn't meet all of their needs out of the box. For example, I cannot tell you the number of times someone has complained that Microsoft Excel is "missing" a piece of functionality (like it is not bloated enough) when it would only take them a few minutes to write a macro for it if they tried.
Also, more and more jobs are actually specialized programmer positions. And even if the positions aren't true programming jobs rooted in a particular domain, an employee will perform much better if they have programming skills. Take a look at this list for a few examples:
- Scientist (well, if not most, many of these positions)
- Financial analyst
- Business analyst
- Network engineer
- Systems administrator
- Engineer (most engineers)
All of these jobs require an employee to work with complex software applications all day long and often do tasks in those applications that are directly programming or close to it.
From talking to my peers, the typical Computer Science, MIS, or other IT student entering college is woefully unprepared. While our school systems claim to teach kids how to use computers and prepare them for a state-of-the-art career, what you see when you scratch the surface are students slapping together clip art PowerPoint presentations or how to find dancing baby videos on YouTube. The students are not being taught how to use or leverage a computer; they are assigned to do tasks on a computer that they would do if they weren't using a computer.
The first thing students need to learn is how to actually use a computer. This means learning more than how to send an email; it means how to really use a search engine with advanced features, how to use critical thinking skills to evaluate if a Web site is credible, and so on. The next thing students need to learn is how to properly use common productivity applications to truly be productive in them. From there, the students should be exposed to some sort of development.
When I was growing up, it was trendy to show kids Logo to move the turtle around the screen. I was especially lucky in sixth grade because our class had a LEGO set that we could control from a computer (and through Logo, I believe). This was a great experience to whet my interest. Unfortunately, it seems like these kinds of things have fallen out of favor; or that by the time these types of activities are introduced, most students are old enough to think that moving a fake turtle around the screen is cheesy. Teachers should introduce students to programming in the fourth, fifth, or sixth grades. Those students are old enough to "get it" as long as the teacher is sufficiently patient.
From there, teachers need to show students how to integrate programming into their other studies. For example, art teachers might show students how to put together a macro in their photo editing software or even how to write their own filters. Math teachers might demonstrate how to program the calculator to perform common functions, which would really teach the underlying mathematical concepts at the same time. And so on.
There are challenges with this proposed idea. For instance, not every class would be able to teach programming. And, for a variety of reasons, training teachers would be an uphill battle.
Until the vision of "programmers" building the blocks that users can custom assemble into applications on-demand comes true, it will be necessary to start teaching programming to students as early as third grade. There is no denying that programming practice and theory is a major key to being the best at many jobs. So, if we wish to keep making progress, we need to ensure that U.S. students learn programming concepts and understand how those concepts apply to real world scenarios.
Related TechRepublic posts
- How to introduce high school students to programming
- Ripoff educations
- Poll: How did you learn to program?
J.JaDisclosure of Justin's industry affiliations: Justin James has a contract with Spiceworks to write product buying guides. He is also under contract to OpenAmplify, which is owned by Hapax, to write a series of blogs, tutorials, and other articles.
---------------------------------------------------------------------------------------Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!