It has become common for Silicon Valley types to argue that everyone else would be better off if they, too, were Silicon Valley types. Given the prevalence of developers in Silicon Valley, a kissing cousin corollary to this argument is that everyone should be programmers; that a world increasingly composed of code needs everyone to learn how to write code in order to participate.
What complete and utter rot.
Not only is it questionable (at best) that everyone has the aptitude to code, but it's even more doubtful that everyone should.
Sister, I'm a poet
Everyone on the planet should learn to code, goes one headline. Why every millennial should learn some code, reads another. Both are wrong, but both fit the pattern of Silicon Valley navel-gazing, which generally supports any argument that tries to make the world look more like itself. (Given how wildly out-of-touch Silicon Valley is with the rest of the planet, as investor Marc Andreessen has called out, perhaps this comes as no surprise.)
The problem with this line of reductionist thinking is that it collapses the world into "Haves" (coders) and "Have-nots" (non-coders), without taking into consideration the varied value that differently talented people bring to the world. I've talked a bit about this with respect to Humanities majors—helping us ask insightful questions rather than hoping a dose of machine learning brute force can wrangle us answers—but it's time to push back, hard, on the notion that everyone should be a developer.
SEE: Stop trying to force students into STEM fields (TechRepublic)
As computer scientist Peter Denning remarked: "Computational thinking primarily benefits people who design computations and...the claims of benefit to non-designers are not substantiated."
Universalism is universally dumb
Denning is being a bit cagey, but rightly so: Having gone out on one too many universalist limbs, he's aiming for more circumscribed truth ("universalism" being related to the notion that everyone would universally benefit from learning to code). In an analysis of proposed plans to teach everyone to code, he wrote:
In the late 1990s, we in computer science (including me) believed everyone should learn object-oriented programming. We persuaded the Educational Testing Service to change the Advanced Placement curriculum to an object-oriented curriculum. It was a disaster. I am now wary of believing that what looks good to me as a computer scientist is good for everyone. The proposed curriculum for computational thinking looks a lot like an extended object-oriented curriculum.
It wasn't simply that Denning erred on the belief that everyone should program, but also that everyone should program in a certain way. Having seen his agenda crash and burn, he takes a more measured approach: "It would do all of us good to tone down the rhetoric about the universal value of computational thinking."
SEE: AI pioneer: AI will definitely kill jobs, but that's OK (TechRepublic)
That rhetoric has involved significant generalization of what constitutes programming, algorithms, and more, all in an attempt to bring as many possible people into the computer science fold. In so doing, Denning argues, such "unsubstantiated claims [have] overs[old] computer science, raising expectations that cannot be met, and leaving teachers in the awkward position of not knowing exactly what they are supposed to teach or how to assess whether they are successful."
Other kinds of creativity
As we seek to turn everything into a programmable algorithm, we end up losing the individuality that makes things like marketing actually work. Gartner analyst Martin Kihn has highlighted this problem:
[I]n our rush to hypertarget, marketers ignore the perils of personalization. Algorithms...build a commercial echo chamber and hone us down to our obvious features. Every time an optimization is made, some data is discarded. Usually, it's data that doesn't fit the model, which are exactly the features that make us unique.
He goes on to point out that, "in programmatic terms, we give digital signals from our comfort zone that label us as the Brooks Brothers man or luxury two-seater millennial. For a minute, these labels improve ad response. But over time, putting people into audiences flattens them and they lose their impact."
"The result," he laments, "is a narrowing of our digital marketing experience that makes it less interesting."
This isn't to suggest that algorithms are bad, but rather that they must be mixed with human ingenuity to reach the optimal result. In like manner, programming isn't bad, either, but it's just one way to express the world, and not better in any objective sense.
We need programmers and non-programmers, with both worlds including poets, priests, and politicians. We don't need more reductionist thinking that tries to force everyone into learning Java or Go.
- STEM is great, but here's why an English degree might be a smarter bet (TechRepublic)
- AI pioneer: AI will definitely kill jobs, but that's OK (TechRepublic)
- Special report: How to automate the enterprise (free ebook) (TechRepublic)
- Stop trying to force students into STEM fields (TechRepublic)
- Video: Cognizant deal: Tech success requires a liberal arts degree after all (ZDNet)
- Jobs vs. AI: What happens when everything is automated? (ZDNet)
- McKinsey: AI, jobs, and workforce automation (ZDNet)
Matt is currently head of the developer ecosystem at Adobe. The views expressed are his own, not those of his employer.
Matt Asay is a veteran technology columnist who has written for CNET, ReadWrite, and other tech media. Asay has also held a variety of executive roles with leading mobile and big data software companies.