Developers are calling it quits on polyglot programming

As much as developers have enjoyed new options in programming languages, databases, and more, they're increasingly asking for consolidation to simplify development.


Call off the polyglot holiday party. Developers are voting for consolidation, after all.

Despite years of hearing that developers now run the enterprise asylum (true), and that this means they'll use a dizzying assortment of technologies to get things done (not as true), the limits of human patience have been reached.

Instead of a proliferation of programming languages, databases, operating systems, and other developer-friendly technologies, developers are instead settling into a relatively homogeneous mix of general-purpose technologies.

Toward the polyglot future

It wasn't supposed to be this way. In a world where developers reign, developers would choose the optimal tool for a particular job, even at the expense of increased complexity.

Some of the smartest minds in the business have been heralding the future of polyglot computing, from Redmonk's James Governor ("We're entering a polyglot era in software development, driven by cloud and multicore systems architectures, as new languages emerge to challenge, and coexist with, the long hegemony of Java and .NET.") to ThoughtWorks' Martin Fowler ("We are gearing up for a shift to polyglot persistence — where any decent sized enterprise will have a variety of different data storage technologies for different kinds of data.").

While they've been right to call out the increased complexity developer-led enterprises have yielded us, there was no way this polyglot reality could persist. Not given its cost. I'm not referring as much to the cost of the enterprise — which is very real — but rather, the cost to developers in terms of time and attention.

Why U no love polyglot?

Make no mistake: the cost is enormous. For example, while the idea of using the exact right database for a precise type of data seems right, the reality is that this approach means that developers are forced to learn a seemingly infinite number of databases. (Okay, DB-Engines only lists 190, but that's the functional equivalent of "infinite" since more are added all the time.)

Redmonk analyst Stephen O'Grady, reading the developer blog tea leaves, tentatively concurs:

"Developers have historically had an insatiable appetite for new technology, but it could be that we're approaching the too-much-of-a-good-thing stage. In which case, the logical outcome will be a gradual slowing of fragmentation followed by gradual consolidation."

One of the developers he cites, ex-Googler Tim Bray, makes the case very clearly:

"There is a re­al cost to this con­tin­u­ous widen­ing of the base of knowl­edge a de­vel­op­er has to have to re­main rel­e­van­t. One of today's buz­zwords is 'full-stack developer,' which sounds good, but there's a lit­tle guy in the back of my mind scream­ing, 'You mean I have to know Gra­dle in­ter­nals and ListView fail­ure modes and NSMan­agedOb­ject quirks and Em­ber con­tain­ers and the Ac­tor mod­el and what in­ter­face{} means in Go and Dock­er sup­port vari­a­tion in Cloud provider­s?' Color me sus­pi­cious."

Bray is, of course, not alone. In fact, the history of every market involves a whittling down of choices to a few that are somewhat general purpose and good enough. We used to have many relational databases, and now you can count the ones that matter on one hand. (I cover this in detail in a presentation I often give — see especially slides 26 to 28.)

Ditto operating systems, application servers, public cloud platforms, etc..

Developers, being human, want choice. But not too much of it. Because too much choice is hard, exhausting, and ultimately counterproductive.

The short tail of technology adoption

This calls to mind Hugh Macleod's hilarious (and true) response to Chris Anderson's "Long Tail" theory (Figure A).

Figure A

Figure A

Response to the "Long Tail" theory.

There are many reasons that the "long tail" ultimately fails, but in terms of technology adoption, developer fatigue ranks top of the list. Second is the reality that while developers may initiate more and more technology acquisition today — removing IT completely from the purchasing picture 7.2% of the time by 2015, and initiating 10.4% of technology purchases, then including IT to manage the applications, according to Forrester — it's more often the case that developers are working in partnership with their CIOs/IT departments.

In such a partnership, the realities of IT governance and ongoing management come into play. No large enterprise wants to support 30 different programming languages. In fact, even small startups tend to agglomerate when it comes to their technology choices: the top-five programming languages featured on HackerNews remain constant month-to-month, and Leo Polovets' analysis of AngelList data suggests they tend to congregate around a few standards up and down the technology stack.

Again, this is just how markets work. We get excited by new technologies and find novel ways and places to use them. That's normal.

But it's also normal to then consolidate options, even as they borrow features from each other to become general purpose and good enough. This results in the industry stabilizing around a few options that everyone grumbles about together, eventually to be displaced by the next big thing(s). But for now, we're in consolidation mode for developers, and it's about time.


Matt Asay is a veteran technology columnist who has written for CNET, ReadWrite, and other tech media. Asay has also held a variety of executive roles with leading mobile and big data software companies.

Editor's Picks