Software

Why is programming language Julia growing so fast and where is it going next?

Professor Alan Edelman, co-creator of the Julia programming language, on why the language is so popular and what's coming next.

At the turn of the decade, Professor Alan Edelman and his colleagues set out to solve a longstanding problem holding back programming languages.

This "two-language" problem is a trade-off that developers typically make when choosing a language — it can either be relatively easy for humans to write, or relatively easy for computers to run, but not both.

"It was kind of thought of as a law of physics if you will, a kind of law of nature, that you can have one or the other, but that in some ways it would be impossible to have both," said Edelman, who runs the Julia Lab at MIT and co-founded Julia Computing, the company that acts as the steward for the language.

"It sounds almost reasonable and people believed it for a long time, but we didn't."

edelman.png

Professor Alan Edelman co-founded Julia Computing and heads up MIT's Julia Lab.

Image: MIT

The answer that Edelman and his colleagues came up with was Julia, a programming language with grand ambitions. A "tongue in cheek" launch post for Julia promised the language combined the speed of C with the usability of Python, the dynamism of Ruby, the mathematical prowess of MatLab, and the statistical chops of R.

Six years after that launch those lofty aims seem to be paying off. Julia has more than 700 active open-source contributors, 1,900 registered packages, two million downloads, and a reported 101 percent annual rate of download growth.

Despite Julia's success and the challenge it poses to Python in the big-data analytics and machine-learning space, Edelman doesn't necessarily see Julia as a replacement for other languages, saying "technologies have a way of coexisting", and stressing how it's straightforward to call code written in other languages from Julia "in many cases".

Why use Julia?

The growing interest in Julia is perhaps unsurprising when considering the benefits to developers of tackling the two-language problem. Edelman talks of organizations prototyping code in a higher-level, "easy-to-use" language and then having to "hire a team of programmers to recode it in a fast language, once they're happy".

"That slows everything down, right? You bring in another team and that then turns a cycle that you might hope to complete in days or weeks into a cycle that takes weeks, months or years," he said.

"When the same person or team can simply be both prototyping and deploying the cycles are just much faster."

Julia is a general-purpose computing language, but Edelman says it is targeted at big data analytics, high-performance computing and running simulations for scientific and engineering research.

The language's core features from a technical standpoint, according to Edelman, are its multiple dispatch paradigm, which allows it to express object-oriented and functional programming patterns, its support for "generic programming", and its "aggressive type system". This type systems caters to many different use cases. It is dynamically typed, but with support for optional type declarations. The language "feels like a scripting language", but can be compiled to "efficient native code" for multiple platforms via LLVM.

Edelman believes there is a "lot more runway" for Julia's userbase to continue to grow, and that the recent milestone 1.0 release of the language will tempt new users who've been holding back.

"There were people who were waiting for 1.0, and the announcement last month in London was certainly what quite a lot of people had been waiting for."

The core improvement the 1.0 release brings is stability, according to Edelman, who says the language is ready for real-world use in production code.

"1.0 is a statement that now Julia is going to be stable enough. You can build things on it, there won't be breaking changes. Julia is ready for big-time use, we're not going to tinker with the language the way we have."

What's next for Julia?

Another string to Julia's bow are its built-in features that make it easier for developers to spread workloads between multiple CPU cores, both in the same processor and across multiple chips in a distributed system. Edelman says the plan is to improve Julia's native support for parallel processing on other types of processors, such as Graphics Processing Units (GPUs) and Google's Tensor Processing Units (TPUs) that are used to accelerate machine learning.

"It's been the case with these novel architectures that, by and large, if you wanted to program these things, you really had to learn like a whole new way of programming. For GPUs you had to learn Nvidia's CUDA language. If you wanted to do distributed parallel computing, you had to learn MPI.

"You either had to learn something else, or else you could just call a library, if there was a library that met your needs.

"What we're trying to do with Julia is, recognize, yes, these are special architectures, but no, you shouldn't have to learn a whole new programming language to use them. You should be able to, by and large, work in the same language."

At the moment, Edelman said the group behind Julia was focused on making the language easier to use with TPUs, but said there was still some work to do improving the "old-fashioned distributed parallel computing, the shared memory and multi-threading across CPU cores".

"I come from a high-performance computing, distributed parallel computing background a long time ago and I've always hated the way we program these machines," he said.

"So this is a long-term remedy that I see for that problem."

With commercial interest in machine-learning continuing to build, Edelman says he and his colleagues also plan to continue to building on Julia's strengths as a language for implementing machine-learning models.

The team behind Google's Tensorflow, a popular open-source machine-learning framework, recently cited Julia as a "great language" that was "investing in machine-learning techniques", and that "shares many common values with TensorFlow".

Edelman said: "The kind of language features that both Google's building and Julia has, the kindred research that we have in this area, is exactly what's going to lead to what we think will be the real breakthroughs in machine learning."

Julia's ability to solve machine-learning challenges that other languages struggle with is illustrated by a recent example, according to Edelman, who described the difficulty an organization was having using machine learning to diagnose tuberculosis, based on recordings of coughing.

Unfortunately, the machine-learning model's ability to predict whether an individual had TB was hampered when those coughing had different accents.

"What you want to do, of course, is learn whether somebody was sick or not and you didn't want it to learn the difference in accents," said Edelman.

Resolving the confusion caused by different accents was difficult using a high-level language like Python with standard machine-learning libraries, which are typically written in a better performing low-level language like C++.

"What I was told is that all of the regular libraries just couldn't do it. It wasn't very difficult, but you had to tweak the neural networks in a way that the standard libraries just wouldn't let you do," he said.

"The current libraries are sort of like brick edifices, and if you to move them around you've got to be a pretty heavy-duty programmer to change them.

"But this fellow said with Julia, because it's high level, he was able to go in readily and solve this problem.

"So what we really want to do is enable more and more people to do that sort of thing, to be able to get beyond the walls of these existing libraries and to innovate with machine learning."

Beyond these grander aspirations for Julia, Edelman says work will likely continue on improving core tools, such as debuggers, which at present seem to be some missing features, with the latest version of the Gallium.jl debugger built into the Julia's Juno IDE currently not supporting breakpoints and other components.

However, the ecosystems of tools supporting Julia are also growing organically with the language's popularity, with Julia plug-ins for various IDEs, including Visual Studio, Atom and VS Code.

Every major programming language is a work-in-progress, and, ultimately, Edelman says the team behind Julia have many ambitions for the language they've not yet realized.

"There's so many things, we still have so many dreams, we're not even close to declaring victory for ourselves."

About Nick Heath

Nick Heath is chief reporter for TechRepublic. He writes about the technology that IT decision makers need to know about, and the latest happenings in the European tech scene.

Editor's Picks

Free Newsletters, In your Inbox