Julia emerged in 2018 as one of the fastest-growing programming languages, prized for its approach of combining the strengths of several major languages.
Helping realise that goal is Flux, a machine-learning software library for Julia that’s designed to make ML code easier to write, to simplify the training process, and to offer certain performance benefits over rival frameworks on hardware accelerators such as GPUs and Google’s TPUs [Tensor Processing Units].
Today the Python and R languages typically dominate machine learning, with Python still the fastest-growing programming language in terms of developer popularity, driven in large part by the strength of its machine-learning frameworks and libraries. In comparison, only a relatively small proportion of developers use the fledgling Julia.
That said, the team behind Julia say their language is well-placed to craft differentiable algorithms — the name given to procedural, data-driven code that can be used to build the neural networks used in machine learning.
“We need a language to write differentiable algorithms, and Flux takes Julia to be this language,” the Julia team write in a blog post.
“Being designed from the ground up for mathematical and numerical computing, Julia is unusually well-suited for expressing ML algorithms. Meanwhile, its mix of modern design and new ideas in the compiler makes it easier to address the high-performance needs of cutting edge ML.”
The Flux library extends Julia’s compiler with various ML-focused tools, according to the blog, supporting first-class gradients to strike a better balance between performance and developer control, just-in-time CUDA kernel compilation for GPUs, automatic batching for reducing overheads during training, and optimizations for running on Google TPUs.
The team says that Julia with Flux, alongside the upcoming differentiable programming language Myia and recent Swift for Tensorflow option, could soon challenge established machine-learning frameworks and approaches.
“We believe that the future of machine learning rests in language and compiler technology, and in particular, in extending new or existing languages to meet the high demands of ML research,” they write, adding that languages which “support differentiation, vectorisation and exotic hardware” will “drive many advancements in science”.
“There is some way to go before these next-generation tools — Myia, Swift/TF and Flux — are as production-ready as their existing framework counterparts, TensorFlow, PyTorch, and Knet,” says the Julia team.
“But if you’re breaking new ground in ML, they might well be your best bet. Give them a go, and see what the future of machine learning looks like.”
Upon Julia hitting version 1.0 earlier this year, users of the language were generally positive about how it had progressed, although some still had concerns about the state of its error-handling and unhelpful documentation.
Machine-learning engineer was the fastest growing job category in the five years to 2017, according to LinkedIn, and there are an increasing number of free courses available to developers who want to specialize in the field.