While machine learning offers advantages for nearly every industry, very few companies have actually adopted this artificial intelligence (AI) technology, and face several common barriers to entry, according to a new Deloitte report.

Less than 10% of executives said that their companies were investing in machine learning, according to a recent SAP survey, and many cite barriers to adoption including qualified staff, still-evolving tools and frameworks, and a lack of large datasets required to train algorithms. Many people also face the “black box” problem, in that they understand that machine learning models generate valuable information, but are reluctant to deploy them in production, because their inner workings are not immediately clear.

SEE: Sensor’d enterprise: IoT, ML, and big data (ZDNet special report) | Download the report as a PDF (TechRepublic)

To lower the barriers to entry, Deloitte researchers identified five “vectors of progress” that make it easier, faster, and less expensive to deploy machine learning in the enterprise:

1. Automate data science

Developing machine learning solutions requires data science skills, a field in which practitioners are in large demand and short supply. However, as much as 80% of the work of data scientists can be fully or partially automated, according to Deloitte, including data wrangling, exploratory data analysis, feature engineering and selection, and algorithm selection and evaluation.

“Automating these tasks can make data scientists not only more productive but more effective,” the report stated. A growing number of tools from both established companies and startups can help reduce the time required to execute a machine learning proof of concept from months to days, Deloitte noted. This also means augmenting data scientists’ productivity, so that even with a talent shortage, enterprises can still expand their machine learning adoption.

2. Reduce the need for training data

Training a machine learning model can require up to millions of data elements, and acquiring and labeling this data can be time consuming and costly for enterprises.

However, we’ve seen a number of techniques emerging for reducing the amount of training data required for machine learning. Some use synthetic data, generated with algorithms to mimic the characteristics of the real data, and have seen strong results: A Deloitte LLP team tested a tool that allowed it to build an accurate model with only a fifth of the training data previously required by synthesizing the remaining 80%.

Enterprises can also use transfer learning, an approach in which a machine learning model is pre-trained on one dataset as a shortcut to learning on a new dataset in a similar domain, such as language translation or image recognition, Deloitte noted.

SEE: Research: Companies lack skills to implement and support AI and machine learning (Tech Pro Research)

3. Accelerate training

Training a machine learning model can take weeks, due to the large volumes of data and complex algorithms involved. But semiconductor and computer manufacturers are developing specialized processors such as GPUs, field-programmable gate arrays, and application-specific integrated circuit to cut down the time required to train these models, by speeding the calculations and the transfer of data within the chip. Doing this also brings down costs.

For example, using GPUs, a Microsoft research team was able to build a system that recognized conversational speech as well as a human in only a year. If the team had used CPUs, it would have taken five years.

Adoption of these specialized AI chips is spreading in industries including retail, finance, and telecommunications. And every major cloud provider–including IBM, Microsoft, Google, and Amazon–offers GPU cloud computing, so accelerated training will soon be available to data science teams in any organization.

4. Explain results

Since many machine learning models are black boxes, it’s difficult to explain with confidence how they came to their decisions–making them difficult to trust in fields such as medicine, business, and finance. However, new techniques may help enterprises better understand how these models work, making them more interpretable and accurate. For example, MIT researchers have developed a method of training a neural network that delivers accurate predictions along with the rationales for those predictions, Deloitte noted.

These kind of tools will allow companies in highly regulated industries to find more opportunities to use machine learning, the report said, including in areas such as credit scoring, recommendation engines, fraud detection, and disease diagnosis and treatment.

5. Deploy locally

Adoption of machine learning will increase along with the ability to deploy it where it can most improve efficiency and outcomes, according to the report. Advances in software and hardware are making it easier to use the technology on mobile and Internet of Things (IoT) devices and sensors. Apple, Facebook, Google, and Microsoft are all creating more compact machine learning models that can handle tasks such as image recognition and language translation on mobile devices.

Using machine learning on mobile devices also expands the potential applications of the technology, and may help companies develop applications in areas such as smart homes and cities, autonomous vehicles, wearables, and industrial IoT.

“Machine learning has already shown itself to be a very valuable technology in many applications,” the report stated. “Progress along the five vectors can help overcome some of the obstacles to mainstream adoption.”