Job obsolescence is always a concern for workers, especially in technology fields where automation and artificial intelligence (AI) might endanger human-based positions.
Human technical know-how is too valuable an asset to dismiss; at the very least, if one technological role evaporates it will lead to the creation of a new role.
I discussed where AI is heading and how the IT workforces can prepare with Terri Schlosser, head of product, technical, and solutions marketing at SUSE, the open source Linux provider.
SEE: Managing AI and ML in the enterprise 2019: Tech leaders expect more difficulty than previous IT projects (TechRepublic Premium)
Current status of AI
Scott Matteson: What is the current status of AI?
Terri Schlosser: While AI has been talked about for many years, it is a market that is just now starting to take off — doubling every year, with analysts projecting growth from $9.5 billion in 2018 to $118.6 billion in 2025. In fact, AI is helping companies do things like sharpen customer service, organize calendars, verbally respond to questions, automate recruitment processes, and sense when machines need to be repaired.
However, there are many areas where the full advantages of AI have not been leveraged, such as in the workplace. In a recent SUSE survey of IT professionals, only 35.7% of the respondents said their company uses or plans to use AI solutions for business needs. That means nearly two thirds said that either their companies have no exposure, or they’re not sure.
On the surface, these numbers dovetail with projections of massive growth in the global AI software market, but they may be underestimating the actual use of AI in the workplace. Many people tend to equate AI with robotics, but it really applies to many other applications that funnel analytics back into operations, triggering improvements in overall processes.
Scott Matteson: Where is AI building jobs?
Terri Schlosser: As has been widely speculated, automation triggered by AI likely will eliminate certain rote tasks. For example, you’ll need fewer storage administrators to manage storage environments. In big box building supply stores the installation of automated watering systems — based on sensors that measure moisture in plants — will phase out the need for dedicated plant waterers.
Still, other people have to install the sensors and develop a program that writes the algorithm for the AI solution. So, there will be more demand for programmers, data scientists, and high-performance computing (HPC) administrators.
SEE: The impact of machine learning on IT and your career (free PDF) (TechRepublic)
Scott Matteson: How should organizations prepare for this?
Terri Schlosser: Companies might be prepared for what they’re doing with AI today, but not necessarily for the demands of where AI is going. They might be using an AI app in their current environment, but if they switch to a parallel computing environment, which can run the same AI app 50 to 100 times and get to the end result faster, that requires a much more high-powered computing infrastructure.
Organizations need to take a closer look at what they want their future AI environment to be and then build a strategy to get there over time. Look at what HPC requirements they will need to support this desired future state, understand impacts, and build plans to get there.
Scott Matteson: How should employees prepare for this?
Terri Schlosser: According to our recent survey, nearly half (41.4%) of respondents said they have an opportunity for development, while 20.4% say other employees are developing skills. This means at least 61.8% are being actively trained. Meanwhile, only 32% of non-users are being actively trained, leaving 68% of that population out of the loop.
As expected, the user companies have a jump on the non-users when it comes to training. But the fact that 38.2% of the user group is not being actively trained on AI-related technologies shows that even companies that are tapping into the value of AI aren’t throwing the full weight of training programs behind the technology.
Understanding what their company strategy is related to new AI workloads and programs will help employees understand where to spend their time learning. For example, if you are in IT operations, and the strategy is to leverage new AI workloads that need an HPC environment to run on, then starting to learn about HPC, how that is different than their current environment, key technologies that they may need to leverage and learn are all areas that can be focused on to help them prepare.
SEE: Digital transformation in manufacturing: A guide for business pros (TechRepublic Premium)
Preparing for the future
Scott Matteson: Where do you expect these endeavors to lead?
Terri Schlosser: In the near future, we see AI generating more insights from data and making companies more productive than ever before. But AI needs a solid infrastructure underneath to power more intensive applications. In the coming years, we expect AI, as a concept, to be more understood. With the right investments, companies can leverage the technology to its full potential.
Scott Matteson: Any tips for people in roles which might be impacted by AI?
Terri Schlosser: Where your data resides and how your data is being collected has a bearing on how your HPC infrastructure needs to be set up and managed. That includes running AI in the cloud, public or private, or entirely on-premises or mixed. It includes where data storage might need to be enhanced with an affordable Ceph-based solution. It includes how your infrastructure needs to be updated, from the core OS to the scalable hardware environment required.
As an IT manager, AI will have an impact on how you establish the right infrastructure for the new-wave workloads. As a data scientist, your role just got a lot more integral to ensuring the right outcomes from new data collected. New skills might be required for IT administrators in a high-performance parallel computing environment. New skills might be required for developers looking to enhance and build new AI/ML applications – even deep learning skills are starting to be needed as data volumes grow exponentially.