Big data success: Why desktop integration is key

The best data is the data that people actually use. Make sure the end user knows how to use the data tools for the best business results.

shipping in a harbour

Image: iStock/thitivong

There are few vectors in big data that are more popular than GPS. Geographic positioning tells you where data came from. It enables you to visualize weather patterns, disease spread patterns, housing patterns, etc., and to cobble them together into a composite of data that can inform corporate strategy.

But the key to deriving the most value from data is being able to get it easily into a system where all of the data can be amalgamated into a holistic representation of what's really going on. This is where desktop integration that can occur by pushing a few buttons on a user workstation enters in and why it should be a key consideration in every big data use case. Let's take plotting the shipping lanes into a harbor as an example.

SEE: Chatbot trends: How organizations are leveraging AI chatbots (free PDF) (TechRepublic)

First, bathymetry instrumentation records the depths of water at different points in the harbor to determine if depths are adequate for the different types of ships and barges that use the harbor. The instrumentation records year-to-year changes in depths.

These measurements are then uploaded into a point cloud that plots the depth findings onto a map of the harbor that is used in a desktop GIS system. From here, the user can overlay additional information on this map, such as the frequency that carriers travel the lanes, the areas of port congestion or construction, weather information, and so on.

SEE: 7 data science certifications to boost your resume and salary (free PDF) (TechRepublic)

At the end of the day, the end user has a complete picture of the harbor that can be used in planning harbor operations.

It's also worth noting that IT's role in a scenario like this is collecting and cleaning data, performing backend integration as needed, and then letting the user cobble together the data so it's best positioned to solve a business problem. 

Focusing on end user ability to integrate and mix and match data does two things:

  • It gives the business more agility in constructing the composite of data that it needs to solve business problems.
  • It provides the ability to integrate at the desktop lessons the data integration workload for IT.

There are many benefits to effective desktop integration of data, but the irony is that IT doesn't always pay attention to them. 

SEE: 4 steps to user buy-in for big data (TechRepublic)

Instead, there is a tendency for IT to "drop off" user business intelligence and reporting tools at the doors of user departments, and then let the users figure the tools out. Then, what the users end up doing (or not doing) with the tools becomes a "black box" as far as IT is concerned because IT moves on to other projects and doesn't track the evolution of business use cases that users develop. 

An alternative for IT would be to assess the long-term success of a project in light of its usability and its ability to answer business problems on a daily basis. By doing this, IT can take the lessons learned and apply them to future projects. 

One case in point is Microsoft Excel. For years, many prognosticators felt that this 30-year-old spreadsheet tool would be eclipsed by new technologies, but Excel continues to solve business problems, import data that is needed, and be easy to use

GIS systems are similar. They have a long history of use and users know how to use them. As more big data from Internet of Things (IoT) devices and other sources gets added to GIS systems, the data can be manipulated by users so they can obtain the results from data that they need to see.

The takeaway is that when users have full manipulative capability with data at their desks, the business gains agility and can solve more problems. 

This is why the ultimate test of big data integration is what can happen at a user's desk.

Also see