Computer-generated batch reports have occupied a place in IT since the green bar days of the 1950s, and batch reporting continues to play a vital role in business. It’s important to mention batch reporting along with big data analytics because no corporate information management and reporting strategy is complete without striking an appropriate balance between the two.

What is the right balance?

Daily checks on inventory, defects on the manufacturing floor, receivables, payables, and transportation scheduling are clearly in the province of the batch world, where reports have been developed over decades that tailor this information to the needs of those who depend on it. The new analytics that take a look beyond the domain of fixed record data, probe data repositories outside of the enterprise and data that is unstructured and even machine-generated is the province of big data. So, too, is the gathering of data in real time that can now facilitate real-time decisions.

Industry’s focus has recently been on big data harvesting and analytics, because this is the area where technologies and best practices are still evolutionary; the day-to-day reality, however, is that the lion’s share of reporting in enterprises remains in standard batch processes.

“We give our customers visibility of the end to end logistics process, and also what inventory levels are in the warehouse if we are performing distribution for them, and we deliver this information to them in standard batch or real-time reports,” said Judy Craig, vice president at Kenco, a third-party logistics provider.

The reasons why are simple. Traditional batch reporting is established and trusted. If there are areas of these reports that aren’t trusted, the users already know where they are and how to work around them. Companies also have on-staff expertise in the area of batch reporting and batch report development that they might not yet have in the big data world.

What is changing is business’s need to know the immediate repercussions of a business change or event — this can’t wait for a nightly or even an on-demand batch report. This is also an area where big data can deliver big results that can define a company’s competitive advantage.

Big data also opens up untapped frontiers of data, such as data that is generated from sensors and then analyzed. This data is needed to track and monitor transportation grids, utility grids, remote medical robotics and telemedicine procedures, goods shipments, and remote home and business security.

Understanding the different roles that batch reports and big data play within an overall organization is instrumental for constructing the right types of applications and databases to support this activity. The reporting architecture will also govern strategic IT decisions on the types of processing, storage, and software to acquire for the data center, and which processes will be outsourced to the cloud.

In some cases, an all-inclusive look at batch and big data reporting have generated innovative results.

For example, financial services firm Crédit Mutuel Arkéa freed its mission-critical mainframe resources, formerly tasked with massive batch reporting work, by moving batch reporting to Hadoop, a big data batch platform whose parallel processing sped up jobs and reduced the infamous nightly “batch window,” which for some organizations can take so long to process that it just barely completes before the company opens its doors in the morning. Other businesses, including Costco, Sears, and Walmart, are also adopting this approach.

In all of these cases, batch and big data reporting are being looked at together and not as separate functions; in some cases, this has allowed technology platforms to be “crossed” in order to maximize results. In taking this approach, data center managers are orchestrating an integrated IT approach to reporting that best serves the business.