
Collecting data and then later aggregating and querying the data in a batch mode data mart for business intelligence is a leading way to apply big data and analytics. It can also help level the playing field for people in emerging countries, saving them time and delivering rapid insights on data queries.
One example is illustrated by Springg, a Dutch agricultural software company that works with farmers around the world. Frustrated with the lack of infrastructure in developing countries and cognizant of the need of farmers in these countries to gain access to the same agricultural intelligence as their richer counterparts, Springg wanted to find a way to capture valuable data from the field that could be assessed and then return rapid insights to farmers in more remote areas.
“For farmers, it is important to take soil samples so you can better understand the characteristics of your land, and what types of fertilizers that you can apply to gain optimal results on crops,” said Ashley Stirrup, CMO of Talend, which furnishes the big data integration software that Springg uses.
SEE: Podcast: Business Technology Weekly – Fast moooving cattle tech
Historically, field soil samples are taken in the locale and then sent in for analysis to laboratories that could be hundreds or thousands of miles away.
“What Springg wanted to do was to establish mobile test centers in Kenya that could use Internet of Things (IoT) technology,” said Stirrup. By using the mobile test centers, Springg was able to collect local soil data with sensors and then run soil analyses onsite so local farmers could get immediate results on the condition of their soil and the best fertilizers for crops. Soil data was collected and analyzed directly at the field sites. It was later sent in to a centralized database where data could be further analyzed in a more inclusive and holistic context.
“For local farmers, this process was extremely effective and it sped the results of lab soil analyses by a factor of five,” noted Stirrup. “In a less developed area, efficacy of results and costs can be vital. It can make the difference between a family being able to support itself, or of a child being able to go to school.”
To connect all of the dots between local data collection and analysis and then forward the data into a larger data repository at a remote location requires an assortment of technologies — from wireless communications and mobile phones to flexible messaging protocols that can handle different country telecommunications environments. “In our own data tools that are applied in this use case, we wanted a solution that could handle any type of mobile device and that could support simple communications protocols as needed,” said Stirrup.
SEE: How big data is going to help feed nine billion people by 2050
In local guerrilla field apps like site data collection with IoT sensors, flexibility in data preparation and transport also had to be developed so data could be captured, analyzed, and ultimately leveraged.
“With this particular approach, you might be bringing in data that is collected from sensors in all corners of the world,” said Stirrup. “You then analyze this data in real time or near real time at the site to get immediate local results from the data.”
SEE: Executive’s guide to IoT and big data (free ebook)
After this point, data is collected from the multiple collection sites around the world sites and is sent to a central data repository where it is repurposed for a multitude of uses.
“One way that this agricultural data is further leveraged is in the financial markets,” said Stirrup. “When a system is able to analyze and produce intelligence that is gathered from agricultural collection points around the world, companies can get a better sense of how current crop yields are comparing to historical trends, how weather conditions are factoring in, and what the likely impact might be on commodity pricing.”
The use of these “in the-field” guerrilla data collection and analytics that return immediate results to local farmers and that then are sent on their way to large data repositories where they are analyzed for trends modeling and decision making in major financial markets, is still in early stages, but the results are promising for a data model that can be looked at from both the small and the large ends of the lens.