Greg Gardner, chief architect for government and defense solutions at NetApp, has outlined his ideas for technology strategies to conquer the challenges of extracting insights from large volumes of data.
Gardner wrote in a op-ed for Defense Systems posted Wednesday that big data yields “the greatest amount of useful information” from structured, unstructured, real-time and legacy assets using the right analytical tools.
“Successfully preparing for and managing the sensor data tsunami can come down to conquering four key challenges,” he wrote.
Gardner noted the military’s focus on the processing, exploitation and dissemination approach for big data analysis.
“[U.S.] Air Force doctrine, for its part, now expands that term to PC-PAD – planning and direction, collection, processing and exploitation, analysis and production and dissemination,” he added.
Gardner also recommended users separate the intelligence, surveillance and reconnaissance data from unnecessary information to provide situational awareness to decision makers.
He also wrote that enterprises should adopt the right data visualization tools and train big data analysts in order to address growing volumes of data.