Home / News / NetApp’s Greg Gardner Outlines Strategies to Address Big Data Challenge

NetApp’s Greg Gardner Outlines Strategies to Address Big Data Challenge

Greg Gardner

Greg Gardner, chief architect for government and defense solutions at NetApp, has outlined his ideas for technology strategies to conquer the challenges of extracting insights from large volumes of data.

Gardner wrote in a op-ed for Defense Systems posted Wednesday that big data yields “the greatest amount of useful information” from structured, unstructured, real-time and legacy assets using the right analytical tools.

“Successfully preparing for and managing the sensor data tsunami can come down to conquering four key challenges,” he wrote.

Gardner noted the military’s focus on the processing, exploitation and dissemination approach for big data analysis.

“[U.S.] Air Force doctrine, for its part, now expands that term to PC-PAD – planning and direction, collection, processing and exploitation, analysis and production and dissemination,” he added.

Gardner also recommended users separate the intelligence, surveillance and reconnaissance data from unnecessary information to provide situational awareness to decision makers.

He also wrote that enterprises should adopt the right data visualization tools and train big data analysts in order to address growing volumes of data.

Check Also

Top Pentagon Data Officer Explains Challenges Utilizing AI at DoD

Michael Conlin, Pentagon’s chief data officer, has said that the structure of data and developing a workforce would be the two major challenges to the agency’s adoption of artificial intelligence, FedScoop reported Thursday.

Leave a Reply

Your email address will not be published. Required fields are marked *