Home / DoD / DoD Eyes New Data Processing Techniques for Battlefield Operations

DoD Eyes New Data Processing Techniques for Battlefield Operations

The Department of Defense is working to discover new ways of storing and leveraging data through aggregation techniques and new technologies like artificial intelligence and machine learning, C4ISRnet reported Thursday.

Matt Benigni, chief data scientist at the Joint Special Operations Command, said the command has created an algorithm that will enable the processing and transfer of enemy data from overseas to systems in the U.S. According to Benigni, the command has also launched an effort with the Massachusetts Institute of Technology’s Lincoln Labs to discover advanced data processing methodologies, including the fusion of datasets extracted from the field.

John Ferrari, chief administrative officer of software and analytics firm QOMPLX, said at an Association of the U.S. Army event that he expects military assets like tanks to include AI and data processing features in their development requirements within 20 years.

“The Defense Department is in the business of data destruction,” he said. “We are also in the business of being enormously afraid of aggregating data. If you put that data set together, yeah they’re both unclassified, maybe they’ll be classified. Whatever you do, don’t bring the data together, keep it siloed.”

Check Also

Bryan Ware to Focus on CISA Modernization, Data Mgmt

Bryan Ware, the Cybersecurity and Infrastructure Security Agency's new assistant director for cybersecurity, said he will focus on the modernization of CISA's infrastructure, Fifth Domain reported Tuesday. The effort will largely involve updates across the agency's artificial intelligence technologies and data management tools, Ware said at FedScoop’s Data Cloud Summit.