DoD Eyes New Data Processing Techniques for Battlefield Operations

The Department of Defense is working to discover new ways of storing and leveraging data through aggregation techniques and new technologies like artificial intelligence and machine learning, C4ISRnet reported Thursday.

Matt Benigni, chief data scientist at the Joint Special Operations Command, said the command has created an algorithm that will enable the processing and transfer of enemy data from overseas to systems in the U.S. According to Benigni, the command has also launched an effort with the Massachusetts Institute of Technology’s Lincoln Labs to discover advanced data processing methodologies, including the fusion of datasets extracted from the field.

John Ferrari, chief administrative officer of software and analytics firm QOMPLX, said at an Association of the U.S. Army event that he expects military assets like tanks to include AI and data processing features in their development requirements within 20 years.

“The Defense Department is in the business of data destruction,” he said. “We are also in the business of being enormously afraid of aggregating data. If you put that data set together, yeah they’re both unclassified, maybe they’ll be classified. Whatever you do, don’t bring the data together, keep it siloed.”

Check Also

NIST SP 800-172

NIST Updates Publication for Controlled Unclassified Information

National Institute of Standards and Technology (NIST) has released an updated draft publication that lists new requirements regarding controlled unclassified information. Draft NIST Special Publication (SP) 800-172 tackles federal CUI security from advanced persistent threats that use physical and cyber means to breach systems and access information with no authorization.