Department of Defense is Tackling AI Challenges, Considering Bills of Materials

Read more on this topic in our article published in Law360.

As Artificial Intelligence (AI) grows in popularity, discussion of its potential uses and risks is everywhere. The Department of Defense (DoD) is no exception and has been considering how AI development can be helpful or harmful to U.S. national defense.

In the National Defense Authorization Act for Fiscal Year 2021, Congress directed the Government Accountability Office (GAO) to assess DoD’s resources, capabilities, and acquisition plans for AI technology. Based on interviews with officials from multiple DoD entities, external subject matter experts, and private industry officials, GAO identified several AI-related challenges. Based on our review, these challenges may lead to additional regulation, but they also present opportunities for government contractors to develop and provide AI solutions.

DoD is currently pursuing AI projects that focus on three main areas:

  1. Recognizing targets through intelligence and surveillance analysis;
  2. Providing recommendations to operators on the battlefield (such as where to move troops); and
  3. Increasing the autonomy of uncrewed systems (such as aircraft and ships that do not require human operators).

While some of the projects currently underway are specific to individual programs or military branches, many have the potential to be broadly applicable, and therefore will challenge various stakeholders. GAO recommended that DoD develop a high‑level plan capturing all requirements and milestones necessary to accomplish its goal of providing a complete inventory of DoD’s AI activities and related budget data.

One challenge GAO identified in a recent assessment of DoD’s AI programs is that training high performing AI generally requires accurately labeled data (images, text files, videos, etc. that have been tagged with one or more identifiers) that the AI algorithm can learn from. Currently, most of DoD’s data is unlabeled. Because the process of labeling all previously gathered data would be too challenging and time consuming, the most realistic solution is for DoD to focus on incentivizing programs and contractors going forward to collect and store data in a standardized format that will be more usable by AI systems.

Another challenge GAO identified is integrating AI into existing weapons platforms. Because AI capabilities embedded in weapon platforms must be able to function in areas without access to digital infrastructure, like the cloud, integration requires physical room for computing equipment that may not be available. Thus, DoD needs to understand the existing physical space capacity, and create the capacity needed aboard existing weapons platforms to enable the successful implementation of AI.

Initial DoD Efforts to Address the Unique Challenges Presented by AI

DoD has initiated at least one ongoing effort to specifically address the challenges identified by GAO, among others related to AI development and implementation. The Joint Common Foundation, which became operational in July 2021, was created as DoD’s digital platform to design, develop, and test AI capabilities. The platform is intended to provide project developers and individual users with access to open-source, and commercially available AI development tools (such as open-source algorithms), a data catalogue, project directories, and support services. GAO recommended that DoD issue a roadmap of all requirements and milestones for development and user onboarding of the Joint Common Foundation. Although it is too early for GAO to assess the effectiveness of the Joint Common Foundation, it could potentially be a useful tool for contractors interested in designing and developing AI capabilities.

Potential New Effort: AI Bill of Materials

The U.S. Army, recognizing potential cybersecurity risks associated with AI systems, is considering a new approach to identifying potential weak spots that could allow for malicious acts such as data poisoning (changing the data that AI systems rely upon) and backdoor or Trojan attacks (modifying an AI to function “normally” but given a specific input it will trigger an unintended act). If implemented, the new approach would require companies seeking to provide AI products to the Army to turn over the underlying data and algorithms. The Army’s experts would then review the algorithms to identify any areas that could potentially be targeted by bad actors or could otherwise threaten the ethical nature of the AI. The Army does not want to accept a product without taking a look under the hood.

Companies may be reasonably concerned about turning over their algorithms. Companies devote significant resources to developing and perfecting their algorithms, and are likely not enthusiastic about providing those algorithms to a customer that could potentially reverse-engineer them, or allow a competitor to obtain access to them.

The Army recognizes this concern and has emphasized that the goal of this potential effort is “not to get at vendor IP. It’s really about, how do we manage the cyber risks and the vulnerabilities?”. Even so, the Army recognizes that it likely faces an uphill battle convincing companies to sign onto this approach. Recognizing that private sector entities are essential to advancing DoD’s AI initiatives, the Army has already started engaging with industry leaders to see how the parties can work together to make this effort work for both sides.

Lessons Learned from Software Bill of Materials

The Army’s approach resembles the concept of a Software Bill of Materials (SBOM), a tool that has been under development in the private sector and government for years, though its use remains preliminary. SBOMs have emerged as a promising tool to improve software security by creating supply chain transparency and allowing faster detection of new vulnerabilities. President Biden’s 2021 Executive Order on Improving the Nation’s Cybersecurity (Cyber EO) defines a SBOM as a “formal record containing the details and supply chain relationships of various components used in building software.” As the Army and other government agencies consider how to identify the underlying data and coding of AI products, there are three lessons that can be learned from the process to date to refine and promote SBOM use.

First, the acceptance and adoption of SBOM has grown in recent years as companies identified the long-term cybersecurity benefits. Although software serves different functions depending on the industry, most companies recognize that software can open them up to cyber-attacks, whether directly or through the supply chain. For example, the SolarWinds attack in 2019 emphasized the vulnerabilities in supply chain security and made many companies realize that they needed a way to identify what software components were integrated into their systems. Before requiring an AI BOM, government entities should identify exactly which risks will be addressed by requiring disclosure of underlying data and processes used in AI products.

Second, despite recent guidelines on what minimum elements should be included, information included in SBOMs varies between companies and can lack interoperability or scalability. As the U.S. Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response explained, despite growing industry adoption, SBOMs lack uniformity or a standard practice for what information most SBOMs should include. This lack of standardization continues despite the National Telecommunications and Information Administration’s (NTIA) SBOM Minimum Elements Report, issued in 2021, which was developed in compliance with the Cyber EO. This problem could be compounded in the AI setting as large amounts of data are utilized and algorithms evolve. If AI BOMs are to succeed, baseline guidance that is flexible and provides minimum elements may help streamline adoption within the industry.

Third, while NTIA’s guidance is voluntary, the Cybersecurity and Infrastructure Security Agency is implementing a government-wide repository of software attestations from government contractors certifying inclusion of the NTIA’s minimum SBOM elements. As recently highlighted, government contractors are often the first group to see the effects of federal cybersecurity priorities. The SBOM requirement is proving no different, and to the extent government contractors are required to provide AI BOMs, the broader AI industry could be getting a preview of how the federal government plans to handle some of the security risks that come with use of AI.

Looking ahead

Interest in technology supply chains is only intensifying, as efforts expand across government to promote secure software development practices, including in proposals to add supply chain and software development elements to the NIST Cybersecurity Framework presently being revised by NIST.

Companies selling products and services to the government should expect more scrutiny of their technology and development practices, as well as more mandatory attestations and certifications. AI innovators should watch what DoD does on bills of materials overall and for AI in particular, because as we have seen with cybersecurity over the past decade and SBOMs in the past few years, contractors often face the first set of meaningful obligations, which then are applied across the broader innovation base.

***

Wiley’s cross-functional AI advisory team is helping clients prepare for the future, drawing on decades of work in emerging tech, cybersecurity, privacy, consumer protection, and federal procurement. All of these areas are intersecting to shape the future of AI.

Grace Moore, a Wiley 2023 Summer Associate, contributed to this blog post.

Tags

Wiley Connect

Sign up for updates

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.