Using Algorithmic Tools in Retrospective Review of Agency Rules

Type
Recommendation
Publication Date
June 27, 2023

Executive Summary

Retrospective review is the process by which agencies assess existing rules and decide whether they need to be revisited. Agencies may be able to leverage algorithmic tools, including artificial intelligence, to more efficiently, cost-effectively, and accurately identify rules that are outmoded or redundant, contain typographic errors or inaccurate cross-references, or might benefit from resolving issues with intersecting or overlapping rules or standards. This Recommendation identifies best practices for agencies to assess, acquire, and use algorithmic tools for retrospective review in a way that accords with applicable legal requirements and promotes accuracy, efficiency, transparency, and accountability. Among other things, it recommends:

  • When agencies use algorithmic tools to support retrospective review, they should consider whether to develop them in-house or procure them from other sources. Other agencies, especially the General Services Administration, can be useful resources.
  • Agencies should ensure that agency personnel who use algorithmic tools to support retrospective review have adequate training on the tools’ risks and capabilities.
  • Agencies should publicly disclose whether and how they use or plan to use algorithmic tools to support retrospective review.

See also: Recommendation 2021-2, Periodic Retrospective Review; Recommendation 2014-5, Retrospective Review of Agency Rules; Statement #20, Agency Use of Artificial Intelligence; Recommendation 2017-6, Learning from Regulatory Experience; Recommendation 2014-5, Retrospective Review of Agency Rules; Recommendation 95-3, Review of Existing Agency Regulations

This summary is prepared by the Office of the Chair to help readers understand the Recommendation adopted by the Assembly, which appears in full below.


Recommendation of the ACUS Assembly

Retrospective review is the process by which agencies assess existing rules and decide whether they need to be revisited. Consistent with longstanding executive-branch policy, the Administrative Conference has endorsed the practice of retrospective review of agency rules (including those that incorporate standards by reference), encouraged regulatory agencies to cultivate a culture of retrospective review, and urged agencies to establish plans to conduct retrospective reviews periodically.[1] The Conference has also recognized, however, that agencies often have limited resources available to conduct retrospective reviews. To encourage agencies to undertake retrospective reviews despite resource limitations, the Conference has identified opportunities for agencies to conserve resources, for example by taking advantage of internal and external sources of information and expertise.[2]

New technologies may offer additional opportunities for agencies to conserve resources and conduct more robust retrospective review in a cost-effective manner. Among these, algorithmic tools may enable agencies to automate some tasks associated with retrospective review. An algorithmic tool is a computerized process that uses a series of rules or inferences drawn from data to transform specified inputs into outputs to make decisions or support decision making.[3] The use of such tools may also help agencies identify issues that they otherwise might not detect. The General Services Administration (GSA) and several other agencies have already begun experimenting with the use of algorithmic tools to conduct some tasks in service of retrospective review or similar functions.[4]

Although algorithmic tools hold out the promise of lowering the cost of completing governmental tasks and improving the quality, consistency, and predictability of agencies’ decisions, agencies’ use of algorithmic tools also raises important concerns.[5] Statutes, executive orders, and agency policies highlight many such concerns.[6] In a prior Statement, the Conference itself described concerns about transparency (especially given the proprietary nature of some artificial intelligence (AI) systems), harmful bias, technical capacity, procurement, data usage and storage, privacy, security, and the full or partial displacement of human decision making and discretion that may arise when agencies rely on AI tools.[7] There are also practical challenges associated with the development and use of agency-specific algorithmic tools that may lead agencies to rely on the algorithmic tools developed and used by GSA and other agencies. These challenges include the potentially high startup costs associated with developing or procuring them, the need to develop internal capacity and expertise to use them appropriately, related needs in staffing and training, and the need for ongoing maintenance and oversight.

The Conference recognizes that agencies may be able to leverage algorithmic tools to more efficiently, cost-effectively, and accurately identify rules (including those that incorporate standards by reference) that are outmoded or redundant, contain typographic errors or inaccurate cross-references, or might benefit from resolving issues with intersecting or overlapping rules or standards. Because agencies have only recently begun using algorithmic tools to support retrospective review, this Recommendation does not address the potential use of those tools to perform more complex tasks—such as identifying rules that may need to be modified, strengthened, or eliminated to better achieve statutory goals or reduce regulatory burdens—for which the potential risks and benefits are still unclear and which may raise additional issues regarding agency decision making, including those highlighted above. This Recommendation identifies best practices for agencies to acquire, use, and assess algorithmic tools for retrospective review in a way that accords with applicable legal requirements and promotes accuracy, efficiency, transparency, and accountability. To encourage coordination and collaboration across the executive branch, this Recommendation also encourages GSA to continue to explore options for developing, acquiring, and using algorithmic tools to support retrospective review and share its findings and capabilities with other agencies, and the Office of Management and Budget to provide guidance on the use of these tools to support retrospective review.

RECOMMENDATION

1.  Agencies should assess whether they can use algorithmic tools to more efficiently, cost-effectively, and accurately identify rules (including those that incorporate standards by reference) that are outmoded or redundant, contain typographic errors or inaccurate cross-references, or might benefit from resolving issues with intersecting or overlapping rules or standards.

2.  When agencies contemplate using an algorithmic tool to support retrospective review, they should consider whether it would be most efficient, cost-effective, and accurate to develop a new tool in-house, implement a tool developed and made available by another agency, or procure a tool from a commercial vendor or contractor. In making this determination, agencies should assess whether there is an existing tool that meets their needs and, in so doing, consult with other agencies that have experience using algorithmic tools to support retrospective review. If there is no such tool, agencies should consider whether they have sufficient in-house expertise and capacity to develop an adequate tool.

3.  Agencies should ensure that agency personnel who use algorithmic tools to support retrospective review have adequate training on the capabilities and risks of those tools and that regulatory decision makers carefully assess the output before relying on it.

4.  To promote transparency and build internal expertise, agencies should, when developing or selecting an algorithmic tool to support retrospective review, consider open-source options and those that would maximize interoperability with other government systems. Agencies should ensure that key information about the algorithmic tool’s development, operation, and use is available to agency personnel and the public.

5.  When agencies publish retrospective review plans and descriptions of specific retrospective reviews, as described in Recommendation 2021-2, Periodic Retrospective Review, they should disclose whether, and if so, explain how, they plan to use or used algorithmic tools to support retrospective review. Additionally, when agencies incorporate retrospective reviews in their Learning Agendas and Annual Evaluation Plans, as described in Recommendation 2021-2, they should include information about the use of algorithmic tools.

6.  When the analysis deriving from a retrospective review using an algorithmic tool will influence a new rulemaking, agencies should be transparent about their use of the tool and explain how the tool contributed to the decision to develop the new rule.

7.  Agencies should share their experiences with each other in using these tools. To manage risk and monitor internal processes, agencies should consider developing their own internal evaluation and oversight mechanisms for algorithmic tools used in retrospective review, both for initial approval of a tool and, as applicable, for regular oversight of the tool.

8.  The General Services Administration should continue to explore options for developing, acquiring, and using algorithmic tools to support retrospective review and share its findings and capabilities with other agencies.

9.  The Office of Management and Budget should provide guidance on the use of algorithmic tools to support retrospective review.

 


[1] See, e.g., Admin. Conf. of the U.S., Recommendation 2021-2, Periodic Retrospective Review, 86 Fed. Reg. 36,080 (July 8, 2021); Admin. Conf. of the U.S., Recommendation 2017-6, Learning from Regulatory Experience, 82 Fed. Reg. 61,783 (Dec. 29, 2017); Admin. Conf. of the U.S., Recommendation 2014-5, Retrospective Review of Agency Rules, 79 Fed. Reg. 75,114 (Dec. 17, 2014); Admin. Conf. of the U.S., Recommendation 2011-5, Incorporation by Reference, 77 Fed. Reg. 2257 (Jan. 17, 2012); Admin. Conf. of the U.S., Recommendation 95-3, Review of Existing Agency Regulations, 60 Fed. Reg. 43,108 (Aug. 18, 1995).

[2] Admin. Conf. of the U.S., Recommendation 2014-5, Retrospective Review of Agency Rules, 79 Fed. Reg. 75,114 (Dec. 17, 2014).

[3] Algorithmic tools include, but are not limited to, applications that use artificial intelligence techniques.

[4] Catherine M. Sharkey, Algorithmic Retrospective Review of Agency Rules (May 3, 2023) (report to the Admin. Conf. of the U.S.).

[5] David Freeman Engstrom, Daniel E. Ho, Catherine M. Sharkey & Mariano-Florentino Cuéllar, Government by Algorithm: Artificial Intelligence in Federal Administrative Agencies (Feb. 19, 2020) (report to the Admin. Conf. of the U.S.).

[6] See, e.g., AI Training Act, Pub. L. No. 117-207, 136 Stat. 2237 (Oct. 17, 2022); Exec. Order No. 14,091, Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government, 88 Fed. Reg. 10,825 (Feb. 16, 2023); Exec. Order No. 13,960, Promoting the Use of Trustworthy Artificial Intelligence in the Federal Government, 85 Fed. Reg. 78,939 (Dec. 3, 2020); Exec. Order No. 13,859, Maintaining American Leadership in Artificial Intelligence, 84 Fed. Reg. 3967 (Feb. 11, 2019).

[7] Admin. Conf. of the U.S., Statement #20, Agency Use of Artificial Intelligence, 86 Fed. Reg. 6616 (Jan. 22, 2021).

Recommended Citation: Admin. Conf. of the U.S., Recommendation 2023-3, Using Algorithmic Tools in Retrospective Review of Agency Rules, 88 Fed. Reg. 42,681 (July 3, 2023).