President Biden’s push to create a stronger governance framework for artificial intelligence has banks and credit unions wondering how regulators will incorporate these recommendations into current and future rules. But industry advocates say operating in an already highly regulated environment makes additional rules easier to manage.
this executive order It does not target any specific industry, but directs the Commerce Secretary, through the Director of the Department of Commerce’s National Institute of Standards and Technology, to coordinate with the heads of other government agencies such as the Federal Housing Finance Agency, the Consumer Financial Protection Bureau and the Department of Homeland Security on “development and deploying safe, reliable, and trustworthy artificial intelligence systems.”
Order requirements Developer Generating artificial intelligence models follows agreed-upon procedures to test the cybersecurity resiliency of programs. These include what is known as “red team testing”, launching targeted cyberattacks on systems to identify vulnerabilities.
Internet security has always been Core supervision focus As AI grows in popularity, NCUA President Todd Harper recently joined “federal financial regulators in issuing proposed rules for automated valuation models that incorporate the Fair Lending Principles.” Popularity, regulators at the National Credit Union Administration have been paying attention to the issue.statement of Defense Credit Union BoardAnnual meeting in August.
“So while artificial intelligence can allow credit unions to intelligently automate certain functions, such as member communications and loan underwriting, it must be leveraged responsibly to ensure fairness, transparency and consumer protection,” Harper said. “
But credit union advocates say calls for greater transparency should also extend to regulators.
Given how agencies like the Consumer Financial Protection Bureau use AI tools and the lack of communication about their specific uses, increased clarity could help agencies better adhere to compliance standards, said Andrew Morris, senior adviser for research and policy at the Federal Association. – Insured Credit Union.
“Some principles of procurement and government transparency around the use of artificial intelligence may be helpful in this regard, which may shed light on how agencies like the CFPB prioritize their work in the absence of human decision-making at all levels, which I think I think it’s important from a fairness standpoint,” Morris said.
Morris speculated that his Letter to CFPB Last October, the bureau said it was using artificial intelligence to analyze its database of consumer complaints but did not disclose where or what capabilities such technology would be used.
“The CFPB’s potential use of artificial intelligence and machine learning technologies to assist with regulatory prioritization should not be relegated to the fine print of the federal contract. … Instead, NAFCU requires the CFPB to follow its own recommendations to financial institutions and ensure that financial institutions adopt sophisticated algorithms. and models. The agency will not operate as a black box,” Morris said in the letter.
When considering future rules, regulators will have to walk a fine line between effective enforcement and over-enforcement to avoid “compliance” [AI] An exhaustive regulatory review process would almost discourage investment in this area, which would be counterproductive,” Morris said.
Fintech experts at global law firm Linklaters said the executive order’s focus on national security represented increasing regulatory scrutiny of artificial intelligence, likening the rapid pace of development to a technological arms race.
“To me, this emerging technology, including artificial intelligence, [and] The same goes for digital assets, you can look at it a few different ways… one of them is that it’s a race to the moon, another is that it’s an arms race. ” Head of Blockchain and Digital Assets at Linklaters. “As you see with the executive order, there are a lot of things that are consistent with that.”
Financial Institutions have Continued adoption of AI-driven solutions In recent years, attempts have been made to Policymakers flesh out concerns for all possible use cases for this type of technology.
As part of this concern and the existing regulatory framework, banks “have taken these steps” to scrutinize third-party providers and new products because “if you can’t explain it to regulators or customers, no one wants to join the scheme” Mickey Marshall, assistant vice president and regulatory advisor at the Independent Community Bankers Association of America, said it “has come to a conclusion.”
“Banks are used to being heavily regulated and already place a lot of demands on third parties. [partners]because regulators are demanding a lot from them,” Marshall said.
Other trade groups, such as the American Bankers Association, have also established dedicated artificial intelligence working groups to help shape policy positions, including bankers from financial institutions large and small.
This working group was established after some discussions Request information The report was released earlier this year by the National Telecommunications and Information Administration, which is part of the Commerce Department and is responsible for advising the White House on policy in this area.
Banks and credit unions agreed they want more information from regulators to find ways to resolve the issue. Accelerate the pace of innovation safely.
Ryan Miller, the ABA’s vice president for innovation policy, said Biden’s order could herald a more cohesive approach to regulation of artificial intelligence at the federal level.
“Our members want a more consistent approach across the country, rather than what we’ve seen in some other areas which is effectively a patchwork of state laws that create huge compliance issues with inconsistent levels of consumer protection. Burden,” Miller said.