SEC aims at tech future with new rules on cybersecurity for public companies, proposals on AI, online advice

New regs give fund industry potential preview of final cybersecurity rules for funds, advisers

The SEC this week adopted and proposed new requirements around the technologies that are expected to become the foundation on which future investment markets will be built.

The commission decided, after a 3-2, party-line vote, to enact a significantly modified version of the cybersecurity regulation for public companies the SEC proposed in March 2022.

The vote to finalize the rules comes as the fund industry waits to see whether the SEC will adopt similar regulation for investment advisers and fund boards, as it proposed in February 2022.

Commissioners also voted to propose two new rules – the first of which would require that investment firms take extra steps to demonstrate any artificial intelligence systems they use don’t create conflicts between the interests of investors and those of the investment firm.

The second proposal would update rules defining how investment advisers that operate exclusively through the internet can register with the SEC.

New cybersecurity disclosure requirements for public companies

The newly finalized cybersecurity rules will require that companies report material incidents within four days and require the inclusion of cybersecurity information in annual reports filed after Dec. 15, 2023.

They also require that companies provide a description of the process they follow to evaluate and respond to cybersecurity threats, and the role the board of directors plays in deciding which are material and what to do about them.

The initial proposal had recevied angry public comments accusing the SEC of setting an unreasonable standard by requiring that all incidents be reported within four days. Even major data breaches are often not discovered until months later and, even after discovery, could require more than four days for confirmation and a reasonable estimate of the scope of the damage.

The finalized version requires that companies report only material incidents, and that reports be filed within four days after the company determines that an incident was significant enough to be considered material, Gensler said in a statement before the vote.

In response to complaints that any company revealing a major cybersecurity breach could make itself a target for more attacks or potential lawsuits, the final version offers companies a way to delay especially risky disclosures – if the U.S. Attorney General agrees that disclosure would pose a substantial risk to national security or public safety, Gensler wrote.

The new cybersecurity rules are an example of the SEC’s recent habit of adding overly prescriptive regulations covering issues the SEC can already address through enforcement actions without clearly defining materiality in a way that could limit the SEC’s power to require disclosure of sensitive information, commissioner Hester Peirce said in opposition to the rules.

The new rules also focus on the potential impact of a cybersecurity risk without comparison to risks involving customer acquisition, retention, product development, competition, regulatory controls or other routine risks of business, commissioner Mark Uyeda said in his statement opposing the rules.

“Following today’s amendments, investors will have far less insight into how a company manages these other risks relative to cybersecurity, even if the company has not had any material cybersecurity incidents,” Uyeda wrote.

The intent of the rules is not only to create a standardized schedule and format for cybersecurity disclosure, as the SEC announcements describe it, compliance expert Matt Kelly wrote in an analysis of a June 22 speech to a cybersecurity conference by Gurbir Grewal, director of the Division of Enforcement.

The goal of the rules is to also to make sure that cybersecurity plans include the need to protect investors by letting them know what’s going on, not protect the stock price by keeping incident reports secret until the danger to a company’s reputation has passed, Kelly wrote.

“We should appreciate what Grewal is telling us here,” Kelly wrote. “Even while companies are suffering through a cyber attack against them, they still have a duty of care to both customers and investors. That provides the legal basis for the SEC to sanction companies when they ignore that duty of care — and the confusion of a cyber attack won’t be a sufficient excuse for said ignoring.”

AI regulation state of mind

SEC chair Gary Gensler, who has frequently spoken out on the risks to investors posed by the advanced analytics and behavioral prediction capabilities possible with artificial intelligence (AI), described the first AI-specific rule proposed under his administration as simply extending the conflict-of-interest rules that apply to all investment firms and broker-dealers.

The ability of AI systems to predict the behavior and decisions of individuals gives advisers selling investments a tremendous advantage in crafting pitches that appeal to individual consumers, and introduce the potential that those pitches will serve the interests of the advisers as much or more than they serve the interests of investors, Gensler said in the SEC announcement of the rule proposal.

“If a firm’s optimization function takes the interest of the firm into consideration as well as the interest of the investor, this can lead to conflicts of interest,” Gensler said during discussion of the proposal. “What’s more, such conflicts could manifest efficiently at scale across brokers’ and advisers’ interactions with their entire investor bases.”

The proposed rule requires that companies using AI systems neutralize the effect of conflicts of interest from those systems and that companies using AI systems have written policies and procedures designed to prevent conflicts of interest and keep records of any potential conflicts and how they were resolved.

“The best thing I can say for this proposal is that it serves, perhaps unintentionally, as a mirror reflecting the Commission’s distorted thinking,” Peirce said in a statement fiercely opposing the rule and criticizing the Commission’s attitude toward all technology as “not neutral, but hostile.”

“Let us be honest about what we are doing here: banning technologies we do not like,” she said.

Peirce, who was nicknamed “Crypto Mom” for her defense of cryptocurrencies, said the effort not to define problematic AI technologies too closely in the rule proposal left it with a scope so broad it would include spreadsheets, databases or any other technology capable of being used for analysis.

If, as the SEC’s description of the proposal admits, the compliance costs resulting from such a rule does cause some companies to avoid using automated investment-advice systems, “we risk depriving investors of the benefits of technological advancement,” Peirce said.

The rule would also penalize companies if a single segment of an algorithm or data set produced a result that favored the interests of the firm rather than an investor – but determining the specific behaviors or reasons an AI application model makes a decision could force investment firms to spend as much time testing their applications as building them to determine the risk of conflict, Peirce said.

“How could a firm get comfortable that it had done enough testing to spot that one conflicted factor in an algorithm or data set?” she asked.

AI application models are created by feeding huge volumes of data through an analytical process that involves tens or hundreds of thousands of lines of linear algebra, each of which tells the application to look for a particular type of data or a data element as small as a single pixel on a photograph, and describes where and when to keep track of similar data and how much weight to give to that data point while making a decision.

After the application has been “trained,” it produces a relatively static data model capable of “recognizing” specific patterns or situations and quickly identifying a particular identification tag or appropriate response.

Deep-learning applications almost all rely on decision-making models that are notorious as “black boxes” whose decision-making is so difficult to trace that even their developers are often hard-pressed to say why a particular decision was made.

It is the black-box tendency of those applications Gensler has described in previous discussions as needing to be documented well enough to identify where and how potential conflicts of interest might be created.

Internet investment adviser proposal

The internet-investment-adviser rule is an effort to update the narrow exception the SEC granted in 2002 that allowed investment advisers to register as advisers providing investment advice only over the internet.

That exemption is long out of date and currently allows many advisers to register as “internet” advisers when they may have just one online customer, granting those advisers exemptions from the other, more internet-savvy rules that govern every other category of investment adviser, Gensler said in his statement of support.

As many as 40% of existing “internet” advisers may be ineligible companies using the classification unfairly, but the update also takes a harsher approach that “may be overly critical of investment advisers that provide advice over the internet,” according to commissioner Mark Uyeda, who urged members of the public to comment on the proposal to provide a better idea of how to address internet advice most effectively.

Both the AI and internet investment-adviser rule proposals will be open for comment for 60 days. The cybersecurity rules will go into effect 30 days after they are published in the Federal Register. Annual reports published after Dec. 15, 2023 must also comply with the rules.

Print
Save