Crown image Columbia Law School

SEC Chairman Gensler Speaks Before the Investor Advisory Committee

It’s good to be back with the Investor Advisory Committee (IAC) again. As is customary, I’d like to note that my views are my own, and I’m not speaking on behalf of the Commission or SEC staff.

I’d like to acknowledge the departure of Committee members J.W. Verret and Paul Mahoney. J.W. has served as the Assistant Secretary and Chair of the Market Structure Subcommittee. Paul has served in a number of roles, including IAC Chair during a transitionary period. Both have been active, engaged members of the Committee. Thank you for volunteering your time to make important contributions to our work.

The topics you’re discussing today address a number of items on the Regulatory Flexibility Act Agenda.

Your opening panel tackles the ethical issues and fiduciary responsibilities related to the use of artificial intelligence in robo-advising. More broadly, I’d like to address what we call digital engagement practices, and how they intersect with a variety of finance platforms.

Technological developments can increase access and choice. They also, however, raise important public policy considerations — with respect to conflicts of interest and bias, as you’ll discuss today, and systemic risk, a topic I’ll leave for another day.

First, conflicts of interest. Predictive data analytics, differential marketing, and behavioral prompts — what we’ve collectively called digital engagement practices — are integrated into robo-advising, wealth management platforms, brokerage platforms, and other financial technologies.

As they design their user experiences, platforms and the people behind those platforms have to decide what factors they’re optimizing for, statistically speaking, and there may be many factors.

For example, in the case of an online retailer, perhaps the platform’s employees are optimizing for revenues, basket size, and margin.

In the case of online finance platforms, when they use certain digital engagement practices, what are they optimizing for? Are they solely optimizing for the investor’s benefits, including risk appetite and returns? Or are they also optimizing for other factors, including the revenues and performance of the platforms?

Finance platforms have to comply with investor protections through specific duties — things like fiduciary duty, duty of care, duty of loyalty, best execution, and best interest.

These are the things that financial professionals, whether human, app, or robot, must optimize for — for the benefit of the investor. When a platform also is trying to optimize for its own revenues, though, that’s where there is a conflict with its duties to investors.

Further, when do behavioral nudges by broker-dealers take on attributes similar enough to recommendations such that related investor protections are needed? The nature of certain steers may create gray areas between what is and isn’t a recommendation — gradations that could be worth considering through rulemaking.

A related issue is bias, and how people — regardless of race, color, religion, national origin, sex, age, disability, and other factors — receive fair access and prices in the financial markets.

How can we help ensure that new developments in analytics don’t instead reinforce societal inequities?

Today, platforms have an insatiable appetite for data. The underlying data used in the analytic models could reflect historical biases, or may be proxies for protected characteristics, like race and gender.

I’ve asked staff to take a close look at the feedback we received from a recent request for comment on digital engagement practices as they make recommendations for the Commission’s consideration. Your comments will help us as we think through these issues as well.

The second panel today is on cybersecurity disclosures. Such disclosures are important for investors in both funds and public issuers. Today, cybersecurity is an emerging risk with which public issuers increasingly must contend. The interconnectedness of our networks, the use of predictive data analytics, and the insatiable desire for data are only accelerating, putting our financial accounts, investments, and private information at risk. Investors want to know more about how issuers and funds are managing those growing risks.

On Wednesday, we voted to propose new rules on cybersecurity disclosures for issuers, so this conversation is especially timely. This was the third rulemaking project we have proposed that implicates cybersecurity. Earlier this winter, the Commission voted to propose expanding Regulation Systems Compliance and Integrity (SCI) to certain government securities trading platforms. In February, we voted to propose new obligations for registered investment advisers and funds with respect to cybersecurity. I welcome your comment on these proposals.

Going forward, I’ve also asked staff to make additional recommendations for the Commission’s consideration with respect to broker-dealers, Regulation SCI, and intermediaries’ requirements regarding customer notices (Regulation S-P).

Finally, I’d like to thank the Investor as Purchaser Subcommittee for your Recommendation regarding Individual Retirement Accounts in December. I am grateful for the thoughtful recommendation on a critically important retail investor protection issue. We must always be concerned when gaps in our regulations expose retail investors to fraudulent schemes. I’ve asked my staff to look into what actions we can take, both internally and with our regulatory partners, to address these issues.

These remarks were delivered on March 10, 2022, by Gary Gensler, chairman of the U.S. Securities and Exchange Commission, before the SEC’s Investor Advisory Committee.