Dear Ms. Countryman:

We appreciate the opportunity to comment on the Securities and Exchange Commission’s proposed rule addressing “Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers,” which seeks to “eliminate, or neutralize the effect of, certain conflicts of interest associated with broker-dealers’ or investment advisers’ interactions with investments through these firms’ use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors and outcomes.” The Cato Institute is a public policy research organization dedicated to the principles of individual liberty, limited government, free markets, and peace, and the Center for Monetary and Financial Alternatives focuses on identifying, studying, and promoting alternatives to centralized, bureaucratic, and discretionary financial regulatory systems. We are the Center’s director of financial regulation studies and financial technology policy analyst, and the opinions we express here are our own.

The Commission should withdraw this proposed rule. Rather than being “technology-neutral” as the Commission claims, the proposed rule is instead indiscriminately hostile to the use by broker-dealers and investment advisers of almost all technology that has some connection to retail investors. In addition to imposing heavy burdens on any firm that chooses to use such technology—or has already been using such technology for decades—this proposed rule goes an enormous step further by requiring, not disclosure of any conflict-of-interest posed by the technology, but “elimination or neutralization” of the conflict presented. This requirement is at odds with existing standards of care appliable to broker-dealers and investment advisers in Regulation Best Interest and adviser fiduciary duties, respectively, and wrongfully judges investors incapable of digesting and evaluating disclosures about technology. The overbreadth of the proposed rule’s definition of “covered technology” makes clear that such a requirement is unwarranted. But even were the rule cabined to the more complex technologies that the Commission couches the rule as addressing, such as artificial intelligence (AI), requiring conflict remediation rather than disclosure is not justified.