Author(s)
Jennifer Elisa Chapman, Kristin Johnson and
Frank Pasquale
Source
Fordham Law Review, Vol. 88, pp. 499-529, 2019
Summary
Financial technology firms (fintech) using big data and machine learning have revived old debates regarding regulation of the financial services industry. State consumer protection laws should apply to these firms to protect consumers from predatory lending practices.
Policy Relevance
State and federal regulators should collaborate to promote the accountability of financial algorithms.
Main Points
- Some predict that automation of financial markets will substitute objective models of creditworthiness and risk for human evaluations tainted by bias.
- Fintech firms promise to serve consumers who have historically lacked access to credit, but consumer advocates are skeptical, because of the financial sector’s history of predatory practices; some aggressive fintech firms entice borrowers to borrow more and more at high interest rates.
- Even when developers expressly program an algorithm not to discriminate on the basis of a protected trait, inaccurate data sets may cause the algorithm to learn to discriminate unfairly.
- The Office of the Comptroller of the Currency (OCC) has ruled that fintech firms could be chartered as banks, even though these firms do not take deposits; the OCC’s action preempted state regulation of fintech firms, leaving consumers without protection.
- AI systems should be “explainable,” that is, they should enable humans to understand why an algorithm has made a specific decision; explainable AI would reveal bias, allow tinkering to improve performance, and enable the enforcement of laws and regulations.
- Even if fintech firms are given federal bank charters, state regulators and attorneys general should retain the power to develop and enforce rules for algorithms used in consumer finance.
- The Consumer Finance Protection Bureau could play an important role in establishing minimum standards for fintech.