Bias in Automated Valuation Models
- Federal Valuation Agency Impact on Appraisers & the Public - July 22, 2022
- Is Georgia Going Rogue? - June 13, 2022
- Bias in Automated Valuation Models - February 28, 2022
The CFPB is reviewing bias in Automated Valuation Models (AVMs). The proposed rules are a joint effort by the Consumer Financial Protection Bureau, the Office of the Controller of the Currency, the Federal Deposit Insurance Corporation, the National Credit Union Administration and the Federal Housing Finance Agency. These agencies are concerned AVMs may reflect bias in design and function. The mathematical models rely on biased data resulting in inaccurate valuations. Basically the agencies are stating historical data going back to redlining is built into these models and do not reflect current market data. Remember markets are not static and are always changing.
VaCAP shared Ted Talks by Cathy O’Neil and Tricia Wang back in 2019 in our “Big Data? Thick Data? Human Data?” post. Both of these professionals have a clear understanding of the pitfalls of algorithms in all aspects of life, not just valuation models. Even if you have listened before, it is well worth listening and sharing again.
Don’t be fooled they are listening; the proposed rules are mandated in the Dodd Frank Act as clearly stated in the introduction section of the proposal. Dodd Frank was enacted on July 21, 2010. The real question is why did it take 12 years.
The Consumer Financial Protection Bureau (CFPB) today outlined options to ensure that computer models used to help determine home valuations are accurate and fair. The options will now be reviewed to determine their potential impact on small businesses.
“It is tempting to think that machines crunching numbers can take bias out of the equation, but they can’t,” said CFPB Director Rohit Chopra. “This initiative is one of many steps we are taking to ensure that in-person and algorithmic appraisals are fairer and more accurate.”
…The CFPB is particularly concerned that without proper safeguards, flawed versions of these models could digitally redline certain neighborhoods and further embed and perpetuate historical lending, wealth, and home value disparities.