How Do You Time Adjust?
Time adjustments is a timely topic. Recent ski-jump rises in prices have been ramp-ant in many places.
Valuemetrics.info provided a free, two-hour webinar on the topic, and we had almost exactly 1000 appraisers sign up. (Although many did not show – free, you know.) Prior to the webinar, I asked some reviewers about how appraisers were handling the price changes. Also, my friend Steve Smith raised the question on his National Appraisers Forum (NAF) to see what sources they are using to support time adjustments.
Before we continue, it is important to note that a time adjustment comprises two main elements: 1) the change in the value of the dollar; and 2) the movement of the subject in its specific market segment.
I was able to sort adjustment methods by data source used (mostly compiled, published reports) and by the analytic used (which ranged from “do nothing” to market-specific algorithms). Here is the list:
- Ignore it means no time adjustment, but find one sale above what you need to make the deal.
- Zero means “its too hard to support” so just use the most recent comps. (This is the fallacy of insufficient reason, which says pretend it is a level market, even though you know it is not level.)
- National CPI (consumer price index) handles the “value of the dollar” part but ignores the subject market price changes. This seems more clever, and makes the bracketing-fudging a bit easier.
- Regional CPI is a step up, in that it usually is slightly closer to a specific real estate market.
- City/census tract/zip code-compiled reports use trend lines and sale/resale concept. These get closer to what might be happening to a specific market area. But they don’t match competitive aspects.
- 1004MC “Market Conditions Addendum” by FannieMae and FreddieMac purports to provide a “clear and accurate understanding of the market trends and conditions in the subject” But it does nothing of the sort. It answers the wrong question (neighborhood, not competitive segment), uses uneven groupings, discards exact sale dates, and is ambiguous and contradictory as between the form instructions and separate guidelines. It produces the wrong time adjustment in every case, and can even produce an adjustment in the wrong direction! It survived peer-reviewed published criticism and continues to be used for no known justification other than habit or policy.
- Neighborhood analysis using some control on property type (such as SFR or condo) with a scatterplot of sale prices and sale dates. At least this enables the appraiser to generate a least squares trend line, and a numerical time adjustment on a daily basis. The main issue again is that the neighborhood may act quite differently from a specific narrow competitive market within that neighborhood. Even though it uses the wrong data, it can be audited and data-verified.
- Competitive Market Segment (CMS)© uses the scatterplot/regression as above, but now applies the most relevant data, albeit subjectively selected, that directly competes with the subject property. It is readily reviewable, auditable, and limits modeling decisions to understandable levels. (This is the method we provided in the 2-hour “price indexing” webinar. We continue to teach this method in the Stats, Graphs, and Data Science1 class, improving it with objective data selection methods.)
- Reproducible CMS comprises the method above, augmented with independently obtainable data from stable data sources.
Reproducible, auditable methods enable risk/reliability scores, allowing individual valuations to directly integrate with lender collateral risk and credit risk models – as well as management dashboards and regulatory investigations.
Why is there such a wild range of “methods” and “work-arounds”? It is due to inconsistent education, standards, enforcement, and regulatory absence in the area. Today’s multi-state education approval process makes a progressive, modernized appraisal curriculum economically impossible, and bureaucratically impenetrable.