Visualizing Market Activity Using Graphs & Trendlines
- AVMs… The Not So Accurate Magic Bullet - November 22, 2022
- Visualizing Market Activity Using Graphs & Trendlines - November 18, 2022
- Diatribes Against Appraisers - October 31, 2022
Appraisers, this article will highlight the process I use to bring clarity to my research and analysis of ‘comparable’ properties. This completed process becomes an exhibit in all my appraisal reports because it helps the Intended User(s) visually see what the market has been doing over a known time period. The exhibit graph also indicates a daily rate of change for the sales over the time period, which can then be applied as a ‘time’ (market) adjustment to each comp in the report – if warranted.
Before launching into the process, I want to acknowledge appraisers David Braun, Patrick Egger, George Dell, Joe Lynch and Steven R. Smith for teaching the importance of, and processes for, analyzing and using data and graphs in appraisal reports to me and other appraisers. Additional input has been derived from other educational sources, primarily for the use of Excel.
There are some services available to appraisers which provide similar graphs in more of an automated way. MLS Associations, via their property data bases, and secondary appraiser resources, often have graphs available. But I find most of these to be neutered in the way I want to present the sales data information. There is limited customization available to the appraiser when using these services, but they are ‘better than nothing’ as an additional exhibit in reports. It will help you justify your ultimate value decision. The chart of sales and graph I use are developed in MS Excel (because my MLS exports data to that), but virtually any spreadsheet software can be utilized to replicate the information I present.
The process begins with examining the subject property and surrounding area to understand the ‘appraisal problem’ consistent with USPAP Standard 1.
Once I examine the subject property, I determine the primary characteristics. Those start with location, then GLA size, design, site size, and other amenities. Note that I have not mentioned “Price” in this. I normally segregate comp search by GLA size, in a range of 20% smaller to larger than the subject’s GLA. If need be, I narrow that range. The comp selection distance is determined by the subject’s geographic location, urban areas closer, expanded in distance in suburban/rural areas.
Subject property has 3 bedrooms, 3.5 baths, built in 2005, 2,612 sq.ft. in gross living area (GLA), and lot size is 0.140 acre. It was originally listed for $625,000 and list price was lowered to $599,997.
This property is in a dense urban development. But what is somewhat different is the homes are ‘semi-custom.’ This means the builder started with a variable stock house plan on a particular lot, to randomize the look of the neighborhood, and would not begin building until a prospective purchaser entered the process. At that point, the purchaser could select finishes available in the builder catalogue to ‘customize’ the home.
The net result of this builder program is the neighborhood has similar ‘looking’ homes from the exterior, but the insides could be very different in quality finishes, and thus the price points of the purchased homes could be in a wide range. However, in a general way, the home sale price was predicated on GLA size, so that’s an appropriate metric for comp searching.
Now take a look at the graphs below. When I first began this research project, I started with a GLA range 20% smaller to larger than the subject. After graphing those sales, it became obvious that the range was too wide.
So I re-did the search to a 10% range, and the graphs reflect those sizes. This is where graphing is vital. It helps you see the actual sales and how they are positioned in terms of their sales price in the market verses the date of sale.
Once that step is completed, I then add the linear, and later, the polynomial trend lines. In Excel, and other spreadsheet software, this is a user-defined application, so you will have to learn how it’s done in the software you are using.
Now we are getting to the meat of this article. Interpreting the data and making decisions, so that those decisions can be translated into the determination of appropriate market value. It’s the ‘analyst’ becoming prominent that George Dell stresses.
Take a close look at the yellow graph. The diamonds show the sales related to the date of sale. The linear green line shows the OVERALL trend of those sales, with a formula shown in Green as y=58.177. This is an automated process built into Excel, and other spreadsheets, which shows the DAILY rate of change in dollars to the sale prices. This can be used to develop a ‘time’ adjustment to comps in the report.
Normally I’d probably stop right there and just proceed to finish the report. But when this graph is examined closely, you will see that there is a significant gap between the lower and higher priced sales of about $15,000. The linear green line is the demarcation! That tells me that in this neighborhood, the GLA and customization is impacting the sale prices.
Also note that the average and median sale prices are quite a bit below the offered ORIGINAL List Price. That’s a significant data point to consider. It’s a chin scratching moment ‘analysts’ must evaluate and then make a decision to back up their value conclusion.
Once this price differential was revealed ‘visually’ in the graph, I produced a second graph – the blue one – which is limited to those sales above the green linear line in the yellow graph.
At this point, the brains of pure academic data scientists, and some appraisers, are exploding. “How dare you use so few sales. You need at least “X” numbers to be accurate.” To that I say: “Real estate does not always follow predictable courses; real estate sales are impacted highly by buyer emotions. So the data you have available is what it is, where it is, when it is.” You can, and probably should, make a decision with it! Especially if your comp selection is consistent with the subject’s characteristics. Again, this is a key aspect of being an analyst, which is vital for every appraiser.
Back to the blue graph. It reveals that the overall value trend is STABLE (the dark black line), with a rate of change at only $0.135 cents PER DAY! I would not apply a time adjustment to the comps based on that. Secondly, the dashed (red) polynomial line, which precisely tracks the INDIVIDUAL sales, is also stable when the end points of that trend line are analyzed.
When the average and median sales prices are examined, those fall closer to the sales agent’s list prices, and thus to me, those sales between $590,000 and $625,000 would be considered more appropriate as comps in the report.
Now to a point about POLYNOMIAL lines: you can select the “order number” to apply in the graph. Some people advocate using only “order number 3” in conjunction with the linear line. I don’t. Simply because when the “order number” is raised, its precision relative to the sales is improved. When you learn to use the Poly trend line, and vary the “order number”, you will notice how the ‘magnetic forces’ of the line are influenced by the positions of the sales on the graph. That’s why the blue graph looks different from the yellow graph, even though the “order number” in each is the same.
Does this process I use take more time to do? Yes. But is the exhibit I produce much better than ‘nothing’ or other ‘automated’ graphs with limited info? I would say Yes. I now have credible evidence I can use to justify and back up my valuation decision.
Regardless of what you choose to do, adding one or more graph exhibit pages to your report will enable you to be recognized as a more diligent appraiser. Clients appreciate that aspect of professionalism.