Over the past few months I’ve given a number of talks on open data and open innovation to groups of realtors around the country. During these talks I have cautioned that the more the real estate industry tries to protect (e.g. not share) its data, the more it risks making access to data (control) be the source of competition as opposed to accessibility with the data (allowing others to create value-added services).
Consequently, recreating already existing data sets will become the goal of competitors if working with the real estate industries data to innovate new services is not possible or prohibitively expensive. Competition on this axis has, I believe, two possible outcomes: One is a winner take all world where the person with the biggest data set wins, or put another way, either the current monopoly maintains its grip or a new monopoly takes over. The other is that there ceases to be, in a meaningful way, a single data repository on real estate and market place data gets broken into lots of different silos. This implications of this outcome are less clear, but there is a risk it could be bad for consumers as, essentially, market information would be fragmented. In either case, both outcomes carry significant risks for organized real estate.
Despite this, many realtors don’t believe it is likely to happen, because they don’t believe their data can be duplicated, no matter how much I try to tell them otherwise.
But…. whoops! Look what happened!
Today, as if to hammer home what I believe is the inevitable, the Globe and Mail has an article today on a new service that allows one to get an estimate on the assessed value of one’s home by accessing an alternative data set (one essentially created by the banks). It is basically a data set that is outside of any owned by organized real estate (that found in MLS). Look! Someone has recreated what was previously seen as a data set that could not be replicated!
Of course the counter is: “It isn’t as good.” Well, two things here. First, it may not have to be. If it offers 80% of the accuracy and 20% of the cost then it will probably be good enough for at least part of the market. And once it is established, I’m confident the owners of the website will find ways to make their service better and the data more accurate.
The real estate industry has an opportunity to shape its future or be shaped by the future. The market (and the competition bureau) isn’t going to give them for forever to make up their minds.
“If it offers 80% of the accuracy and 20% of the cost then it will probably be good enough for at least part of the market…”
It over estimates the price you are likely to get by $40,000, $50,000 dollars. More sometimes. So it would be good enough for that part of the market that is happy to list for way more than they can get and then don’t sell. How many people is that?
Given this is data the banks use I have my doubts that the estimates are that far off. And, as I say in the article, they will be looking for ways to make them more and more accurate if the service finds a market.
But if you are right, and the service consistently overestimates the value by $40k-$50k dollars… that is an easy problem to solve. Either the website or the user could just subtract that amount and then the data is accurate.
I agree with Bigcitylib. An overpriced propery will sit on the market for a long time, and when the price is finally lowered, they may get less than they should have if priced properly to begin with. It becomes a stale listing.
People can now list their propery on MLS for $99, and handle their own showings and negotiations. Currently, a realtor must still input the listing data and be responsible for verifying the accuracy of the data. Accuracy is better for the consumer.
I also just notice that Ottawagirl works for Coldwellbanker (real estate company). Totally valid opinion, but also from the part of the industry that feels threatened by these moves. I’m not sure of BigCitylib’s affiliation.
I always find the reaction of realtors interesting. If the data and the service were not of value, then they’ll die as people won’t use them. So there is little need to worry. What’s always more problematic is the defense: “We know what is best for the consumer…”
It’s a very valid perspective, but it is worth noting that realtors also have a vested stake in this debate, about 5% of the value of 90% of the homes sold in Canada… so they aren’t just speaking on behalf of the consumer.
Actually, I’m not threatened by any new website or discount brokerages. I do want to maintain the accuracy of the data on MLS. There will always be people who want to sell on their own, and that is their right as a property owner.
If this site isn’t accurate enough, then it will fail. Maybe that would be unfortunate.
Tanks for the post, David. I’m a Realtor, and I’m sure that the future of this industry will be different. I don’t know how, and I’m not wetting my pants thinking about it, but it is a point if uncertainty. What I haven’t heard addressed by anyone is an aspect of privacy. Without looking at who owns the data sets, I wonder about the implications of an entirely open system with sales data available for all to see. I don’t know if that is where this is going, but supposing it is, who would want everyone to know how much they paid for their house? Again, I don’t yet know if that is hat is being proposed, but if so, I don’t really like that idea as a homeowner. If the end goal is to have more than one body given access to a still-protected database, that is quite different. Am I way off?
Yes, you’re way off. I may be wrong here, but I believe most (if not all) jurisdictions make this information available to anyone willing to conduct a search of their land titles registrations. Voila – independent, objective, readily available information.