Yes, this was not a voluntary move. A specific group of people pressured the government, wanting it to remove data it had made public as well as make it harder for the public to repurpose and make use of the data. So what happened? And what lessons should governments, NGOs and citizens take away from this incident?
The story revolves around the Ontario Ministry of Education which earlier this year created a website that mashed up performance data (e.g. literacy and math scores) with demographic information (e.g. percentage of pupils from low-income households and percentage of gifted students). The real problem – according to a group representing teachers, parents and stakeholders – occurred when the Ministry enabled a feature that allowed the website’s users to compare up schools to one another.
The group, called People for Education, protested that the government was encouraging a “shopping-mentality” in the public school system.
Of course, many parents already shop for schools. I remember, as a kid, hearing about how houses on one side a street, but within the catchment area of my high school, were more expensive than houses on the other side of the same street, but within the catchment area of another school. Presently however, this type of shopping is reserved for the wealthy and connected (e.g. the privileged). Preventing people from comparing schools online won’t eliminate or even discourage this activity, it will simply preference those who are able to do it, further reinforcing inequity.
The real problem however, is that the skills and analysis involved in school shopping are the same as those required in accessing and being engaged in, the performance of one’s local school. Parents, and taxpayers in general, have a right to know their childrens school’s performance – especially in comparison to similar schools. If parents don’t have information to analyze and compare, how can they know what systemic issues they should ask their childrens teachers about? More importantly, how can they know what issues to press their local school board about?
Ironically, People for Education states on its “About Us” page that it works towards a vision of a strong public education system by a) doing research; b) providing clear, accessible information to the public and c) engaging people to become actively involved in education issues in their own community.
And yet, asking the government to remove the comparison feature runs counter to all three of its activities. Limiting how the Ministry’s data can be used (and as we’ll see later, suggesting that this data shouldn’t be shared):
- prevents parents, and other analysts such as professors or politicians, from doing their own research
- runs counter to the goal of providing clear and accessible information to the public. Indeed, it makes information harder to access.
- makes it harder for parents to know how they should get involved and what issues they should champion to improve their local school
What is interesting about this story is that it reveals the core values and underlying motivation of different actors. In this case People for Education – which I believe to be a well intentioned a positive contributor to the issue of education – is nonetheless revealed to have a conservative side to it.
It fears a world where citizens and parents are equipped with information and knowledge about schools. On the one hand it may fear the types of behaviours this could foster (such as school shopping). However, it may also fear a weakening of its monopoly as “expert” and advocate on educational issues. If parents can look at the data directly, and form their own analysis and conclusions, they may find that they don’t agree with People for Education. Open data would allow those it represents to self-organize, challenging the hierarchy and authority of the organization.
For whatever reason, we see an NGO bending over backwards to advocate for an outcome that runs directly counter to the very vision and activities it was founded to serve. More ironically, this result in some paradoxical messaging as an organization that champions Ontario’s school system essentially arguing that it doesn’t trust the products of that system – the citizens of Ontario – to use the information and tools provided by the Ministry that was responsible for their education. It is an unsustainable position – particularly for a group that was originally founded as a bottom up, grass-roots organization.
So what lessons are there here?
A key mistake made by the Ontario Ministry of Education is that it didn’t open up the data enough. While the website allowed users to look at school performance data they could only do this on the Ministry’s website using the Ministry’s tools and interface. Had the data been available as an API or in downloadable format someone else could have taken the data and created the system for comparing schools. People for Education were mostly upset that the Ministry’s website encouraged a “shopping-mentality.” Had the Ministry simply shared the data then People for Education could build their own interface using criteria and tools they though relevant. The Fraser Institute or a multitude of other organizations could build their own as well, and people could have used the tools and websites they found most useful and relevant. Let People for Education go head to head with the Fraser Institute and whoever else. This is not a battle the government need fight.
Lesson: Always provide the data – a goal that is hard to argue against – but sometimes, leave it to others to conduct the analysis. A marketplace of ideas will emerge, and citizens can choose what works best for them.
For NGOs in general
First, understand what open data means for your cause. One of the news articles had this highly disturbing quote from the Executive Director of People for Education:
Among her complaints about the type of information available, Ms. Kidder took issue with the ministry’s contention the Web site merely consolidated information already available to the public. “You can’t walk into your child’s school and say ‘What’s the average income of parents at this school?’ ” she said. “It’s not true at all [that this is public information].”
This is a shocking statement. In actuality, all the data assembled by the Ministry is publically available. It was just that, until now, it had remained scattered and isolated. Just because it was hard to find (and thus reserved for an elite few) or located on the school property (and thus easy for parents to locate) does not mean it didn’t exist or wasn’t public.
Lesson: Transparency is the new objectivity. People increasingly don’t trust anyone – governments, the media, or even NGOs. They want to see the analysis themselves, not take your word for it. Be prepared for this world.
Second, be careful about taking positions that will deny your supporters – and those you represent – tools with which to educate themselves. Organizations that are perceived as trying to constrain the flow of information so as to retain influence and control risk imploding. I won’t repeat this lesson in detail but Clay Shirky’s case study about the Vatican, written up in Here Comes Everybody, is a powerful example.
For educators in particular
In the past, educators have been deeply concerned with ranking systems. This is understandable. Ranking systems are often a blunt tool. Comparing apples to oranges can be foolish – but then, sometimes it is helpful. The question is to know when it is helpful and ensure it is used accordingly.
The fact is, ranking is an outcome of data. The two simply cannot be separated. The moment there is data, there is ranking. A ranking by school size, number of teachers, or amount of gym equipment is not different than a ranking of class sizes, literacy rates, disciplinary trends, or graduation rates. What matters is not the rank, but the conclusions, meaning and significance we apply to these rankings. Here, the role for groups like People for Education could be profound.
This is because we can’t be in favour of transparency and accessible information on the one hand and against ranking on the other. The two come hand in hand. What we can be opposed to are poor ranking systems.
Every profession gets assessed, and teaching should be no different. The challenge is that there is much more to teaching than what gets reflected in the data collected. This means that groups like People for Education shouldn’t be against transparency and open data – they should be trying to complexify and nuance the discussion. Once the data is publicly available anyone can create a ranking system of their own choosing – but this gives us an opportunity to have a public discussion about it. One way to do this is to create one’s own tools for measuring schools. You don’t like the Ministry of Education’s system? Create your own. Use it to talk to parents about the right questions to ask and to promote the qualitative ways to evaluate their childrens’ schools performance. It’s an open world. But that doesn’t mean it needs to be feared – it is rife with opportunity.