Category Archives: public policy

Lost Open Data Opportunities

Even sometimes my home town of Vancouver gets it wrong.

Reading Chad Skelton’s blog (which I frequently regularly and recommend to my fellow Vancouverites) I was reminded of the great work he did creating an interactive visualization of the city’s parking tickets as part of a series around parking in Vancouver. Indeed, it is worth noting that the entire series was powered by data supplied by the city. Sadly, it just wasn’t (and still isn’t) open data. Quite the opposite, it was data that was wrestled, with enormous difficulty, via an FOI (ATIP) request.

parking-tickets

In the same blog post Chad recounts how he struggled to get the parking data from the city:

Indeed, the last major FOI request I made to the city was for its parking-ticket data. I had to fight the city tooth and nail to get them to cough up the information in the format I wanted it in (for months their FOI coordinator claimed, falsely, that she couldn’t provide the records in spreadsheet format). Then, when the parking ticket series finally ran, I got an email from the head of parking enforcement. He was wondering how he could get reprints of the series — he thought it was so good he wanted to hand it out to new parking enforcement officers during their training.

What is really frustrating about this paragraph is the last sentence. Obviously the people who find the most value in this analysis and tool are the city staff who manage parking infractions. So here is someone who, for free(!), provides an analysis and some stories that they now use to train new officers and he had to fight to get the data. The city would have been poorer without Chad’s story and analysis. And yet it fought him. Worse, an important player in the civic space (and an open data ally) feels frustrated by the city.

There are of course, other uses I could imagine for this data. I could imagine the data embedded into an application (ideally one like Washington DC’s Park IT DC – which let’s you find parking meters on a map, identify if they are available or not, and see local car crime rates for the area) so that you can access the risk of getting a ticket if you choose not to pay. This feels like the worse case scenario for the city, and frankly, it doesn’t feel that bad and would probably not affect people’s behaviour that much. But there may be other important uses of this data – it may correlate in some interestingly and unpredictably against other events – connections that if made and shared, might actually allow the city to leverage its enforcement officers more efficiently and effectively.

Of course, we won’t know what those could be, since the data isn’t shared, but it is the kind of thing Vancouver should be doing, given the existence of its open data portal. But all government’s should take note. There is a cost to not sharing data. Lost opportunities, lost insights and value, lost allies and networks of people interested in contributing to your success. It’s all our loss.

How to Unsuck Canada’s Internet – creating the right incentives

This week at the Mesh conference in Toronto (where I’ll be talking Open Data) the always thoughtful Jesse Brown, of TVO’s Search Engine will be running a session title How to Unsuck Canada’s Internet.

As part of the lead up to the session he asked me if I could write him a sentence or two about my thoughts on how to unsuck our internet. In his words:

The idea is to take a practical approach to fixing Canada's lousy
Internet (policies/infrastructure/open data/culture- interpret the
suck as you will).

So my first thought is that we should prevent anyone who owns any telecommunications infrastructure from owning content. Period. Delivery mechanisms should compete with delivery mechanisms and content should compete with content. But don’t let them mix, cause it screws up all the incentives.

A second thought would be to allocate the freed up broadcast spectrum to new internet providers (which is really what all the cell phone providers are about to become anyways). I’m actually deeply confident that we may be 5 years away from this problem becoming moot in the main urban areas. Once our internet access is freed from cables and the last mile, then all bets are off. That won’t help rural areas, but it may end up transforming urban access and costs. Just like cities clustered around seaports and key places nodes along trade networks, cities (and workers) will cluster around better telecommunication access.

But the longer thought comes from some reflections over the timely recent release of OpenMedia.ca/CIPPIC’s second submission to the CRTC’s proceedings on usage-based billing (UBB) which I think is actually fairly aligned with the piece I wrote back in February on titled Why the CRTC was right about User Based Billing (please read the piece and the comments below before freaking out).

Here, I think our goal shouldn’t be punitive (that will only encourage the telco’s to do “just enough” to comply. What we need to do is get the incentives right (which is, again, why they shouldn’t be allowed to own content, but I digress).

An important part of getting the incentives right is understanding what the actual constraints on internet access. One of the main problems is that people often get confused about what is scarce and what is abundant when talking about the internet. I think what everyone realizes is that content is abundant. There are probably over a trillion websites out there, billions of videos and god knows what else. There is no scarcity there.

This is why any description of access that uses an image like the one below will, in my mind, fail.

Charging per byte shouldn’t be permitted if the pipe has infinite capacity (or at least it wouldn’t make sense in a truly competitive market). What should happen is that companies would be able to charge the cost of the infrastructure plus a reasonable rate of return.

But while the pipe may have infinite capacity over time, at any given moment it does not. The issue isn’t about how many bytes you consume, it’s about the capacity to deliver those bytes in a given moment when you have lots of competing users. This is why it isn’t the “where the data is coming from/going to” that matters, but rather how much of it is in the pipe at a given moment. What matters is not the cable, but the it’s cross section.

A cable that is empty or only at 40% capacity should deliver rip-roaring internet to anyone who wants it. My understanding is that the problem is when the cable is at 100% or more capacity. Then users start crowding each other out and performance (for everyone) suffers.

 

 

 

 

 

 

 

Indeed this is where the OpenMeida/CIPPIC document left me confused. On the one hand they correctly argue that the internet’s content is not a limited resource (such as natural gas). But they seem to be arguing that the network capacity is not a finite resource (sections 21 and 22) while at the same time accepting that there may be constraints on capacity during peak hours (sections 27 and 30 where they seem to accept that off peak users should not be subsidizing peak time users and again in the conclusion where they state “As noted in far greater detail above, ISP provisioning costs are driven primarily by peak period usage.” If you have peak period usage then, by definition, you have scarcity). The last two points seem to be in conflict. The network capacity cannot be both infinite and constrained during peak hours? Can it?

Now, it may be that there is more network capacity in Canada then there is demand – even at peak times – at which point, any modicum of sympathy I might have felt for the telcos disappears immediately. However, if there is a peak consumption period that does stress the network’s capacity, I’d be relatively comfortable adopting a pricing mechanism that allocates the “scarce” amount of broadband pie. Maybe there are users – especially many BitTorrenters – whose activities are not time sensitive. Having a system in place that encourages them to bittorrent during off-peak hours would create a network that was better utilized.

So the OpenMedia piece seems to be open to the idea of peak usage pricing (which was what I was getting at in my UBB piece) so I think we are actually aligned (which is good since I like the people at OpenMedia.ca).

The question is, does this create the right incentives for the telco’s to invest more in capacity? My hope would be yes, that competition would cause users to migrate to networks that provided high speeds and competitive low and/or peak usage time fees. But I’m open to the possibility that it wouldn’t. It’s a complicated problem and I don’t pretend to think that I’ve solved it in one blog post. Just trying to work it though in my head.

 

 

 

Applications and Hardware Already Running On Open Data

Yesterday, Gerry T shared a photo he snapped at the University of Alberta in Edmonton of a “departure board” in the university’s Student Union building that uses open transportation data from the city’s website.

Essentially the display board is composed of a simply application, displayed over a large flat screen TV turned vertically.

TransitApp_BusDepartures-217x300It’s exactly the kind of thing that I imagine University Students in many cities around the world wish they had – especially if you are on a campus that is cold and/or wet. Wouldn’t it be nice to wait inside that warm student union building rather than at the bus stop?

Of course in Boston they’ve gone further than just providing the schedule online. They provide real-time data on bus locations which some students and engineers have used to create $350 LED signs in coffee houses to let users know when the next bus is coming.

It’s the kind of simple innovations you wish you’d see in more places: government’s letting people help themselves at making their lives a little easier. Yes, this isn’t changing the world, but its a start, and an example of what more could happen.

Mostly it’s nice to see innovators in Canada like playing with the technology. Hopefully governments will catch up and let the even bigger ideas students around the country have be more than just visions in their heads.

Not sure who at the University created this, but nice work.

New York release road map to becoming a digital city

Yesterday, New York City released its “Road Map for the Digital City: Achieving New York City’s Digital Future.” For those who missed the announcement, especially those concerned about the digital economy, the future of government and citizen services, the document is definitely worth downloading and scanning.

At the heart of the document sits a road map which I’ve ripped from the executive summary and pasted below.What makes me particularly interested in it is how the Open Government section is not uniquely driven by the desire for transparency but with the goal of spurring innovation and increasing access to services. Of course, the devil is in the details but I’m increasingly convinced that open initiatives will be more successful when the government of the day has some specific policy objectives (beyond just transparency) it wishes to drive home, with open data as part of the mix (more on this in a post coming soon).

As such, “government as platform” works best when the government also builds atop the platform. It itself must be a consumer and stakeholder. This is why section 3 is so important and interesting. Essentially section 2 and 3 have parts that are strikingly similar, its just that section 2 outlines the platform and lays out that the government hopes others will build on top of it whereas parts of section 3 outline what the government intends to build atop of it. Of course section 3 goes further and talks as well about gathering information and data from the public which is the big thing in the Gov 2.0 space that many governments have not gotten around to doing effectively – so this will be worth watching more closely. All of this is great news and exactly what governments should be thinking about.

It is great when a big city comes out with a document like this because while New York is not the first to be thinking these ideas, but its profile means that others will start devoting resources to pursue these ideas more aggressively.

Exciting times.

1. Access

The City of New York ensures that all New Yorkers can access the Internet and take advan- tage of public training sessions to use it effectively. It will support more vendor choices to New Yorkers, and introduce Wi-Fi in more public areas.

  1. Connect high needs individuals through federally funded nyc Connected initiatives
  2. Launch outreach and education efforts to increase broadband Internet adoption
  3. Support more broadband choices citywide
  4. Introduce Wi-Fi in more public spaces, including parks

2. Open Government

By unlocking important public information and supporting policies of Open Government, New York City will further expand access to services, enable innovation that improves the lives of New Yorkers, and increase transparency and efficiency.

  1. Develop nyc Platform, an Open Government framework featuring APIs for City data
  2. Launch a central hub for engaging and cultivating feedback from the developer community
  3. Introduce visualization tools that make data more accessible to the public
  4. Launch App Wishlists to support a needs-based ecosystem of innovation
  5. Launch an official New York City Apps hub

3. Engagement

The City will improve digital tools including nyc.gov and 311 online to streamline service and enable citizen-centric, collaborative government. It will expand social media engagement, implement new internal coordination measures, and continue to solicit community input in the following ways:

  1. Relaunch nyc.gov to make the City’s website more usable, accessible, and intuitive
  2. Expand 311 Online through smartphone apps, Twitter and live chat
  3. Implement a custom bit.ly url redirection service on nyc.gov to encourage sharing and transparency
  4. Launch official Facebook presence to engage New Yorkers and customize experience
  5. Launch @nycgov, a central Twitter account and one-stop shop of crucial news and services
  6. Launch a New York City Tumblr vertical, featuring content and commentary on City stories
  7. Launch a Foursquare badge that encourages use of New York City’s free public places
  8. Integrate crowdsourcing tools for emergency situations
  9. Introduce digital Citizen Toolkits for engaging with New York City government online
  10. Introduce smart, a team of the City’s social media leaders
  11. Host New York City’s first hackathon: Reinventing nyc.gov
  12. Launch an ongoing listening sessions across the five boroughs to encourage input

4. Industry

New York City government, led by the New York City Economic Development Corporation, will continue to support a vibrant digital media sector through a wide array of programs, including workforce development, the establishment of a new engineering institution, and a more stream- lined path to do business.

  1. Expand workforce development programs to support growth and diversity in the digital sector
  2. Support technology startup infrastructure needs
  3. Continue to recruit more engineering talent and teams to New York City
  4. Promote and celebrate nyc’s digital sector through events and awards
  5. Pursue a new .nyc top-level domain, led by DOITT

 

Just a Click Away Keynote Slides

A little over two months ago I gave a keynote at the Just a Click Away Conference in Vancouver. The conference was a gathering for legal information and education experts – for example the excellent people that provide legal aid. My central challenge to them was thinking about how they could further collapse the transaction costs around getting legal assistance and/or completing common legal transactions.

I had a great time at the event and it was a real pleasure to meet Allan Seckel – the former head of British Columbia’s public service. I was deeply impressed by his comments and commitment to both effective and open government. As one of the key forces behind the Citizens at the Centre report he’s pushed a number of ideas forward that I think other governments should be paying attention to.

So, back to the presentation… I’ve been promising to get my slides from the event up and so here they are:

Why Does Election Canada Hate Young People?

This weekend the New York Times had an interesting article about how the BBC and other major media organizations are increasingly broadcasting new television episodes simultaneously around the world. The reason? The internet. Fans in the UK aren’t willing to wait months to watch episodes broadcast in the United States and vice versa. Here a multi-billion dollar industry, backed by copyright legislation, law enforcement agencies, and the world’s most powerful governments and trade organizations is recognizing a simple fact: people want information, and it is increasingly impossible to stop them from sharing and getting it.

Someone at Elections Canada should read the article.

Last week Elections Canada took special care to warn Canadian citizens that they risked $25,000 fines if they posted about election results on social network sites before all the polls are closed. Sadly, Election Canada’s approach to the rise of new internet driven technologies speaks volumes about its poor strategy for engaging young voters.

The controversy centers around Section 329 of the Canada Elections Act which prohibits transmitting election results before polling stations have closed. The purpose of the law is to prevent voters on the west coast from being influenced by outcomes on the east coast (or worse, choosing not to vote at all if the election has essentially be decided). Today however, with twitter, facebook and blogs, everybody is a potential “broadcaster.”

Westerner may have a hard time sympathizing with Election Canada’s quandary. It could simply do the equivalent to what the BBC is doing with its new TV shows: not post any results until after all the voting booths had closed. This is a much simpler approach then trying to police and limit the free speech of 10 million Canadian social media users (and to say nothing of the 100s of millions of users outside of Canada who do not fall under its jurisdiction).

More awkwardly, it is hard to feel that the missive wasn’t directed at the very cohort of Election’s Canada is trying to get engaged in elections: young people. Sadly, chastising and scaring the few young people who want to talk about the election with threats of fines seems like a pretty poor way to increase this engagement. If voting and politics is a social behaviour – and the evidence suggests that it is – then you are more likely to vote and engage in politics if you know that your friends vote and engage in politics. Ironically, this might make social media might be the best thing to happen to voting since the secret ballot. So not only is fighting this technology a lost cause, it may also be counter productive from a voter turnout perspective.

Of course, based on the experience many young voters I talk to have around trying to vote, none of this comes as a surprise.

In my first two Canadian elections I lived out of the country. Both times my mail in ballot arrived after the election and were thus ineligible. During the last election I tried to vote at an advanced poll. It was a nightmare. It was hard to locate on the website and the station ended up being a solid 15 minute walk away any of the three nearest bus routes. Totally commute time? For someone without a car? Well over an hour and a half.

This are not acceptable outcomes. Perhaps you think I’m lazy? Maybe. I prefer to believe that if you want people to vote – especially in the age of a service economy – you can’t make it inconvenient. Otherwise the only people who will vote will be those with means and time. That’s hardly democratic.

Besides, it often feels our voting infrastructure was essentially built by and for our grandparents. Try this out. In the 1960’s if you were a “young person” (e.g 20-30) you were almost certainly married and had two kids. You probably also didn’t move every 2 years. In the 60’s the average marriage age was 24 for men, 20 for women. Thinking in terms of the 1950s and 60s: What were the 3 institutions you probably visited on a daily basis? How about A) the local community centre, B) the local elementary school, and C) the local church.

Now, if you are between the age of 20 and 35 or under, name me three institutions you probably haven’t visited in over a decade.

Do young people not vote because they are lazy? Maybe. But they also didn’t have a voting system designed around them like their grandparents did. Why aren’t their voting booths in subway stations? The lobbies of office towers? The local shopping mall? How about Starbucks and Tim Hortons (for both conservatives and liberals)? Somewhere, anywhere, where people actually congregate. Heaven forbid that voting booths be where the voters are.

The fact is our entire voting structure is anti-young people. It’s designed for another era. It needs a full scale upgrade. Call it voting 2.0 or something, I don’t care. Want young people to vote? Then build a voting system that meets their needs, stop trying to force them into a system over a half century old.

We need voting that embraces the internet, social networks, voters without cars and voters that are transient.  These changes alone won’t solve the low voter turn out problem overnight, but if even 5% more young people vote in this election, the parties will take notice and adapt their platforms accordingly. Maybe, just maybe, it could end up creating a virtuous circle.

Back to Reality: The Politics of Government Transparency & Open Data

A number of my friends and advocates in the open government, transparency and open data communities have argued that online government transparency initiatives will be permanent since, the theory goes, no government will ever want to bear the political cost of rolling it back and being perceived as “more opaque.” I myself have, at times, let this argument go unchallenged or even run with it.

This week’s US budget negotiations between Congress and the White House should lay that theory to rest. Permanently.

The budget agreement that has emerged from most recent round of negotiations – which is likely to be passed by congress –  slashes funding to an array of Obama transparency initiatives such as USASpending, the ITDashboard, and data.gov from $34M to $8M. Agree or disagree, Republicans are apparently all too happy to kill initiatives which make the spending and activities of the US government more transparent as well as create a number of economic opportunities around open data. Why? Because they believe it has no political consequences.

So unsurprisingly, it turns out that political transparency initiatives – even when they are online – are as bound to the realities of traditional politics as dot.com’s were bound by the realities of traditional economics. It’s not enough to get a policy created or an initiative launched – it needs to have a community, a group of interested supporters, to nurture and protect it. Otherwise, it will be at risk.

Back in 2009, in the lead up to the drafting and launching of Vancouver’s Open Data motion I talked about creating an open-government bargain. Specifically, I argued that:

..in an open city, a bargain must exists between a government and its citizens. To make open data a success and to engage the community a city must listen, engage, ask for help, and of course, fulfill its promise to open data as quickly as possible. But this bargain runs both ways. The city must to its part, but so too must the local tech community. They must participate, be patient (cities move slower than tech companies), offer help and, most importantly, make the data come alive for each other, policy makers and citizens through applications and shared analysis.

Some friends countered that open data and transparency should simply exist because it is the right thing to do. I don’t disagree – and I wish we lived in a world where the existence of this ideal was sufficient enough to guarantee these initiatives. But it isn’t sufficient. It’s easy to kill something that no one uses (or in the case of data.gov, that hasn’t been given enough time to generate a vibrant user base). It’s much, much harder to kill something that has a community that uses it, especially if that community and the products it creates are valued by society more generally. This is why open data needs users, it needs developers, think tanks and above all, the media, to take interest in it and to leverage it to create content. It’s also why I’ve tried to create projects like Emitter.ca, recollect.net, taxicity and others, because the more value we create with open data for everyone, the more secure government transparency policies will be.

It’s use it or risk losing it. I wish this weren’t the case, but it’s the best defense I can think of.

City of Vancouver Wins Top Innovator Award from BC Business

To be clear, this is not top innovator among governments, this is top innovator among all organizations – for-profit, non-profit and government – in the province.

You can see the award write up here.

As the article states, Vancouver Open-Data initiative “floored the [judging] panel.” Indeed, one panellist stated: “I have never seen a municipality open to new ideas in my life. When was the last time any level of government said, Here are our books; fill your boots?”

Back in October BC Business asked me to write a think piece explaining open data, I ended up penning this piece entitled “The Difference Data Makes”. It fantastic to see the business community recognizing the potential of open data and how it could transform both the way government works, and the opportunities it poses for the private and non-profit organizations as well as citizens.

It’s a great data for the City of Vancouver and for Open Data.

Calgary Launches Business Plan and Budget App

So this is interesting. The City of Calgary has launched a Business Plan & Budget app for free from iTunes.

It’s a smart move as it creates an easy, “one button” option for citizens to participate in and learn about the city’s financial planning process. You can read (a tiny bit) more at the City of Calgary’s blog.

Looking more closely at the app, it doesn’t offer a huge amount but don’t dismiss it too quickly. Consolidating all the information into a single place and making it available to people on the go is a great starting point. Secondly, it is worth remembering that this is just a starting point – there is obviously lots to be learned about how to engage citizens online – especially using mobile technology. If this is done right, Calgary will be learning these lessons first, which means their 2nd and 3rd generation versions of the app and the process will be more sophisticated while others are left catching up (think of Apple and the iPad).

So while the app is fairly light on features today… I can imagine a future where it becomes significantly more engaging and comprehensive, using open data on the data and city services to show maps of where and how money is spent, as well as post reminders for in person meet ups, tours of facilities, and dial in townhall meetings. The best way to get to these more advanced features is to experiment with getting the lighter features right today. The challenge for Calgary on this front is that it seems to have no plans for sharing much data with the public (that I’ve heard of), it’s open data portal has few offerings and its design is sorely lacking. Ultimately, if you want to consult citizens on planning and the budget it might be nice to go beyond surveys and share more raw data and information with them, it’s a piece of the puzzle I think will be essential. This is something no city seems to be tackling with any gusto and, along with crime data, is emerging as a serious litmus test of a city’s intention to be transparent.

The possibilities that Calgary’s consultation app presents are exciting – and again it is early days – so it will be interesting if developers in Calgary and elsewhere can begin to figuring out how to easily extend and enhance this type of approach. Moreover, it’s nice to see a city venturing out and experimenting with this technology, I hope other cities will not just watch, but start experiments of their own, it’s the best way to learn.

 

Access to Information is Fatally Broken… You Just Don’t Know it Yet

I’ve been doing a lot of thinking about access to information, and am working on a longer analysis, but in the short term I wanted to share two graphs – graphs that outline why Access to Information (Freedom of Information in the United States) is unsustainable and will, eventually, need to be radically rethought.

First, this analysis is made possible by the enormous generosity of the Canadian Federal Information Commissioners Office which several weeks ago sent me a tremendous amount of useful data regarding access to information requests over the past 15 years at the Treasury Board Secretariat (TBS).

The first figure I created shows both the absolute number of Access to Information Requests (ATIP) since 1996 as well as the running year on year percentage increase. The dotted line represents the average percentage increase over this time. As you can see the number of ATIP requests has almost tripled in this time period. This is very significant growth – the kind you’d want to see in a well run company. Alas, for those processing ATIP requests, I suspect it represents a significant headache.

That’s because, of course, such growth is likely unmanageable. It might be manageable if say, the costs of handling each requests was dropping rapidly. If such efficiencies were being wrestled out of the system of routing and sorting requests then we could simply ignore the chart above. Sadly, as the next chart I created demonstrates this is not the case.

ATIPcosts

In fact the costs of managing these transactions has not tripled. It has more than quadrupled. This means that not only are the number of transactions increasing at about 8% a year, the cost of fulfilling each of those transactions is itself rising at a rate above inflation.

Now remember, I’m not event talking about the effectiveness of ATIP. I’m not talking about how quickly requests are turned around (as the Information Commissioner has discussed, it is broadly getting worse) nor am I discussing less information is being restricted (it’s not, things are getting worse). These are important – and difficult to assess – metrics.

I am, instead, merely looking at the economics of ATIP and the situation looks grim. Basically two interrelated problems threaten the current system.

1) As the number of ATIP requests increase, the manpower required to answer them also appears to be increasing. At some point the hours required to fulfill all requests sent to a ministry will equal the total hours of manpower at that ministry’s  disposal. Yes that day may be far off, but they day where it hits some meaningful percentage – say 1%, 3% or 5% of total hours worked at Treasury Board, may not be that far off. That’s a significant drag on efficiency. I recall talking to a foreign service officer who mentioned that during the Afghan prisoner scandal an entire department of foreign service officers – some 60 people in all – were working full time on assessing access to information requests. That’s an enormous amount of time, energy and money.

2) Even more problematic than the number of work hours is the cost. According to the data I received, Access to Information requests costs The Treasury Board $47,196,030 last year. Yes, that’s 47 with a “million” behind it. And remember, this is just one ministry. Multiply that by 25 (let’s pretend that’s the number of ministries, there are actually many more, but I’m trying to be really conservative with my assumptions) and it means last year the government may have spent over $1.175 Billion fulfilling ATIP requests. That is a staggering number. And its growing.

Transparency, apparently, is very, very expensive. At some point, it risks becoming too expensive.

Indeed, ATIP reminds me of healthcare. It’s completely unsustainable, and absolutely necessary.

To be clear, I’m not saying we should get rid of ATIP. That, I believe, to be folly. It is and remains a powerful tool for holding government accountable. Nor do I believe that requesters should pay for ATIP requests as a way to offset costs (like BC Ferries does) – this creates a barrier that punishes the most marginalized and threatened, while enabling only the wealthy or well financed to hold government accountable.

I do think it suggests that governments need to radical rethink how manage ATIP. More importantly I think it suggests that government needs to rethink how it manages information. Open data, digital documents are all part of a strategy that, I hope, can lighten the load. I’ve also felt that if/as government’s move their work onto online platforms like GCPEDIA, we should simply make non-classified pages open to the public on something like a 5 year timeline. This could also help reduce requests.

I’ve more ideas, but at its core we need a system rethink. ATIP is broken. You may not know it yet, but it is. The question is, what are we going to do before it peels off the cliff? Can we invent something new and better in time?