Most often when people think of the web they think of it as a place to get new information. Companies are told they must constantly update their website while customers and citizens look for the latest updates. But because the web is relatively new, it is strongly biased towards digitally displaying and archiving “new” information.
What happens when the web gets older?
One possibility… it could change how we study history. Again, nothing is different per se – the same old research methods will be used – but what if it is 10 times easier to do, a 100 times faster and contains with a million time the quantity of information? With the archives of newspapers, blogs and other websites readily available to be searched the types of research once reserved for only the most diligent and patient might be more broadly accessible.
Consider this piece in the New York Times published on November 5th 1999. It essentially defines ground zero of the financial crisis:
Congress approved landmark legislation today that opens the door for a new era on Wall Street in which commercial banks, securities houses and insurers will find it easier and cheaper to enter one anothers businesses.
The measure, considered by many the most important banking legislation in 66 years, was approved in the Senate by a vote of 90 to 8 and in the House tonight by 362 to 57. The bill will now be sent to the president, who is expected to sign it, aides said. It would become one of the most significant achievements this year by the White House and the Republicans leading the 106th Congress.
”Today Congress voted to update the rules that have governed financial services since the Great Depression and replace them with a system for the 21st century,” Treasury Secretary Lawrence H. Summers said. ”This historic legislation will better enable American companies to compete in the new economy.”
Here is what may be the defining starting point of the financial crisis. The moment when the tiny little snowball was gently pushed down the hill. It would take 10 years to gather the mass and momentum to destroy our economy, but it had a starting point. I sometimes wish that the New York Times had run this article again in the last few months, just so we could get reacquainted with the individuals – like Larry Summers – and political parties – both – that got Americans into this mess.
Indeed, as an aside, it’s worth noting the degree by which the legislation passed. 90 votes to 8 in the senate. 362 votes to 57 in the House. There was clearly a political price to pay to vote against this bill. Indeed, it fits in nicely with the thesis Simon Johnson outlined in his dark, but important, piece The Quiet Coup:
“…these various policies—lightweight regulation, cheap money, the unwritten Chinese-American economic alliance, the promotion of homeownership—had something in common. Even though some are traditionally associated with Democrats and some with Republicans, they all benefited the financial sector”
Still more fascinating is how accurately the legislation’s detractors predicted it’s dire consequences. Check out Senator Dorgan’s comments at the time:
”I think we will look back in 10 years’ time and say we should not have done this but we did because we forgot the lessons of the past, and that that which is true in the 1930’s is true in 2010,” said Senator Byron L. Dorgan, Democrat of North Dakota. ”I wasn’t around during the 1930’s or the debate over Glass-Steagall. But I was here in the early 1980’s when it was decided to allow the expansion of savings and loans. We have now decided in the name of modernization to forget the lessons of the past, of safety and of soundness.”
‘Scores of banks failed in the Great Depression as a result of unsound banking practices, and their failure only deepened the crisis,” Mr. Wellstone said. ”Glass-Steagall was intended to protect our financial system by insulating commercial banking from other forms of risk. It was one of several stabilizers designed to keep a similar tragedy from recurring. Now Congress is about to repeal that economic stabilizer without putting any comparable safeguard in its place.”
And of course, it worth remembering what the legislation’s supporters said in response:
Supporters of the legislation rejected those arguments. They responded that historians and economists have concluded that the Glass-Steagall Act was not the correct response to the banking crisis because it was the failure of the Federal Reserve in carrying out monetary policy, not speculation in the stock market, that caused the collapse of 11,000 banks. If anything, the supporters said, the new law will give financial companies the ability to diversify and therefore reduce their risks. The new law, they said, will also give regulators new tools to supervise shaky institutions.
”The concerns that we will have a meltdown like 1929 are dramatically overblown,” said Senator Bob Kerrey, Democrat of Nebraska.”
What is most fascinating about this piece is that it shows us how the financial crisis wasn’t impossible to predict, that it didn’t come out of nowhere and that it could have been eminently preventable. We simply chose not to.
It also goes back to the type of journalism that I believe we are missing today and that I wrote about in my post on the Death of Journalism. Here is a slow moving crisis, one that is highly complex, but not impossible to see. And yet we chose not to “see it.”
This, I believe, has to do with the fact that today, much of our journalism is gotcha journalism (or what Gladwell refers to as mysteries). It looks to finding the insider or the smoking gun that will bust open the story. I suspect that in a networked world – one of increased complexity and interconnectedness – finding the smoking gun is irrelevant. For an increasing number of stories there simple is no smoking gun. There are whole series of cascading action that are what Galdwell calls open secrets. Our job is to “see them” and painstakingly connect the dots to show how our decisions are allowing for the scary and unpredictable event – the black swan event – to become a near certainty.
What the above article shows me is that while the very tools and forces that make these scary events more likely – the internet, globalization our interconnectedness – they may also make the the open secrets easier to identify.
This is an interesting post, and it highlights the importance of archival in the digital age. Whereas you can go back and retrieve many articles from the NY Times online, and you can go to the library and look at the fiche of the Globe & Mail from 1982, we must consider what happens if those assets vanish. You tweeted yesterday (http://twitter.com/david_a_eaves/status/1654845327) about the withering National Post. What if it does die, or what if the NY Times does? What happens to all that content, your potential archive? Of course, with properties like those, you can probably go to libraries to look at information because it was, at one point, printed on paper. And, you can try looking for some things in the Internet Archive's WayBackMachine (http://www.archive.org/); however, lots of stuff that might be useful (cf. recent news about trying to archive Geocities http://www.thewhir.com/web-hosting-news/042909_…) is ephemeral, and as more news shifts to an online-only format, content may become even more ephemeral.You're right that analysis of data can expose risks, but one of the risks that we seem to being a fine job of ignoring is the preservation of data such that analysis is even possible. Probably not everything needs to be archived, but who decides what of today is important in 2050? Or even 2015?
Unfortunately, this gets significantly harder with gov docs once you want to go back beyond the current government and do any kind of systematic research. URLs are apt to change over time and when talks and speeches have pithy titles like “Remarks before dinner with the President of France” googling is not always straightforward. There is very little consistency between government departments or across time about where they archive things like old speeches and websites.For instance, the US Dept of State points you quite easily to a “Former Secretaries of State” website where you can access archived websites and speeches, while White House practice seems to be to maintain a separate website for the previous President's tenure, before moving them over to the national archive. The military, by contrast, archives all of their speeches onsite; I don't know what happens to previous version of websites.Former Canadian Prime Minister's Speeches used to be linked directly from the PCO website; now they are stored at the National Archives' Website. DFAIT has only very, very recently linked its archived speeches from the DFAIT website; previously you had go to the national archive website (though I think that's fairly new as well) or call the department historian. In Canada, the National Archive site is improving its web archiving, but the interface is still not straightforward. The Germans and the French are better at this than we Canadian/Americans are (at least as it concerns foreign policy) but we can take heart that the Brits are much, much worse (except when it comes to Hansard, which is currently digitized back to 1803. Awesome.)
Google News Timelines takes a first stab at showing the lineage of a topic across news, blogs video etc. I think tools that built on this so that you could create communities of interest around a topic or issue might help address some of the issues you're talking about.
Pingback: Daily News About Mr Irrelevant : A few links about Mr Irrelevant - Thursday, 30 April 2009 12:06
Another artifact of the web is the explicit semantic association of content that is possible only with rich meta-data that would be impossible in a newspaper. The web allows newspapers to take on the attributional rigor of academic papers without sacrificing readability – they can quote sources and link directly do them, cite articles etc. This web of content is quite valuable and is otherwise impossible with paper based articles (it also encourages a greater accountability amongst journalists).Microfiche digitization is (relative to say books) a reasonably straightforward process. UofT houses the largest digitization facility in the country and can process thousands of fiches in a matter days. What becomes the issue is storing and distributing the content – one must avoid any single-points of failure. Another problem is future-proofing, data must be accessible and readable by future software, something that often breaks (anyone try opening up a RealMedia file from 1996 with a 2009 RealPlayer?) open source has a part to play here, as does opening up the specifications for deprecated file-formats.
Pingback: Twitted by entwistletx
Pingback: PrebleNY.com
Thank you for pointing out both the archival value of the web, but also the fact that all those who said, 'there was no way to see this coming' are in the same criminal league as, 'No one could have thought they'd use planes and run them into buildings…'Can we please, finally, get rid of those who refuse to understand and act on complexities? Or at the very least, remove them from power? I make the connection to religious 'belief'. We tell lies, fantasies to children and then a few short years later, expect them to act rationally. It's a hopeless situation, really.
Pingback: Twitted by Tom_Chippendale
Pingback: Treating the Web as an Archive | External Brain