Andrew Leonard in How The World Works recently posted on how computers and the availability of data is changing the study of Economics, and I have to agree. There are a number of forces that are converging to make this happen right now.
Open Government initiatives are making more data available and the internet makes it easier to get the data. Emerging movements like the Open Data Commons emulate the Open Source movement that has made software more available. The Open Data movements are concerned to not only make the data more openly available but to make the data better by providing tools to manage it and inspection so that problems with the data can be corrected.
Web sites to make data available have been around for some time. For example, Numbary.com exists to make public data more available. Sites like Many Eyes and Swivel allow the user to upload data sets and analyze them. You do not need to find your own data sets because you can go to these sites and play around with data sets that others have uploaded.
Several popular books have shown us what can be done. The best known example is Freakonomics, which takes a number of interesting data sets and shows us how they can be analyzed to tell interesting and sometimes quite startling stories. Less flamboyant and more educational is Super Crunchers, subtitled "why thinking-by-numbers is the new way to be smart".
Leonard suggests that the rise of large scale data analysis will displace the old guard who sit in their ivory towered and built model. I have to disagree. The economic model is the explanation of what is happening, the result of analysis. Building a model to explain some aspect of the data or behavior that is brought to light by the data is the result of Analytics. More and better data means that the models will be better, more definitive and most importantly in a fractious discipline, more defensible.