June 2007

You are currently browsing the monthly archive for June 2007.

Fixmystreet is a neat little project out of the UK, made to:

help people report, view, or discuss local problems they’ve found to their local council by simply locating them on a map. It launched in beta early February 2007.

You enter a postal code, are shown a map, click on the map, and add your comments about problems (graffiti, overflowing drains, broken lights, etc). An email is then sent to the local municipal council. As of today, 171 reports have been made in the past week, 381 problems have been fixed in the past month, and 2462 reports have been updated.

The project comes to you from MySociety, which:

builds websites which give people simple, tangible benefits in the civic and community aspects of their lives. For more info on our aims, click here.

I’m going to feature some Canadian data access projects and people working with data in Canada that I find interesting and important on datalibre.ca . Here is my first go at it. Hope you like it! It is about a great program called the Data Liberation Initiative (DLI) that was formally instituted in 1996. I greatly benefited from the DLI as an undergraduate student studying Geomatics at Carleton University.


Did you know that until the latter half of the 1990s students and faculty in Canadian Universities had to pay for Canadian Demographic Data that were collected with the use of their own tax dollars? Well it’s true! If students and faculty wanted access to Statistics Canada data to conduct any kind of demographic analysis, to study the labour market, or issues related to income and poverty, explore provincial migration patterns etc. they had to pay exorbitant amounts. What was the effect? Canadian students became US experts since their data were free and worse policy decisions for Canadians were based on US data! The real knowledge and social cost of Data Cost Recovery policies can never be recovered!

Why access to Canadian public data?

I think Professor Paul Bernard, Chair, Advisory Committee on Social Conditions (Statistics Canada) and member of the National Statistics Council said it well back in 1991:

…the genuine exercise of democracy increasingly requires that citizens get access to complex information and have the skills required to understand it.” While he realizes there are pressures on Statistics Canada to reduce costs and increase income, he feels the outcome has been the restriction of “…access to information only to groups that have the solid ability to pay.” Bernard feels that this may “…hamper the participation in public debates of groups whose contribution is not backed up by much money” as well as “those who have no prospect of turning a profit or reaping some tangible and relatively immediate benefit from using it.” This, he states, is “…likely to lead, in the long run, to suboptimal development and less than full-blown democracy.” (see Watkins).

Interestingly, since 1927 the Government of Canada did have a program to share Government information via the Depository Services Program (DSP) which is

an arrangement with some 680 public and academic libraries to house, catalogue and provide reference services for the federal government publications they acquire under the Program. These depositories must make their DSP collections available to all Canadians and for interlibrary loans. DSP also includes depositories such as Parliamentarians, central libraries of the federal government departments and press libraries.

The DSP however does not include the dissemination of public data files or databases collected and managed by the Government of Canada. Data users were and still are considered a special interest group. Odd! Numerate Canadian citizens a special interest group? Imagine literate Canadian citizens being considered a special interest group! Anyway, this meant that independent analysis on a variety of topics important to Canadians was left unquestioned, unstudied, ignored and unknown. Not the best scenario for a democracy or a knowledge based economy let alone for the promotion and growth of a numerate workforce and citizenry.

Fortunately, in 1993 we see the early formation of the Data Liberation Initiative (DLI). An early working group consisting of researchers, data librarians and representatives from Canadian Association of Research Libraries (CARL) and Canadian Association of Public Data Users (CAPDU) , Statistics Canada and the DSP as well as members of the Social Science Federation of Canada (SSFC) got together and held a series of meetings. In 1995 Statistics Canada gave the DLI its formal blessing and the DLI received Treasury Board approval in1996.

What is the Data Liberation Initiative?

The DLI a data purchasing consortium between Canadian Universities and Statistics Canada. Large universities pay $12,000 per year and smaller universities pay $3,000. The Treasury Board of Canada, Industry Canada, Health Canada, Human Resources Development Canada, Social Sciences and Humanities Research Council of Canada, Medical Research Council of Canada and Statistics Canada also financially contribute. These institutions subscribe to the service.

The DLI provides

affordable and equitable access to the standard data products listed in the Statistics Canada Catalogue through an annual subscription fee. The terms of agreement specified in the DLI license place conditions on the use of products disseminated through this program. These restrictions are directed at stopping the redistribution of data received through this channel and protecting against the loss of sales to non-educational markets for Statistics Canada, which is known within Statistics Canada as “leakage”. The license allows the use of DLI data for non-profit, academic research and instruction. Access to statistical information through DLI does require student or staff affiliation with a DLI member institution. While students and staff do not have to pay directly for access, DLI does require mediated services to disseminate statistical and data products on local campuses.

How does it works:

Students and Faculty go to their respective data libraries , consult with the data librarian, sign a use agreement in plain english a DLI Data Use License, access the data via a dedicated computer and download what they need.

The Infrastructure:

An elaborate organizational structure with very dedicated members is in place with a data delivery technical infrastructure that includes a web site, an FTP service, CDRom data delivery service and a special order process. In addition each participating university institutes a ‘data service’ which assumes responsibility for DLI at their site. The project is also glued together with two listserves. The data files are delivered in ASCII formats with associated metadata discoverable using StatCan Software at dedicated workstations in the Library.

Critical Note:

The DLI was and is the best possible reaction and compromise to the very restrictive data cost recovery policies initiated in 80s that remain alive and well with us today. It is important to repeat that these public data have already been paid for by taxation, they are re-paid for with tuition and DLI data access is restricted only to Canadians who are university students and faculty. The DLI solved one very important Canadian knowledge creation and dissemination issue in academic institutions but not the broader issue of access to data by Canadian citizens. They did set a precedent!

Statistics Canada data are still sold to Federal Departments, Provincial Governments and Municipal Governments who are not allowed to share between and among them due to very stringent licensing regimes. Our taxes have paid for many of the same datasets multiple times since these are government purchases and transactions. Just think of all the bureaucracy to manage these license regimes, royalties, the lawyers, purchasing and accounting services, storage, and so on. In addition civil society organizations such as Non Governmental Organizations, Non Profit Organizations, Community Based Researchers etc. who are not wealthy yet fulfill an important democratic function cannot afford these data while it is their role to keep government accountable on a variety of issues (e.g. Environment, Homelessness, Education etc.). Further citizens who want to learn about their communities, develop a community plan or start a new business want access to data but can only do so if they have a significant amount of cash to do so. The result – a lack of informed decision making.

Dream Idea:

It would be fantastic to have the knowledge, training and infrastructure of the DLI extended to all of our public libraries and community access points. Imagine knowledge one stop shopping – picking up a video, a music CD, a novel and some demographic data related to school closures in your neighbourhood – Wow! Of course, the data should be at no cost to the citizen nor the library. Also, imagine having a data librarian in every library that can help citizens find the data they need and helping them learn how to use them? Now that is a knowledge Society.


You can access the documents I referred to here – my del.icio.us – tagged with datalibre civicaccess and DLI.

Continuum of Access, By Chuck Humphrey, University of Alberta.

Charles Humphrey (2005). Collaborative Training in Statistical and Data Library Services: Lessons from the Canadian Data Liberation Initiative. Resource Sharing & Information Networks, Vol. 18 (1/2), pp. 167-181.

This is an older post (May 2007) from Michael Geist, but well worth a read, regarding Canada’s National Science and Tech Strategy. He argues that opening up government-funded R&D data will result in more innovation. He has two specific recommendations:

  1. the government should identify the raw data under its control and set it free. Onerous licensing conditions are a hinderance to commercialization and accountability for taxpayer funded research.
  2. Federal research granting institutions should build open access requirements into their research mandates.

From the post:

I argue that maximizing the value of Canada’s investment in research requires far more than tax breaks and improved accountability mechanisms. Instead, government must rethink how publicly-funded scientific data and research results flow into the hands of researchers, businesses, and individuals.

Achieving that goal requires action on two fronts. First, the government should identify the raw, scientific data currently under its control and set it free. Implementing expensive or onerous licensing conditions for this publicly-funded data runs counter to the goals of commercialization and to government accountability for taxpayer expenditures….

Second, Ottawa must pressure the three federal research granting institutions to build open access requirements into their research mandates.


Mix up and make pretty your data at Swivel:

Swivel’s mission is to liberate the world’s data and make it useful so new insights can be discovered and shared…

We believe data is most valuable when it’s out in the open where everyone can see it, debate it, have fun, and share new insights. Swivel is applying the power of the Web to data so that life gets better.

UPDATE: The graph below is titled: “The iPhone: did it shake up the phone market?”, and can be found here, with some added context/


This presentation is not actually about podcasting, it’s about data…but it was presented at podcastersacrossborders, and LibriVox is the inspiration for these thoughts.


Today’s New York Times has a story on regional variation in the availability and cost of health care. The story is accompanied by a “multimedia interactive graphic” — that is, a Flash visualization that chartsvariables on a U.S. map …For each mapped variable, mousing over the displayed hospital referral regions yields the local, state, and national values for that variable.

It’s nicely done. There’s no question that, as of mid-2007, this is cutting-edge data interactivity for the mainstream. But times are changing fast. The Times sourced this data from the Dartmouth Atlas of Health Care. It took me five minutes to download the surgical data, upload it to Dabble DB, and publish a similar map along with a complete tabular dump.

[more …]

The great and famous Rosling Video, about data, from TED.

Not canadian but could be?  We have the best Radarsat data in the world  and have done some great work in the past with tracking down toxic bins floating around in flood zones using radar.  Radar is the only remote sensing technique that will cut through rain, fog, and cloud cover thus ideal during tropical storms, or for rainforest imagery.

The funding mechanism is also very interesting.