infrastructure

You are currently browsing the archive for the infrastructure category.

The City of Vancouver will soon vote on a Motion to have:

  • Open Standards
  • Open Source
  • Open Data
  • CBC News: Vancouver mulls making itself an ‘open city’, by Emily Chung

    Via: Digital Copyright Canada

    It is quite surprising that this was not the norm, to manage the public good!

    the Federal Court of Canada released late yesterday that it will force the federal government to stop withholding data on one of Canada’s largest sources of pollution – millions of tonnes of toxic mine tailings and waste rock from mining operations throughout the country.

    The Federal Court sided with the groups and issued an Order demanding that the federal government immediately begin publicly reporting mining pollution data from 2006 onward to the National Pollutant Release Inventory (NPRI). The strongly worded decision describes the government’s pace as “glacial” and chastises the government for turning a “blind eye” to the issue and dragging its feet for “more than 16 years”.

    I look forward to reading the court order. According to Ecojustice (Formerly the Sierra Legal Defence Fund) the ruling includes the following strong wording:

    * It calls the federal government’s pace “glacial”[paragraph 145];
    * It says the government’s approach has been simply to turn a “blind eye”[207];
    * It notes that the frustration felt by advocates trying to uncover this information “after more than 16 years of consultation” is “perfectly understandable” [124];
    * It states that not reporting “denies the Canadian public its rights to know how it is threatened by a major source of pollution”[127];
    * It highlights that the minister has chosen not to publish the pollution data “in deference to” the mining industry[220];
    * It used unusually simple language even I understand when it said that the government was simply “wrong”[177].

    The advocates were: Justin Duncan and Marlene Cashin and their dedicated clients at Great Lakes United and Mining Watch Canada who launched the case in 2007.

    It is uncertain how these data will be released. Currently, these types of pollutant data are released on the National Pollutant Release Inventory (NPRI) which is:

    The National Pollutant Release Inventory (NPRI) is Canada’s legislated, publicly accessible inventory of pollutant releases (to air, water and land), disposals and transfers for recycling. (Mining Watch)

    The NPRI is fairly usable & accessible, includes georeferencing and some mapping services. I tried to use their library and it was however not working!

    The Mining Association of Canada wants to read the ruling “carefully” to assess how Environment Canada should release these data. I find this confusing, since I thought the Government got to decide how these data are to be released and what is to be included, and that decision was based on ensuring the public good and the public right to know. The fight is not yet quite over. It will be important to ensure the data are not watered down for public consumption.

    It is another wonderful example of creating an infrastructure – NPRI + law – to distribute public data. This also teaches us something about gouvernementalité, and who the government thinks with, in this case the mineral and mining industry and not citizens. Citizens should not have to lobby for 16 years and expend incredible resources to get the courts to get the government to ensure the public good!

    Articles:

  • Court orders pollution data from mining made public, By Juliet O’Neill, Canwest News ServiceApril 24, 2009
  • Environment Canada forced to reveal full extent of pollution from mines
    Court ruling considered major victory for green organizations
    , MARTIN MITTELSTAEDT, Saturday’s Globe and Mail, April 24, 2009
  • Great Lakes United Press Release, Court victory forces Canada to report pollution data for mines, April 24, 2009 – 11:16am — Brent Gibson
  • Mining Watch Press Release: Court Victory Forces Canada to Report Pollution Data for Mines, Friday April 24, 2009 11:31 AM

    John Chambers, CEO of CISCO on what the future holds, from MITWorld. He thinks we are about to see the most fundamental change in businesses and government that we’ve ever seen, moving from command and control to collaboration and teamwork.

    Canada Institute for Scientific and Technical Information (CISTI) is

    Canada’s national science library and leading scientific publisher, provides Canada’s research and innovation community with tools and services for accelerated discovery, innovation and commercialization.

    CISTI delvers science data and information to Canadians online, in the Depository Service and as paper delivery service to researchers in Universities.  But its days of doing that are numbered…

    CISTI has just suffered very serious budget cuts – 70% cut – that affects scientific innovation, access to scientific data, the dissemination of Canadian Science and open access publishing.

    The Government of Canada and the National Research Council of Canada have decided that the journals and services of NRC Research Press will be transferred to the private sector.

    Privatization? In a sense they are a victim of their own success.  The NRC frames it as follows in a letter to their clients (e.g. Depository Service Program):

    this transformation is not the development of a “new business” but the movement of a successful program into a new legal and business environment. It is our belief that this new environment will afford us more flexibility to manage our publishing activities.

    More flexibility to reduce services to Canadians more like it since the Depository Services Program (DSP) and the delivery of online access to journals to Canadians cannot be funded by an entity outside of the Federal government, and it is expected that the termination date to journals delivered in this way will be sometime in 2010.

    This means less access to scientific journals to Canadians. Research Canadians have paid for!  CISTI journals deposited in the DSP were important, since the DSP’s:

    primary objective is to ensure that Canadians have ready and equal access to federal government information. The DSP achieves this objective by supplying these materials to a network of more than 790 libraries in Canada and to another 147 institutions around the world holding collections of Canadian government publications.

    In addition, hundreds of government jobs – scientists, librarians and researchers are expected to be lost.  The budget cut is $35 million in annual expenditures.

    This plan includes a reduction in NRC’s a-base funding totalling $16.8 million per year by 2011-2012 (announced in Budget 2009) as well as reductions in revenue-generating activities.

    Hmm! Wonder what our current Federal Minister of State for Science and Technology’s thoughts are about science?

    Here are a couple of articles:

    Actions:

    Here are a few articles:

  • NRC cuts could affect 300 positions, The Ottawa Citizen
  • Access to CISTI Source to End
  • Action:

    Wi-fi structures and people shapes, from Dan Hill:

    One of the ideas I’ve been exploring relates to how urban industry – in the widest sense of the word – in the knowledge economy is often invisible, at least immediately and in situ. Whereas urban industry would once have produced thick plumes of smoke or deafening sheets of sound, today’s information-rich environments – like the State Library of Queensland, or a contemporary office – are places of still, quiet production, with few sensory side-effects. We see people everywhere, faces lit by their open laptops, yet no evidence of their production. They could be using Facebook, Photoshop, Excel or Processing. [more…]

    wifi structures

    CityGML:

    The City Geography Markup Language (CityGML) is a new and innovative concept for the modelling and exchange of 3D city and landscape models that is quickly being adopted on an international level. CityGML is a common information model for the representation of 3D urban objects. It defines the classes and relations for the most relevant topographic objects in cities and regional models with respect to their geometrical, topological, semantical and appearance properties. Included are generalization hierarchies between thematic classes, aggregations, relations between objects, and spatial properties. In contrast to other 3D vector formats, CityGML is based on a rich, general purpose information model in addition to geometry and graphics content that allows to employ virtual 3D city models for sophisticated analysis tasks in different application domains like simulations, urban data mining, facility management, and thematic inquiries. Targeted application areas explicitly include urban and landscape planning; architectural design; tourist and leisure activities; 3D cadastres; environmental simulations; mobile telecommunications; disaster management; homeland security; vehicle and pedestrian navigation; training simulators; and mobile robotics. [more…]

    Mobile Millenium:

    In a partnership between Nokia, NAVTEQ, and UC Berkeley, coordinated by the California Center for Innovative Transportation (CCIT) and supported by the U.S. Department of Transportation and Caltrans, researchers have constructed an unprecedented traffic monitoring system capable of fusing GPS data from cell phones with data from existing traffic sensors. The research and development phase of this project was dubbed Mobile Millennium for the potential thousands of early adopters who will participate in the pilot deployment, launching in early November, 2008. [more…]

    The visuals I saw while watching the US elections on the tele on Tuesday were just plain dazzling.  Lots of speculative data, predictions, interactivity leading to scenarios and more speculation on the results, good visualizations, resulting from a visualization dissemination and creation infrastructure which manufactures the geographic imagination of the US Nation.  Obama stated in the speech that won him the candidacy for the Democrats (UK Guardian)

    that there were no red states, no blue states, only the United States.

    The maps we saw on US election night however, were all about blue and red differences.

    Map of results by state

    Map of results by state

    Zooming into county maps shows a different picture where colour speckles add up to a uniform blue for Ohio on the state map above. Many voices are not seen on the state map, the county map shows lots of diversity, as would the sub county map.  Maps tell all sorts of stories and can portray silences or consensus where in fact cacophonies and polarities exist. The county map looks way more red than blue for the Democratically won state of Ohio.

    Speckles of red and blue in Ohio became a uniform Blue

    Speckles of red and blue in Ohio became a uniform Blue

    Reading about the US Electoral system helps explain how this works out.

    The map in popular culture is key to the formation of the collective imagination of the nation.  I do wonder if viewers will actually think that Hawaii and Alaska are really located in the ocean south of Arizona instead of one connected to Canada’s North and the other in the middle of the Pacific!

    1 square = 1 electoral vote

    1 square = 1 electoral vote

    Information Aesthetics produced an excellent blog post which includes links to numerous electoral visuals.  Watching this also highlighted the lack of maps and visuals during the Canadian 2008 Elections.  Eventually I did see a map on the Tele, around 11:30 PM on Radio Canada, while CBC showed none!

    I just came across Many-Eyes which is a really great online collaborative data visualization tool designed by the IBM Collaborative User Experience (CUE) Visualization Collaboration Lab.

    You essentially contribute a dataset and use their online visualization tools to see what you’ve got. A colleague added these Canadian City datasets and it was truly very easy and helpfull to find different ways to tease out patterns and to assess the best way derive a story from them.  The results provided us with a boundary object to facilitate our discussions on how we will design a report.

    The options are great as you can create contemporary tag clouds, treemaps, network maps, flow lines, bubble charts, block histograms as well as your usual line graphs, pie charts and bar graphs.  They even have some rudimentory choropleth mapping tools.  You can view multiple variables and time series for a particular dataset which allows you to see change.

    In our case, we will probably play with these tools and also excell graphs, present these to our graphic designers who will trace them into the look and feel of our report.  The best part is to know that we can communicate effectively without robbing the visualization bank and by moving forward on more interesting ways to tell our stories.

    Enjoy!

    Remember hearing about SETI@home? Check out, and download, Gridrepublic.org:

    GridRepublic members run a screensaver that allows their computers to work on public-interest research projects when the machines are not otherwise in use. This screensaver does not affect performance of the host computer any more than an ordinary screensaver does.

    By aggregating idle resources from users around the world, we create a massive supercomputer.

    Gridrepublic is built on the system that started as SETI@home, which was turned into a general distributed computing platform BOINC. Gridrepublic is a central place for all projects using this distributed platform, where you can dowload & install the system and even better, choose which projects your computer’s idle time will be supporting, including:

    Einstein@home: you can contribute your computer’s idle time to a search for spinning neutron stars (also called pulsars) using data from the LIGO and GEO gravitational wave detectors.

    BBC Climate Change: The same model that the Met Office uses to make daily weather forecasts has been adapted to run on home PCs. The model incorporates many variable parameters, allowing thousands of sets of conditions. Your computer will run one individual set of conditions– in effect your individual version of how the world’s climate works– and then report back to the research team what it calculates. This experiment was described on the BBC television documentary Meltdown (BBC-4, February 20th, 2006). Note: workunits require several months of screensaver time; faster computers recommended.

    Rosetta@home: needs your help to determine the 3-dimensional shape of proteins as part of research that may ultimately contribute to cures for major human diseases such as AIDS / HIV, Malaria, Cancer, and Alzheimer’s.
    Proteins@Home: investigating the “Inverse Protein Folding Problem”: Whereas “Protein Folding” seeks to determine a protein’s shape from its amino acid sequence, “Inverse Protein Folding” begins with a protein of known shape and seeks to “work backwards” to determine the amino acid sequence from which it is generated.

    Quantum Monte Carlo: Reactions between molecules are important for virtually all parts of our lives. The structure and reactivity of molecules can be predicted by Quantum Chemistry, but the solution of the vastly complex equations of Quantum Theory often require huge amounts of computing power. This project seeks to raise the necessary computing time in order to further develop the very promising Quantum Monte Carlo (QMC) method for general use in Quantum Chemistry.

    Donate here.

    « Older entries § Newer entries »