Open Government

You are currently browsing articles tagged Open Government.

I was just awarded a small but not insignificant award as part of the Carleton University COVID-19 Rapid Response Research Grants. Below is a description of what I will be up to, along with some great students and expert advisors.  I will share everyone’s names later.  Results of the work will be published here as it becomes available!  Stay tuned. Also, let me know if you want to contribute in any way! Tracey dot Lauriault at Carleton dot CA

Research Summary

There is much official COVID-19 data reporting by federal, provincial, territorial and Indigenous Communities. As the pandemic evolves, and more information comes to light, there is a call to add data attributes about Indigenous, Black and Racialized groups and of the affected labour force, and to report where cases predominate. The pandemic also revealed that foundational datasets are missing, such as a national list of elder care homes, maps of local health regions and data about the digital divide. This project will embrace technological citizenship, adopt a critical data studies theoretical framework and a data humanitarian approach to rapidly assess data shortfalls, identify standards, and support the building of infrastructure. This involves training students, conducting rapid response research, developing a network of experts, learning by doing and a transdisciplinary team of peer reviewers to assess results. The knowledge will be mobilized in open access blog posts, infographics, policy briefs and scholarly publications.

Research challenge:

Official COVID-19 public heath reports by Federal, Provincial, and Territorial (F/P/T) and First Nation Communities are uneven and there are calls to improve them ( 1 CBC News, Toronto Star). Asymmetries can be attributed to dynamically evolving challenges associated with the pandemic, such as working while practicing social distancing; jurisdictional divisions of power in terms of health delivery; and responding to a humanitarian crisis, where resources are stretched and infrastructures are splintered (i.e. digital divide, nursing home conditions).

The Harvard Humanitarian Initiative (HHI) developed a rights-based approach to the management of data and technologies during crisis situations which includes the right to: be informed, protection, privacy and security, data agency and rectification and redress (2). These apply to contact tracing (3 ITWorld, Scassa) and to equity groups calling for demographic data (1). Other have conducted rapid response data reporting, for example after the Haiti Earthquake volunteers developed real-time crowdsourcing data collection systems to support humanitarian responders (4 Meier) and WeRobotics mobilizes local drone expertise to objectively assess proposed pandemic response technologies (5 WeRobotics).

This research will apply a critical data studies (CDS) theoretical framework (6 Kitchin & Lauriault), the principles of the HHI and, practice technological citizenship (7 Feenbert) to the study of the Canadian COVID-19 data response. Lauriault will leverage her expertise and Canadian and international network of open data, open government, civic technology experts in government, civil society, and Indigenous Communities (see CV) as seen in the policy briefs published on DataLibre.ca (8) to rapidly assess and support COVID-19 data management and reporting.

The objective is to carry out the following activities:

  1. Compare official COVID-19 public health data reports to identify gaps and best practices (9 Lauriault & Shields).
  2. Identify and support the building of framework datasets to standardize reporting (10 Lauriault).
  3. Analyze data standards and protocols to support data management, interoperability and cross-jurisdictional reporting (11 GeoConnections).
  4. Publish case-studies, resources, an archives of official reporting, and a glossary and
  5. Rapidly conduct expert analysis, peer review, knowledge mobilization and provide evidence-based recommendations to improve data reporting.

The rationale for this research is as follows:

  1. Official COVID-19 public health data are inconsistently reported, impeding comparability, and the ability to assess impact and target actions. Also, predictions missed seniors’ homes, precarious labour, and Indigenous communities and social determinants (12 Global News, NCCDH), resulting in an increase in cases and deaths. Currently job classifications and Indigenous, Black, and Racialized people classifications (13 CTV News) remain absent. This research will create a corpus of F/P/T and Indigenous Communities’ official reports, compare results, identify gaps.
  2. Framework data are standard information infrastructures upon which other analysis can consistently be done (14 Toronto Star). When this is lacking analysis is impeded, for example there is no national reporting by health region since no national framework dataset exists (15 Lauriault), and mitigating the digital divide is thwarted with a lack of broadband maps (16 Potter & Lauriault et al.). Other missing national datasets include senior care facilities, homeless shelters, precarious labour, and Indigenous Communities (17 Gaetz et al.). Needed framework datasets will be identified and if necessary coordinate their building (18 SPCOStatCan LODE), advocacy for the opening of public datasets such as corporate registries may be carried out (19 Fed. Registry,  Open Corporate, Open Contracting), and experts from public health , social planning, and Indigenous Communities will help identify localized frameworks.
  3. Consistent COVID-19 reporting requires an interoperable infrastructure which builds upon standards developed through consensus processes (20 CIHI, PHAC). Current uneven reporting may be attributed to a lack of standards adoption and formalization in terms of data flows. This research will develop a repository of standards and protocols and share these with decision-makers to improve interoperability (i.e. Data Standards for the Identification and Monitoring of Systemic Racism (21 ON Govt) and FNIGC OCAP Principles (22 FNIGC)).
  4. Rapidly mobilizing knowledge is important to improve reporting and manage data, and to build a crisis data reporting infrastructure for the future. This project will compile, and archive information, rapidly assess and peer review results with experts and report results on DataLibre.ca and other websites, will produce infographics and policy briefs, deliver online webinars, and help administrators and Indigenous Communities improve their data and technology policies.

A CDS framework recognizes that data have social and material shaping qualities and that they are never politically neutral while also being inseparable from the people and institutions who create them including practices, techniques, and infrastructures. This involves a team of data, technology, legal, social and health, and Indigenous experts to rapidly assess official COVID-19 data assemblages and to act as technological citizens by applying knowledge in real time and mobilize results to mitigate the data shortfalls witnessed during this crisis and support decision makers to respond with a data humanitarian and rights-based approach for now and to better respond in the future.

Expected Impact:

The target audience for this rapid response data and technology reporting is F/P/T public officials and Indigenous Community Leaders who manage public health, socio-economic, statistical and official record data flows; and civil society actors and the public involved in open data, open government and open contracting, transparency and accountability. This includes C-class executives, chief technology, information data, and digital officers.

The outcome of this research is to standardize and improve humanitarian crisis data management and data reporting in the short term to ensure consistent reporting, and in the long term establish standardized data workflows and operationalize data infrastructures for this pandemic in preparation for the next.

The timing to compile, inventory and build an open access archives of official data reporting is now as the fractures in the system have become apparent in real-time and have had negative consequences. It is important to monitor the response as it evolves so as to be able to improve it while our collective institutional memory is fresh and to have the evidence available as a reminder for if and when we forget, but also to build more robust systems.

The results of this research will be continuously reported and made openly accessible as it becomes available and will lead to the formation of a new research team.

Tags: , , , , , , ,

Below is and excerpt from a blogpost on the Programmable City website.  I work there now, and post quite a bit of open data, big data, data infrastructure posts there.  Most do not include any CanCon so I do not always put them here.  The Open Government Partnership is big for the Federal Government in Canada, and the OGP Independant Reporting Mechanism report by the Independant Reviewer Dr. Mary Francoli, was not particularly kind to our Action Plan, and rightly so.  The OGP is however not that big a deal on the ground or with civil society in Canada.  It is however really important elsewhere, in Ireland for example, the EU and the OGP are leveraged as a way to bring and promote progressive practices, regulation, laws, and so on.  In developing countries, it is a way for civil society organizations to have a voice and meet officials they would otherwise not get to interact with at home, and again have a transnational organization promote change.

I will try and post here more often!  Took me time to adjust to my new home.  Rest assured though, that I have not forgotten you nor do I not pay attention to the data shenanigans ongoing in Canada!

********************************

I attended the European Regional Meeting of the Open Government Partnership at the Dublin Castle Conference Centre in May of this year.  The meeting was a place for performance and evaluation wonks to show their wares, especially at the following sessions: Open Government Standards and Indicators for Measuring Progress, The EU’s Role in Promoting Transparency and Accountability and Engagement with the OGP, and Open Contracting: Towards a New Global Norm.  I did not attend the Independent Reporting Mechanism (IRM) sessions, but having read the IRM report for Canada, I know that it too is an emerging performance evaluation indicator space, which is affirmed by a cursory examination of the IRMs two major databases.  The most promising, yet the most disappointing session was the Economic Impact of Open Data session.  This is unfortunate as there are now a number of models by which the values of sharing, disseminating and curating data have been measured.  It would have been great to have heard either a critical analysis or a review of the newly released Ordinance Survey of Ireland report, Assessment of the Economic Value of the Geospatial Information Industry in Ireland, the many economic impact models listed here in the World Bank Toolkit, or the often cited McKinsey Global Institute Open data: Unlocking innovation and performance with liquid information report.  Oh Well!

While there I was struck by the number of times maps were displayed.  The mapping of public policy issues related to openness seems to have become a normalized communication method to show how countries fare according to a number of indicators that aim to measure how transparent, prone to corruption, engagemed civil society is, or how open in terms of data, open in terms of information, and open in terms of government nation states are.

What the maps show is how jurisdictionally bound up policy, law and regulatory matters concerning data are.  The maps reveal how techno-political processes are sociospatial practices and how these sociospatial matters are delineated by territorial boundaries.  What is less obvious, are the narratives about how the particularities of the spatial relations within these territories shape how the same policies, laws and regulation are differentially enacted.

Below are 10 world maps which depict a wide range of indicators and sub-indicators, indices, scorecards, and standards.  Some simply show if a country is a member of an institution or is a signatory to an international agreement.  Most are interactive except for one, they all provide links to reports and methodologies, some more extensive than others.  Some of the maps are a call to action; others are created to solicit input from the crowd, while most are created to demonstrate how countries fare against each other according to their schemes.  One map is a discovery map to a large number of indicators found in an indicator portal while another shows the breadth of civil society participation.  These maps are created in a variety of customized systems while three rely on third party platforms such as Google Maps or Open Street Maps.  They are published by a variety of organizations such as transnational institutions, well resourced think tanks or civil society organizations.

We do not know the impact these maps have on the minds of the decision makers for whom they are aimed, but I do know that these are often shown as backdrops to discussions at international meetings such as the OGP to make a point about who is and is not in an open and transparent club.  They are therefore political tools, used to do discursive work.  They do not simply represent the open data landscape, but actively help (re)produce it.  As such, they demand further scrutiny as to the data assemblage surrounding them (amalgams of systems of thought, forms of knowledge, finance, political economies, governmentalities and legalities, materialities and infrastructures, practices, organisations and institutions, subjectivities and communities, places, and marketplaces), the instrumental rationality underpinning them, and the power/knowledge exercised through them.

This is work that we are presently conducting on the Programmable City project, which will  complement a critical study concerning city data, indicators, benchmarking and dashboards, and we’ll return to them in future blog posts.

1.       The Transparency International Corruption by Country / Territory Map

Users land on a blank blue world map of countries delineated by a thick white line, from which they select a country of interest.  Once selected a series of indicators and indices such as the ‘Corruption measurement tools’, ‘Measuring transparency’ and ‘Other governance and development indicators’ appear.  These are measured according rankings to a given n, scored as a percentage and whether or not the country is a signatory to a convention and if it is enforced.  The numbers are derived from national statistics and surveys.  The indicators are:

  • Corruption Perceptions Index (2013), Transparency International
  • Control of Corruption (2010), World Bank dimension of Worldwide Governance Indicators
  • The Bribe Payer’s Index (2011), Transparency International
  • Global Corruption Barometer (2013), Transparency International
  • OECD Anti-Bribery Convention (2011)
  • Financial Secrecy Index (2011), Tax Justice Network
  • Open Budget Index (2010), International Budget Partnership
  • Global Competitiveness Index (2012-2013), World Economic Forum Global Competitiveness Index
  • Judicial Independence (2011-2012), World Economic Forum Global Competitiveness Index
  • Human Development Index (2011), United Nations
  • Rule of Law (2010), World Bank dimension of Worldwide Governance Indicators
  • Press Freedom Index (2011-2012) Reporters Without Borders
  • Voice & Accountability (2010), World Bank dimension of Worldwide Governance Indicators

By clicking on the question mark beside the indicators, a pop up window with some basic metadata appears. The window describes what is being measured and points to its source.

The page includes links to related reports, and a comments section where numerous and colourful opinions are provided!

*****************************

View the rest at Programmable City.

Tags: , , ,