government

You are currently browsing the archive for the government category.

Jon Udell interviews Greg Elin, chief info architect of the Sunlight Foundation, which aims to make the operation of Congress and the U.S. government more transparent and accountable. It’s interesting to follow this debate in the USA – where government data and reports are de facto public domain, though true access is a different story, compared with Canada where government data is often covered by restrictive copyright provisions (starting with Crown Copyright). Says Udell about the talk:

Having surveyed a wide range of government data sources, Greg’s conclusion is that the future is already here, but not yet evenly distributed. There are pockets within the government where data management practices are excellent, and large swaths where they are mediocre to horrible. The Sunlight Foundation has an interesting take on how to bootstrap better data practices across the board. By demonstrating them externally, in compelling ways, you can incent the government to internalize them:

Some of that can be said here, but we are behind the curve, having a big hurdle to get over just convincing the Canadian government that the proved wisdom of US government data policy is compelling: making government data available spurs innovation. Restricting it restricts innovation.

See the Interviews with Innovators page, on IT Conversations.

Check out the Statistics Canada Canadian Environmental Sustainability Indicators report.

It discusses three main indicators:

Air quality indicator tracks Canadians’ exposure to ground-level ozone—a key component of smog and one of the most common and harmful air pollutants to which people are exposed.

The greenhouse gas emissions indicator tracks the annual releases of the six greenhouse gases that are the major contributors to climate change. The indicator comes directly from the greenhouse gas inventory report prepared by Environment Canada for the United Nations Framework Convention on Climate Change and the Kyoto Protocol.

The freshwater quality indicator reports the status of surface water quality at selected monitoring sites across the country. For this first report, the focus of the indicator is on the protection of aquatic life, such as plants, invertebrates and fish.

The report also has some links in the references to some of the data used to build these indicators. Also check out the methodology section to get the low down on how to use these data. Perhaps some data and ideas to play with!

But now that we know that

the three indicators reported here raise concerns for Canada’s environmental sustainability, the health and well-being of Canadians, and our economic performance. The trends for air quality and greenhouse gas emissions are pointing to greater threats to human health and the planet’s climate. The water quality results show that guidelines are being exceeded, at least occasionally, at most of the selected monitoring sites across the country.

What do we do?

Good news: Elections Ontario makes the postal code/electoral riding data file available:

The Postal Codes by Electoral Districts (ED) file provides a link between the six-character postal code and Ontario’s new provincial electoral districts. It is a zip file containing three files that can be loaded and used in spreadsheets and databases. The first is a text file with the ED names; the second file contains the postal codes that have been assigned to a single ED; and the third file contains postal codes that have been found in multiple EDs. This third files repeats the postal code for each ED in which it is found.

Have not looked at it yet: any comments about formats/license etc?

(from the civicaccess.ca mailing list).

The good folks at freeourdata.org.uk (one of the major inspirations for this blog) met with the UK’s Minister of Information, Michael Wills. The whole interview is interesting, of course, but just the opening remarks from Michael Wills show a remarkable openness to the idea of freed data:

Personally I’m very excited by this area, I asked to do this as part of my portfolio… The whole issue of data is I think tremendously exciting for all the reasons that you’ve said, it’s part of the infrastructure now of our society and our economy and it’s going to become more so with what’s happening with data mashing, the extraordinary intellectual creative energy that’s being unleashed is something that as a government we have to respond to, and the power of information you know is a very exciting document, something that I think is very much where government wants to be.

[more…]

What is the cost to taxpayers of public institutions purchasing public data? As citizens we do not like to pay for the same thing many times. So here is a real scenario and an estimated best guess of the #s on the cost to taxpayers for public data which they pay for many times via their public institutions whose job it is to work for the public interest and re-purchase data citizens have already paid for once in taxation:

a) Each Canadian municipality, city or town purchases demographic data from Statistics Canada. Lets suggest there are approximately 2000 of these entities. Lets say they each purchase a subset of the Census at varying scales, with a specialized geography to match their boundaries, so lets say they each spend conservatively $ 10 000 each (factoring that some small towns will buy less and others more).

2000 Towns/municipalities/cities * $ 10 000 = $ 20 000 000

b) Since many cities/towns/municipalities do not have efficient data infrastructures to manage their data assets, sometimes different departments purchase the same data twice or three times. So you may get planning, health and social welfare departments each purchasing the same data and not sharing as they are unaware and there is no central accessible repository they can mutually search. So lets pretend that the top 100 (conservative #) cities in Canada purchase the same/similar data 3 times each. We already included one purchase once above but we will keep to 3 as potentially some have purchased 4 times while the other 2900 units may have done so at least once.

100 Towns/municipalities/cities * 3 (duplicate copies of the same data) * $ 10 000 = $3 000 000

c) The best part, often each of these Towns/municipalities/cities are purchasing data for their entire respective provinces as they wish to do some cross comparisons. This means that each of these entities is paying each for the exact same/similar data set each time! Dam! Talk about a non-rivalrous good and how smart is StatCan? Dam we thought the public service did not have a corporate mindset!

d) The Provinces and Territories also each purchase Census data. They do not necessarily have a centralized data infrastructure either, they have bigger bureacracies, more departments, more specialized needs and bigger data requirements. So lets suggest that each Province and Territory spends $ 15 000 * 5 duplicate/similar sets, and an additional each $ 10 000 on multiple special orders between censuses.

13 Provinces/Territories * $ 15 000 * 5 = $ 975 000

13 Provinces/Territories * $ 10 000 = $ 130 000

d) Again many of the Provinces and Territories will purchase National scale datasets for comparison purposes, which like Towns/municipalities/cities are purchasing the exact same/similar copy of the exact same/similar data sets for the exact same geography numerous time. Recall the great part about information is its non-rivalrousness! We can each consume the same entity many times and none will suffer as a result. Unless of course you are a Canadian Tax Payer.

e) Then we have the Federal Government with approximately 350 departments and agencies and lets say each purchases some city data, some provincial data and a whole bunch of national data for $ 17 000 each. Then many, lets say 175 of these departments and agencies are purchasing special ordered data set to meet their particular needs, each at $ 7 500.

350 Federal Departments and Agencies * $ 17 000 = $ 5 950 000

175 Federal Departments and Agencies * $ 7 500 = $ 1 312 500

TOTAL:

  1. 2000 Towns/municipalities/cities * $ 10 000 = $ 20 000 000
  2. 100 Towns/municipalities/cities * 3 (duplicate copies of the same data) * $ 10 000 = $3 000 000
  3. 13 Provinces/Territories * $ 15 000 * 5 = $ 975 000
  4. 13 Provinces/Territories * $ 10 000 = $ 130 000
  5. 350 Federal Departments and Agencies * $ 17 000 = $ 5 950 000
  6. 175 Federal Departments and Agencies * $ 7 500 = $ 1 312 500

Grand Total of Census Data Expenditures by Taxpayers via Public Institutions in Canada: $ 31 367 500

The above is conservative number as it does not include the human resource expenditures like the following:

  1. Person hours associated for each public servant to negotiate and discuss their data needs
  2. Person hours for the StatCan officials to fill in the orders
  3. Person hours of the public servant lawyers to take care of licensing
  4. Person hours associated with all of the purchasing and accounting work to pay for, acquire and account for this money
  5. Person hours for each official who has to work the data in the same way to meet their needs
  6. Dunno if public agencies pay taxes on these! That would add insult to injury would it not?

It is also important to note, that hospitals, school boards, universities, crown corporations and a host of other quasi public institutions are doing the same thing. And that these numbers are only for census data, these do not include the cost of other datasets like road networks, water quality, maps, environment data and so on.

Would seem to me that we could spend a fraction of that cost to deliver the data online to all of these institutions, private sector, NGOs, and Citizens and we would all be better off financially. We would waive all the administration costs, and the license management costs, and we would all be smarter to! Further, we could reinvest that money into more research, air quality infrastructure, healthcare, waive recreation fees in municipalities etc. We could reinvest wisely in quality of life and know more how to do so at the same time.

PS-If anyone has:

  • come across any type of cost analysis reports etc.
  • has a better way to calculate this
  • knows of some real costs

Please pass them along! The more we have on this the better.

Says Jesse Robins:

The Istanbul Declaration (see: pdf) signed at the [OECD World Forum] calls for governments to make their statistical data freely available online as a “public good.” The declaration also calls for new measures of happiness and well-being, going beyond just economic output and GDP. This requires the creation of new tools, which the OECD envisions will be “wiki for progress.” Expect to hear more about these initiatives soon.

From the Declaration (pdf):

A culture of evidence-based decision making has to be promoted at all levels, to increase the welfare of societies. And in the “information age,” welfare depends in part on transparent and accountable public policy making. The availability of statistical indicators of economic, social, and environmental outcomes and their dissemination to citizens can contribute to promoting good governance and the improvement of democratic processes. It can strengthen citizens’ capacity to influence the goals of the societies they live in through debate and consensus building, and increase the accountability of public policies.

Hear hear!

Newer entries »