policy

You are currently browsing the archive for the policy category.

The Green Party, reports Michael Geist, put out it’s policy document.

Have not looked thru it yet, but there’s support for network neutrality:

Supporting the free flow of information

The Internet has become an essential tool in knowledge storage and the free flow of information between citizens. It is playing a critical role in democratizing communications and society as a whole. There are corporations that want to control the content of information on the internet and alter the free flow of information by giving preferential treatment to those who pay extra for faster service.

Our Vision

The Green Party of Canada is committed to the original design principle of the internet – network neutrality: the idea that a maximally useful public information network treats all content, sites, and platforms equally, thus allowing the network to carry every form of information and support every kind of application.

Green Solutions

Green Party MPs will:

* Pass legislation granting the Internet in Canada the status of Common Carrier – prohibiting Internet Service Providers from discriminating due to content while freeing them from liability for content transmitted through their systems.

and free/open source software:

Open source computer software

As computer hardware improves, it is important that software programs are readily modifiable by the people who buy and use them. Developing alongside the proprietary software sector is Free/Libre Open Source Software (FLOSS). This software is generally available at little or no cost, making it very popular in the developing world. It can be used, copied, studied, modified and redistributed with little or no restriction. Businesses can adapt the software to their specific needs.

Under the free software business model, vendors may charge a fee for distribution and offer paid support and customization services. Free software gives users the ability to work together enhancing and refining the programs they use. It is a pure public good rather than a private good.

Our Vision

The Green Party supports the goals and ideals of Free/Libre Open Source Software (FLOSS) and believes that Canada’s competitiveness in global information technology (IT) will be greatly enhanced by strongly supporting FLOSS.

Green Solutions

Green Party MPs will:

* Ensure that all new software developed for or by government is based on open standards and encourage and support a nationwide transition to FLOSS in all critical government IT systems. This will make Canada’s IT infrastructure more secure and robust, lower administration and licensing costs and develop IT skills.
* Support the transition to FLOSS throughout the educational system

I’d say it would be worth asking them what they think of civic access to canadian government data.

It is a small world!  Especially in Ottawa where the degree of separation is about .0005 degrees!  Feels like that anyway!

Zzzoot is definitely a kindred blog to datalibre.ca.  The Sept. posts are excellent reviews of International data access and preservation technologies, RFPs and initiatives.  All my faves’ are discussed there – Cyberinfrastructure in the US, Joint Information Systems Committee (JISC) UK DataShare project and CODATA Open Data for Open Science Journal and a whole bunch more.

I think like I, he wishes there would be some uptake from these innovations & initiatives here!

The issue of public access to government data has a number of components: availability (is it available?), format (is it in a usable/open format?), cost (is it free?), and copyright (do I need permission to use it, may I do with it what I wish?).
one cent
The City of Toronto has recently launched a campaign to get more money for cities from the Federal government, asking for one cent from the GST. The campaign is called: onecentnow.ca, and uses the Canadian penny in ads and on their web site.

They’ve received a retroactive bill from the Royal Canadian Mint for $47,000+ for use of the image of the Canadian penny, and for use of the words “one cent” (!).

There are political/moral issues here about how government agencies use (or abuse) existing laws. Notably, the Royal Canadian Mint is a crown corporation that answers to the federal government, and the federal government is a target of the onecentnow.ca campaign, so this retroactive charge could be interpreted as politically motivated. Perhaps not.

And of course there are policy issues about how Crown Copyright ought to be used, or whether it should exist at all. In the USA, for instance, federal government documents, designs and publications are de facto in the public domain.

But other than these abstract concerns, there is a more crucial point: the Mint appears to be on the wrong side of the Canadian Copyright Act. As Howard Knopf points out in Excess Copyright, Canadian copyright law provides copyright protection until 50 years after the death of the creator. Crown Copyright extends 50 years after date of publication.

The Canadian penny was designed by G.E. Kruger Gray in 1937. He died in 1943, meaning that the design for the Canadian penny went into the public domain 50 years later, in 1993. Which means that no one, including the Royal Canadian Mint, can claim ownership of the image, much less charge for its use.

He notes further that it seems unlikely that any court would agree with the Mint that they own a copyright or trademark on the words “one cent.”

So it seems possible that the Royal Canadian Mint has developed an Intellectual Property policy that is claiming – and charging for – ownership where none exists.

One of the big worries about access to government data are issues of privacy. Here’s a video of Dr. Ann Cavoukian, Ontario’s Privacy Commissioner, taking to engineers at Waterloo, about the importance of designing for privacy:

Globally, issues about information privacy in the marketplace have emerged in tandem with the dramatic and escalating increase in information stored in electronic formats. Data mining, for example, can be extremely valuable for businesses, but in the absence of adequate safeguards, it can jeopradize informational privacy. Dr. Ann Cavoukian talks about how to use technology to enhance privacy. Some of the technologies discussed included instant messaging, RFID tags and Elliptical Curve Cryptography (ECC). Then Dr. Cavoukian explained the “7 Privacy – Embedded Laws” followed by a discussion on a biometrics solution to encryption.

[video link]

Our friends at freeourdata.org.uk have an article about abolishing Crown Copyright in the UK. Canada suffers under the same of copyright policy on government documents and data, while in the USA, everything published by the government is de facto public domain.

The key point is:

But the problem with crown copyright as it stands, and more importantly as it’s used, is that it’s used to restrict.

[link…]

There is a interesting article in the Globe today by Eric Sager a professor of history at the University of Victoria about access to the names of Census respondents of Censuses gone by and those in the future.

I consider the privacy aspects of the Census to be sacred and so does StatCan. I fill it out because I know I am anonymous and that the data will be aggregated therefore not traced back to my personal address. Many people feel the same way, recall the Lockheed Martin online Census debacle. Fortunately for Canadians we do not live in Nazi Germany, Stalin’s Ukraine, or are in Idi Amin’s Uganda where Censuses were explicitly used to target, kill or expulse ‘undersireable’ populations or to mask the death tole of massive mistakes. Censuses can and have been used to trace and target people of ethnic, religious, sexual orientation, or racial backgrounds. This 2006 Census year included a question as to whether or not we would be willing to give consent to sharing our private information 92 years from now. I responded with an educated no.

Historians and genealogists argue that past census respondent’s names should be made available and that we should have future access to current censuses:

The census is the only complete inventory of our population, an indispensable historical record of the Canadian people. It’s critical to genealogy, our most popular form of history. Of all visitors to our national archives today, half are doing genealogical research. If you had ancestors in Canada in 1901 or 1911, you can find them in the censuses of those years, online from Library and Archives Canada. Your children will also be able to find their grandparents and great-grandparents in the censuses of the past century — but only after a legally mandated delay of 92 years.

Seems like our friends in the South are sharing their Census information, as the U. S. Census information is released

through their National Archives after a delay of 72 years. They apply the principle of “implied consent” — a principle well known to privacy experts. When completing their census forms, Americans are consenting to the present-day use of their information by the Census Bureau, and to its use by other researchers in the distant future. Americans do not complain about the future use of their information, and there is no evidence that public release after 72 years has made them reluctant to participate.

Spammers and telemarketers have been using “implied consent” when they send me unsolicited email garbage, drop popups on my computer or call my home to sell me stuff. I have to say there are dubious elements to this concept. I do however like the concept of informed consent and think the Census had it right by leaving it up to census respondents to decide if they wish to share their personal information to future generations of researchers or potentially less progressive political regimes (see the question and your options).  StatCan even provided a very extensive section on historical and genealogical position. See the informed consent Question 8 on the short form and Question 53 on the long form. These are perfectly legitimate questions supported with a ton of explanatory texte and is a perfect compromise to the debate.

Prof. Sager makes a compelling argument for access to this private information, but he believes we should give up our right to informed consent, that we are not smart enough to understand on our own the importance of historical and genealogical research.  I vehemently disagree with these points. He does however correctly point out the importance of the Census for research and decision making.

I would like to have free – as in no cost – access to the non-private Census data and maps in the same way we have free access to the forms and the methodological guides. Now that, along with informed consent, is what a democracy looks like!

One of the great data myths is that cost recovery policies are synonymous with higher data quality. Often the myth making stems from effective communications from nations with heavy cost recovery policies such as the UK who often argue that their data are of better quality than those of the US which have open access policies. Canada, depending on the data and the agencies they come from is at either end of this spectrum and often in between.

I just read an interesting study that examined open access versus cost recovery for two framework datasets. The researchers looked at the technical characteristics and use of datasets from nations of similar socio-economic, jurisdiction size, population density, and government type (Netherlands, Denmark, German State of the North Rhine Westfalia, US State of Massachusetts and the US Metropolitan region of Minneapolis-St. Paul). The study compared parcel and large scale topographic datasets typically found as framework datasets in geospatial data infrastructures (see SDI def. page 8). Some of these datasets were free, some were extremely expensive and all under different licensing regimes that defined use. They looked at both technical (e.g. data quality, metadata, coverage, etc.) and non-technical characteristics (e.g. legal access, financial access, acquisition procedures, etc.).

For Parcel Datasets the study discovered that datasets that were assembled from a centralized authority were judged to be technically more advanced while those that require assembly from multiple jurisdictions with standardized or a central institution integrating them were of higher quality while those of multiple jurisdictions without standards were of poor quality as the sets were not harmonized and/or coverage was inconsistent. Regarding non-technical characteristics many datasets came at a high cost, most were not easy to access from one location and there were a variety of access and use restrictions on the data.

For Topographic Information the technical averages were less than ideal while for non-technical criteria access was impeded in some cases due to involvement of utilities (tendency toward cost recovery) and in other cases multiple jurisdictions – over 50 for some – need to be contacted to acquire a complete coverage and in some cases coverage is just not complete.

The study’s hypothesis was:

that technically excellent datasets have restrictive-access policies and technically poor datasets have open access policies.

General conclusion:

All five jurisdictions had significant levels of primary and secondary uses but few value-adding activities, possibly because of restrictive-access and cost-recovery policies.

Specific Results:

The case studies yielded conflicting findings. We identified several technically advanced datasets with less advanced non-technical characteristics…We also identified technically insufficient datasets with restrictive-access policies…Thus cost recovery does not necessarily signify excellent quality.

Although the links between access policy and use and between quality and use are apparent, we did not find convincing evidence for a direct relation between the access policy and the quality of a dataset.

Conclusion:

The institutional setting of a jurisdiction affects the way data collection is organized (e.g. centralized versus decentralized control), the extent to which data collection and processing are incorporated in legislation, and the extent to which legislation requires use within government.

…We found a direct link between institutional setting and the characteristics of the datasets.

In jurisdictions where information collection was centralized in a single public organization, datasets (and access policies) were more homogenous than datasets that were not controlled centrally (such as those of local governments). Ensuring that data are prepared to a single consistent specification is more easily done by one organization than by many.

…The institutional setting can affect access policy, accessibility, technical quality, and consequently, the type and number of users.

My Observations:
It is really difficult to find solid studies like this one that systematically look at both technical and access issues related to data. It is easy to find off the cuff statements without sufficient backup proof though! While these studies are a bit of a dry read, they demonstrate the complexities of the issues, try to tease out the truth, and reveal that there is no one stop shopping for data at any given scale in any country when it comes to data. In other words, there is merit in pushing for some sort of centralized, standardized and interoperable way – which could also mean distributed – to discover and access public data assets. In addition, there is an argument to be made to make those data freely (no cost) accessible in formats we can readily use and reuse. This of course includes standardizing licensing policies!

Reference Institutions Matter: The Impact of Institutional Choices Relative to Access Policy and Data Quality on the Development of Geographic Information Infrastructures by Van Loenen and De Jong in Research and Theory in Advancing Data Infrastructure Concepts edited by Harlan Onsrud, 2007 published by ESRI Press.

If you have references to more studies send them along!

Says Jesse Robins:

The Istanbul Declaration (see: pdf) signed at the [OECD World Forum] calls for governments to make their statistical data freely available online as a “public good.” The declaration also calls for new measures of happiness and well-being, going beyond just economic output and GDP. This requires the creation of new tools, which the OECD envisions will be “wiki for progress.” Expect to hear more about these initiatives soon.

From the Declaration (pdf):

A culture of evidence-based decision making has to be promoted at all levels, to increase the welfare of societies. And in the “information age,” welfare depends in part on transparent and accountable public policy making. The availability of statistical indicators of economic, social, and environmental outcomes and their dissemination to citizens can contribute to promoting good governance and the improvement of democratic processes. It can strengthen citizens’ capacity to influence the goals of the societies they live in through debate and consensus building, and increase the accountability of public policies.

Hear hear!

Putting Canadian “Piracy” in Perspective, a video from Geist and Albahary is a great way to present an argument. In Geist’s words

over the past year, Canadians have faced a barrage of claims painting Canada as a “piracy haven.” This video – the second in my collaboration with Daniel Albahary – moves beyond the headlines to demonstrate how the claims do not tell the whole story.

The video also uses quite a bit of public and private sector data to support its argument. This to me is what public data are for and this is what democracy looks like – when civil society has access to the data it requires to keep its government accountable, can keep citizens informed and can temper industry desires with public interest!

One of the cultural issues that has become pervasive as of late is the proliferation of policies and decisions being based on assumptions and not on facts, and in the case of the very powerful lobby against Canada on IP in the cultural sector – really biased reports that are not based on facts but on an industry’s desires and self interests. Look for the sources of the data and the methodology in all reports. Even in this great video! Geist and Albahary do a great job in this to show what is being said and repeated (memes) about the cultural industry in Canada and reality.

It is interesting that the video ends with a slide acknowledging the photos used, the music heard, the creators of the video and the license but not all the data sources in the charts! Some of the data references are in some of the bar charts while most statements are referenced with their source at the bottom of the slide. I always look for data references, else how can I go back and verify what was purported!

The data in the charts were:

  • Hollywood Studio Revenue Growth – Data Source unknown
  • Top Hollywood International Markets – Data Source unknown
  • Canadian Music Releases – Statistics Canada
  • Canadian Artist Share of Sales – Canadian Heritage Music Industry Profile
  • Digital Music Download Sales Growth – Data Source unknown
  • Private Copying Revenues 2000-2005 – Data Source unknown
  • RCMP Crime Data – Data Source unknown but assume the RCMP

*************************************
NOTE: See the comments of this post, the references to the data, quotes and reports that were not listed in the credits or with the information in the film are now fully described on Michael Geist’s Blog here.

This is an older post (May 2007) from Michael Geist, but well worth a read, regarding Canada’s National Science and Tech Strategy. He argues that opening up government-funded R&D data will result in more innovation. He has two specific recommendations:

  1. the government should identify the raw data under its control and set it free. Onerous licensing conditions are a hinderance to commercialization and accountability for taxpayer funded research.
  2. Federal research granting institutions should build open access requirements into their research mandates.

From the post:

I argue that maximizing the value of Canada’s investment in research requires far more than tax breaks and improved accountability mechanisms. Instead, government must rethink how publicly-funded scientific data and research results flow into the hands of researchers, businesses, and individuals.

Achieving that goal requires action on two fronts. First, the government should identify the raw, scientific data currently under its control and set it free. Implementing expensive or onerous licensing conditions for this publicly-funded data runs counter to the goals of commercialization and to government accountability for taxpayer expenditures….

Second, Ottawa must pressure the three federal research granting institutions to build open access requirements into their research mandates.

[more…]

Newer entries »