August 2007

You are currently browsing the monthly archive for August 2007.

Real time data from the Hudson

Imagine the Ottawa River Keeper having access to this type of data! Or for the folks along the St-Laurent Sea Way! How wonderful for citizens to be able to view a 3D model of their rivers and their conditions at any time of day!

This is exactly what is going on along the Hudson where IBM and the Beacon Institute, a nonprofit scientific-research organization in NY are collaborating on the development of the River and Estuary Observatory Network (REON) which is a

distributed-processing hardware and analytical software, the system designed to take heterogeneous data from a variety of sources and make sense of it in real time. The software learns to recognize data patterns and trends and prioritizes useful data. If some data stream begins to exhibit even minor variations, the system automatically redirects resources toward it. The system will also be equipped with IBM’s visualization technologies; fed with mapping data, they can create a virtual model of the river and simulate its ecosystem in real time.

The type of data that will be gathered from sensor reports are

temperature, pressure, salinity, dissolved oxygen content, and pH levels, which will indicate whether pollutants have entered the river. Other sensors will be directed toward sea life, says Nierzwicki-Bauer, and will be used to study species and determine how communities of microscopic organisms change over time.

It is expected there will be many hundreds of sensor required for this project that will rely on fibre optic cables and wireless technologies. Eventually the system will be connected to Ocean sensor and monitoring networks.

REON

Ah! Nice to see some exciting data collecting activities!

Via:
Networking the Hudson River: The Hudson could become the world’s largest environmental-monitoring system. By Brittany Sauser.

udell & greg elin of sunlight foundation

Jon Udell interviews Greg Elin, chief info architect of the Sunlight Foundation, which aims to make the operation of Congress and the U.S. government more transparent and accountable. It’s interesting to follow this debate in the USA – where government data and reports are de facto public domain, though true access is a different story, compared with Canada where government data is often covered by restrictive copyright provisions (starting with Crown Copyright). Says Udell about the talk:

Having surveyed a wide range of government data sources, Greg’s conclusion is that the future is already here, but not yet evenly distributed. There are pockets within the government where data management practices are excellent, and large swaths where they are mediocre to horrible. The Sunlight Foundation has an interesting take on how to bootstrap better data practices across the board. By demonstrating them externally, in compelling ways, you can incent the government to internalize them:

Some of that can be said here, but we are behind the curve, having a big hurdle to get over just convincing the Canadian government that the proved wisdom of US government data policy is compelling: making government data available spurs innovation. Restricting it restricts innovation.

See the Interviews with Innovators page, on IT Conversations.

Canadian Environmental Sustainability Indicators

Check out the Statistics Canada Canadian Environmental Sustainability Indicators report.

It discusses three main indicators:

Air quality indicator tracks Canadians’ exposure to ground-level ozone—a key component of smog and one of the most common and harmful air pollutants to which people are exposed.

The greenhouse gas emissions indicator tracks the annual releases of the six greenhouse gases that are the major contributors to climate change. The indicator comes directly from the greenhouse gas inventory report prepared by Environment Canada for the United Nations Framework Convention on Climate Change and the Kyoto Protocol.

The freshwater quality indicator reports the status of surface water quality at selected monitoring sites across the country. For this first report, the focus of the indicator is on the protection of aquatic life, such as plants, invertebrates and fish.

The report also has some links in the references to some of the data used to build these indicators. Also check out the methodology section to get the low down on how to use these data. Perhaps some data and ideas to play with!

But now that we know that

the three indicators reported here raise concerns for Canada’s environmental sustainability, the health and well-being of Canadians, and our economic performance. The trends for air quality and greenhouse gas emissions are pointing to greater threats to human health and the planet’s climate. The water quality results show that guidelines are being exceeded, at least occasionally, at most of the selected monitoring sites across the country.

What do we do?

Malamud & Case Law

From O’Reilly Radar:

Carl Malamud has this funny idea that public domain information ought to be… well, public. He has a history of creating public access databases on the net when the provider of the data has failed to do so or has licensed its data only to a private company that provides it only for pay. His technique is to build a high-profile demonstration project with the intent of getting the actual holder of the public domain information (usually a government agency) to take over the job.

Carl’s done this in the past with the SEC’s Edgar database, with the Smithsonian, and with Congressional hearings. But now, he’s set his eyes on the crown jewels of public data available for profit: the body of Federal case law that is the foundation of multi-billion dollar businesses such as WestLaw.

In a site that just went live tonight, Carl has begun publishing the full text of legal opinions, starting back in 1880, and outlined a process that will eventually lead to a full database of US Case law. Carl writes:

1. The short-term goal is the creation of an unencumbered full-text repository of the Federal Reporter, the Federal Supplement, and the Federal Appendix.
2. The medium-term goal is the creation of an unencumbered full-text repository of all state and federal cases and codes.

Link to the database.

GeoData Alliance

The GeoData Alliance is a nonprofit organization open to all individuals and institutions committed to using geographic information to improve the health of our communities, our economies, and the Earth.

The purpose of the GeoData Alliance is to foster trusted and inclusive processes to enable the creation, effective and equitable flow, and beneficial use of geographic information.

[more...]

Good news: Elections Ontario makes the postal code/electoral riding data file available:

The Postal Codes by Electoral Districts (ED) file provides a link between the six-character postal code and Ontario’s new provincial electoral districts. It is a zip file containing three files that can be loaded and used in spreadsheets and databases. The first is a text file with the ED names; the second file contains the postal codes that have been assigned to a single ED; and the third file contains postal codes that have been found in multiple EDs. This third files repeats the postal code for each ED in which it is found.

Have not looked at it yet: any comments about formats/license etc?

(from the civicaccess.ca mailing list).

This CBC.ca video gives a brief on how 2d and 3d street view data are collected. In this case it is the city of Toronto and the data collector is Tele Atlas. The things cartographers do to make maps! Tele Atlas seems to be selling georeferenced landmarks, street networks, and a variety of other data it collects simply by driving the streets with cameras and GPS mounted on the roof of cars. At 500 km a day and terrabytes of data, these folks are collecting and selling tons of geo-information that we like to play with on google earth, help find places in mapquest, and allow city planners or police forces to prepare evacuation plans, understand the characteristics of the route planned for a protest or know the point address in a 911 call.

The video also briefly discusses privacy issues, seems like the street is public space and if you happen to be naughty going into some taudry establishment and your act happens to be caught on film, well, so be it, either behave or accept the digital consequences of your private acts in public space, or so the video suggests!

Regarding access to these data, well, my guess is a big price tag. It is a private company after all!

Information Machine, The (1958)

The Information Machine is a short film written, produced and directed by Charles and Rae Eames for the IBM Pavillion at the 1958 Brussels World’s Fair. Animation by John Whitney. Music by Elmer Bernstein. The topic is primarly about the computer in the context of human development but I think it also represents our fascination and need to collect, organize data and abstract the world around us. Since it was written in 1958 it does go on about he, his, him, man and men’s yada yada at nauseaum, it nonethelss remains a cute informative short film in the public domain and captured in the Internet Archive and does represent ideas as relevant to us today as they were then!

via: Information Aesthetics 

Interview: Jon Udell

Jon Udell writes about the outside edge of what’s happening on the web (including lawnmowers), but his focus is often as much about how regular, non-digerati people might be helped by new changes and technologies. Formerly the blogger-in-chief at Infoworld, he’s now working with microsoft. He’s been writing recently about public data, and I wanted to find out why.

1. you seem to be spending much time recently writing about access to public data … why is that?

I’ve always thought the real purpose of information technology was to harness our collective intelligence to tackle complex and pressing problems. When I heard Doug Engelbart’s talk at the 2004 Accelerating Change conference. I realized for the first time how all of his work points toward that one goal. Graphical user interfaces, networks, hyperlinked webs of information — for him, these are all means by which we “augment” our human capabilities so we can have some hope of dealing with the challenges we face as a species.

In that context, getting data into shared information spaces is just part of the story. We’ll also need to be able to share the tools we use to analyze and interpret the data, and the conversations we have about the analysis and interpretation.

2. what do you think is the most compelling argument for making public data available to citizens?

Well it’s ours, our taxes paid for it, so we should have it. But the compelling reason is that we need more eyeballs, hands, and brains figuring out what’s going on in the world, so that when we debate courses of action we can ground our thinking in the best facts and interpretations.

3. are you convinced by any arguments *against* making public data available to citizens?

Here’s an argument I don’t buy: That amateur analysts will do more harm than good. I don’t buy it because there will be checks and balances. Those who don’t cite data will be laughed at. Those who do cite data but interpret it incorrectly will be corrected. Those who do great work will develop reputations that are discoverable and measurable.

Here’s an argument I do buy: There’s the risk of violating privacy. The District of Columbia, for example, has released a lot of data but has postponed releasing adult arrests and charges until the location information can be aggregated. We will increasingly have to make these kinds of calls.

5. public data is an issue that most people will have trouble getting excited about. how do you think “data activists” should approach it?

The best advice I’ve heard comes from Tom Steinberg, founder of MySociety.org. He counsels activists to use data in ways that matter directly to people. Suppose you could get geographic data on planned highway routes, for example. Nobody cares, until you connect the dots and show people their houses will have to be bulldozed to make way for it. Then they really care.

6. in your experience with government officials, how have *they* reacted to your requests for data?

When I started asking my local police department for crime data, they stonewalled. Eventually I had to get a lawyer to write them a letter citing our state’s ‘Right to Know’ act, and we were both unhappy about having to do that.

But once I met with the police chief and explained my interest in exploring both local patterns as well as this whole general process, he was OK with that. Better than OK, actually. I think he was relieved when he saw that some questions people have been speculating about might now be discussed in a more rational way. And he’s really excited by the prospect of geographical analysis because they haven’t had that capability.

8. what do you think are the connections between open access to public data and other similar movements – free culture, free software etc?

There’s an arc that runs from free and open-source software, to open data, to Web 2.0-style participation, and now to the collaborative use of software, services, and public data in order to understand and influence public policy.

9. with your crystal ball, where do you think the confluence of these movements will take us in, say, 5 years?

I’m sure it won’t happen that soon, but here’s what I’d like to see. Imagine some local, state, or national debate. The facts and interpretations at issue are rarely attached to URLs, much less to to primary sources of data at those URLs and to interactive visualizations of the data. We spend lots of time arguing about facts and interpretations, but mostly in a vacuum with no real shared context, which is wildly unproductive. If we could establish shared context, maybe we could argue more productively, and get more stuff done more quickly and more sanely.

I have been paying attention to infrastructures lately. More recently, I seemed to be coming across more stories about infrastructural failures, Submarine Cables in Asia, or in the Ring of Fire. The most recent failure being in Minneapolis. Today’s Globeand Mail online has a story that links to some AP video data and this one in particular – U.S Infrastructure under scrutiny – does a good review on how engineers gather their primary data, the nature of that data, and the making of safety reports. Seems like those reports get shelved allot! William Ibbs from UC Berkely an expert on construction risk said it well with a knowing smirk on his face:

well, ah, we’ve had had ah maybe some other social priorities for the past few years in the nation and public works have taken, ah, a bit of a back seat.

The map below shows the distribution of deficient bridges in the US. I thought I was hearing more stories and this data seems to support that my assumptions were not entirely off base!Bridges US

Then I wondered about Canada so I did some superficial digging and found the following report – The Age of Public Infrastructure produced by Statistics Canada. The great thing about all of their report is that you can access their methodology documents, data sources and contacts which is great education material for amateur data geeks who wants to collect data themselves and want to find a systematic and statistically sound way to do so. I also found an Infrastructure Canada report that discusses the Government’s Infrastructure Assets and their management. The collapse in Minneapolis created a media context and receptivity on the subject as seen here – Canada’s infrastructure needs urgent attention, while some specialized think tanks look at particular infrastructures related to investment and stock prices in the energy industry – Aging Energy Infrastructure Could Drive Molybdenum Demand Higher -which is loaded with data particular to engineers in that field.

Why, talk about that here! Well, mostly because infrastructure is a boring thing that we rarely think about yet there is a ton of citizen money locked into these very huge material physical artefacts, also because there is little citizen generated data on the topic and the data available or the decisions that are being made rarely have a price tag or the name of the responsible agent attached to them! Yet without infrastructure we can cannot function! Infrastructure is what distinguishes a good city to live in versus a not so good city to live in, and well infrastructure is an inseparable part of our human habitat.

Imagine a concerted effort by citizens to collect data about satellite dishes, or receiving ground stations, server farms, isp offices, aging bridges, cool sewers, following the complete cycle of ones local water purification plant, or telephone switching station, where one’s poo goes once flushed, where one’s data is stored, and sharing and visualizing all that data on a map. We are starting to see some really interesting adventurer/art urban exploration projects or how some boyz are navigating the 3d elements of a city’s hardware in parkour. I love stuff like this Pothole reporter, could we develop collaborative tools to report missing manhole covers, Ottawa’s thriving road side ragweed cultivations, where the public washrooms are/are not along with public water fountains, Montreal’s missing trees in sidewalk planters (Michael‘s idea on location portal content gathering) and so on.

« Older entries