DBpedia is a community effort to extract structured information from Wikipedia and to make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia and to link other datasets on the Web to Wikipedia data.
It seems that there is an explosion of data visualization work being done on the political process and the Presidential election in the US of A. I just landed on PresidentialWatch08 a site for all you political junkie/blogospheria/dataviz fans. They’ve got a lovely map of influential political blogs and news sites. The project seems to be run by a web analytics company, linkfluence.
Anyone planning anything similar in Canada?
[via infosthetics]
A short promotional video of the Stan’s Cafe performance installation, Of All The People In All The World, in which rice is used to represent human statistics.
[via infosthetics]
Civic data comes in many guises, and this is a neat way to analyze a politician’s speeches:
Last year’s 2007 State of the Union Tag Cloud was such a hit, I decided to follow up again this year…
[from Boing Boing]
Would be nice to do this systematically for every politician, maybe just based on their web pages? Or published documents, anyway.
Well, why not? Why not, indeed? Here is:
Stephen Harper:
Stephane Dion:
Jack Layton:
Gilles Duceppe:
MySociety has released some very useful and sexy interactive travel-time maps for the UK using public data.
This is a most interesting use of political geodemographics – The Copyright MPs.
Ted at GANIS blog introduces this very interesting data visualization initiative – The British Columbia Atlas of Wellness.
Environment XML is now live http://www.haque.co.uk/environmentxml/live/ enabling people to tag and share remote realtime environmental data; if you are using Flash, Processing, Arduino, Director or any other application that parses XML then you can both respond to and contribute to environments and devices around the world.
EnvironmentXML proposes a kind of “RSS feed” for tagged environmental data, enabling anyone to release realtime environmental data from a physical object or space in XML format via the internet in such a way that this content becomes part of the input data to spaces/interfaces/objects designed by other people.
[more…]
From Wired:
Sources at Google have disclosed that the humble domain, http://research.google.com, will soon provide a home for terabytes of open-source scientific datasets. The storage will be free to scientists and access to the data will be free for all. The project, known as Palimpsest and first previewed to the scientific community at the Science Foo camp at the Googleplex last August, missed its original launch date this week, but will debut soon.
Building on the company’s acquisition of the data visualization technology, Trendalyzer, from the oft-lauded, TED presenting Gapminder team, Google will also be offering algorithms for the examination and probing of the information. The new site will have YouTube-style annotating and commenting features.
[Via Open Access News]
The Federation of Canadian Municipalities has just released its Quality of Life Reporting System (Press Release) – Trends & Issues in Affordable Housing and Homelessness (Report in pdf). If you go to the end of the report & this post you will find the data sources required to write this important report on the situation of housing & homelessness in Canadian Cities.
NOTE – these public datasets were purchased to do this analysis. It costs many many thousands of dollars to acquire these public data. Public data used to inform citizens on a most fundamental issue – shelter for Canadians. Statistics Canada does not generally do city scale analysis as it is a Federal agency and the Provinces will generally not do comparative analysis beyond cities in their respective provinces. This type of cross country and cross city analysis requires a not-for-profit organization or the private sector to do the work. We are very fortunate that the FCM has the where-with-all to prepare these reports on an ongoing basis. This is an expensive proposition, not only because subject matter specialists are required, much city consultation is necessary for contextual and validation reasons but also because the datasets themselves are extremely costly. There is no real reason to justify this beyond cost recovery policies. Statistics Canada and CMHC are the worst in this regard. The other datasets used are not readily accessible to most. While the contextual data requires specially designed surveys.
The documents referred to in this report were however freely available but not readily findable/discoverable as there is no central repository or portal where authors can register/catalogue their reports. This is unfortunate as it takes a substantial amount of effort to dig up specialized material from each city, province or federal departments and NGOs.
Public (but not free) Datasets Used in the report:
- Statistics Canada – Population Census, Population and Dwelling Counts, Age and Sex, Families and Households, and Housing and Shelter Costs, Tax Filer Statistics for economic, some population data and Migration Estimates.
- Canada Mortgage and Housing Corporation – Customized data on the cost of housing, the availability of housing, vacancy rates and housing Starts and Completions Survey.
- Citizenship and Immigration Canada – volume of immigration, demographics of immigrants and destinations in Canadian cities.
- Human Resources and Social Development Canada – Minimum wage database, Homeless Individuals and Families Information System (HIFIS).
- Homeless and Social Housing Data were derived from 22 FCM QOLRS participating municipalities.
Report and Press Release Via Ted on the Social Planning Network of Ontario Mailing List.
Comments on Posts