From Wikipedia, the free encyclopedia….

From Wikipedia, the free encyclopedia.

Jump to: navigation, search

The term „Web 2.0“ refers to development of the World Wide Web, including its architecture and its applications.

As used by its proponents, the phrase currently refers to one or more of the below :

  • a transition of websites from isolated information silos to sources of content and functionality, thus becoming a computing platform serving web applications to end users
  • a social phenomenon referring to an approach to creating and distributing Web content itself, characterised by open communication, decentralization of authority, freedom to share and re-use, and „the market as a conversation“
  • a more organized and categorized content, with a more developed deeplinking web architecture.
  • a shift in economic value of the web, potentially equalling that of the dot com boom of the late 1990s

However, a consensus upon its exact meaning has not yet beenreached. Skeptics argue that the term is essentially meaningless, orthat it means whatever its proponents decide that they want it to meanin order to convince the media and investors that they are creatingsomething fundamentally new, rather than continuing to develop and usewell-established technologies.

Many recently developed concepts and technologies are seen as contributing to Web 2.0, including weblogs, wikis, podcasts, rss feeds and other forms of many to many publishing; social software, web APIs, web standards, online web services, AJAX, and others.

The concept is different from Web 1.0, as it is a move away from websites, email, using search engines and surfingfrom one website to the next. Others are more skeptical that such basicconcepts can be superceded in any real way by those listed above.


From the Fringe Department… This article is…

From the Fringe Department… This article is making the rounds today. It’s about a 15 year old boy in Nepal (see the picture) who is said to have been meditating under a tree without food or water, or even moving or going the bathroom, for 6 months. Interesting. Although unfortunately, now that he has been „discovered“ I wonder if he’ll have any peace from the crowds of onlookers? Jeez, can’t a guy just be left alone to meditate without food or water for 6 months in peace anymore??? What is this world coming to?


London Tube Lines on Google Maps – Jonathon

Technorati Tags : , , , , ,
London Tube Lines on Google Maps – Jonathon Scott thought it would cool to create a map which showed all the London Underground lines running over it. The map map really isn’t usable as a transit mashup, since stations are not displayed as markers. Leaving stations out was due to the worry of load times (read more here), but I truly hope that someone takes it on to create one for London. As the OnNYTurf NYC transit map shows, it is possible to use the Google Maps API to include many station stops.

Killer Data

Technorati Tags : ,


Editorial: Killer Data –, Google Maps/DigitalGlobe and LiDAR – Part 1

By: Simon Greener

(Ed.Note: Part one of this editorial addresses issues related to data accessibility, while the second, to appear next week, will discuss why LiDAR might be the killer data we all want and need.)

I have been musing about some threads relating to data for a while and I’ve noticed they have been tangling people up.These threads resonated at the recent Spatial Sciences Institute (SSI) Conference held in Melbourne, Sept.12th-16th.I want to unravel a few threads in light of that conference.

It has struck me for quite a while how crazy things are in our industry.Let me give an analogy that describes, until recently the bind in which we found ourselves.Your hi-fi system dies.Off to the hi-fi shop you go and, before you know it, you are walking out of the shop with a system that would take someone with a use! Needless to say, your wallet is very much lighter.Your new system, you tell your friends, is the best – it has features, features and more features.Then the awful truth dawns on you: you can’t afford the music CDs!

I have heard lots and lots of people tell me how they have the biggest and best GIS around.(What my previous employer’s CEO called a „Rolls-Royce GIS.“) But then they go on to bemoan the fact that they can neither access nor afford the data.

In the Australian environment the question of data access and pricing is a perennial one.So it was no surprise that it was talked about constantly at the recent GITA Australia (reviewed here) and SSI conferences.

Stuart Nixon: The „Where“ of Data
Stuart Nixon, CEO of ERMapper (based in Perth, Western Australia,) gave a keynote presentation on the first day of the SSI conference.He provided a marketing/financial view on what, in his view, is holding back the geospatial industry.

Nixon explained how he thought we were good at capturing data; pretty good at managing it (though I beg to differ there); but pretty awful at delivery.By „delivery,“ he wasn’t referring to its technical aspects per se, rather he meant the way in which we put our services in the hands of the masses.In his view the latter needs to change, and change quickly.To create a framework for discussing it, he characterized the spatial market as being:

  1. Small but expert;
  2. Something that, when compared to search engines, is not used by everyone;
  3. Restricted by business and data copyright models;
  4. Characterized by poorly integrated data; and
  5. Distinguished by having most of its data locked up.

In Nixon’s view, accessibility is the issue.

„What“ and „Where“
To illustrate, Nixon compared the value of the „What“ industry (i.e. search engines like Google, MSN, Yahoo!, etc.) to the „Where“ industry (us).The „What“ industry, in his view, generates money, whereas the „Where“ industry spends it.The „What“ industry does this by giving access to information about things into the hands of anyone.This is the value of „What“ – accessibility.Nixon presented a diagram that he used to compare the two industries.

(Click for larger image)

Advertising is an area in which the „What“ industry makes its money, and is the one that our industry needs to work on to turn it from being a cost sink to a revenue generator.He also thinks that „GeoLocation,“ i.e., geocoding services, is the area stopping the industry from generating revenue in the advertising space.He stated quite emphatically that „geocoding does not exist.“ So Nixon believes that the value of „Where“ – on which our industry’s future lies – has not been addressed.

A Way Forward?
Nixon thinks there is a way forward but first three missing things need to be fixed.

1.We can’t locate (geocode) things. Geocoding for Nixon is the equivalent of a Google „search.“ He thinks there is no unified approach and that it is costly (currently about one cent per transaction).Nixon stated that the industry must stop seeing geocoding as the business model but rather as the enabling technology for value-added services that anyone can use.Part of the issue, Nixon believes, is the lack of free street databases on which to base geocoding.

I take a somewhat different view that Nixon on this. I simply don’t believe that geocoding is a geospatial process.My view is that it is a semantic matching problem in which two fuzzy sentences (addresses) are matched for the purpose of transferring two attributes: a latitude and a longitude.Yes, geocoding requires some geoprocessing to create a matchable dataset (from multiple spatial sources) but it does not need a GIS to execute a geocode operation.

2.Current data licensing models and access modes are too restrictive. Nixon highlighted the dilemma when asked the rhetorical question: Why is our satellite imagery rotting in tape libraries when it should be getting used? He also underlined his passion on this topic when he proclaimed that there is better, free, detailed coverage of Mars than of Earth! Sure, Australia has some large datasets, but current license and access agreements are killing access.

Part of the problem is that the major suppliers of data are the government agencies.Some of the data are free but some are not.The problem is compounded by the fact that much of the data (e.g. topographic data) was captured in the days before economic rationalism saw governments retreat from major infrastructure projects like mapping.

This is not an issue that is isolated to Australia. In my view some of the current owners of navigable road datasets will face this dilemma if they continue to price themselves out of Internet portal use.The restrictions on use of road data inside Google Maps are not Google’s fault.It is the data licensing that is the issue: this licensing runs counter to the portal’s approach of ubiquitous access.If this becomes a problem, Google and the other location enabled portals just might go elsewhere (or capture it themselves).

I think new technologies like LiDAR (Light Detection And Ranging, discussed below and defined here) have the potential to change this. Much more on this in Part 2 of this article, which will appear next week.

3. Current data delivery mechanisms are poor.Those of us who thought CD and DVD technologies were a boon to data delivery can think again.Now even these are not good enough.In the Internet age, the main method of delivery should be via the Web (for those who can afford the bandwidth).

Putting Money Where Mouth Is!
Nixon drove home his point that „usage equates to value“ when he made a special announcement that ERMapper, in conjunction with the Australian Greenhouse Office and GeoScience Australia is sponsoring a new web site to enable greater access to large geospatial datasets.

Before Nixon made the announcement, he noted that BitTorrent traffic takes up to 35% of today’s Internet traffic and is, therefore, the delivery mechanism of choice for Internet data delivery.The new site Nixon announced is called

The technological vision and driver for the new site is one of ERMapper’s staff, Richard Orchard.It uses BitTorrent technology to improve user access and delivery, thereby hoping to increase usage and value.It is not meant for online access or searching, rather simply for data delivery.The great thing about this site is that the last 30 years of Landsat imagery for Australia is now available at one access point! Perhaps not „killer data,“ but it is a great start that will hopefully cause others to open their data vaults to the public.

Too late?
But is this a late response from an industry caught with its pants down by Google? Is it already game set and match to Google?

Nearly.An example may help clarify why I think this might be the case. In my home state, Tasmania, there has been a ground swell of demand for high resolution imagery for use in the private and public sectors. Local government (of which there are 29 councils in Tasmania), fire, police, ambulance, parks and other state government agencies, forestry companies, agribusinesses, etc., all want this data for their many needs.None of them, as individuals, can afford to buy and process imagery from the main suppliers to create a single seamless mosaic over their operational areas and keep it up-to-date.But, as a single body, they could afford such imagery.

The problem is that the main agency for coordinating data capture and access within the state, The LIST, has not been responsive to this demand.It has not gathered the many players together to ascertain need (coverage, scale, sensors, time scales), draw up common specifications, negotiate with suppliers, and then coordinate the processing of the product to make it available to all.Something might be happening as I write, but this is a reactive response rather than proactive service provision.

At a recent local Tasmanian conference, I publicly asked the question: „Do we have to wait until Google buys all the DigitalGlobe data for Tasmania, processes it and makes it available in Google Maps/Google Earth before we can finally get the data we need?“ Someone in the audience yelled out: „Too right, mate!“ If my listener’s response is any measure, the decision has been made: we will wait for Google to do what we seem incapable of doing ourselves!

What Google has done in this space can be likened to „killer data.“ In my Killer Apps article published last week, I used the term in the following sense (from Wikipedia): „The term is especially used when the technology existed before but did not take off before the introduction of the killer application.“ We know that large data stores existed before Google Maps and Google Earth, but these were not as accessible or useful (raw data for download is not what most people want).Thus, it seems to me that the only real data integration Web Service that has any traction in the market today is Google’s KML (reviewed here). One attendee at the SSI conference informed me that Google does have a contract with DigitalGlobe for their data and is buying up an incredible amount of it.

But with Google, geospatial data’s presence has become visible and so is now „taking off,“ as it were.Google knows data is everything in the geospatial business.One attendee at the SSI conference informed me that Google has a contract with DigitalGlobe for their data and is buying an incredible amount.This data will be made accessible via its platform interfaces (Maps API and KML).Perhaps it is these interfaces that will open up data access far more than the ones the geospatial industry has been pushing for quite a while now.

In summary, let’s keep it simple.If I could provide my users a Google Maps layer within GIS application(s) in the same way we access WMS services, then it really is close to game, set and match! Tasmanian users will get what they want: seamless, statewide coverage of up-to-date high resolution imagery.And, just in case Google is listening, they would be willing to pay!

[Part 2 will appear next week.]

Google Earth vs. World Wind

Technorati Tags : , ,

Planetenbrowser wie Google Earth oder World Wind sind nicht nur faszinierende Spielzeuge – sie könnten sich zur universellen Schnittstelle für ortsbezogene Informationen aller Art entwickeln und so völlig neue Geschäftsfelder eröffnen. Was das Tool vom Suchmaschinenhersteller von dem der NASA unterscheidet, was ihnen gemeinsam ist und wie man sie erweitern kann.


Weiteres Buch unter Creative-Commons-Lizenz veröffentlicht

Technorati Tags : ,

Nach dem der Heise Zeitschriften Verlag den Anfang gemacht hatte, folgten weitere Veröffentlichungen von Büchern unter der Creative-Common-Lizenz, zum Beispiel das Gütersloher Netzkunstbuch von Matthias Weiß. Nun kommt ein neues Buch unter der Creative-Commons-Lizenz auf den Markt: Der VSA-Verlag gab passenderweise den neuen Titel „“Wissensallmende – Gegen die Privatisierung des Wissens der Welt durch geistige Eigentumsrechte“ aus der Reihe „AttacBasisTexte“ zum Download frei. Jeder darf den Inhalt zu nicht-kommerziellen Zwecken vervielfältigen, bearbeiten und verbreiten.


Alle Bundesgesetze nun kostenlos im Internet

Technorati Tags : , ,

Gemeinschaftsprojekt des Bundesjustizministeriums und der juris GmbH

Unter stehen ab sofort alle Gesetze und Rechtsverordnungen des Bundes kosten- und barrierefrei bereit. Das Angebot startete heute Bundesjustizministerin Brigitte Zypries (SPD) anlässlich des 20-jährigen Jubiläums der juris GmbH.