In several conversations I've had in different parts of Australia, the agency view was that they only wanted to release useful data, and were prepared to set up an internal review process to assess how useful data could be, then selectively release what they decided was valuable.
I strongly oppose this approach on the basis that it shouldn't be agencies who decide what data is useful, to whom, when or where.
There's no evidence that government agencies have the skills to successfully decide which data may be useful to particular groups in the broader community, or which won't. There's also no evidence that they are good at successfully predicting the future, which data will become useful at a future date.
My view is that agencies should simply release all the data they can without trying to assign levels of usefulness.
An example of this was featured at a Gov 2.0 Canberra lunch in November 2012, where Jake McMullin spoke about his use of a open dataset from the National Library to create a unique mobile app.
When he'd created the prototype app, he walked into the library and showed the first staff member he saw (who happened to be the project manager for their iPhone catalogue app).
As a result of this serendipitous meeting, the National Library funded the app, which has just been released in the iTunes store under the name Forte, with an accompanying event (on 25 March) and video (below).
Forte provides a way to explore the National Library's digitalised Australian sheet music catalogue by decade and composer.
The dataset Jake used had been released a year earlier by the National Library for a hack event, however had not been previously used, as another National Library staff member, Paul Hagon, discusses in his blog.
Government agencies cannot predict these types of events - which, when, where or how a dataset will become useful if it is released as open data. And they shouldn't try.
The power of open data is often in serendipity.