Friday, August 23, 2013

Is it possible to deliver a government agency's standard IT systems on a single USB?

CSIR Mk 1 with Hollerith
equipment, Sydney 1952
Source: Museum Victoria
The Australian government was one of the earliest adopters of computers and computerisation.

CSIRAC (or CSIR Mk1), the first computer in Australia (and now the oldest surviving first-generation electronic computer), was used by scientists within CSIRO, by the Snowy Mountains Hydro Electric Authority and various university and government departments and agencies between 1949 and 1964 to make sense of 'big data' (for the time) which would have taken years to analyse by hand.

As the fifth stored program computer in the world, CSIRAC programmers could write their programs on punch tapes, check them one step at a time, and store them in the computer to be run again and again.

While computers have gotten a lot smaller, faster and efficient, they still use a similar programming approach to CSIRAC. Programs (software) are loaded into their memory and may then be accessed and run many times.

Of course modern computers use different storage mediums and can store and execute many programs at the same time.

Every government agency has an IT architecture made up of hundreds, if not thousands, of different programs - some run on a mainframe computer, others on desktop computers and still more on servers which allow staff to access the programs remotely from their desktop, laptop or even mobile platforms.

It is a very complex process to manage an agency's IT architecture - some programs may not 'play nice' with others, some may be twenty or more years old and require special hardware and maintenance to keep them operating.

Setting up a new agency can be an even more complex process. Often agencies are 'spawned' from existing departments and 'borrow' much of their IT infrastructure - the software required to run everything from payroll and HR to manage contracts, projects, compliance, Ministerial correspondence and provide the desktop applications required by staff to do their jobs.

Even more complex is the process of combining disparate agencies into a new department. This can require blending two or more sets of software programs into a single solution, with all the data migration and management issues this entails - not to mention addressing security considerations, staff training and avoiding long outages or data loss.

This is where my concept of 'government on USB' comes in.

Why not develop all the core software that a government agency needs to operate as open source shareable software and release it for other government agencies to reuse?

Using this approach it is possible that when a government dictates that a new agency must be formed that the CIO simply pulls out his 'Government Agency USB' and uploads all the required operational software as a complete agency package.

Potentially, via this method, a new agency could have all its core ICT systems in place and operating in days, if not hours.

This approach might seem farfetched, however we're already heading in that direction due to a couple of trends.

Today much of the software an agency needs to run its operations is available through SAAS (Software as a Service) or as cloud-based services - which both basically means that software is stored offsite, maintained by a specialist company and simply accessed and used as needed by an agency - provided they are confident of the security levels.

We're also seeing more and more of the software 'building blocks' of organisations becoming available in open source forms which can be downloaded, adjusted as required by an agency and used, either hosted internally or via a SAAS or cloud provided.

The US has actively been developing and releasing software in open source formats for other governments to use, as has the UK and a few other governments around the world. This offers massive national and international efficiencies for governments who can reuse rather than build or buy software.

The next step is for a government to audit the core systems required to establish a new agency and develop a standard IT Architecture that can be applied for any new agency (with room for specialised modules for unique functions). Then, by selecting from existing open source programs and potentially writing additional services, a government could put together a 'flatpack' IT architecture that any new agency could adopt quickly and easily.

If all the software in this 'flatpack' were open source, it could be easily improved and adjusted over time to meet changing legislative and operational requirements and to integrate ongoing improvements and enhancements.

Then once agencies have adopted this common 'flatpack' of software, it would be significantly easier and cheaper to merge agencies, as they would already be operating in a similar and interchangeable way.

Moving all of government across to this approach would take quite a few years - it's not achievable in a single term - however it would provide ultimately for a 'government on USB'.

This also has implications across the developing world and for newly formed countries, where their government agencies and institutions can suffer from a lack of experience, expertise and money to build the robust IT architecture needed for modern nations.

In the scenario I've described, a new or developing government could simply plug in the 'government on USB' into an agency's systems and establish a sophisticated IT environment to underpin governance in a very short period of time.

Is this simply an unattainable pipedream?

Some may scoff at the notion, however there are many people around the world working on parts of the 'government on USB' model today - albeit many may not be thinking about the bigger picture.

Much of the software required for a government agency is already available in open source form, from HR and financial management systems to desktop applications. It simply hasn't been linked together with a single set-up process.

To explore the concept it would take a government willing to innovate, investing resources and money.

This would be used to model the software requirements of an agency, identify where open source solutions exist (or existing solutions can be modified) and write new open source software where necessary.

Next there would be the need to ensure the solution is secure and to write a single set-up approach that makes it easy for a CIO to roll out the solution quickly.

This may not ultimately be possible or cost-effective, but given the cost of IT architecture changes today when creating, merging or updating agencies, surely it is worth considering.

Read full post...

Wednesday, August 21, 2013

From Gov 2.0 to GovInnovate - expanding the agenda

GovInnovate speaker badgeI'm pleased to note that CEBIT, whose Gov 2.0 Conference has been a great event over the last few years, has recognised the growing innovation agenda in government and broadened this annual conference into GovInnovate.

Now including Gov 2.0, Cyber Security, Service Design and mobile Government (mGov) streams, the GovInnovate conference looks like it will retain a leading position amidst Australian events aimed at government innovators and leaders.

I'll be returning to speak at the conference after an absence of a few years due to other commitments, and I strongly recommend that people involved in government who are interested in the streams above consider whether they can attend.

GovInnovate is being held from 26-28 November in Canberra and more information is available at its website: www.cebit.com.au/govinnovate

Read full post...

Friday, August 16, 2013

Social Media specifications guide

One challenge organisations may face with social media is designing their account pages to reflect their common look.

I've seen many organisations place graphics poorly - stretched logos and unintentionally pixelated images - due to not having the specifications to hand when instructing a graphic designer.

Fortunately someone has come up with the below very useful specifications sheet for major social networks.

While this is a 'point in time' resource, as social networks regularly change their designs, it provides a starting point that should help organisations design their account pages to platform constraints.

Social Media Spec Guide

Read full post...

Tuesday, August 13, 2013

Should public servants be relying on the courts to clarify their right to use social media?

About eighteen months ago the APSC released updated guidance for the use of social media by public servants.

Designed to cover personal and professional use, the guidance was widely criticised at the time by traditional media and former public servants for its imprecise language and broad reach.

I criticised it as well, and it was one of my motivations for leaving the public sector, as it was for several other people I know.

In particular criticisms related to one piece of the guidance that states that when APS employees are making public comment in an unofficial capacity, it is not appropriate for them to make comment that is, or could be perceived to be:
so harsh or extreme in its criticism of the Government, a member of parliament from another political party, or their respective policies, that it raises questions about the APS employee’s capacity to work professionally, efficiently or impartially. Such comment does not have to relate to the employee’s area of work
so strong in its criticism of an agency’s administration that it could seriously disrupt the workplace. APS employees are encouraged instead to resolve concerns by informal discussion with a manager or by using internal dispute resolution mechanisms, including the APS whistleblowing scheme if appropriate 

The APSC has provided a few (broad) case studies designed to help public servants navigate use of social media, within their definition of appropriate conduct.

However terms such as 'so harsh and extreme' have remained largely undefined and subject to the interpretation of senior public servants - which unfortunately has left them open to accidental and deliberate misuse, potentially for bullying or internal politics.

I've long advocated that for the public service to improve its use of social channels it needs to foster and support staff in using those channels - professionally and personally as well as officially.

If the public sector doesn't firmly embed social media use into the culture of agencies, it will find it increasingly difficult and expensive to match Australian society's preference for communication via social channels and be less effective at carrying out the instructions of the government of the day.

Imprecision is the enemy of adoption. It has remained unclear what is meant by terms such as 'harsh or extreme', 'so strong' and 'seriously disrupt', leading public servants to either avoid participating online, carefully self-censor or to conceal their identities.

Now we're beginning to see some of the fruits of that imprecision, in the case of Michaela Banerji who reportedly used the Twitter identity @LaLegale. Ms Banerji has lost a court case to stay her dismissal from the public service, partially related to comments made by her pseudonymous Twitter account.

I'm not casting judgement on the case decision itself. While Marcus Mannheim's article, Public servant loses fight over Twitter attack on government, focuses on Twitter, there's some indication there were other issues as well. Ms Banerji was directly and publicly criticise the policies of her own department and there's been clear and precise guidance for quite some time that this is highly dangerous territory.

However I wonder how the department identified @LeLegale as Michaela Banerji - there would be serious privacy considerations if the Department were investigating other pseudonymous Twitter or other social media accounts to determine who owns them, regardless of whether they then took any actions as a result.

I am also concerned that this had to go to a court decision (albeit one brought by Ms Banerji). Agencies have had a number of years to write social media policies and educate staff as to their responsibilities and what constitutes appropriate conduct online - however there's not been any research released publicly indicating whether they've done this in an effective way.

I do support the need to put boundaries as to how far public servants can criticise agency operations and government policies related to their work (and only those related to their work - unlike the current guidelines).

However I don't think that public servants should need to ever go to court to clarify their right to privately use social media channels for political comments.

The social media guidelines for public servants need to be clearer, and the policies and training supporting the guidelines need to be implemented consistently and effectively.

Otherwise we all lose.

Read full post...

Thursday, August 08, 2013

Political participation in a crowded age

Whether we call today the information age, the digital age or the internet age, it is very true that society today has changed radically from the society we saw fifty years ago.

Massive personal access to information, entertainment and communication means this is the crowded age - every person has a plethora of choices and can individually decide what they watch, read, create or participate in.

I've been reading an interesting thread in the Australian Public Policy LinkedIn group discussing the lack of young people involved in politics, the falling level of participation in political parties and the impacts this is likely to have on our society in the future.

Named Where's the young blood?, the thread has seen a great deal of interesting and considered views on the topic.

I thought it would be useful to share my thoughts on this topic in my blog, as well as in the thread, as below as I believe this shift is a consequence of our increasingly digital world and will have a profound impact on the depth and professionalism of Australia's political leadership over the next twenty years.

My say:

Falling political party memberships is not a unique thing - it should be considered in the broader context of falling participation in all kinds of voluntary organisational activities.

And this is a symptom of wider social change - people have more engagement and entertainment options than in the past.

Thirty years ago the choices for what active minds did in the evening or weekend was more limited, so active participation in political parties (or unions) was a more common choice - it brought like-minded people together to share their dreams and visions, to socialise and form tribes.

People today form these social bonds in different ways, but still form them. Hence the old political party 'branch' where people attend every Wednesday night with cupcakes and a readiness to debate, discuss and romance, is no longer as attractive as it once was.

By and large political parties in Australia have failed to modify their membership and participation model to remain relevant to people aged 40 or under - which is why we see a vast underrepresentation of young people in political party memberships, and much lower participation and engagement from most who still sign up.

This isn't solely due to parties being led by older people, steeped in ye olden days, or due to the fact that the lower participation actually suits some active young people as they have less competition for attention and position in political parties. It is also a function of the legal framework in Australia around how such organisations must be formed and registered, and the traditions these organisations have built over a hundred or more years.

There is little radical innovation in party structures to find one which will work for present-day society, and as a result the political party tree is dying from the roots up.

One of the implications is much poorer representation for Australians. Political parties used to be testbeds for peoples' ideologies - challenging them to think, consider, test and assess their ideas in the light of broader views. Politicians who emerged from this after a ten or twenty year 'apprenticeship' in party positions were professional, broadminded and good at their jobs - sound in their own thinking, committed to the public good (whatever their ideological view of 'public good' was).

As membership numbers have fallen and people have had to be fast-tracked into political office without these long apprenticeships, we've seen a focus on the popular and less commitment to specific ideological viewpoints. While this has its benefits, it also has many disadvantages - less tested views, a lower commitment to the public good and more commitment to self-entitlements and promotion.

While the long apprenticeship approach had its flaws, creating more group-think and less ideological flexibility, with politicians ground in the values of their youth, it also had many advantages in terms of a professional political 'class', politicians with broader exposure to views and to what worked or didn't work in practice.

We are losing these advantages as people are increasingly entering politics with less party grounding, and as we are drawing from a thinner and thinner (and more incestuous) talent pool.

How do we reframe politics for the modern day? That's yet to be seen.

In twenty years we might employ politicians like corporate managers in order to attract professional and more objective individuals to these jobs, with citizens being shareholders in massive corporate states.

Or we might see a massive change in how politicians operate, having online 'brains trusts' of thousands of citizens, who are selected, like juries, to contribute to selected decisions via algorithms, picking how their representative should vote using their always-on mobile devices and have the politicians be merely a proxy votes in a more direct democratic model.

We may even see political parties reinvent themselves for a modern age, potentially the most unlikely option!

Read full post...

Tuesday, August 06, 2013

Is it easy for non-programmers to reuse government open data?

Opening up data is one thing, but using it in a productive way is another.

Data may be released in formats that are hard to reuse, data may be 'dirty' (with mistakes) or incomplete.

However when organisations release data in machine-readable formats, with a reasonable level of completeness, it can be surprisingly easy for even a novice with no programming experience to reuse it in meaningful ways.

Below are two examples of how I've recently reused very different sets of data, an example of data released directly by a government agency, and an example of how to capture and reuse data that is public but technically not open.

Example 1: Mapping Australian polling places

Earlier today @Maxious tweeted the release of the Australian Electoral Commission's (AEC) expected polling places for the federal election as a CSV file. CSV is a standard format, like a basic spreadsheet, where every value is separated from the next by a comma, making it easy to import into (or export from) Microsoft Excel, OpenOffice Calc, Google Spreadsheet or other spreadsheets or databases.

The polling locations data is valuable, but in the CSV format simply appears as lines and lines of data. I thought it would be interesting and useful to visually map the polling locations on a map of Australia, making it easy for people to find the polling booths nearest to them.

So I downloaded the CSV file from the AEC website (www.aec.gov.au/About_AEC/cea-notices/election-pp.htm) and went to Google Drive, which supports a type of spreadsheet called Fusion Tables which can map geographic data.

Fortunately the AEC was smart enough to include latitude and longitude for each polling location. This can be easily mapped by Fusion Tables. The CSV also contained address, postcode and state information, which I could also have used, less accurately, to map the locations.

I uploaded the CSV into a newly created Fusion Table, which automatically organised the data into columns and used the Lat/Long coordinates to map the locations - job done! Or so I thought....

When I looked at the map, it only showed NSW polling locations - about 2,400 of them - while the original CSV listed over 8,000.

Clearly something hadn't worked properly, so I tried reloading the data into a new Fusion Table - with the same result - it didn't seem to be a problem with the CSV or the import process.

I went into the CSV using Microsoft Excel and studied at the data. There were many columns of data I didn't need for the map, so I deleted them - reducing the size of the spreadsheet by tens of thousands of cells.

I reimported the CSV into a Fusion Table and it worked! All eight and a half thousand expected polling locations appeared on the map. Clearly there had been too much (extraneous) data for Fusion to manage.

From here finishing the map was easy. It was simply a process of making the data a little more presentable by changing datasheet names and editing what appeared in the  information box that appeared when a polling location was clicked on.

I shared my Fusion Table and published the map so people could view and embed it (see below).

You can view (but not edit) my full Fusion Table at: https://www.google.com/fusiontables/DataSource?docid=1kzLZTqNRkXMu1w4eBdsOLRakx3S8FLHziu6PdbU



So job done - map created with useful information when you click a red dot.

However, these are only expected polling places - the AEC may update this information at any time as they confirm or remove specific polling places.

My map is current at 6 August 2013, however may become out-of-date quite fast. How do I ensure my map updates when the AEC updates their CSV?

The short answer is that I can't - using my Google Fusion Table.

Because the AEC has chosen to release the data in a format easy for them (a CSV, straight from their internal systems), it is less useful for outsiders who wish to keep their maps or mash-ups current.

A programmer would be able to write a script that checked the AEC page each day to see if the CSV had updated, download it into a program that updated a map and published it to the web with the changes - even providing a history of which polling stations were added or removed over time.

However the broader community, including me, don't have the programming skills to do this - and shouldn't need them.

To replicate what the programmer could do in a few lines, any non-programmer, such as me, would have to manually check the page, download the updated CSV (assuming the page provides a clue that it has changed), manually delete all unneeded columns (again) and upload the data into my Fusion Table, simply to keep my map current.

Of course, if the AEC had spent a little more time on their data - releasing it as a datafeed or an API (Application Programming Interface), it would be easy even for non-programmers to reuse the data in a tool like Google Maps for public visualisation - or the AEC could have taken the one additional step necessary to map the information themselves (still providing the raw data), providing a far more useful resource for the community.

This is one of the challenges with open data - releasing it in formats useful for the audience, rather than the agency.

Agencies often choose to release data in what they see as the fastest and easiest solution for them, even though it greatly increases the risk that their data will be reused online in out-of-date or inappropriate ways. Imagine the same issue with a listing of illegal drugs, accident hotspots or holiday dates - anyone who relied on old data, because it didn't automatically update in apps or third-party websites, would potentially be at significant risk.

However with a little more effort and thought, agencies can release their data in ways that biase online reuse towards remaining current and accurate - such as via APIs, which automatically update the information whenever a user accesses a mobile app or website which draws from it. With some data, APIs can potentially save lives - as well as reduce the risks to both agencies and developers.

Example 2: Analysing agency tweets

I'm interested in what government agencies say online and have been tracking the use of Twitter by Australian governments, including local, state and federal agencies, for six years. I track these accounts using my @egovau Twitter account, in two Twitter lists (as the maximum list size is 500 accounts):


Now it's great to track these accounts within Twitter, however how can I easily get a sense of which agencies are most active or have the largest following?

Followerwonk.com Twitter report
I use followerwonk.com for this purpose - a tool which can capture a snapshot of the number of followers, tweets and other details of every account at a particular time. In fact it is so good that I actually pay money for it.

These snapshots can be downloaded as CSVs and analysed in spreadsheets - which makes it easy to identify the most and least active government Twitter users (as I've blogged about in an infographic).

However what Followerwonk doesn't do is to capture and archive the actual tweets from the roughly 890 Australian government agencies and councils that use Twitter. If I want to analyse what they actually say in their tweets, rather than simply analyse the number of tweets, I need different tools.

While it is reasonably easy to archive the tweets from an individual Twitter account (you can download your own tweets from Twitter directly), or tweets that use particular terms or hashtags, using a tool like TweetArchivist, which is really useful for tracking conferences, it is harder to capture all the tweets from a large number of Twitter accounts at the same time - even if they are in the same Twitter list.

I've previously captured some Twitter list tweets using paper.li, which turns them into a daily 'newspaper'. In fact I have mapped Australian Federal parliamentarian tweets, by house and party, for those who wish a daily dose of political discussion in a condensed form.

The beauty of this approach is that paper.li updates as I update my @egovaupollies Twitter lists (where I follow Australian federal politicians) - the use of this datafeed ensures the 'newspapers' are always current.

However paper.li only selectively captures and reports tweets and doesn't allow them to be downloaded in a structured way. It doesn't really help me archive my government agency Twitter lists.

I have tried using a number of tools without success, including the fantastic IFTTT (If This, Then That) site, which allows the creation of 'recipes' which perform actions between different online social networks and web 2.0 tools. I have used IFTTT previously to do things such as automate the change of my Facebook profile image when I change my image in Twitter.

However the fantastic Digital Inspirations blog, written by Amit Agarwal, provides useful code 'recipes' that can be adapted to do all kinds of things by non-programmers.

I tried one of Amit's 'recipes' for converting a Twitter list into an RSS feed, however found it didn't work properly as Twitter had changed its own code. I tweeted to Amit (@labnol) and he graciously replied with a link to an updated post, A Simple Way to Create RSS Feeds for Twitter, which did indeed provide a simple way of doing this, with a step-by-step video.

I followed the video and, using the Twitter Widgets page and the Google script that Amit provided, was able to quickly create the RSS feeds I needed for my Twitter lists (one feed per list).

You can view these RSS feeds using the following (unpretty) web addresses:


However I had a new issue. Taking the tweets from the RSS feeds and archiving them in a structured way into a spreadsheet or database for later analysis.

I thought it would be relatively easy to find a free online or downloadable RSS reader which could archive all the tweets from these RSS feeds. I was wrong.

I could not find an RSS reader that was designed to capture, store and archive RSS - only ones designed to format and view them.

So I went back to IFTTT and searched for a recipe that might help.

Here I found the recipe, Backup RSS Feed to Google Spreadsheet by Martin Hawksey.

The recipe was simple. All I had to do was put in my first RSS feed (above) and adjust the name of the spreadsheet in which it would be stored. Then I activated the recipe, which connected to my Google Drive and created an archival spreadsheet that updated every time a government agency or council on the list tweeted.

As I had two lists, I replicated the recipe, using the second RSS feed and a new spreadsheet name. Then I left it to see what happened....

A few hours later, checking back, the spreadsheets were growing, with about a hundred tweets between them.

I am now able to easily analyse this data to build a picture of what government agencies actually talk about, providing insights that otherwise would never be captured (stay tuned!)

In this case study the government data was already public and visible in agency Twitter accounts, however it was not really 'open, - neither easy to capture nor easy to reuse.  No government or local council in Australia I am aware of currently releases its tweets as open data or in any specific format, such as RSS, which could be captured and stored, (even though many use RSS for media releases).

However these tweets are also useful government data. The tweets are able to paint a picture of how government uses social media, what they talk about, how they say it and who they interact with. It has both historic value for the country as well as current value for understanding what different agencies and local governments are focused on today.

Capturing and reusing these government tweets was harder than reusing the data from the AEC. The AEC at least released the poll locations as open data, albeit in an imperfectly reusable form.

However using some ingenuity, but without any coding, it was still possible for a non-programmer to capture all of government's tweets and make them more useful.

Conclusion

There's still a long, long way for agencies to go with open data. Right now the data released around the countries by state and local jurisdictions is often hard to match up, being in different formats, collected in different ways, presented in different ways and often is not directly comparable from jurisdiction to jurisdiction. Federally there's not the same issue, however different agencies use different geographic areas, different terminology and different formats, again, for releasing data.

Much data remains unreleased, and even where data is technically public (such as tweets or Facebook updates), archives of this data are not always easily available to the public.

However there are now many tools online which can help make some of this imperfect public data more usable and useful - and you no longer need to be a programmer to do it.

Read full post...

Bookmark and Share