Showing posts with label information architecture. Show all posts
Showing posts with label information architecture. Show all posts

Wednesday, January 15, 2014

Rethinking government IT to support the changing needs of government

We recently saw a change in the federal government in Australia, with a corresponding reorganisation of agency priorities and structures.

Some departments ceased to exist (such as Department of Regional Australia), others split (DEEWR into two departments, Education and Employment) and still others had parts 'broken off' and moved elsewhere (Health and Ageing, which lost Ageing to the (renamed) Department of Social Services).

This isn't a new phenomenon, nor is it limited to changes in government - departments and agencies are often reorganised and reconfigured to serve the priorities of the government of the day and, where possible, create efficiencies - saving money and time.

These adjustments can result in the movement of tens, hundreds or even thousands of staff between agencies and regular restructures inside agencies that result in changing reporting lines and processes.

While these reorganisations and restructures - Machinery of Government changes (or MOGs) as they are known - often look good on paper, in reality it can take time for efficiencies to be realised (if they are actually being measured).

Firstly there's the human factor - changing the priorities and allegiances of staff takes time and empathy, particularly when public servants are committed and passionate about their jobs. They may need to change their location, workplace behaviours and/or learn a new set of processes (if changing agency) while dealing with new personalities and IT systems.

There's the structural factor - when restructured, merged or demerged public sector organisations need to revisit their priorities and reallocate their resources appropriately. This can extend to creating, closing down or handing over functions, dealing with legal requirements or documenting procedures that an agency now has to follow or another agency has taken over.

Finally there's the IT factor - bringing together or separating the IT systems used by staff to enable them to do their work.

In my view the IT component has become the hardest to resolve smoothly and cost-effectively due to how government agencies have structured their systems.

Every agency and department has made different IT choices - Lotus Notes here, Microsoft Outlet there, different desktop environments, back-end systems (HR and Finance for example), different web management systems, different security frameworks, programming environments and outsourced IT partners.

This means that moving even a small group of people from one department to another can be a major IT undertaking. Their personal records, information and archival records about the programs they work on, their desktop systems, emails, files and more must be moved from one secure environment to another, not to mention decoupling any websites they manage from one department's web content management system and mirroring or recreating the environment for another agency.

On top of this are the many IT services people are now using - from social media accounts in Facebook and Twitter, to their email list subscriptions (which break when their emails change) and more.

On top of this are the impacts of IT service changes on individuals. Anyone who has worked in a Lotus Notes environment for email, compared to, for example, Microsoft Outlook, appreciates how different these email clients are and how profoundly the differences impact on workplace behaviour and communication. Switching between systems can be enormously difficult for an individual, let alone an organisation, risking the loss of substantial corporate knowledge - historical conversations and contacts - alongside the frustrations of adapting to how different systems work.

Similarly websites aren't websites. While the quaint notion persists that 'a website' is a discreet entity which can easily be moved from server to server, organisation to organisation, most 'websites' today are better described as interactive front-ends for sophisticated web content management systems. These web content management systems may be used to manage dozens or even hundreds of 'websites' in the same system, storing content and data in integrated tables at the back-end.

This makes it tricky to identify where one website ends and another begins (particularly when content, templates and functionality is shared). Moving a website between agencies isn't as simple as moving some HTML pages from one server to another (or reallocating a server to a new department) - it isn't even as easy as copying some data tables and files out of a content management system. There's enormous complexity involved in identifying what is shared (and so must be cloned) and ensuring that the website retains all the content and functionality required as it moves.

Changing IT systems can be enormously complex when an organisation is left unchanged, let alone when when teams are changing agencies or where agencies merge. In fact I've seen it take three or more years to bring people onto an email system or delink a website from a previous agency.

As government increasingly digitalises - and reflecting on the current government's goal to have all government services delivered online by 2017 - the cost, complexity and time involved to complete  these MOG changes will only increase.

This risks crippling some areas of government or restricting the ability of the government of the day to adjust departments to meet their policy objectives - in other words allowing the (IT) tail to wag the (efficient and effective government) dog.

This isn't a far future issue either - I am aware of instances over the past five years where government policy has had to be modified to fit the limitations of agency IT systems - or where services have been delivered by agencies other than the ones responsible, or simply not delivered due to agency IT restrictions, costs or issues.

Note that this isn't an issue with agency IT teams. These groups are doing their best to meet government requirements within the resources they have, however they are trapped between the cost of maintaining ageing legacy systems - which cannot be switched off and they don't have the budget to substantially replace them - and keeping up with new technological developments, the increasing thirst for IT-enabled services and gadgets.

They're doing this in an environment where IT spending in government is flat or declining and agencies are attempting to save money around the edges, without being granted the capital amounts they need to invest in 'root and branch' efficiencies by rebuilding systems from the ground up.

So what needs to be done to rethink government IT to support the changing needs of government?

It needs to start with the recognition at political levels that without IT we would not have a functioning government. That IT is fundamental to enabling government to manage a nation as large and complex as Australia - our tax system, health system, social security and defence would all cease to function without the sophisticated IT systems we have in place.

Australia's Prime Minister is also Australia's Chief Technology Officer - almost every decision he makes has an impact on how the government designs, operates or modifies the IT systems that allow Australia to function as a nation.

While IT considerations shouldn't drive national decisions, they need to be considered and adequately resourced in order for the Australia government to achieve its potential, realise efficiencies and deliver the services it provides to citizens.

Beyond this realisation, the importance of IT needs to be top-of-mind for Secretaries, or their equivalents, and their 'C' level team. They need to be sufficiently IT-savvy to understand the consequences of decisions that affect IT systems and appreciate the cost and complexity of meeting the priorities of government.

Once IT's importance is clearly recognised at a political and public sector leadership level, government needs to be clear on what it requires from IT and CIOs need to be clear on the consequences and trade-offs in those decisions.

Government systems could be redesigned from the ground-up to make it easy to reorganise, merge and demerge departments - either using common IT platforms and services for staff (such as an APS-wide email system, standard web content management platform, single HR of financial systems), or by only selecting vendors whose systems allow easy and standard ways to export and import data - so that a person's email system can be rapidly and easily moved from one agency to another, or the HR information of two departments can be consolidated in a merger at low cost. User Interfaces should be largely standardised - so that email works the same way from any computer in any agency in government - and as much code as possible should be reused between agencies to minimise the customisation that results in even similar systems drifting apart over time.

The use of these approaches would significantly cut the cost of MOGs, as well as free up departmental IT to focus on improvements, rather than meeting the minimum requirements, a major efficiency saving over time.

Unfortunately I don't think we're, as yet, in a position for this type of significant rethink of whole of government IT to take place.

For the most part government still functions, is reasonably efficient and is managing to keep all the lights on (even if juggling the balls is getting progressively harder).

It took the complete collapse of the Queensland Health payroll project to get the government there to act to rethink their systems, and it is likely to take a similar collapse - of our Medicare, Centrelink or tax system - for similar rethinking to occur federally.

However I would not like to be a member of the government in power when (not if) this occurs.

Read full post...

Tuesday, October 29, 2013

How do we solve falling trust in online services before it becomes critical?

A few days ago LinkedIn launched its latest IOS app, Intro.

The app promises to integrate LinkedIn profile content directly into emails, allowing more rapid connections and helping give email recipients access to a range of relevant information about the sender.

Given both Apple and LinkedIn are well-known brands, many people are likely to trust that this app is safe for them to use, that these two global companies have taken every step to ensure that users are not exposed to privacy risks.

It's also not a big decision. Intro is free and installing the app is a two-click process, done in under 30 seconds. People are unlikely to spend the time to look at the usage policy in detail, or consider the impact of such a simple decision when they trust the brands.

However, in this case, trusting LinkedIn and Apple may not be wise. Global Security Consultancy Bishop Fox released a very compelling post outlining serious concerns with how LinkedIn's new app works.

According to Bishop Fox, the app works in the same way as a 'man in the middle' hacking attack, by sending all of a user's emails through LinkedIn's mail servers. Here they could be read by LinkedIn or, if encrypted, this process could stop the final recipient from ever receiving the email.

LinkedIn states that it will keep information from the emails it captures - and while it states that LinkedIn “will never sell, rent, or give away private data about you or your contacts.” there's no clarification of what data LinkedIn might consider private, nor any solid information on how LinkedIn has mitigated against the type of security breach it suffered in 2012.

This is just a single instance of a situation where the public are being asked to trust a company to do the right thing online, while there's no guarantee they will, and often there's few ways for an individual, organisation or even a government to hold a company to account when they fail to keep their end of the trust bargain.

So the conundrum for the public has become, who can they trust online?

Clearly there must be a level of trust to use online systems, with banks and government clear cases of where trust relationships are critical for transactions and service provision. With no trust in online systems, online banking and egovernment could not exist.

Social networks are also important. As places where people store personal information and share more and more of it over time, there's a clear requirement for companies to appear trustworthy and safe.

Even search engines, which have become the front door to most websites (with Google the dominant player), have a huge trail of data on their users - what you search for helps define who you are, particularly when people use search for medical and personal matters.

The public must implicitly trust all these organisations to both play nice with their personal information and to secure it such that nefarious groups or individuals don't get it. However it has become very clear that they simply can't.

Whether it is commercial providers, who primarily use this data to identify more effective ways to sell, or governments and banks who require this data to validate individuals, the number of reported data breaches is rising - in a global environment where few governments legally require companies to report breaches to the people potentially impacted.

On top of this comes revelations of data surveillance operations by government agencies, such as the NSA, commercial entities such as the example from LinkedIn above, where the data helps them productise their users, or organised crime, who use hackers and insider sources to secure valuable data for use and resale.

However despite increasing concern over how data is secured, who can access it and how it will be used, individuals continue to use many of these online services, either because they simply cannot live their normal lives, or conduct business, without using them, or because of the "it won't happen to me" principle.

If public trust disappears, what does that mean for every organisation using the internet to build its business or to provide more convenient and cost-efficient services?

What impact would it have on government, where a shift to electronic transactions means less investment in other channels and, over time, less capability to meet citizen needs should a collapse in online trust occur?

I don't know how this situation can be resolved, particularly with the low attention paid to ensuring organisations report and rectify data breaches and be clear on how they will secure and use data.

While it is a global issue, individual governments can have an impact, by establishing a robust privacy framework for their citizens and recognising that people own their own data and any organisation allowed access to it should be held accountable for not securing or using it appropriately.

Do we have such a regime in Australia today?

I wanted to finish with an extract from the response I received from the Australian Privacy Commissioner when I reported the LinkedIn app using their email form:

Dear Craig  
Thank you for your enquiry.  
The Office of the Australian Information Commissioner (OAIC) receives a large quantity of written enquiries each day. An representative will be assigned to your enquiry and will be in contact soon. 
We aim to respond to all written enquiries within ten working days. 
If your enquiry is urgent and requires an immediate response, please telephone us on 1300 363 992 and quote your reference number. More complex phone enquiries may require a written response and may still take some time.

A response within 10 working days (14 actual days).

I wonder how many individuals may have their privacy breached, or organisations their confidential data exposed, by a single popular mobile app from a well-known company in this period of time.

Read full post...

Thursday, October 10, 2013

The road to public sector IT hell may not be paved with intentions at all

Something that scares me enormously is the house of cards that many (if not most) governments have built with their IT systems.

It can be witnessed every time government agencies get 'MOGed' - Machinery of Government changes where parts of agencies are shifted to other agencies to meet the latest political whim.

In these cases it's not simply a matter of moving tens, hundreds or even thousands of public servants to new offices - in fact in many cases they may not move at all - it is about extracting them from the secure environment, software and network systems of one agency and connecting them (including all their historical records, emails and files) to the network and software of another.

This is a hugely complex and increasingly expensive exercise that can have an enormous productivity and cost hit each time it occurs.

Why is it complex and expensive? Because every agency uses different systems - or different versions of systems - and agencies are now so wedded to these systems after a purchase decision many years earlier that, even though senior bureaucrats recognise the issue, they can not address it without a complete (expensive and time-consuming) overhaul of how government runs its information technology.

Another example is eTax. While I have a great deal of praise for eTax, and it has been very successful by most measures, when the system was originally procured and built it was done in such a way that limited it to the IBM-PC platform. Certainly no-one can blame the ATO for not foreseeing the rise of Apple or the arrival of smartphones and tablets - however the decisions made at the time locked the system into a single platform, which has caused significant pain over the years.

Other examples include the Department of Finance and Deregulation's choice of a document management system as a Web Content Management System for www.australia.gov.au, an entirely appropriate decision at the time based on their well-governed procurement approach, but which led to delays and cost blowouts, constraining the site from what it could have become.

A better known example would be the failure of the Queensland Health payroll system several years ago, where an enquiry is still ongoing. It even has its own website - www.healthpayrollinquiry.qld.gov.au

Indeed, there are hundreds of examples both big and small, where this has occurred - a decision has been taken with the best possible knowledge at the time, or small incremental decisions have been taken over time - all for the right reasons - which have inadvertantly led into blind alleys or very expensive remedial work years later.

And lest you think this is an issue only for the public sector, consider the disaster that was Telstra's bill payment system, the issues our largest banks have had keeping their systems operating, or Virgin's booking system.

With the pace of change accelerating and the increasing limits on public sector employment, the likelihood is that these types of issue will continue to grow and plague IT, becoming even more widespread and expensive.

Agencies could increasingly find themselves trapped into slow and inefficient systems, restricting staff productivity and absorbing more and more of their resources to maintain, with no funds to 'jump tracks' to more future-proofed solutions.

This can even affect the performance of elected governments - who may be forced to change their policies to fit IT limitations. I am already aware of government initiatives that have had to be abandoned (never having seen the light of day) not because they were bad ideas but because the IT constraints in government make them impossible to cost-effectively deliver.

This isn't the fault of public servants or of politicians - seeing that far into the future simply isn't possible anymore. Technology isn't progress linearly and the accelerating rate of change means left-field technologies can appear and radically transform peoples' expectations and strain existing IT systems within a few years (remember the iPhone).

There's many more of these technologies emerging around us. For example 3D printers, capable of printing anything from kitchen utensils to medical devices to firearms, disintermediating physical manufacturers, opening a new front in the ownership of intellectual property and providing access to deadly weapons. There's also unmanned aerial vehicles (UAVs), drones that are capable of live-streaming video, or even carrying weapons, that can be bought online for a few hundred dollars and flown with limited chance of detection by individuals or corporations.

Many others technologies from Google Goggles to driverless cars are in development and could, in increasingly shorter timeframes, radically transform societies.

So when government agencies are still struggling to manage and maintain their legacy green-screen mainframe systems, out-dated (insecure and unsupported) web browsers, where they are locked into increasingly expensive proprietary technologies (due to the cost and resourcing required to migrate - even changing email systems can cost our largest agencies $100 million or more), what are they to do?

There's little time for innovation or for thinking of consequences - the majority of resources in an agency's IT team are committed to maintenance and quick patches on existing solutions.

The likely outcome over time is that we'll start to see more catastrophic IT failures - particularly across the most complex and most essential systems - such as welfare, payroll and grants management.

So how do we fix this? How do we break the cycle before the cycle breaks us?

There's no simply solution, but there's fortunately some trends which work for government agencies facing this challenge - if they're prepared to consider them.

A big area is open source software, which is increasingly being used by agencies in a variety of ways. While open source can run into the same issues as proprietary software, a platform with a large and diverse group of users can combine their IT assets to ensure the system is more useful to agencies and more rapidly updated as the world around it changes.

Another area is cloud-based solutions, which allow a government to more rapidly reconfigure itself to meet the needs of political masters. When software is independent from computer systems and there's a government-wide secure environment which can host software approved for use it can be far faster and cheaper for people moving agencies to retain the files and applications they require.

There's open data - which when made available in machine-readable formats liberates the data from proprietary systems and simplifies how it may be discovered and reused by other agencies (as well as the public).

These trends do not allow governments to replace all their existing systems - however they allow agencies to contain the problem to critical systems, which allowing all other services to be done 'in the cloud'. Imagine, a single email system and intranet across government. A web-based suite of office tools, graphic design tools, finance and HR tools - which can be managed centrally within a government, leaving agency IT teams to focus on the unique systems they can't share.

What does this vision take? Intention, planning and choice.

Governments that fail to proactively and intentionally plan their futures, who simply live on autopilot, will inevitable crash - not today, not tomorrow, maybe not in five years, but eventually - and the damage that their crashes will cause may take decades to recover from.

So for agencies who see themselves as being a continuous entity, with an existence that will exist as long as the state they serve, it is imperative that they plan intentionally, that they engage their Ministers and all their staff in understanding and addressing this issue.

It is not good intentions that will cause agency IT to fail, it is the lack of intention, and that is highly addressable.

CORRECTION: I have been advised by John Sheridan, the Australian Government CTO, there was no cost-overrun on australia.gov.au, it was a fixed price contract.


Read full post...

Friday, August 23, 2013

Is it possible to deliver a government agency's standard IT systems on a single USB?

CSIR Mk 1 with Hollerith
equipment, Sydney 1952
Source: Museum Victoria
The Australian government was one of the earliest adopters of computers and computerisation.

CSIRAC (or CSIR Mk1), the first computer in Australia (and now the oldest surviving first-generation electronic computer), was used by scientists within CSIRO, by the Snowy Mountains Hydro Electric Authority and various university and government departments and agencies between 1949 and 1964 to make sense of 'big data' (for the time) which would have taken years to analyse by hand.

As the fifth stored program computer in the world, CSIRAC programmers could write their programs on punch tapes, check them one step at a time, and store them in the computer to be run again and again.

While computers have gotten a lot smaller, faster and efficient, they still use a similar programming approach to CSIRAC. Programs (software) are loaded into their memory and may then be accessed and run many times.

Of course modern computers use different storage mediums and can store and execute many programs at the same time.

Every government agency has an IT architecture made up of hundreds, if not thousands, of different programs - some run on a mainframe computer, others on desktop computers and still more on servers which allow staff to access the programs remotely from their desktop, laptop or even mobile platforms.

It is a very complex process to manage an agency's IT architecture - some programs may not 'play nice' with others, some may be twenty or more years old and require special hardware and maintenance to keep them operating.

Setting up a new agency can be an even more complex process. Often agencies are 'spawned' from existing departments and 'borrow' much of their IT infrastructure - the software required to run everything from payroll and HR to manage contracts, projects, compliance, Ministerial correspondence and provide the desktop applications required by staff to do their jobs.

Even more complex is the process of combining disparate agencies into a new department. This can require blending two or more sets of software programs into a single solution, with all the data migration and management issues this entails - not to mention addressing security considerations, staff training and avoiding long outages or data loss.

This is where my concept of 'government on USB' comes in.

Why not develop all the core software that a government agency needs to operate as open source shareable software and release it for other government agencies to reuse?

Using this approach it is possible that when a government dictates that a new agency must be formed that the CIO simply pulls out his 'Government Agency USB' and uploads all the required operational software as a complete agency package.

Potentially, via this method, a new agency could have all its core ICT systems in place and operating in days, if not hours.

This approach might seem farfetched, however we're already heading in that direction due to a couple of trends.

Today much of the software an agency needs to run its operations is available through SAAS (Software as a Service) or as cloud-based services - which both basically means that software is stored offsite, maintained by a specialist company and simply accessed and used as needed by an agency - provided they are confident of the security levels.

We're also seeing more and more of the software 'building blocks' of organisations becoming available in open source forms which can be downloaded, adjusted as required by an agency and used, either hosted internally or via a SAAS or cloud provided.

The US has actively been developing and releasing software in open source formats for other governments to use, as has the UK and a few other governments around the world. This offers massive national and international efficiencies for governments who can reuse rather than build or buy software.

The next step is for a government to audit the core systems required to establish a new agency and develop a standard IT Architecture that can be applied for any new agency (with room for specialised modules for unique functions). Then, by selecting from existing open source programs and potentially writing additional services, a government could put together a 'flatpack' IT architecture that any new agency could adopt quickly and easily.

If all the software in this 'flatpack' were open source, it could be easily improved and adjusted over time to meet changing legislative and operational requirements and to integrate ongoing improvements and enhancements.

Then once agencies have adopted this common 'flatpack' of software, it would be significantly easier and cheaper to merge agencies, as they would already be operating in a similar and interchangeable way.

Moving all of government across to this approach would take quite a few years - it's not achievable in a single term - however it would provide ultimately for a 'government on USB'.

This also has implications across the developing world and for newly formed countries, where their government agencies and institutions can suffer from a lack of experience, expertise and money to build the robust IT architecture needed for modern nations.

In the scenario I've described, a new or developing government could simply plug in the 'government on USB' into an agency's systems and establish a sophisticated IT environment to underpin governance in a very short period of time.

Is this simply an unattainable pipedream?

Some may scoff at the notion, however there are many people around the world working on parts of the 'government on USB' model today - albeit many may not be thinking about the bigger picture.

Much of the software required for a government agency is already available in open source form, from HR and financial management systems to desktop applications. It simply hasn't been linked together with a single set-up process.

To explore the concept it would take a government willing to innovate, investing resources and money.

This would be used to model the software requirements of an agency, identify where open source solutions exist (or existing solutions can be modified) and write new open source software where necessary.

Next there would be the need to ensure the solution is secure and to write a single set-up approach that makes it easy for a CIO to roll out the solution quickly.

This may not ultimately be possible or cost-effective, but given the cost of IT architecture changes today when creating, merging or updating agencies, surely it is worth considering.

Read full post...

Thursday, July 25, 2013

Social media impacts on ICT teams - presentation from the Technology in Government conference

Over the last two days I've been down at the Technology in Government conference - an event I thought went very well, with a great group of speakers (including the UK Government's CIO Liam Maxwell).

I gave a presentation this morning, and chaired the afternoon, for the Connected Government stream and have uploaded my presentation for wider access.

In it I discussed the impact of social media on agency ICT teams and some potential approaches they can take to work with business areas to ensure that agency goals are met with a minimum of intra-agency friction.

Overall my message was that social media must be engaged with, not ignored, in government and agency ICT teams have a role to play.

There's several stances ICT teams can take - whether as a leader, supporter or observer of agency social media efforts and, depending on this stance, they could take on a greater or lesser involvement in the various roles required to implement a successful social media approach.

Social media offers benefits for ICT teams, as it does for other areas of agencies - it is simply up to ICT leadership to either step up and work with business areas in a closer ongoing way, or stay out of the way and allow other areas of an agency to move forward.



Read full post...

Tuesday, May 22, 2012

Standardising content across government (or why does every agency have a different privacy policy?)

Every government website serves a different purpose and a different audience, however there are also standard content every site must have and legislation and standardised policies they must follow.

This includes content such as a privacy policy, legal disclaimer,  terms of use, accessibility statement, copyright, social media channels, contact page, information publication (FOI) pages and so on. It also includes the navigational structure and internal ordering of pages and the web addresses to access this content (such as for 'about us' pages).

So is there a case to standardise the templates and/or content of these pages and where to find them in websites across government?

I think so.

From an audience perspective, there is a strong case to do so. Citizens often use multiple government websites and it makes their experience more streamlined and efficient if they can find what they need in a consistent place (such as www.agency.gov.au/privacy), written in a consistent format and, where possible, using identical or near identical language.

It would also save money and time. Rather than having to write and seek legal approval for the full page content (such as for privacy information), only agency-specific parts would need writing or approval. Websites could be established more rapidly using the standard content pages and lawyers could focus on higher value tasks.

To put a number on the current cost of individually creating standard, if you assume it cost, in time and effort, around $500 to develop a privacy policy and that there are around 941 government websites (according to Government's online info offensive a flop), it would have cost up to $470,500 for individual privacy policies for all sites. Multiple this by the number of potentially standardisable pages and the millions begin adding up.

Standardisation could even minimise legal risks. It removes a potential point of failure from agencies who are not resourced or have the expertise to create appropriate policies and expose themselves to greater risks - such as over poorly written legal disclaimers which leave them open to being sued by citizens.

In some cases it may be possible to use the same standard text, with a few optional inclusions or agency-specific variations - such as for privacy policies, disclaimers, accessibility statements, terms of use, and similar standard pages.

In other cases it won't be possible to use the same content (such as for 'about us' pages), however the location and structure of the page can be similar - still providing public benefits.

Let's take privacy policies specifically for a moment.There's incredible diversity of privacy policies across Australian Government websites, although they are all subject to the same legislation (the Privacy Act 1988) and largely cover the same topics (with some variation in detail).

While this is good for lawyers, who get to write or review these policies, it may not be as good for citizens - who need to contend with different policies when they seek to register for updates or services.

Many government privacy policies are reviewed rarely, due to time and resource constraints, which may place agencies at risk where the use of new tools (such as Youtube, Slideshare and Scribd) to embed or manipulate content within agency sites can expose users unknowingly to the privacy conditions of third party sites (see how we handled these in myregion's privacy policy with an extendable third party section).

So, how would government go about standardisation? Although effectively a single entity, the government functions as a group of agencies who set their own policies and manage their own risks.

With the existence and role of AGIMO, and the WebGuide, there is a central forum for providing model content to reflect the minimum standard agencies must meet. There are mandatory guidelines for agencies, such as for privacy, however limited guidance on how to meet it. A standard privacy policy could be included and promoted as a base for other agencies to work from, or even provided as an inclusion for sites who wanted to have a policy which was centrally maintained and auto-updated.

Alternatively web managers across government could work together, through a service such as GovDex, to create and maintain standard pages using a wiki-based approach. This would allow for a consistently improving standard and garner grassroots buy-in, plus leverage the skills of the most experienced web masters.

There's undoubtably other ways to move towards standardised pages, even simply within an agency, which itself can be a struggle for those with many websites and decentralised web management.


Regardless of the method selected, the case should receive consideration. Does government really need hundreds of versions of what is standard content, or only a few?


Examples of government privacy policies (spot the similarities and differences):

Read full post...

Monday, March 05, 2012

Who is your Marketing or Communications CIO?

I was struck by a comment from Dan Hoban (@dwhoban) at GovCamp Queensland on Saturday, which resonated with me, and with others in the audience, that organisations now need a CIO (Chief Information Officer) in their marketing or communications teams.

This is a person who understands the technologies we use to communicate with customers, clients, citizens and stakeholders and can provide sound advice and expertise in a manner that traditional ICT teams cannot.

The role of this person is to understand the business goals and recommend approaches and technologies - particularly online - which are a best fit. Then it may be this person and their team, or an ICT team, who build and deliver the solutions needed.

When Dan named this role I realised it fit absolutely the role I had been performing in government for my five years in the public service, and for a number of years prior in the corporate sector.

Where ICT teams were focused largely on reactive management of large critical ICT systems - the SAPs, payment frameworks and secure networks - it has long been left to Online Communications, or similar teams or individuals in other parts of the organisation, to proactively introduce and manage the small and agile tools communicators use in public engagement.

No organisation I've worked in or spoken to has ICT manage their Facebook page, Twitter account, GovSpace blog or YouTube channel. Few ICT teams are equipped to cost-effectively and rapidly deliver a focused forum, blog, mobile app or data visualisation tool. They don't recruit these skills or, necessarily, have experience in the right platforms and services.

When Communications teams seek advice on the online channels and technological tools they should use they ask ICT, but frequently are told that ICT doesn't understand these systems (even when individuals within ICT might be highly skilled with them), doesn't have the time or resources to commit in the timeframes required (due to the need to focus on critical systems), doesn't have the design skills or that it would take months (sometimes years) to research and provide an effective opinion - plus it will cost a bomb.

So Communications teams, who have their own deliverables, have no choice but to recruit their own social media and online communications smarts.

It is this person, or team's role, to understand Communication needs, make rapid and sound recommendations of channels and tools, design the systems and the interfaces, integrate the technologies (or manage the contractors who do) to deliver relevant and fast solutions on a budget.

So perhaps it is time to recognise these people for what they actually are for an organisation - a Marketing or Communications CIO.

I expect ICT teams will hate this. Information has long been their domain even though their focus is often on technology systems and they do not always understand the information or communication that feeds across these systems - the reason these systems actually exist.

Perhaps it is time for them need to rethink their role, or let go of the agile online and mobile spaces and focus on the big ticket systems and networks - remain the heart, but not always the adrenal glands or, indeed, the brains, of an organisation's ICT solutions.

Read full post...

Tuesday, January 17, 2012

IT can drive big productivity gains in government

With the rise in the efficiency dividend and increasingly tight budgets across government, I keep wondering whether there are places where government can make real savings and raise productivity other than simply by cutting costs.

The crunch is often that one must invest money to save money - a position common in business but often a struggle in government, where the focus is so often on grants and programs.

However, having spoken to a fair few frustrated people lately from a range of agencies, there appears to be a significant source of productivity gains - and thereby cost savings - right under the noses of many departments. Their IT systems.

Over the last year more and more of my friends and peers changing departments have cited IT as one of their reasons for wanting to make a move. They all want to be productive, however grappling with slow and aging computers and software or restrictive internet access policies appears to be rising as a concern and even becoming a question to agencies in interviews.

This doesn't surprise me - in fact I noticed when I originally joined the public service that, through no fault of departments, the IT equipment and software wasn't up to the same standard as I'd experienced in the private sector. Over time people adapt and learn to work within the constraints of the system, however what productivity could be unlocked if these constraints were relaxed?

Today I'm aware of agencies where reportedly close to 50% of staff have their own computing devices at their desks. Personal ultra-light laptops, tablets and smartphones have become one route to employee productivity, overcoming desktop IT restrictions.

However since a friend of mine left an agency late last year frustrated that they lost over an hour a day of productive time in struggling with their desktop computer and that they couldn't access the forums and blogs written and frequented by their stakeholders due to access limits, I thought it was worth doing a calculation of the productivity losses that could be attributed to IT constraints.

Let's say that an agency's low bandwidth or older desktop PCs and software cost 2 hours of productive time per employee each week. This may sound like a lot, but if a PC takes 10 minutes to start up each morning you're halfway there already.

For a moderate sized agency of 4,000 staff the lost productive time would be 8,000 hours per week - the equivalent of employing another 200 staff.

At an average wage, including onboarding costs, of $70,000 per year (about $35 per hour), this lost time equates to $280,000. Each week.

Per year the cost of the IT productivity loss would be $14,560,000. Every year. Or, if you prefer, a productivity loss of $3,640 per person per year. Every year.

For an agency experiencing this type of productivity loss there's a few ways to offset it:

1) Reduce wages across the board by $3,640. This would be deeply unpopular.
2) Find efficiencies in other areas (reducing expenses) equivalent to the lost productivity. This may be difficult to do every year.
3) Reduce expenditure on programs and activities affecting citizens. This is politically dangerous.
4) Invest in IT improvements.

So how much would agencies have to invest to reclaim that 2 hours per worker per week? It would vary quite widely as it depends on what is causing the IT productivity drain.

However it is possible to model how much an agency should be willing to invest into improving their IT. This, of course, assumes that agencies can convince their Minister, the Department of Finance and Treasury that they should invest in IT systems - not an easy sell.

Assuming that an IT cycle is around five years (from a top-end PC becoming a low-end PC and corresponding software and network impacts), an agency should spend less than the cumulative five years of productivity loss in order to emerge ahead.

On that basis, a Department should spend less than $18,200 per staff member (the $3,640 productivity loss multipled by five years). Given wage rises, let's round this up to a maximum of $20,000 per staff member.

Therefore a Department with 4,000 staff should spend at most $80 million to rejuvenate its IT and remove the productivity shrinkage. If it spends less than this it is realising a productivity increase.

That's a fair chunk of cash - and far more than most agencies of that size would ever need to spend on IT equipment and software.

In fact, if you bought every staff member a $3,000 PC plus the same amount for support, equipped each staff member with $2,000 of software and $2,000 worth of broadband (coming to $10,000 per staff member), you'd only have spent $40 million for a 4,000 person agency.

Of course with bulk purchases agencies can get much better prices than these. Also I didn't include staff, training and overheads. Hopefully it would balance out.

If it did, that would leave you with $40 million dollars in productivity savings - $8 million per year.

Of course all these figures are 'finger in the air' rough and some of the productivity benefits can be realised quickly and cheaply by simply adjusting internet policies and filters or giving staff who need the best equipment the equipment they need.

However the basic premise holds, that IT isn't just a cost for agencies, it is a valid and important source of productivity gain for agencies. If an agency can equip their staff with the right tools and connectivity for their jobs they will be able to be more productive.

And if an agency can do so at less than the cost of their staff not having the right IT tools then the agency, the government, and Australia, are all ahead.

Read full post...

Friday, March 18, 2011

The coming open data battle - government versus commercial interests

I'm a big fan of opening up as much public sector information as possible in easily discoverable and reusable ways (taking into account privacy, security and commercial-in-confidence considerations).

The data allows citizens and organisations to build a more informed view of their government's activities, a good accountability measure.

It also allows the development of useful applications and services at low cost and even lower (frequently free) prices. Sure they may not be as polished as multi-million dollar services developed by governments or big business, however they allow citizens to choose the tools that work best for them. Government or big business can always use these learnings to build on.

Open data also allows government agencies to see what data other agencies have, and lets them use it to improve their models, understanding and policy. While often overlooked in the rush to provide data to citizens, often agencies have as much trouble discovering and accessing data from other agencies as citizens do.

However as more public sector data gets released, losers are also emerging, some with deep pockets and effective lobbyists.

Who loses when government data is released for free? Several groups spring to mind.

First are companies that make their living from licensing public information and selling it on (often with value-adds) at a mark-up. These companies allow agencies to extract a market price for their data without having to contend with the complexities of the open market. They often have a monopoly position, controlling access to a source of public data, and can be very resistant to losing their monopoly or seeing the data 'devalued' through free release.

Second are companies that rely on getting data first to build their edge. This includes stock market traders, where having information a few hours earlier than the market may be worth millions. It can also include the media, who thrive on 'exclusives'. Where data is released to specific journalists under Freedom of Information or through other channels ahead of others they have an informational edge over their rivals.

Next are organisations who prefer to obscure the true cost of goods and services in favour of complexity. Where customers can't compare prices effectively they can't make the best price decision, therefore they may choose expensive services based on brand and never realise they are paying more than they should. Sound like any industry you know?

Finally there's groups within government who prefer to keep citizens at arms length. Those who do not want too much scrutiny of their decisions or who believe the public won't understand the broad context under which they were made. This group believes in only telling the public what they think the public needs to know.

We're starting to see some of these groups flex their muscles in jurisdictions that are releasing a great deal of public sector information, or who are legislating for organisations to become more transparent.

One group currently resisting openness in the US are airlines. In the New York Times article, This Data Isn’t Dull. It Improves Lives, the journalist reports that,

...the Department of Transportation is considering a new rule requiring airlines to make all of their prices public and immediately available online. The postings would include both ticket prices and the fees for “extras” like baggage, movies, food and beverages. The data would then be accessible to travel Web sites, and thus to all shoppers.

The airlines would retain the right to decide how and where to sell their products and services. ...
The approach would make markets more transparent and efficient - allowing consumers to make a decision on flights based on complete knowledge.

So do airlines support this approach? Well, not completely. They wish the right to choose when and how they display their fees - choosing to control the flow of information and force consumers to continue to make sub-optimal decisions on partial information.

This reflects the situation in Australia with the Rudd Government's attempt to launch Fuelwatch and GroceryWatch websites. Petrol and grocery companies weren't particularly supportive of having the true cost of their products visible to consumers before they were at the service station or in the store. Once consumers were there it was far less likely they'd leave and shop elsewhere because of price. Of course the reason given was the complexity of exposing the prices publicly, although they don't seem to have this issue at the checkout.


Another example I have been watching is in Canada, where there's been an active discussion of the decision of BC Ferries to release FOI requests online at the same time they are released to the requester (where the request doesn't involve personal information).

Journalists have complained that the approach means they won't get an exclusive, removing their financial incentive for requesting government information in the first place. One journalist in particular, Chad Skelton, has written a series of pieces detailing why it is so important that governments allow media to profit off FOI requests, as otherwise they are unlikely to ask for this information and it won't be exposed for the public good. One of his articles worth reading is Why David Eaves is wrong about BC Ferries' Freedom of Info policies.

It is an interesting point, however I tend to sympathise with David's view - government information laws should not be designed to support the financial goals of media outlets, or any other organisations, over the goals of public openness and transparency. These laws should be designed to ensure that public information gains public scrutiny, not so that journalists can 'make' their careers with exclusives.


As we see more public sector information released by governments I expect we'll see more battles over its release. Some forms of opposition will be passive, providing information in the least usable formats possible or hidden away in websites; other forms will be active, direct refusals to release information (because it is incomplete, the context wouldn't be understood, or it isn't useful), court cases from commercial interests asking for information to be suppressed, or even active information sabotage where data is destroyed rather than published.

Reputations and fortunes can be made and lost over access to information. It is unlikely that entrenched interests will support changes to the playing field without putting up an ongoing fight.

Read full post...

Friday, December 24, 2010

US releases national survey of social media use in State Governments

The National Association of State Chief Information Officers (NASCIO) in the US has released an excellent report, NASCIO: Friends, Followers, and Feeds (PDF), which looks at social media adoption by US states, identifying best practice and sharing knowledge on how tools are being deployed.

To quote the report,

The survey examined adoption trends, current applications and expectations of social media technologies, the extent to which implementation is governed by formal policies or individual agency initiative, and perceptions of risk associated with social media tool use.

This is a fantastic resource for other governments as well and provides some key insights into who, how and why social media is being used by US state governments.

It is a must read for senior managers - particularly CIOs and Secretaries.

I strongly recommend distributing this report within your agency because, as the report says about Web 2.0 and social media,
CIOs may not have been immediately convinced of the business value of these tools as they entered the workplace, but the fact is that this is how effective governments are communicating now, and this is not just a fad.

Read full post...

Monday, September 20, 2010

Complete the ANZSOG survey on the economic value of open government

The South Australian government has commissioned ANZSOG to conduct a research study on the topic of

"Economic value of open access to government-held data and information"

ANZSOG is seeking respondents who can provide information about the approach of their organisations to the collection and dissemination of data and/or information, as well as their personal views on this topic.

They are particularly interested in hearing stories about experiences with open access to government data and/or information (be they positive, negative or neutral).

The survey can be found at http://www.surveymonkey.com/s/govinfosurvey

The survey should take approximately 20 minutes, depending on how much
detail you go into and is divided into the following sections:

  1. Introduction
  2. Access to data
  3. Cost recovery
  4. Characteristics of data
  5. Benefits of access to data
  6. Barriers to sharing data
  7. Health questions (for those working in the health industry only)
  8. Mining industry questions (for those working in the mining industry only)
  9. Conclusion
The survey deadline is Friday 24 September. Any information in addition to the survey can be sent to helen.moreland@transport.vic.gov.au

Read full post...

Thursday, July 01, 2010

Still on the Internet Explorer 6 web browser? Microsoft tells organisations to ditch it

Microsoft has just released a beta version of Internet Explorer 9, however is still having to ask organisations to stop using Internet Explorer 6 (IE6).

Despite lacking the ability to fully view the modern web IE6, released nine years ago, is still used by a number of Australian organisations, including some government agencies.

The Sydney Morning Herald, in the article Microsoft begs users to ditch IE6 quotes Microsoft Australia's chief security officer, Stuart Strathdee as saying “IE6 has a lifecycle. We’re well beyond its expiry date”.

The article also stated that,

Strathdee said corporate users who haven’t yet upgraded to IE8 fearing the loss of customised ERP and CRM systems were probably running outdated versions of those and should look to upgrade them all. He said the company would be happy to help customers do so.

“It’s only a very small number of queries on those systems that would be locked to IE6,” he said.

“For us security and privacy are closely related. We’re really pleading with people to upgrade.”

Is your agency still using IE6?

If so the question becomes, are your senior management aware of the security and reputation risks they are taking by doing so?

Read full post...

Friday, June 11, 2010

Reinventing website perfection

Traditionally, in my experience both in the private and public sector, the way to build a 'perfect' website has been considered to be;
invest a large quantity of resources, personnel and time at the start of the development process,
use this investment to build all the functionality that the developers can dream up, write all the content the communicators can think of and test it with audiences,
launch the 'perfect' website and hope it works, and then
replace the website (fixing most of the bits that failed) after 3-5 years by repeating the process again.

Personally I've never liked this approach. It places a lot of reliance on using past knowledge to guess future (organisational and audience) needs, involves investing a lot of resources upfront with limited ability to terminate or redirect projects until after they have failed and it also results in websites that degrade in effectiveness over time which can lead to progressively greater reputation and legal risks.

I'd like to see the process for developing a 'perfect' website reinvented. The new process must involve a low upfront cost, the ability to be flexible and agile to meet changing needs quickly and be capable of making a website more and more effective over time, improving reputation and reducing legal risks.

But how is it possible to achieve all these goals at once?

The answer is actually quite simple and well understood by successful entrepreneurs.

Rather than aiming for a perfect site on release day after an extended development period, the goal is to quickly build and launch a site that meets at least one critical audience need.

Once the site has been launched, ensure there are tools for monitoring how it is used and identifying user needs. Then progressively build extra functionality and write more content, guided primarily by the needs of your audience.

This approach ensures the site has enough value at launch to be successful, albeit in a more limited fashion than a 'kitchen sink' website (with more functionality at launch). It also ensures that the website grows progressively more useful and relevant to the audience you aim to serve.

In this way the site becomes increasingly perfect in a more realistic way - perfect for the audience who use it, rather than 'perfect' for the stakeholders who think they know what different audiences want.

We see this approach taken with all kinds of websites and products - from Apple's iPhones through to online services such as Gmail.

It's time to see more of this approach used with government websites as well.

After all - don't we want to create the 'perfect' website for our audiences' needs?

Read full post...

Friday, April 30, 2010

The street as a platform, what's government's role?

An extremely thought-provoking post about The street as platform written by Dan Hill in February 2008 has been brought to my attention by Darren Sharp.

The post explores the virtual life of a city street, all the digital data exchanging hands between systems, infrastructure, vehicles and people in the street unseen to human eyes.

While condensed into a single street, the post is based entirely on current technologies and practices. It could easily represent a real street in any major city anywhere in the world today.

The question for me is what is government's role in building the infrastructure, managing and effectively using the data collected?

Streets are generally infrastructure created and maintained by governments and the systems that 'power' a street are often installed and managed by public concerns (roads and pavements, water, sewage, electricity and telecommunications) or at least guided by government planning processes (the nature of the dwellings and commercial services provided on the street). So there's clearly a significant role for government in the virtual aspects of streets as well.

There has been some work done internationally on what precisely is the role of government (some articles and publications listed at the Victorian Government's eGovernment Resource Centre, but have we done enough here in Australia?

Given we have a national broadband network planned, and are already in the process of preparing for pilot roll outs, ensuring that this enables, rather than limits the vision of our digital streets in a managed and well-thought out manner is clearly moving its way up the priority list.

Read full post...

Tuesday, March 30, 2010

Australian public servants told three times - open (reusable) government data is important.

The Australian Public Service (APS) has now been told three times by three different reports in the last year about the importance of releasing much of its information openly to the community.

This began with reforms to Freedom of Information which, once passed, will encourage a pro-disclosure environment within the APS and make it easier and cheaper for people to request information from government.

Second was the Gov 2.0 Taskforce Final Report: Engage, which recommended managing public sector information as a national resource, releasing most of it for free and in ways that promoted reuse in innovative ways.

Third is the report released yesterday by the Department of Prime Minister and Cabinet, Ahead of the Game: Blueprint for the Reform of Australian Government Administration. The report recommended that Departments should create more open government, with one of the detailed sub-recommendations being,

Greater disclosure of public sector data and mechanisms to access the data so that citizens can use the data to create helpful information for all, in line with privacy and secrecy principles;
The last two reports are yet to be responded to by the Australian Government, however I hope that Australian public servants at all levels are taking note.

Once is chance, twice is coincidence, but three times is a strategy.

Read full post...

Tuesday, March 09, 2010

FutureGov Hong Kong - Day 1 LiveBlog

I'm attending FutureGov Hong Kong over the next two days and will be liveblogging and tweeting from the event as possible.

The event features speakers and attendees from countries across Asia-Pac, including Korea, Singapore, Taiwan, Hong Kong and China and should provide insights into Government IT and Gov 2.0 initiatives across the region.

We're just kicking off for the morning so I am opening up my liveblog below...

Read full post...

Thursday, January 28, 2010

Get your entry ready for the Australian Excellence in e-Government Awards (from AGIMO)

AGIMO's Excellence in e-Government Awards are on again for 2010, with nominations opening on 1 February (and closing 1 March).

This year the categories have been revised. There are now five project specific Awards plus a team or individual Award for outstanding achievement in an Information Technology area within the Australian Federal Public Service.

Details of the e-Awards are available at AGIMO's section in the Department of Finance's website.

While nomination forms are not yet available, if you're considering entering the awards you may wish to get a head start in understand the project specific criteria.

Read full post...

Friday, November 27, 2009

How much should a government website cost - are we over-engineering our websites?

These days when I personally need to set up a new website, I either hop onto Wordpress or download one of the free open-source content management systems, purchase space on a decent US server and follow the installation instructions.

I use a design template found online, customising it with some style tweaks where required, then spend a few days writing content.

It's not very hard and doesn't take very long (normally under a week).

However in government we have very strong governance structures around website creation - with good reason - to ensure that the platforms we use are secure, reliable and effective. We also have extensive content approval processes which can require a number of steps before words reach the screen.

This places a great deal of overhead on the process of creating and managing government websites, adding significantly to IT and resourcing costs.

I don't question the need for public organisations to guarantee the reliability and security of their websites. However I do wonder if we're placing a disproportionate level of cost onto this process - so much overhead on our websites that they may be slower to deliver and less cost-effective than other communications channels.

I also wonder if departments spend much time scrutinising their governance arrangements to see if they can reduce the burden, and therefore the cost and time to market, (without compromising the outcome) by either planing ahead or working together better.

If we are really one government shouldn't we be able to - as a group or via some central agency - security assess and review a group of web technologies then pick and choose between them as needed - depending on our internal platforms and needs?

Why not compare our departmental content management processes and learn from the organisations who are most effective and efficient?

Food for thought.

Read full post...

Thursday, October 01, 2009

Adapt the service not the user

I've been rereading the ABC article about the two girls who got caught in a drain and used their mobile phone to update their Facebook status, rather than call Triple 0.

A representative of the Metropolitan Fire Service (MFS) in Adelaide said that,

If they were able to access Facebook from their mobile phones, they could have called triple-0, so the point being they could have called us directly and we could have got there quicker than relying on someone being online and replying to them and eventually having to call us via triple-0 anyway.
Professor of Media and Communications at the Queensland University of Technology, Terry Flew, says public education campaigns are facing an ongoing struggle to compete with social media.

I think that the main point has been missed.

The internet and digital devices are changing cultural and personal behaviours. In some respects they are even changing our physical behaviour and may be changing our brain chemistry.

I don't believe that it is the role of Public Authorities to try to turn the clock back by 'competing' with social media - reinforcing messages such as if you're in trouble call triple-0 - just to preserve the 'way the system has always worked'.

In usability terms this is similar to releasing a human-unfriendly system, then producing a huge user manual and communications campaign to attempt to train people to work the way the system works (except in this case the system remains the same and it is people who have changed).

Often it is cheaper and more effective to turn this approach on its head. Re-engineer the system to work the way that people think.

Successful companies have learnt this. They change their products over time to suit emerging social and cultural norms. It's a Marketing-based approach, where the organisation figures out what people want and provides it, rather than a Communications-based approach, where you build products the way the organisation wants then try to convince people to accept them.

The lesson I draw from this emergency situation is that the public service are still grappling with the questions of whether and how to adapt their systems to suit their audiences.

For the girls down the drain it may have been faster for them to call Triple-0, however this wasn't the behaviour they are used to. It was not 'normal' in fact they've probably never done it before.

So why not adapt our emergency services instead?

Have a presence on social networks that people can use to contact them in emergencies.

Create smartphone apps that people can install and use to send the information the emergency services need to act.

Set up Twitter accounts that can be used to call for help.

Even simply point '911' to '000' so either number reaches our emergency services - most Australians hear '911' far more often in movies and on TV than they ever hear 'Triple-0'. The original rationale of '000' being less likely to be dialed in error due to being more difficult to call on dial phones has disappeared anyway with keypads.

Some of these avenues may be 'less efficient' for the system. They may increase the time required for emergency services to response.

However they will ensure that the emergency services CAN respond.

It may even increase the number of people who legitimately contact emergency services - those who wouldn't call Triple-0, but will put a note on Facebook that, for example, they are feeling suicidal.

Certainly checks and balances will need to be in place to prevent fraudulent use, but we managed to do it with a telephone number - surely we're smart enough to do this in other mediums.


The issue of adapting services versus adapting users isn't unique to emergency services, it affects every interaction between government and public.

Every time the government forces people to use the channel it prefers - be it telephone, paper, in-person (or even online) - it is attempting to adapt the user to suit its own processes and needs.

This can reduce citizen engagement, satisfaction and completion rates, resulting in poorer outcomes for individuals.

Instead the government should seek to understand how people prefer to engage and seek ways to adapt its services to suit peoples' needs. AGIMO's report, Australians' use and satisfaction with e-government services—2008, provides some ideas.

Sure there are many cases where it may be legally impossible to accept channels like the net for transactions with government. However there are many services where we can adapt - it just takes a little creative thinking. We may even save the public money or provide a faster service and we will not be 'competing' with social networks, we'll be leveraging them for public benefit.

Let's seek to change our public sector philosophies and adapt government policies and services wherever possible, rather than attempt to adapt our users to suit 'how we prefer to do things'.

Read full post...

Monday, July 13, 2009

Operating web and IT in an abundance mindset

Chris Anderson, the owner of Wired, recently wrote a very thought-provoking article about the need for organisations to consider how to operate within an abundance mindset rather than a scarcity-based one in his article, Tech Is Too Cheap to Meter: It's Time to Manage for Abundance, Not Scarcity.

Chris uses one example of how Wired used to restrict the email and file space provided to every staff member, with the IT team prompting staff regularly to delete files so as not to fill up the server.

One day he asked his ICT team how much file storage space Wired had for staff and was told that they had 500Gb - half the size of the 1 Terabyte hard-drive in the home computer he had recently bought for his kids. As he said,

My children had twice as much storage as my entire staff.
I have had a similar experience in various organisations I've worked at. Despite falling storage and computing costs, organisations often place heavy restrictions on staff computing power - for what reason I'm not sure.

Cost probably isn't a good reason for this scarcity mindset. If, for example, a 5,000 person organisation only allowed each staff member 200Mb in file and email space, that would mean the organisation had limited itself to 1,000Gb (1 Terabyte) of storage for staff.

Looking quickly at hard-drive prices, a 2 Terabyte commercial quality hard-drive costs about AU$500.

In other words, now you can buy twice as much staff file storage as the example organisation above for only $500 - and the price is going down.

Now consider the staff side of the equation. Files keep getting larger, as do emails. If you assume that each staff member spends 10 minutes each month reorganising their file space to prevent them from going over the organisation's limit, that's a cost of 50,000 minutes or 833 hours each month.

Assuming that each hour of staff time is worth around $50 - including wages, equipment and overheads - that lost time costs the organisation $41,650 in productivity, or $499,800 each year.

To put this in perspective, if the organisation removed the limit on file space and compensated by spending $500 (2 Terabytes) on extra storage it would save $41,650 in staff productivity costs - each month.

That's an ROI of 833% - each month.

Naturally there would be some other costs - servers, redundancy, electricity and the need for effective search technology. However the outcome would remain the same, the organisation is better off investing in more storage than in enforcing a 'scarcity' mindset.

File storage space is only one example.

I've also seen organisations struggling on low bandwidth, slowing down applications and internet services - therefore hindering productivity. With the ability for ISPs to provide adaptable bandwidth there's not really much excuse for this type of approach.

Equally organisations often provide their staff with outdated equipment and applications, which also reduces productivity. In many cases staff now have cheaper and more powerful systems and software at home.

While sometimes software is 'held back' to older versions due to security concerns (or lack of staff to check and approve security), the reality is that most modern software is more secure than older versions of applications.

Restricting software and hardware for security purposes can result in the opposite effect - reducing the organisation's security. If staff are forced to send work home to finish it, or go home to view websites and use online applications, this can raise the risks to the organisation.

Again this type of approach reeks of scarcity and cost-focused thinking, rather than an abundance and productivity-focused approach. It probably costs less for an organisation to employ contract staff to security-assess vital applications than it costs the organisation in lost productivity. Even though upgrading the applications may be expensive the net productivity and security gains for the entire organisation can be significant.

Another example is around the use of web services, which are extremely low cost and easy to test and trial. Organisations need to allow staff to experiment with these tools in appropriate ways, rather than requiring them to always follow tender-based processes to procure expensive custom-built alternatives, or have them coded in house (also at significant opportunity cost).

Finally organisational websites are often managed on a scarcity approach, with limited bandwidth and storage space, or with information cut-down from what is provided in print publications.

Again this applies a scarcity mindset. Domains are cheap, storage is cheap, bandwidth is cheap and an appropriately organised website can have great depth of content at relatively low delivery cost (certainly much lower cost than phone, mail or face-to-face).

So, in conclusion, at least in web and IT matters organisations need to consider an abundance mindset rather than a scarcity one.

They have to consider whether their policies and procedures aid or harm staff productivity and whether the cost of managing and policing some restrictive policies (such as file storage) is worth the productivity hit.

Read full post...

Bookmark and Share