- Blog (3)
I have been reading a great deal recently about the newest “big idea” in computing, “Cloud Computing.” I just realized that I am an expert in cloud computing having entered into the computer industry as a Cloud Computing Specialist some 30 years ago. Way back then (I believe it is now referred to as the Paleo-PC Period) computers cost millions of dollars (that was a lot of money in those days), they were housed in glass enclosed altars where the air was specially treated, the software to run these computers was expensive to build and maintain and rife with flaws (much like today) and there were only a very few of us demi-gods with the requisite skills and magic spells to make them work at all (we wore blue, knee length smocks, were feared and worshipped and called geeks behind our backs, can you imagine). The unwashed masses (some of us geeks called them, “L-Users”) were always bothering us with their endless requests for more functionality, faster response, and complaints about things like accuracy (hence, the term, “Garbage in, garbage out”) and reliability (“we are down for systems testing. How did you do your job before we got here? Well, then go do it that way until we are back up”). Those were the good old days. No one cared about security very much and certainly not privacy at all. We called it “Time-Sharing” and there was an entire industry built up around it. And then it all went away. I wonder if anyone else wonders why it all changed and an entire industry disappeared? And what that might mean for this latest incarnation, now called cloud computing. Well probably not but I thought I would muse a bit about it anyway.
Besides the fact that cloud computing, excuse me, time-sharing, was all we had there were obvious advantages to sharing expensive resources among many L-Users. However as the cost of hardware went down, the power of computers went up, and the distance (physical and relationship) between L-Users and demi-gods caused mistrust and dissatisfaction the rationale for cloud computing diminished. So once again as a factor of being old I get to see another cycle repeat. So what advice does a former demi-god Cloud Computing Specialist have to offer? Well first thing is that nothing, I repeat, nothing fundamental magically changes by virtue of where you do your computing. It is still about the basics. Do you know your requirements? Does the system or application meet those requirements? Does the computing enhance productivity, satisfy the identified needs of the user, and do it in a manner that meets the need for security, privacy, and transparency? Then the question must be asked. Are you really saving money? Are the staff, computers, and facilities that are being replaced in the cloud really going away or are they merely being repurposed? If they are I think we will know by the howl that will go up as jobs and contracts are lost around the country. Next, we must ask, what are we doing in the cloud to make sure that the distance between users and Cloud Computing Managers does not turn back the clock to the bad old days of blue-smocked demi-gods deciding what the L-Users should get. Customer service must be elevated and understanding and responding efficiently to changing user requirements made a priority. Care must also be exercised in the enthusiasm to move to this “new” way of computing to guard against “cherry-picking,” that is, the inevitable tendency to move the newer applications into the cloud but leave behind the older, stove-pipe applications on their non-cloud based legacy computers. Besides the cost implications, the risk is that further fragmentation of data will make transparency even more difficult to achieve. The obvious issues around security, redundancy and resilience of these cloud computing facilities have been identified and are already being widely discussed but I have missed it if there has been a public discussion about the network infrastructure required to support the cloud. The assumptions around reliability, sustainability, and capacity that appear to exist about this network infrastructure need to be identified and challenged. So while this new, old way of computing (or is it old, new way) is worked out, I am trying to get out ahead of the next big idea. Ghost computing.
The single most important IT related activity that the Obama administration should do on January 21st is to set a "mission to the moon" agenda for the Federal IT community. As an example, I would suggest the following: "On January 20, 2013 the federal civilian IT community will be delivering all IT services to all its customers via a single, highly secure, fully redundant, speed of light, state-of-the-art, consolidated IP distribution network at a cost that is five times less expensive than the total cost of these services today and is delivered with a customer satisfaction rating that is five times better than today.” The specifics of the goal are less important than the goal itself being a clarion call for the public sector to step up and serve a leadership role. Instead of being forced into the 21st century the federal IT community—given the talent, resources and mission—should lead the way.
Telework is great. I love it. I support it. I am for its adoption.
So why all the declarations?
Maybe I’m overly sensitive, but it’s been my experience that people are tarred with opposing telework if they point out that teleworking has real costs and requires difficult change.
But it does. And agencies must confront three truths before telewoking will spread more widely than it has so far. They are cost, culture, and Congress.
First, costs. In order to make telewoking possible, agencies must in many cases upgrade their infrastructures. In some cases they’ve got to build a whole new one.
This upgrade starts with a government owned PC, preferably a notebook, issued to each and every teleworker. Gone are the days when we can tolerate a government employee doing government business on a personally owned computer. This computer must have a locked down, standard software and hardware configuration so that any data stored even temporarily on the laptop is virus free, non transferable, and encrypted. This computer must be equipped with two-factor authentication of its user.
Then agencies must provide high-speed telecommunications access so employees can be productive. If we allow a user wireless access we have to assure that the access point is properly configured and operated.
Tech staffs must also provide data-in-motion protection and secure access to government networks via a Virtual Private Network. At the government network access point we must have the capability to assess the presence, currency, and authenticity of all of these factors—before allowing access.
Depending on the level of technological maturity of an organization, all this could mean tens of millions of dollars of new capitalization funds and equal amounts of annual support costs. This money does not exist in most budgets.
If you throw in the costs of providing other technologies allowing teleworkers to be fully integrated into the work space, such as teleconferencing, printing, scanning and PDA usage, the costs climb even higher. GSA has estimated that those costs could reach $7,500 to initially outfit each teleworker, and half that again in annual support costs.
Much of these costs are sunk, meaning that you incur them whether you support 100 or 10,000 users. So the tab to equip 100 users at an agency could run to several million dollars in upfront sunk costs.
Telework supporters often cite the savings in transportation costs, office space and environmental benefits. All probably true, but none of these savings is available to the agencies to offset their hard costs.
Then there are the cultural issues to be addressed. Agencies will have to identify and learn new techniques in collaboration and working together at a distance; new ways of dividing work and reassembling work products; new skills in communication and relationships maintenance. You’ll have to perfect new management skills and motivation techniques.
Not all employees have easily quantifiable work products. Managers will have to create new ways of measuring individual contributions.
Finally Congress will have to make adjustments in law to allow for the change from an hourly approach to compensation to the concept of a salaried government workforce that is paid for results and not just hours worked.
All of these issues are readily addressable but they require a broader view of what is involved in telework. Simply mandating that it should be so is not enough.