Smarter Gov Tech, Stronger MerITocracy
This page is not built out yet. If you are seeing this page, please contact an administrator.

Tin Cupping Doesn’t Work: Why We Desperately Need an IT Modernization Fund

 

GAO this morning confirmed what anyone who has looked around the Federal government knows all too well–we are trying to embrace a 21st century digital government using antique (some would go so far as to say “fossil”) computer systems. GAO just completed a yearlong review of just how bad things truly are, and oh boy, are they bad.

GAO reports that the Federal government spends the majority of its $80 billion technology budget on maintaining and operating legacy systems—systems that are extremely vulnerable, often unprotectable from hacks and exfiltrations. And don’t even mention how much all this old stuff costs to maintain. GAO told Congress that millions of Federal dollars can be saved just through consolidating data centers throughout the country.  To date, agencies have closed more than 3,000 data centers resulting in savings of $2.8 billion.

Some agencies reported 3,427 IT staff employed just to maintain legacy-programming languages, such as COBOL (1,085) and Fortran (613). This does not even include DoD or Labor because they could not estimate the number of lines of legacy code.

My favorite example from GAO is how DoD is using a 1970s-vintage IBM Series 1 mainframe to store nuclear weapons alert notifications. Sure, the system is a backup, and yes, Terry Halversen told Congress that it actually works pretty well and cannot be hacked. I guess the Chinese don’t have anyone old enough to figure out how to get access to it.

So what should agencies do to solve a problem that Federal CIO Tony Scott says is worse than the Y2K challenge?

Well, Scott said something else to Congress that sums up the situation brilliantly: “When you find yourself riding a dead horse, you should dismount.”

His idea–let’s get off the old horses by establishing an IT Modernization Fund to be used as seed corn to accelerate the agencies off of fossil computers. To ensure good money is not thrown after bad, require the agencies to make a hard and fast business case to an independent panel of experts before they get any funding. Then track their success or failure closely and with full transparency.

It’s interesting because this is not a new idea. Vivek Kundra, the previous Federal CIO, called for something very similar in his seminal 25-Point Plan for IT Modernization. And the first versions of FITARA, which was successfully voted out the Oversight Committee and cleared the floor of the House of Representatives by unanimous consent, contained a similar idea that unfortunately died when the legislation went over to the Senate.

Even Jason Chaffetz, R-Utah, chairman of the Oversight Committee and a tough guardian of agency spending, said he was “warming to the idea” of such a fund.

Good. We need the IT Modernization Fund. Going agency by agency and tin-cupping is simply not going to get the job done. How else are we going to get basic cybersecurity embedded across the Federal government?

 

Richard Beutel is a Principal at Cyrrus Analytics.

 

 

 

The Sky’s the Limit: Best Practices for a Surprise-Free Cloud Move

The pressure is on for agencies to make the move to cloud – just 13 percent of Feds say they can deploy new systems as fast as required[1].

However, agencies know cloud is here to stay.  When it comes down to it, Feds are optimistic – 70 percent say increasing their cloud adoption pace will improve IT’s ability to innovate[2].

That said, flying through the cloud procurement process is no easy feat.  Making the cloud move is proving to be more complex than expected – but with the right planning and strategic moves, Feds can ensure they undergo a surprise-free move.

It all starts with the basics.  Agencies need to build their perfect cloud from the ground up.  Whether it’s public, private, or hybrid cloud, agencies should first think about their goals, define what success looks like.

The devil is in the details.  For example, do you have the staffing availability and skillsets to implement cloud and keep it running?  Oftentimes, agencies skip right to hardware – when skillset should be a part of the initial conversation as well.

Expect the unexpected.  What kind of performance and security does your agency need?  Will you need backup protection for your data?  How fast can your agency respond to an issue?  Agencies must also consider unpredictable situations when making the move to cloud and choosing a cloud service provider.  Keep in mind that data and applications are not all alike, each may have a different set of requirements.

Take only what you need.  Agencies need to forecast what they need from their cloud, in terms of capabilities and other specs.  Are you looking for data management?  Flexibility?  What’s best for some is not for others.  A good cloud service provider will walk you through the key considerations.

Shop now, pay later.  Given budget pressures, agencies often want to jump directly to pricing when tackling the cloud procurement process.  But, this can backfire, as the agency can choose a solution that may not best fit their needs.  Start by considering the needs of your applications and what it takes to achieve mission success – then move on to price evaluation.

ViON helps agencies make the right cloud move.  We help you leverage the right architectures and capabilities to solve your business problems and achieve mission success.  We’ll sit down and define success – identify the problems you need to solve.  Then, we’ll design the cloud that’s best for you.

With a straightforward business model approach to cloud, we make sure you pay only for what you use – helping you maximize your budget for the best performing cloud.  And, with 24/7/265 enterprise class support, professional, and managed services, we ensure you don’t have to make your cloud move alone.

And that’s the real silver lining.

This blog post was originally published here

[1] “The Agile Advantage:  Can DevOps Move Cloud to the Fast Lane?” MeriTalk, May 2015.

[2] “The Agile Advantage:  Can DevOps Move Cloud to the Fast Lane?” MeriTalk, May 2015.

The Situation Report: Get Ready For Legacy Dustup On The Hill

FITARA Fallout

My Capitol Hill listening post decoded some interesting chatter this week taking place throughout the halls of the Rayburn House Office Building. That was where NASA landed its newest spacecraft—the Starship Incredulity—and unloaded one of the most expensive works of fiction ever written. NASA CIO Renee Wynn had the audacity to situation report logoreport that not a single investment in the agency’s planned $731 million in technology programs is high risk. That didn’t go over well at all with one senior Federal official, who looked at NASA’s failing grade on the annual Federal Information Technology Acquisition Reform Act scorecard and said: “Look, if they’re not going to be honest then they’re going to get an F. It was that simple.”

Legacy Lashings

Get ready for another round of hard-hitting congressional questioning, this time about the government’s addiction to legacy IT systems that continue to drain billions of dollars of taxpayer money away from new investments and modernization accounts.

The House Oversight and Government Reform Committee plans to call Federal CIO Tony Scott, Defense Department CIO Terry Halvorsen, Health and Human Services’ Acting CIO Beth Killoran, and IRS Chief Technology Officer Terry Millholland before the full committee May 25 to answer for this epic waste of Federal IT dollars.

The Government Accountability Office’s auditor in chief, Dave Powner, will also make an appearance at the hearing—Federal Agencies’ Reliance on Outdated and Unsupported Information Technology: A Ticking Time Bomb. My advance scouts report that Powner’s revelations will certainly leave you guessing “is it live, or is it Memorex?”

 

 

The Weekend Reader – May 20

The Federal IT Papers–Part 1

meritalk.comMeriTalk begins a series taken from a book-length work authored by a senior Federal IT official currently working in government. This is one part of an extensive, firsthand account of how IT decisions are made, the obstacles standing in the way of real change in government technology management, and what one career Federal IT employee really thinks about the way government does IT.

Space Agency’s FITARA Grade Baffles Lawmakers

meritalk.comNASA plans to spend $731 million on major IT investments, but has not reported any of those programs as high risk. “You can’t manage these IT investments appropriately if you can’t acknowledge risk,” said Dave Powner, director of IT Management Issues at the GAO. Those with failing grades were not the only ones to receive criticism from the committee members.

 

IG Uncovers Data Breach at 18F

meritalk.comIG investigators became aware of the vulnerability and the data breach during their ongoing investigation of 18F financial management. The vulnerability stemmed from 18F’s use of the Slack instant messaging system in conjunction with the OAuth 2.0 authentication and authorization process. 18F supervisors are also coming under scrutiny for delaying the reporting of the vulnerability.

 

Industry Prods Cybersecurity Commission to Back Blockchain Tech, Insurance Market

meritalk.comIndustry executives urged the Federal government to do more to advance the use of blockchain technology to secure online financial transactions, and to get behind nationwide adoption of cybersecurity insurance. Although bitcoin is an anonymous network, IBM Fellow Jerry Cuomo advised the commission to support what is known as permissioned blockchain, which involves the use of blockchain in networks where the users are known and trusted. “Blockchain has tremendous potential to help transform business and society, but it’s so strikingly different from what people are used to that many business and government leaders are adopting a wait-and-see attitude,” Cuomo said.

As IT Changes Health Care, CIOs Must Be Responsive

meritalk.comDave Dimond of EMC said the challenge health care providers face has everything to do with meeting patient expectations in a world “where they want things, and they want them now.” A recent survey from Vanson Bourne found that 89 percent of health care providers say technology has already changed their patient expectations. With the health care sector moving rapidly toward value-based care–a model based on using data to improve outcomes while lowering costs–information management will be key for organizations that want to survive, Dimond said. IT is a resource that will be spread too thin without finding new ways to free up a CIO’s time to focus on the requirements of value-based care, such as predictive analytics.

 

Cloud Security – Silver Lining for Feds

FedRAMP is not accelerating the path to the cloud for Federal agencies as quickly as anticipated. But, recognizing potential saving opportunities and significant operational and efficiency benefits, Feds are ready to move. A research analyst at Deltek stated, “Fiscal 2016 will be a year when cloud spending picks up greater speed…”

It’s time to clear the most significant barrier to cloud adoption – security concerns.

 

 

 

The Cost of Cloud: Covering All Your Bases

To score a home run, you have to cover all the bases. But getting from start to finish is harder than it looks. For Federal agencies, hitting IT out of the park is even more difficult due to budget constraints, lengthy procurement processes, and staffing difficulties.

And, as agencies continue to spend 79 percent of IT budgets – or $62 billion annually – on legacy systems, the window for innovation decreases every day.

Federal mandates like FITARA are stepping up to the plate, and Federal IT leaders are looking to cloud migration as a key to increased efficiency, innovation, and cost savings. Feds point to the opportunity to realize an estimated $12 billion in potential cloud savings through cutting duplicative and wasteful spending.

But here’s the catch – 73 percent of Feds say budgets hold them back from updating legacy systems.

So how can agencies stay ahead of the curve?

Most agencies jump directly into price evaluation as the first step in the cloud procurement process. It is critical that agencies first identify cloud goals, ensure the right solution design – and then dive into cloud cost. For example, what mission challenges is your agency facing? What problems do you need to solve? What are your security needs?

Agencies also need to keep physical space and environmental factors in mind. Who needs access and how often? This area is often overlooked when considering cloud options and planning cloud costs. What are the performance and availability requirements? What kind of SLA’s do you need? How fast will the cloud provider respond if there is an issue?

Last but not least, agencies must determine whether they have the right skills on board – or alternatively, be prepared to leverage outside resources to manage their cloud infrastructure.

When it comes to overall price, many assume public cloud is cheaper than private. This is not always the case.

Typical public cloud implementations meter charges based on an input/output (I/O) rate. With private cloud, the necessary network infrastructure is included – and there will rarely be additional charges based on fluctuating I/O rate – which means more predictable costs.

Plus, private cloud storage can be customized. Flash, SATA, thin provisioning, and many other techniques can be leveraged to control costs and drive performance. Often as the storage environment scales up, or as the performance requirements increase, private cloud will be a better choice. The key is to define requirements completely, then build the right solution from there.

With the private cloud environment, qualified CSPs can tell agencies the most they’ll ever spend, cap that number, and give the CIO the opportunity to block additional spending. This is typically not the case with public cloud, where forecasting costs is more difficult – and there are often significant charges involved if an agency wants to remove its data.

At ViON, our cloud pricing model provides agencies with the option of adding more capacity – but still only pay for what they use.

We help agencies balance cloud costs and data access/data sharing needs to identify the right option. We begin with defining the requirements, considering all factors impacting costs, and identifying which cloud model is right (and most cost effective) for your agency. You can deliver more flexible, responsive IT services to your teams, improve citizen service, and enhance visibility, control, and efficiency.

And that’s the real silver lining.

 

This blog post was originally published here.

The Weekend Reader – May 13

Device Makers, Telecoms Face Competing Government Demands on Privacy

reuters.comThe FCC sent a letter to mobile carriers, citing “a growing number of vulnerabilities…that threaten the security and integrity of a user’s device and all the personal, sensitive data on it,” and asking how carriers address those vulnerabilities. The FTC simultaneously ordered eight manufacturers of mobile devices to respond to a detailed set of questions about how they update the devices’ security protections and keep customers informed of those updates. Terrorist groups rely on encryption, FBI Director James Comey said, suggesting–as the government argued throughout its attempt to compel Apple to help crack security on an iPhone used by the San Bernardino shooters–that law enforcement agencies believe they are entitled to assistance from tech companies.

 

Can Artificial Intelligence Help Government Serve Citizens?

govtech.comAt the NASCIO Mid-Year Conference in Baltimore last week, Government Technology talked to state CIOs about whether cognitive computing can help them deal with the data deluge. Wisconsin CIO David Cagigal sees a definite role for technology that learns from citizen behavior to inform services and anticipate future needs.

 

 

 

FDIC Officials Differ on What Constitutes ‘Major’ Data Breach

federaltimes.comFDIC experienced several data breaches over the last few years, at least seven of which were the result of employees taking sensitive information with them when moving on to new jobs. Initially, the FDIC declined to label these as major incidents, which would require immediate reporting to Congress. However, subsequent to IG investigations and news reports, it was determined all seven rose to this level and they were retroactively reported.

 

 

Menlo Park Police Join Federal Push for Open Data

mercurynews.comThe local police department has signed on to a 2015 White House initiative that calls for boosting transparency to increase trust in communities. The Menlo Park Police Department joins 52 other law enforcement agencies around the nation, including San Jose, San Francisco, Vallejo and San Leandro, participating in the Police Data Initiative.

 

Georgia Attorney General Supports Federal Data Breach Standard

lexology.comGeorgia Attorney General Sam Olens has come out in support of Federal data breach preemption as a more realistic way to ask companies to comply with regulatory requirements in the wake of a breach or data loss incident. His statement comes on the heels of California Attorney General Kamala Harris’ report that the burden on companies to comply with the patchwork of state data breach laws is too heavy, and that state laws should be harmonized to lessen that burden. Saying that “the day of benign neglect is gone,” Olens said companies that are lagging behind in putting reasonable security measures in place have no excuse.

 

A Look Ahead at the FITARA Scorecard 2.0

Congress next week plans to hold a hearing on how Federal agencies are doing in their adherence to the Federal Information Technology Acquisition Reform Act, known as FITARA, and the public will get its first look at the second round of agency grades.

The promise, of course, is that this second round of grades—the FITARA Scorecard 2.0—will somehow magically move the government from measuring their performance to changing their bad behaviors. That is not what is going to happen when the House Subcommittee on Information Technology meets May 18.

The FITARA Scorecard is undoubtedly a valuable tool for measuring agency progress in adhering to the letter of the law. It measures progress across data center consolidation, IT portfolio savings, incremental development, and risk assessment transparency. But let’s not forget that this is self-reported data that goes into FITARA Scorecard grades. Sure, most agencies got “Ds” and a handful scored well, but the scorecard must evolve if it is to become a true tool of transformation in government.

Moving forward, Congress must insist that the grades provided in the FITARA Scorecard are based on a much broader set of data points. Congress needs a clearer picture on the progress of CIO authority enhancements, as not all CIOs across government enjoy the same influence.

For example, how far has FITARA come in standardizing CIOs’ influence and control over IT investments, including the ever-present shadow IT that puts agency networks and data at risk? Are agencies leveraging governmentwide or agencywide contracting vehicles that would eliminate costly duplication? What kind of progress are agencies making on moving legacy systems to a modern cloud infrastructure and do agencies have a strategic road map in place to measure progress? And are agencies keeping tabs on the ratio of legacy IT spend and new IT investment? FITARA needs to be able to tell Congress how that imbalance is being addressed.

These are just some of the areas where FITARA’s scorecard could be improved. But by improving the data collection that goes into formulating the grades, we can ensure that those grades not only become more meaningful, they will actually become a tool for effecting real change.

Why Congress Needs to Take a Long, Hard Look at 18F

How many Presidential Innovation Fellows does it take to type 18F?

That may not sound like a serious question, but I can guarantee the anger of taxpayers and lawmakers will be real when they learn that it took a small team of designers from the General Services Administration’s 18F an entire weekend to come up with the following change to the digital service team’s logo.

18F logosIt gets worse. This so-called weekend “brandathon,” as it came to be known, began with an organization-wide workshop on 18F’s core values. “After this workshop, several design studios, and hours of work, the branding team had an initial set of deliverables to share with the rest of 18F,” wrote 18F designers Kate Garklavs and Jennifer Thibault in a mind-numbing 826-word blog posted Thursday to the 18F website.

This is what 18F calls “agile branding.” The weekend design “brandathon” took place last August—nine months ago.

Eric Ronne, one of 18F’s digital designers who participated in the summer work session, summed up the experience as follows: “At 18F we’re always changing and improving government interactions for our users. We iterate constantly here, and now we’ve iterated on our logo, too,” he said. “Our goal was to refresh the mark while nodding to the past, to create a straightforward update that’s accessible, bold, modern, and flexible.”

As if spending a weekend of design hours and nine months of internal back-and-forth discussions wasn’t enough to come up with this epic feat of Photoshop 101, the well-compensated digital branding innovators at 18F also spent time creating a collection of images featuring the new logo and inspirational messaging, “the optimism of which is central to our brand,” according to the blog post.

18F sky photo2“These images, which team members use as desktop art, weren’t exactly the highest priority ‘need’ item, but they were a fast way to show the team how the new system could begin to flex in more exciting ways than just templates,” wrote Garklavs and Thibault.

And this epic waste of tax dollars isn’t over. “We plan to create infographic templates for our social media accounts. And eventually, we’ll restructure and restyle our website, another outfit that we’ve outgrown since we started in March 2014,” they wrote.

I’m all for digital services in government, and for improving government services through technology and innovation. But the amount of time, effort, and money that GSA dedicated to changing a font in Photoshop is an obscene misappropriation of government resources. This type of maddening waste is exactly why so many observers in and out of government have come to question the purpose, mission, and value of 18F.

The innovators at 18F would be wise to “iterate” on something more difficult and of more importance to the American people. If they don’t start doing that soon, then the days of the 18F experiment in government are surely numbered.

The Situation Report: Silicon Valley Disconnect

Valley Girl

Gag me with a spoon, Erie Meyer. America’s self-described “foremost technologist named after a great lake…working on service delivery” at the U.S. Digital Service recently described in a Twitter post what her former career was really all about.

Although there’s no serious Valley experience detectable in Meyer’s official bio, this issue is actually a very serious concern among those who have been critical of the new digital service stars recruited from Silicon Valley to help save government from its Luddite past. As the argument goes, Todd Park—the digital service recruiter-in-chief—has staffed the government’s digital service teams with a bunch of wide-eyed techies who have been unable to divorce themselves from their former Silicon Valley mission of designing click-bait, or as GitHub Evangelist Ben Balter put it, working to “implement the button for those clicks.”

So far, “implementing buttons” seems to be right in the USDS sweet spot. Consider their accomplishments: A College Scorecard website for the Education Department; a “revamped” online presence for the Federal Election Commission; and a Web template designer for static government websites. There have been others, but nothing revolutionary.

This is a big problem for the USDS at the White House and its sister organization, the General Services Administration’s 18F. Both organizations should be bracing for a major slap in the face from the Government Accountability Office, which is planning to release an audit of their progress to date. And The Situation Report has picked up strong signals that these particular digital audit pages will make a loud thud when they fall on the desk of U.S. Chief Information Officer Tony Scott at the Office of Management and Budget.

Deep Throat Meets Demosthenes

On May 16, MeriTalk will begin publishing The Federal IT Papers—An Exclusive Insider Account of IT Decision-making Gone Wrong. This weekly series of stories is based on a book-length work by a current senior Federal IT official who goes by the pseudonym Demosthenes—a great orator in Ancient Greece, and the name chosen by Valentine Wiggin in Ender’s Game. Our author insisted on anonymity due to a fear of reprisals.

PapersIn our first installment, Demosthenes takes issue with the government’s infatuation with the U.S. Digital Service and the way in which career Federal IT workers are treated.

“The problem with the USDS is that they make all these promises and have cachet. They are in the West Wing after all,” the author wrote. “When they recruit people, the mind-set coming in is that everyone who was working in Federal IT before I got here is shit. As a result they get rid of everyone who knows anything.”

In one case, a USDS specialist recruited from Silicon Valley by an agency CIO went so far as to undermine the CIO who hired them. The result? According to Demosthenes, the agency head decided it was time to get a new CIO.

“And now I see them embedding themselves in preparation for this next administration, and I can’t sit idly by and not speak up. Somebody has to pierce this bubble and say, ‘hey guys, thanks for importing a whole bunch of really smart people who don’t know the first thing about how to get shit done here. Thanks for bringing in a bunch of people who don’t value my contributions and don’t want to hear about what has worked and what hasn’t.’ ”

You can read more of The Federal IT Papers starting May 16.

Beyond Escape Velocity

Let’s make it three and go for the hat trick on digital services. My left coast listening post reports that it’s no coincidence that GSA’s latest innovation partnership in San Francisco will be located right next door to the West Coast 18F. My Market Street surveillance station has picked up strong signals that the so-called Superpublic Innovation Lab is part of GSA’s master plan to extend 18F’s presence both physically and politically—reaching a hiring velocity that makes it virtually impossible for the next administration to escape the pull of government digital services.

GAO Visits VA

My Vermont Avenue surveillance station in Washington, D.C., has noticed unusual Government Accountability Office activity at the headquarters of the Department of Veterans Affairs. Sources tell The Situation Report that auditors are interested in VA’s progress on identity theft prevention. In particular, investigators want to know how far VA has come in eliminating the use of Social Security numbers as a key identification number for veterans.

The Weekend Reader – May 6

Cybersecurity Goals to Guide Federal Software Spending

ecommercetimes.comEvolving requirements to greatly improve Federal protection of information technology resources will shape Federal software spending. In fact, Federal cyberprotection goals should be augmented and significantly modified, according to recent studies of the Federal market. The linkage between increased Federal investing in cybersecurity and the requirements for bolstering IT protection are portrayed in two newly released reports.

U.S. Chief Data Scientist: Entrepreneurs Should do a Tour of Duty in Government

venturebeat.comThere’s no question that the U.S. government has collected an incredible amount of data. Whether for things like the Census, housing, agriculture, transportation, or health care, Federal agencies have accumulated data from around the country. In the past seven years, the White House has made efforts to leverage more technology at the Federal level.

 

 

Microsoft’s CEO Explains Why His Company Sued the U.S. Government

cio.comMicrosoft surprised the world last month when it filed a lawsuit against the U.S. Department of Justice, alleging that the frequent practice of attaching gag orders to search warrants for customer data violates the U.S. Constitution.
On Monday, CEO Satya Nadella told a group of tech luminaries why the company did so: Microsoft has a strong view on its privacy promises to users, and the company will fight to prevent government overreach that, in its view, compromises the principles of privacy.

Why Open and Frugal Should Be the Default for Government IT

techwire.netWith public-sector information-technology projects at any level of government, one does not have to look too far to find examples of waste and worse. In the wake of a series of failed projects, Hawaii is auditing its last four years of IT spending. On the local-government level, it would be hard to find a better example of what can go wrong than New York City’s CityTime payroll-system project, abandoned after its costs ballooned from $63 million to $700 million amid mismanagement and outright corruption.

 

Tech Companies are Unlikely to Oppose Government Demands on Data Access

firstpost.comCan other technology companies defy the government the way Apple did when asked to help U.S. investigators crack the code of iPhone 5C? Unlikely. Especially in jurisdictions where the governments may not be so benign in pursuing hidden material in electronic devices or data centers. Not EMC Corporation, the world’s largest data storage multinational.

 

 

The Situation Report: FirstNet Worries and VA’s Leadership Rumors

Public Safety Communications

Remember that $7 billion Congress allocated to help establish a nationwide public safety broadband network, known as FirstNet? Well, when was the last time 50 states agreed on anything?

My Reston, Va., listening post has picked up increasing concerns about FirstNet’s decision to adopt a centralized, controlled network. As any informed observer can imagine, each state’s emergency communications capabilities are based on separate networks and use vastly different technologies—although many still rely on Land Mobile Radios and, according to insider reports, have no plans to completely ditch their own autonomous LMR networks. And that’s a problem for FirstNet.

FirstNet has taken the position that state autonomy in network design decisions and management will jeopardize FirstNet’s ability to provide a network that meets its coverage and service goals.

“The governance model chosen by FirstNet is a federalized, centrally planned and directed network, bolstered by federal procurement practices that limit states to a consultative role,” according to a congressional analysis intercepted by The Situation Report. “A risk in choosing this model is that states may consider the federal presence excessive and cease to cooperate with FirstNet, jeopardizing the purpose of the network.”

But May is the month of proposals for FirstNet. Potential bidders will first be sending FirstNet officials capability statements followed by full proposals to build the network.

However, my mobile listening post outside of 445 12th Street SW, Washington, D.C., has picked up strong signals that the FCC is working hard on opt-out plans.

“As I have suggested to the board a number of times with all due respect, FirstNet isn’t the foremost thing on the minds of many governors when dealing with fiscal crises and natural disasters and things like that,” said former Vermont governor James Douglas during the last FirstNet board meeting in March.

VA’s New CISO?
situation report logoMy listening post on Vermont Avenue in Washington, D.C., tuned into the rumor mill this week and discovered some interesting, yet unconfirmed, reports of a new deputy assistant secretary for cybersecurity at VA.

Talk at VA is that Dominic Cussatt, the deputy director for cybersecurity policy in Terry Holvorsen’s office at the Defense Department, is packing his bags to take the cybersecurity gig at VA. Cussatt would replace Ron Thompson, who’s been serving as interim CISO since Veterans Affairs CIO LaVerne Council ordered Brian Burns to “redirect his exclusive focus on VA’s role in the Interagency Program Office (IPO).”

Truth or Dare?

My remote sensing system on Capitol Hill has picked up chatter that some “analysts” on the Hill are daring lawmakers to do away with traditional polygraph examinations for security clearance holders in favor of the intelligence community’s new Continuous Evaluation program, which relies on constant monitoring of social media and other forms of big data analytics.

If Congress—which isn’t subject to a polygraph examination—does consider eliminating the polygraph, the result could be new business for the IT industry.

“What emerging technologies show the most promise in providing more objective measures to detect lying?” asked the congressional researcher. “Should additional resources be directed toward encouraging such technologies?”

That said, the result could also be a boom in business for foreign intelligence services.

Intercept a situation report? Email me at dverton@meritalk.com or DM via Twitter

The Great Cloud Debate: Public vs. Private – and the ViON Hybrid Model

Today, just 13 percent of Feds say they can deploy new systems as quickly as required. And, it’s no secret agencies are being pushed to make the move to cloud – whether it’s data center consolidation initiatives, flexible performance to meet constituents’ on-demand requests,  simply the desire to increase overall IT efficiency, or the demands of the current regulatory environment.

So where do agencies start as they evaluate their cloud options?  Enter the Great Cloud Debate.

For most, choosing the right cloud is more than a black-and-white decision.  In fact, despite cloud growth goals and Fed initiatives, cloud adoption is currently only around 4% of federal workloads.  Concerns about contract lock in, application readiness, data security, data custody, and legacy buying processes are commonly cited as barriers to cloud adoption.

Before making a cloud decision, agencies need to start with the basics – and understand why they want or need cloud. From accelerating the move to an as-a-Service (aaS) consumption model to maximizing budgets to enhancing faster time-to-value, agencies need to define what their factors are for critical mission success and frame their move to cloud into that context.

When it comes down to public vs. private cloud, direct comparison is nearly impossible. The choice boils down to which cloud is the right answer for the agency’s unique mission, challenges, and workload requirements.  Issues to consider include performance requirements, availability, security, data custody and control, access patterns, support requirements, available internal skill-sets, speed of deployment requirements, and scalability.

Private cloud is often the best cloud option for Federal agencies. Seventy-five percent of Feds want to migrate more services to cloud, but are concerned about retaining control over their data. Further, agencies estimate 32 percent of their data can’t be moved to cloud due to security or data sovereignty issues.

Private cloud offers security, accelerated performance, availability – and importantly, but often overlooked – can be a means to eliminate re-architecting applications to fit into a cloud business model.  Agencies also retain full control of their data. Public cloud promises accelerated deployment, maximum scalability, and simplified acquisition.

That said why not have the best of both worlds?

Hybrid cloud models address agencies’ main concern for data security, availability, and control, and still enable high performance and flexibility. By placing data in a private, hosted environment and connecting just the compute side to a public cloud ensures data is secure – putting agencies’ nerves at ease. But, the agency retains control of the data and never, for example, needs to pay a fee to remove it from the cloud environment.

ViON’s hybrid cloud model enables agencies to experience the benefits of public and private cloud, addressing key concerns of agencies as they approach the cloud procurement process – security, availability, scalability, and cost.

Build Your Cloud, Your Way.

We give agencies a system with a floor and ceiling customized by their needs and budget.

IT teams can select best-of-breed infrastructure elements, customized to deliver the availability and performance required – vs. “take it or leave it” public cloud configurations.

And, they can add capacity – scaling up or down as required to meet constituents’ on-demand needs, only pay for what they use, and retain full control over their data.  That’s the real silver lining.

This blog post was originally published here.

 

Learn More:
Business of Cloud eBook
ViON Agile Cloud Solution Portfolio

Look Who’s MeriTalking: Virginia CIO Nelson P. Moe

The Commonwealth of Virginia has made important investments in IT and cybersecurity in its fiscal 2017 budget. Those investments will support a doubling of the staff supporting Virginia’s chief information security officer, the building of a cybersecurity range, as well as the establishment of a cybersecurity fusion center and a cybersecurity services bureau.

“The beauty of Virginia is that it’s a consolidated state–all the agencies come through us,” said Virginia Chief Information Officer Nelson P. Moe. “So from a cyber, governance, projects, and procurement, we get a chance to have a common agenda focus.”

The Weekend Reader – April 29

Sens. Ron Johnson, Tom Carper Ask OMBs Shaun Donovan for Federal Data Security Guidance Revision

executivegov.comUnder the appendix, Federal agencies are required to subject security controls for major applications and support systems to audits at least every three years. “While some documentation of security controls is essential, these three-year assessments are not cost-effective or consistent with best-practices or other Federal policies,” the lawmakers said. Carper and Johnson requested OMB to submit its response to the Senate committee within 30 days.

 

U.S. CIO Hints Federal Adoption of ‘Bimodal IT’ to Balance Old and New Tech

scmagazine.comU.S. Chief Information Officer Tony Scott Tuesday hinted his office may be working to help guide Federal agencies to adopt “bimodal IT” to balance modern IT with old but necessary systems.
Gartner defines the practice as “managing of two separate, coherent modes of IT delivery, one focused on stability and the other on agility.”

 

New Federal Cybersecurity Rules Moving Too Slow, Senators Say

ciodive.comThe senators say the lack of a new policy is preventing Federal agencies from moving to automated systems that can better protect Federal networks from cybersecurity threats. The existing Federal cybersecurity policy was created in 2000 and the threat landscape has evolved significantly since then.

 

 

 

IT Execs Join Federal Cybersecurity Panel

ecommercetimes.comOne of the most recent developments was the formation of a Federal Commission on Enhancing National Cybersecurity. Another was the formal introduction in Congress of the administration’s information technology investment plan, which is heavily tilted toward cybersecurity protection. The goal of the panel is to make recommend actions that can be taken over the next decade to enhance cybersecurity awareness and protections throughout government and the private sector, according to a White House statement.

Facebook Transparency Report Shows Increase in Government Data Requests, Most With Gagging Orders

betanews.comFacebook has published its latest Global Government Requests Report covering the second half of 2015. The transparency report reveals that there has been a 13 percent increase in the number of government requests for data, but it also shows that Facebook is still not able to be as transparent as it might want. For the first time the social network is able to report about the number of data requests that have a non-disclosure order attached to them.

 

The Situation Report: NIST Framework Mandatory? Open Source Rebellion at DHS?

The New De Facto Voluntary Standard

My Gaithersburg, Md., listening post has picked up strong signals that there’s a new mandatory cybersecurity standard in town. The Framework for Improving Critical Infrastructure Cybersecurity, developed in 2014 by the National Institute of Standards and Technology in close cooperation with industry, has always been a voluntary guide for organizations of all types and sizes to apply common best practices in risk management.

But a low-key change has taken place that sources say has shifted the NIST CSF from a purely voluntary practice to a mandatory standard for Federal agencies. For the first time, the government has linked the Federal Information Security Modernization Act metrics to the CSF. In fact, the fiscal 2016 FISMA metrics leverage the NIST CSF as a standard for managing and reducing risk, and are organized around the CSF’s five major functions of identify, protect, detect, respond, and recover.

“Since they tied the FISMA reporting metrics in 2016 to the Cybersecurity Framework, guess what? It’s now a de facto standard,” a source close to the development of the CSF since the beginning told The Situation Report. “You have to use it. It’s voluntary, but there’s no way of getting around it and still being compliant with FISMA.”

What’s In a Name?

Well, if you’re the focal point of the Federal government’s effort to get all of government and the private sector speaking the same language when it comes to cybersecurity then names matter a lot. Officials at NIST are seriously considering dropping “Critical Infrastructure” from the title of the Framework for Improving Critical Infrastructure Cybersecurity in an effort to boost adoption across a broader swath of industry. The framework is being credited with significantly helping to raise the bar in security across industries, but some in the private sector apparently have questioned whether a critical infrastructure guide applies to them.

situation report logoThe Open Source Battle at DHS

The Department of Homeland Security’s chief information officer Luke McCormack was put in a tough position recently when he had to publicly flip-flop on the department’s official position on the use of open source software.

McCormack was forced to post to GitHub a strong formal endorsement of a draft White House policy for publishing Federal source code in the open. “We believe moving towards Government-wide reuse of custom-developed code and releasing Federally-funded custom code as open source software has significant financial, technical, and cybersecurity benefits and will better enable DHS to meet our mission of securing the nation from the many threats we face,” McCormack wrote, reversing the concerns expressed a week earlier by members of his own team.

Those DHS IT officials had called out the misguided geeks at the White House noting that most security companies do not publish their source code because that would allow hackers to develop highly targeted attacks.

“Government-specific examples: citizenship anti-fraud rules that are coded into software, identification of special codes used to flag law enforcement actions, APT threat indicator scripts, Mafia having a copy of all FBI system code, terrorist with access to air traffic control software, etc. How will this be prevented?” a DHS IT official stated.

And what about protecting taxpayers’ interest in government-developed software? That’s right, some at DHS would like to know how the White House will prevent commercial entities from using taxpayer-funded software components in commercial systems that companies then sell back to the government.

McCormack may have caved to White House pressure, but The Situation Report has picked up on a rear guard action to stop the White House open source push–an effort that puts the nation’s security at risk through a deliberate decision to ignore the security issues surrounding software provenance and the threat of inheriting version-specific vulnerabilities in open source code.

“Given that national security systems are exempted from this policy, and virtually all DHS systems are deemed mission/business essential, any release of code is potentially exploitable,” DHS IT officials wrote. “To avoid having our in-house developed code becoming open source, we will have to either get the DHS CIO approval to the exceptions or declare our in-house developed systems to be National Security Systems and take them off digital rebellionthe Sensitive But Unclassified (SBU) blue line and put them on classified networks, thus increasing our costs of operation and support.”

Digital Service Rebellion

My forward observers report signs of a massive rebellion against the U.S. Digital Service by the career Federal IT employees who are being blacklisted, maligned, and generally pushed aside for not being “from the Valley.” MeriTalk plans to bring you an exclusive look at this insurgency—penned by a current Federal IT insider who believes the Obama administration has gone too far in its attempt to import change from Silicon Valley. Stay tuned.

Why Are We Letting Our IT Infrastructure Fall to Pieces?

Why are we letting our IT infrastructure fall to pieces?

Former congressman and Secretary of Transportation Ray LaHood recently asked this question (without the insertion of “IT”) about the nation’s aging infrastructure.

Alan Balutis
Alan P. Balutis (Photo: MeriTalk)

LaHood’s article focused on an oft-discussed topic–the nation’s crumbling physical infrastructure. He notes that in the past 12 months, broken dams in South Carolina caused flooding and fatalities, a massive gas leak in Los Angeles sickened and displaced thousands of families, and residents of Flint, Mich., found unsafe lead levels in their drinking water. In Washington, D.C., the region’s Metrorail system might be facing line closures to make long-neglected safety repairs.

Our nation’s refusal to perform critical maintenance, to invest in our public infrastructure, and to take care of our roads, rails, bridges, and pipelines has been widely discussed and the costs well documented. As Rosabeth Moss Kanter, a Harvard business professor and the author of “Move,” a recent book on the subject, said in The New Yorker recently, “Infrastructure is such a dull word. But it’s really an issue that touches almost everything.”

Only recently, though, has a similar situation–the government’s reliance an outdated technology–surfaced as an issue. Federal Chief Information Officer Tony Scott has called it a “crisis” to rival the Y2K computer glitch.

Dave Powner, Director of IT Management Issues at the Government Accountability Office (GAO), noted that some agencies are running tens of millions of lines of long-deprecated software code, such as COBOL and assembly languages. Less frequently mentioned is the aging infrastructure itself–switches, routers, servers, desktops, mainframes, etc. Recent research has suggested that a substantial portion of the government’s IT hardware has already reached LDoS (Last Day of Support), which means it is not receiving updates, security alerts or patches, and so on. In the next two years, an ever greater portion of that infrastructure will reach that same stage.

Along with increased security risks and vulnerability to cyberattacks, these outmoded systems can’t support growing demands for greater mobility, collaboration, data analytics, etc. Finally, they are also at higher risk of simply breaking down. Consider what a catastrophic blow that could be to the business of government–tax collection, benefit payments for veterans, monthly checks for Social Security recipients, air traffic control, and so on. Recent reports note that the Coast Guard is “overwhelmed” by the daunting task of updating its legacy IT infrastructure. That could be a matter of when, not if, GAO’s Powner says.

While we are at it, we should also recognize the need to modernize the processes by which government buys and operates its IT infrastructure, which is a major part of the reason why it’s been so hard to modernize. But more about that in another column.

Former White House chief of staff and now Chicago Mayor Rahm Emanuel a few years ago pronounced a rule that now bears his name: “Never let a serious crisis go to waste.”

To quote Kanter once again and note its applicability to the government’s aging IT infrastructure: “This is the heart of our problem: infrastructure policy has become a matter of lurching from crisis to crisis, solving problems after the fact rather than preventing them from happening.  We’ve turned into short-term-fix addicts.”

The president’s legislative proposal to establish an Information Technology Modernization Fund would support the transition to a more secure, efficient, and modern IT infrastructure. It deserves support from all of us.

 

Alan P. Balutis is Senior Director and Distinguished Fellow, U.S. Public Sector, at Cisco Systems.

The Weekend Reader – April 22

Deltek: 2017 Federal Budget Presents Opportunity for Cyber, R&D, Health IT Vendors

executivebiz.comDeltek estimates contractor-addressable spending on the U.S. government’s mission-critical programs will increase by $18 billion to about $682 billion in fiscal year 2017 if Congress approves the White House’s latest budget request. The report forecasts continued growth in the Federal cybersecurity, big data analytics, health care information technology and infrastructure segments despite a projected small decline in overall contractor-addressable IT spending for FY 2017. “Government demand looks particularly strong for…areas that align with the Obama administration’s focus on modernization, health care and veterans services,” said Deniece Peterson, Deltek’s director of Federal industry analysis.

F.B.I. Director Suggests Bill for iPhone Hacking Was $1.3 Million

nytimes.comThe director of the FBI suggested Thursday that his agency paid at least $1.3 million to an undisclosed group to help hack into the encrypted iPhone used by an attacker in the mass shooting in San Bernardino, Calif.
At a technology conference in London, a moderator asked James B. Comey Jr., the FBI chief, how much bureau officials had to pay the undisclosed outside group to demonstrate how to bypass the phone’s encryption.

 

Apples Transparency Report Reveals Compliance with Government Data Requests

govtech.comApple says these requests typically seek information about a user’s iTunes or iCloud account, and each requires a search warrant. That information could then be used to help investigators prevent planned crimes from taking place or, after the fact, assembling a criminal case against someone. Privacy advocates are alarmed by the growing number of these personal-data requests.

 

Why Cyber Is a Boardroom Issue

bgov.comCybersecurity is no longer the exclusive domain of corporate IT shops. In the past and in some quarters today, cybersecurity is still viewed as “some IT thing.” But the companies that take this view do so at their own peril. The specter of data breaches and denial-of service attacks are risks facing every business using an Internet connection.

 

 

Pentagon CIO: U.K. a Model on Cloud Adoption

fcw.comMicrosoft announced last November that the company would begin offering cloud services from the United Kingdom, with the firm saying those services would extend to government organizations. Department of Defense CIO Terry Halvorsen has evangelized for the Pentagon to be more willing to allow cloud vendors to host sensitive DOD data. He would like about 50 DOD personnel to do a stint in the private sector in the coming year, and likewise bring about 50 IT hands from industry to the Pentagon.

 

The Situation Report: VA’s Never-Ending IT Shuffle, and a Bad Start for InfoSec Week

VA CISO Watch

The Situation Report has learned that Department of Veterans Affairs CIO LaVerne Council has ordered VA CISO Brian Burns to “redirect his exclusive focus on VA’s role in the Interagency Program Office (IPO).”

“To meet our goal, we must have a dedicated, focused leader for interoperability,” Council wrote Wednesday in an email to staff obtained by The Situation Report. The agency certified interoperability with the Defense Department on April 8 in accordance with the requirements spelled out in the 2014 National Defense Authorization Act. “Brian’s prior work in the IPO combined with his extensive experience in clinical and health technology reaffirm that he can provide that focus and help guide our efforts beyond the certification, beyond VistA 4, and provide a framework for Veterans today and in the situation report logofuture.”

Council has also tapped Ron Thompson, the former executive director of IT infrastructure and operations for the Department of Health and Human Services who late last year became Council’s Principal Deputy Assistant Secretary, to serve as interim VA CISO.

“To ensure continuity in our information security program, Ron will serve as the interim Chief Information Security Officer (CISO), giving us the opportunity to renew our search for a permanent, long-term CISO,” Council wrote. “The tenet of fully resourcing our cybersecurity efforts must be consistent–our Office of Information Security must have a singularly focused leader.”

Off to a Bad Start

VA kicked off its 2016 Information Security and Privacy Awareness Week (ISPAW) Speaker Series on Monday, but a stellar event it was not. Multiple human sources debriefed the Situation Report on the event, which took place via online chat and telephone dial-in. The most glaring problem with what seems like an important initiative for an agency that has been constantly dogged by security lapses was the absence of LaVerne Council. Although scheduled to provide the keynote, Council canceled her appearance at the last minute for unknown reasons. Tina Burnette, executive director of the Field Security Service, filled in for Council.

The theme for the week, according to Burnette, is enterprise cyber strategy.

The Situation Report analyzed multiple reports from the call and discovered that only about 100 VA employees joined the session. Only four VA employees were brave enough to ask questions, even though many of the agency’s information security leadership was available to answer questions. One question, however, was particularly instructive: “Where does the process of information security start?” a VA employee asked.

A speaker identified as Randy Ledsome (unconfirmed), VA’s director of Field Security Service, tried to answer the question, but somebody had put their call on hold and the hold music temporarily interrupted the call. Once that was cleared up, Jackson made an attempt at an answer. “I think this gentleman had a very complex question,” Jackson said. “It starts with having a program. One of the things we’ve done for the [Information Security Officers] we’ve put together what we call the ISO Reference Guide, and one of the things we laid out in there was a problematic—a programmatic—approach to dealing with our programs.”

The question-and-answer portion of the call went on for another 30 minutes, ending with a long, awkward interruption by a Spanish speaker who did not have his phone on mute.

Look Who’s MeriTalking: USDA’s Flip Anderson

Flip Anderson is the Executive Director of FITARA Operations at the U.S. Department of Agriculture.

MeriTalk caught up with Anderson at this year’s FITARA Forum and Data Center Exchange Brainstorm in Washington, D.C., on March 30, for this special video edition of Look Who’s MeriTalking.

Watch Now

The Weekend Reader – April 15

Federal Data Should Be Open To Public, Lawmakers Say

lawyerherald.comData collected by the Federal government should be open and accessible to public by default, according to a group of lawmakers. According to Sen. Brian Schatz, D-Hawaii, at a discussion in Washington on Thursday, agencies these days just release data using virtual documents wherein the searcher of the information is expected to do their own digging. One of the Federal agencies that will be covered by this bill will include the Education Department.

 

Microsoft Sues the U.S. Federal Government Over the Right to Reveal Data Requests

windowscentral.comMicrosoft wants to reveal more information on the data requests it gets from the U.S. Federal government. The company filed a lawsuit claiming the government has violated the First and Fourth Amendments by ordering Microsoft to keep thousands of data requests to the company secret. Notably and even surprisingly, 1,752 of these secrecy orders, or 68 percent of the total, contained no fixed end date at all.

 

 

‘Cloud-First’ To Close 5,000 Federal Data Centers By 2019

informationweek.comIn 2010, the Obama administration’s first Federal CIO Vivek Kundra mandated that Federal agencies should try to make use of a “cloud-first” strategy instead of building more data centers. Since then, 3,125 Federal agency data centers have been closed, out of the 10,584 that existed when Kundra made the announcement.

 

 

 

Burr-Feinstein Encryption Bill is Officially Here in All Its Scary Glory

techcrunch.comSens. Richard Burr and Dianne Feinstein released the official version of their anti-encryption bill after a draft appeared online last week. The bill, titled the Compliance with Court Orders Act of 2016, would require tech firms to decrypt customers’ data at a court’s request. The Burr-Feinstein proposal has already faced heavy criticism from the tech and legislative communities and is not expected to get anywhere in the Senate. President Obama has also indicated that he will not support the bill.

 

Tony Scott: White House Proposes Federal IT Modernization Fund Bill

executivegov.comTony Scott The White House has proposed a bill that would create a $3.1 billion revolving fund to help Federal agencies update their legacy information technology systems and bolster the government’s cybersecurity posture. He added the bill would also establish an independent board of experts to help identify agency IT systems that face the highest risk for potential cyberattacks as well as strategies to facilitate adoption of common platforms and cybersecurity best practices across the government.

 

Editorial: VA’s Scheduling System Betrayal

Four months after a waiting list scandal forced the Secretary of Veterans Affairs Eric Shinseki to resign, I asked the new VA Secretary, Robert McDonald, if he thought the VA had enough money in the budget to procure a new commercial scheduling system to ensure that veterans could get the care they needed, when they needed.

“We tried to put the money we needed in the act that was recently passed. I can’t predict the future,” McDonald said in answer to my question during a press briefing at VA’s headquarters in Washington, D.C. “But I think we’ve done a good job of that. As we work through this scheduling system, we’re going to be very eager to find an off-the-shelf product that is proven effective. The off-the-shelf product will become very important as we move forward. I thought it was a brilliant piece of work by [Deputy Secretary Sloan Gibson] and the team to come forward and say, ‘we are going to take an off-the-shelf product.’ ”

Eighteen months and $11.8 million later, VA’s resolve is wavering. VA chief information officer LaVerne Council and Veterans Health Administration Under Secretary for Health, David Shulkin, told Congress Thursday that they are now unsure how or if they will proceed with the $624 million Medical Appointment Scheduling System (MASS) contract awarded last year.

“We want to be certain that continuous modernization of a 40-year-old electronic medical record is an appropriate decision,” Shulkin said.

Instead of moving out aggressively on a commercial system with a proven track record, Shulkin and Council did the unthinkable: They decided to develop their own in-house upgrade to the scheduling module of VA’s main electronic health system, known as VistA. That’s right—even after a major scandal involving deliberate manipulation of the scheduling system that led to the deaths of veterans, VA thought it was appropriate to tackle the development themselves.

VA is testing an intermediate VistA Scheduling Enhancement tool at two medical facilities. If the employees like it, VA will continue with the deployment.

As a veteran, such inept decision-making is infuriating. But what makes the situation worse is the fact that these decisions are being made because of money—the money that McDonald told me in 2014 was already in the budget for a new commercial system.

“The entire VSE rollout will cost taxpayers $6.4 million. If we roll out MASS, which is an absolute option for us, the pilot alone will be $152 million,” Shulkin said, explaining to lawmakers how VA was struggling to balance the needs of veterans with the agency’s duty to be good stewards of taxpayer dollars. “It will take us 10 months to roll it out in three sites, and that’s if VA stays on schedule with its pilots,” he said. “We have not ruled out MASS. I want to be absolutely clear about that.”

Shulkin and Council may not have ruled out a commercial replacement system through MASS, but that doesn’t explain what he said next about VA’s plans for the VistA Evolution pilot testing that is underway.

“I have the user evaluations, which are tremendous,” Shulkin said. “It’s planned to roll out to 11 VA [facilities] by the end of this month or the next six weeks, and then a national rollout.”

To his credit, Shulkin got one thing right when he acknowledged “we still have an access crisis” at the VA. Yes, something has to be done to improve the situation in the near term, and VSE is an important part of that. But to consider not moving forward with MASS—a comprehensive commercial health management system awarded under a competitive bidding process—is tantamount to reneging on the promise made to veterans that the scheduling and access problems will be fixed.

McDonald brought Shulkin and Council (who worked for McDonald at Johnson & Johnson) to the VA to inject new thinking into the bureaucracy. But they have failed. The VA is a bureaucracy with a culture beyond repair. The wounds of the scheduling scandal are not even close to being healed and VA’s leaders have handed the task of modernizing that system right back to the people who created it.

“This seems like déjà vu all over again to me,” said Rep. Ann McLane Kuster, D-N.H. “VA has already wasted nine years, $127 million without an update to its scheduling system, after finding a commercial product and abandoning that for an in-house solution that could not deliver an adequate update. We cannot and will not let this happen again.”

Senile Systems?

What was the first computer? George Stibitz’s Model K, Packard and Hewlett’s 200A, Alan Turing’s Colossus?  None of the above.  Believe it or not, the ancient Greeks beat the Geeks to the punch by more than 2,000 years.  Dating back to 205 B.C., the Antikythera Mechanism is an ancient analog platform that computed the orbits of the planets and the timing for the Olympics.

Hieroglyphics:

So, as the House Oversight and Government Reform Committee conducts its archaeological spadework to unearth ancient Federal IT systems – consider, things could be worse.  OGR asked the 24 Cabinet-level agencies for an audit of their legacy systems and migration plans by January 29.  Capitol Hill tells us many agencies missed the deadline – but most of the 24 have now submitted their reports.  Some interesting insight from Dave Powner, IT lead at GAO, at the MeriTalk FITARA Forum and Data Center Brainstorm.  Based on agency reports, agencies are running more than 10 million lines of COBOL and Assembler code – and these hieroglyphics power many mission-critical functions.  Powner pointed to the two-year average Federal CIO term as the enemy of real change.  IT execs look for quick wins – which means they avoid entering the mummy’s tomb.  Powner sparked that these ancient systems are either a huge cyber liability or as safe as houses – who’s writing viruses for this stuff?

 

Back away from the details, this is clearly a call for governmentwide leadership.  Keep an eye out for the May/June OGR hearing on Uncle Sam’s geriatric IT.  We’ll doubtless see some shocking examples – but to be clear, this is not about beating up on agencies, it’s about incentives and changing the failing status quo.  That said, everybody’s curious to see the details in the agency reports.  Here’s hoping OGR makes the data public.

(more…)

The Situation Report: Removing the Intelligence Community CIO’s Extra Hat

Photo: Director of National Intelligence James Clapper. (Credit: INSA)

Two Hats Are Not Always Better Than One

MeriTalk recently broke the news that the Office of the Director of National Intelligence is planning to hire its first chief information officer. Sounds pretty straightforward, but my Langley, Va., listening post has picked up strong signals that there is much more to the story and the timing of this new job search at ODNI.

Keen intelligence community observers will know that the ODNI traces its roots to the Intelligence Reform and Terrorism Prevention Act of 2004. But only the most sophisticated observers will recall the amendment that the first intelligence community CIO, Dale Meyerrose, succeeded in getting into the 2005 Intelligence Authorization Act. That bill, which became law on Dec. 23, 2004, included the following section:

‘(d) PROHIBITION ON SIMULTANEOUS SERVICE AS OTHER CHIEF INFORMATION OFFICER- An individual serving in the position of Chief Information Officer may not, while so serving, serve as the chief information officer of any other department or agency, or component thereof, of the United States Government.’

What does that mean? Well, we asked a few of our data scientists to run this through our decryption tools and it sounds like the ODNI—the center of gravity for intelligence policy in the post-9/11 era—has been dual-hatting the intelligence community CIO for the last decade.

“It may have been a benign oversight. Somebody in the general counsel’s office finally noticed,” said one Langley insider. It also means that it’s probably time to update Intelligence Community Directive 500, which sets forth the authorities for the IC CIO. Former DNI Mike McConnell signed Directive 500 in 2008. The job posting for the new ODNI CIO position, however, remains extremely vague in terms of how the two CIOs are to coordinate their responsibilities—a critical step to ensuring the survival of the Intelligence Community Information Technology Enterprise initiative, known as ICITE.

Protecting ICITE

There’s no question that the ODNI has gone through some growing pains, especially since it has faced an uphill battle against an army of doubters who were often very vocal in their opposition to the need for such an office. But Director of National Intelligence James Clapper has done an amazing job of elevating one of the most pressing issues facing the IC—establishing a common computing and information sharing environment that will enable rapid collaboration and decision-making.

“There is a desperate effort underway to ensure that ICITE survives the election and presidential transition,” our Langley source acknowledged. “ICITE has become the singular technological artifact of the DNI, yet there is no enterprise architecture, there is only ICITE,” our source said. “They’re hoping that by January 2017 ICITE and enterprise architecture are one and the same thing. So anything that works toward that goal is critical. Making sure the ODNI is a going concern as a hub entity with an empowered CIO is part of that. They’re going to try to fix that quickly.”

Quickly indeed. The ODNI CIO job announcement closes April 15 and plans call for a new CIO to be on board within 30 days.

Send your Situation Reports in confidence to dverton@meritalk.com

Look Who’s MeriTalking: Dell Federal’s Jeff Hogarth

Jeff Hogarth is the senior sales director for Dell’s Federal Civilian Business Unit.

MeriTalk caught up with Hogarth at this year’s FITARA Forum and Data Center Exchange Brainstorm in Washington, D.C., on March 30, for this special video edition of Look Who’s MeriTalking.

Watch Now

1 10 11 12 13 14 19