Smarter Gov Tech, Stronger MerITocracy
This page is not built out yet. If you are seeing this page, please contact an administrator.

Data Loss Prevention: Protecting Data Wherever It Resides

Federal agencies have a data problem. Data that was traditionally inside four walls is now everywhere. Employees and vendors access it from all kinds of devices, located in all kinds of places, making it increasingly challenging for security teams to see what those users are doing with that data.

In a nutshell, agencies have lost control of their data, devices and users. We have seen the repercussions in the headlines. Contractors leaking classified reports, employees getting infected with malware, Harold Martin III. As we head into 2018, agencies must shift their approach. The traditional strategy of fortifying the perimeter is no longer effective. The strategy of the present and future must focus on protecting data wherever it is.

If you’re a security practitioner at a Federal agency, you most likely are all too familiar with Data Loss Prevention (DLP) technology.  It was known as the security solution for blocking the transmission of data outside the organization. Considering today’s data problem, DLP is going through a rebirth. Its capabilities are expanding so it can protect sensitive data both inside and outside the organization, while not overburdening limited analyst resources.

The expansion entails DLP being integrated with newer technologies such as Cloud Security Access Broker (CASB) tools to protect data not just on premise, but also in the cloud. Encryption is being added to the mix to protect the data while in transit. Tagging is another important technology to leverage DLP. The tool enables agencies to label documents (i.e. classified, not classified) to give their DLP technology hints as to what’s important and what’s not. For example, if a document is tagged “classified” then DLP knows to block or encrypt it. Multi-factor authentication is also important because it requires the user receiving the data to properly identify herself before the data can be opened.

One more technology that also integrates with DLP, and serves as the glue for tying the other tools together, is User and Entity Behavior Analytics (UEBA). UEBA technologies collect the telemetry data created from the tools mentioned above, identifies potential malicious and non-malicious insider and outsider activities, and delivers a prioritized list of the most critical incidents that DLP analysts must investigate immediately. The integration significantly reduces false positives because, whereas DLP focuses strictly on the data, UEBA determines whether the user that’s elevating the risk of the data being compromised is indeed a threat or business-justified.

For example, let’s say “Joe” from accounting was working on a lengthy project that required him sending a series of classified documents over an extended period of time to a third party contractor outside the agency. Every time Joe sent over the data, DLP would flag it and alert analysts, who would then waste their time investigating each alert and questioning Joe (interrupting his work and potentially lowering morale). And it would all be for nothing, an action that was business-justified.

UEBA would prevent this situation from happening. The technology would learn the first time that Joe’s actions were business-justified, and white label the event as business as usual so that analysts would never again receive an alert about Joe’s behavior.

On the flip side, if Joe was not given permission to send the information, UEBA would prioritize the alert based on the fact that the information was classified, and that Joe’s behavior was unusual compared to himself, his peers and overall team.

Under either circumstance DLP, integrated with UEBA and the other tools mentioned above, is protecting data well beyond the four walls of the agency. The technology has evolved to help analysts understand what’s truly sensitive in terms of data, which data does and does not need protecting, how data should be tagged, how it should be protected, whether it needs to be encrypted, who is handling it and what’s important to investigate. The goal being to protect agencies’ most sensitive data, enable collaboration, discover malicious behavior and prioritize investigation. That’s the DLP of today, tomorrow and beyond.

Automated Authorization: A New “Streaming” Service for Federal IT

Before you say “no,” Federal cybersecurity professionals, hear us out. To borrow a line from Riggs in Lethal Weapon 2, “C’mon, say yes! Be original, everyone else says ‘no.’”

Even with all the discussion around efficiency and modernization, the typical Certification and Accreditation (C&A) process takes six months and costs more than $100,000 to complete. For larger IT systems, doubling those totals isn’t out of the norm. With a large number of organizations, this process is updated every six months and then redone every three years for each system. So, over a five-year-period, certifying and accrediting a system can cost more than $500,000. Wow, that’s a lot of money!

With an increasing attack surface resulting in millions of new threats every year, partially updating C&A documents every six months, re-mediating a few Plan of Action and Milestones, and updating all docs every three years, won’t, and doesn’t, keep the bad guys out of Federal networks.

This compliance process, designed and refined over the past 15 years, was sorely needed when conceived–and still remains the primary means of governing Federal IT systems. Mainframes, client server, and early three-tier architectures ruled the day, with an eventual light sprinkle of this new tech called virtualization. Cybersecurity was an afterthought, meaning build everything and bolt on cyber at the end of the process. Moving typical systems from procurement to implementation took years. Now a server, and even an application, can be provisioned in minutes and the first release can happen in one month. The times and technology have changed, yet updates and adoption have lagged significantly.

Depressed? Don’t be, there’s some help on the way. The latest Risk Management Framework was released for comment and some of the 800 NIST publications are also being revised. The Department of Homeland Security led Continuous Diagnostics and Mitigation program is rolling out across Federal agencies, providing an opportunity to increase visibility and analytics capabilities for applications and systems on their networks. And since we started writing this article, 20 new cyber companies have entered the U.S. market that will help better identify cyber threats, quantify, and qualify risks based on threats, vulnerabilities, and cost to mitigate. The greatest minds are collectively updating the guidance and the conversations grow in numbers…so why are we still not broadly considering the idea of automating security controls and authorizations?

From a technology perspective, the Lego pieces are in the box to get started building the Millennium Falcon. In terms of FISMA compliance, today’s Federal CIOs, CISOs, and Program Managers have access to more than 70 FedRamp Certified FedRamp providers, including a few with high controls. Early adopters of DevSecOps have worked alongside assurance professionals and shifted cybersecurity left in the process and embedded cyber into the DNA of secure automation workloads, from development to production. Automation at every level is possible and can be utilized to achieve assurance and reliability previously unavailable with human implemented processes.

Is it perfect? Nope. Nothing is. This is not about perfection, rather risk management and responsible evolution. The tools are in the toolbox.

So, how do agencies get started? How about a mission-critical mainframe? Ah, no. Well then how about a FIPS moderate back office system on a few virtual servers? Close, but not quite. Let’s instead start with a “net new,” moderate-level data system. Even better, if you can take advantage of the incoming Modernizing Government Technology (MGT) Act to actually rethink a business process/application, rather than carry the same less-than-optimized processes to a new environment and call it modernization.

Some of the criteria to qualify: this is a new project, not a bolted-on enhancement to an old system. Must be hosted in a FedRamp cloud provider. Automate as much as possible, including your security controls, in partnership with your security operational and policy professionals. The development environment should be provisioned using good DevSecOps best practices. Make sure you embed cyber hygiene and analytics at the lowest level of code possible. Lather, rinse, repeat for testing. Then once satisfied the app and cyber is implemented correctly, light up production. Repeat, the key to success is a collaborative and transparent partnership among all stakeholders that include operational and policy professionals…stakeholder engagement.

Beta was overtaken by VHS. VHS got smoked by Blockbuster. Blockbuster got rolled by RedBox. Netflix, Amazon, and Hulu, took out RedBox using mobile phones and broadband. Traditional Federal C&A process, meet Automated Authorization. So, for the traditional certification and accreditation process, “we are getting to old for this.” We couldn’t agree more Sgt. Murtaugh.

Rob Palmer is the executive vice president and CTO for ShorePoint, a privately held cybersecurity services firm serving both private and public-sector customers. Palmer is a former senior executive with the Department of Homeland Security (DHS) where he most recently held the position of deputy CTO and executive director for strategic technology management.
Keith Trippie is a retired DHS IT executive and entrepreneur. He is the founder of Shop4Clouds, digital marketing platform and urMuv, a neighborhood discovery app. He has also launched GotUrSix TV, a digital media platform to share the personal stories of active duty, veterans and military spouses.

Moving Beyond Cloud Security Fears

The federal government has started to embrace the positive impact of cloud on cybersecurity efforts. We first saw this in the May Cybersecurity Executive Order, which outlined a shift to cloud as a key part of cyber security strategy. During a briefing, Tom Bossert, Homeland Security Advisor, said, “We’ve got to move to the cloud and try to protect ourselves instead of fracturing our security posture.” And, the new Report to the President on Federal IT Modernization, released by the White House Office of American Innovation likewise underlines the importance of cloud and a shift to shared services.

Agencies are gaining cloud deployment momentum now because they are matching the cloud to the mission – in many cases implementing highly secure on-premise or hybrid cloud solutions.

At a recent cloud event, John Hale, Chief of Application Services, Defense Information Systems Agency (DISA) said, “the direction that we’re getting from the senior members of the department is ‘move everything to the cloud now.’” He is working to create Cloud Access Points (CAPs) to protect Department of Defense (DoD) networks from the rest of the public cloud and enable DoD security requirement compliance. The recently awarded MilCloud 2.0 will be a hyperconverged on-premise cloud system that is expected to enable approximately 70 percent cost savings for DoD.

And, on September 20, the U.S. Air Force awarded Dell EMC, General Dynamics, and Microsoft a $1 billion, five-year contract to implement a Cloud Hosted Enterprise Services (CHES) program. This is the largest-ever cloud-based unified communications and collaboration contract in the federal space.

We have many examples of how federal agencies are finally moving past the “devil you know is better than the devil you don’t” mentality. But how does cloud, specifically hybrid and secure on-premise cloud, improve security? Agencies can:

  • Maintain control and compliance with security best practices
  • Align data protection services with application demands
  • Access IT services in the event of a disaster with active provisioning
  • Integrate existing security tools and services

FDR famously told us that the only thing to fear is fear itself. Digital transformation, powered by secure hybrid and on-premise cloud environments will modernize government services, improve governance and transparency, and keep federal data and systems more secure.

Learn more:  https://www.dellemc.com/en-us/cloud/hybrid-cloud-computing/index.htm.

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

Federal IoT: Where Innovation and Cyber Risks Collide

Gartner forecasts that by 2020, 20.4 billion devices will be connected across the Internet of Things (IoT). The IoT brings the promise of new possibilities, but to unlock them, agencies must change how they think about data and how to keep it secure.

There are four primary ways IoT can provide value to agencies and support innovation:

  • Driving operational efficiency – improve effectiveness while simultaneously reducing cost
  • Improving constituent experience – discover new ways to engage users
  • Mitigating risk – improve security by detecting failures before they happen
  • Driving mission success – discover new paths to mission success through data insights

Both Department of Defense (DoD) and civilian agencies are currently using connected technologies including drones, cameras, sensors, satellites, etc., to support their missions. For example, the Air Force is combining surveillance and flight sensor data to provide detailed threat information in real-time; the Navy is using a network of connected buoys with sonar capability to detect submarines more quickly and efficiently. On the civilian side, the General Services Administration (GSA) is using a network of low cost motion sensors to turn off the lights when employees are not at their desks – reducing environmental inefficiencies and overhead costs.

While the potential for innovation is great, federal IT teams face a number of challenges when implementing and using the IoT. For starters, up to 80% of IoT data will be unstructured, and these data points have to be stored, managed, and analyzed in a methodical way. For agencies, this means preparing their aging infrastructure for the influx of data from IoT devices on the edge.

The biggest challenge, however, is security. As agencies implement new layers of architecture and processors to harness the IoT, they must address cyber security concerns for both operational technology (OT) devices and traditional IT devices – not straightforward, as IT and OT have very different goals and drawbacks. And, there are an enormous variety of IoT devices that will come into play, each introducing a different level of risk.

It is important for federal agencies to be “paranoid, but not paralyzed” when it comes to IoT security. If approached in the right way – by having heavily encrypted storage environments and a cyber plan that provides for the protection of all endpoints/networks – agencies can effectively manage the security risks and take advantage of the significant opportunity ahead.

Learn more:  https://soundcloud.com/fedscoop-519204393/iot-in-government-sponsored-by-dell-emc#t=0:00.

By: Ro Dhanda, Director, Federal Sales, Dell EMC

 

Rally the Troops: Building a Cyber Workforce for the Future

Federal agencies face a continual struggle to attract top talent in the cyber workforce. Why? Because it is difficult for agencies to find qualified personnel, hard to retain security workers, and there is often an insufficient understanding of job requirements. This impacts us all – as it makes it more difficult for agencies to make good, risk-based decisions as they modernize federal IT and work to meet mission objectives.

The Presidential Executive Order on Strengthening the Cyber Security of Federal Networks and Critical Infrastructure required a group of agency leaders to jointly assess the federal government’s efforts to train the cybersecurity workforce, and develop a report to the President with their findings and recommendations.

These recommendations are not yet available, but federal agencies must find a way to provide potential employees with reasons to choose federal service over private sector perks. Representative Will Hurd (R-Texas) is trying to do just that with his proposed cybersecurity workforce program, the Cyber National Guard. The program would provide students a free cybersecurity education, and in return, students will work in federal service for an equal amount of time.

On C-SPAN, Rep. Hurd explained there is still a significant shortage of computing talent across the board – in Texas alone, 42,000 computing jobs went unfilled in 2015, with an average salary of $89,000. In that same year, Texas produced just 2,100 computer scientists. We need new approaches.

Rep. Hurd’s idea for a Cyber National Guard is that if you are going to college, and will study something related to cybersecurity, the federal government will find you a scholarship. In return, when you graduate, you will serve in government for the same amount of time you were in school. Rep. Hurd also explained that once you are finished with government service, the private sector will loan you back for a period of time – one weekend a month, ten days a quarter, or similar.  “This will improve the cross pollination of ideas between the public and private sectors,” Hurd states.

Once you have the troops in place, you have to give them the tools to be successful. President Trump put this component into motion when he elevated the U.S. Cyber Command to a full combatant command in August. Trump said in a statement that this will “streamline command and control of time-sensitive cyberspace operations by consolidating them under a single commander with authorities commensurate with the importance of such operations….and ensure that critical cyberspace operations are adequately funded.”  As to be expected, there are hurdles to moving these initiatives forward, including:

  • Funding for cybersecurity education.
  • Showing a clearly defined career path for cybersecurity growth within the federal government.
  • Having a standardized listing of cyber jobs within government to ensure students receive the proper credentials.

Fortunately, most agree across the aisle that improving the federal cybersecurity workforce is critical to our future. When asked about the path forward, Representative Hurd is optimistic, sharing, “There really is a group of us that are committed to a bipartisan solution to this problem. So, we’re going to have a posse.”

That’s good, because we know we are stronger together, and we will need continued collaboration and open dialog with federal agencies, our elected leaders, and between the public and private sectors to keep our information and systems secure.

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

Improved NIST Framework Supports Agency FITARA Goals

With the release of the fourth FITARA scorecard, we saw agencies stall on progress – more agency grades declined than improved, and 15 agencies’ grades remained neutral.

One shining star was the United States Agency for International Development (USAID) – the first agency to ever receive an overall A. How did they do it? According to a USAID official, they focused hard on transparency and risk management – where they received an “A”. This was not the case for most of their counterparts however, as 14 agencies received a “C” or lower in that category.

Risk management is one of the more difficult areas for agencies to see success, but every CIO should be using the National Institute of Standards and Technology (NIST) Framework in that area. NIST recently released an updated version of the Framework for public comment, in the hopes that it would be easier to utilize and implement.

These were the most notable changes to the updated version:

  • Refined managing cyber supply chain risks – framework now has a common vocabulary so agencies working on cyber supply chain projects can clearly understand cybersecurity needs.
  • Revised “Identity Management and Access Control” category – framework now has clarified and expanded definitions of the terms authentication and authorization; added and defined the concept of “identity proofing”.
  • Introduced measurement methods for cybersecurity – framework now gives guidance on how to measure how well an agency is reducing risk and identifies overall agency benefits.

NIST has been gathering feedback on the Framework changes, and is expected to release the final version this fall. Hopefully, federal CIOs can use the updated Framework to effect positive change on their cybersecurity and risk management projects – and in turn, see an upward tick in grades when the next scorecard is released in December.

Learn more about Dell EMC’s portfolio of cybersecurity capabilities for government:  https://www.rsa.com/en-us/research-and-thought-leadership/security-perspectives/government-solutions.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

The CDM Marathon: How Feds are Keeping Pace

While the Cybersecurity Sprint focused attention on how to generate improvements quickly, one of our most important cyber efforts – the Department of Homeland Security (DHS) Continuous Diagnostics and Mitigation (CDM) program – is unquestionably a marathon. Now in its fourth year, the program is maturing agencies’ abilities to identify cyber risks and adopt a risk-based approached to mitigation.

The program is entering Phase 3, but agency progress has been staggered. Every agency started from a different point of cybersecurity maturity, so this is not surprising.

Phase 1 involved mapping networks to determine what they would need to improve threat protection; Phase 2 on identifying who has access to the network and how you handle access management. Up next, Phase 3 focuses on boundary protection and incident response.

What was initially surprising was the degree to which agencies discovered during Phase 1 that they were underreporting device numbers. James Quinn, lead systems engineer on the CDM Program at DHS, said that DHS estimated federal agencies would map approximately two million assets, but agencies ended up finding approximately four million.

We have to anticipate this challenge will continue to grow with the Internet of Things (IoT). Every internet-connected device is a potential vulnerability, so improving asset management and establishing a secure supply chain are critical to securing federal systems and information.

When we think about supply chain risk management, we think about our devices and the systems and software we use to protect them. The CDM Project Management Office (PMO) requires vendors submitting products for the CDM Approved Product List (APL) to provide details on their supply chain risk management policies. See more: CDM Supply Chain Risk Management plan.

RSA Archer, a Dell Technologies company, serves as the platform for the agency and federal dashboards. At the agency level, the dashboard captures data locally from network sensors, scores the data, and shows “worst problems first” for operators – i.e. enables a risk-based approach. Agencies are in the process of deploying their dashboards, and the federal dashboard is scheduled to deploy this year.

As agencies and the CDM Project Management Office move forward, they will be tackling ongoing challenges including the need for acquisition flexibility, how to speed the acquisition process, how to integrate FedRAMP, and what’s next for Trusted Internet Connections (TIC).

That’s an already tall order that will continue to grow – and we do win or lose together in this race.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

LPTA is Hurting Employee Relations for Small Businesses

Introduced in June and passed by voice vote with no dissent, HR 3019, the Promoting Value Based Procurement Act of 2017, acknowledges that the Lowest Price Technically Acceptable (LPTA) approach has not produced strong results on contracts. The legislation forces agencies to justify an LPTA approach by “comprehensively and clearly describing the minimum requirements expressed in terms of performance objectives, measures, and standards that will be used to determine the acceptability of offers.” Agencies must exert more control over reporting and standards on LPTA contracts to justify the cost relative to the quality of the product.

This is a welcome step considering the pressure LPTA puts on contractors more interested in quality work with consistently satisfied customers than volumes of one-hit, transactional contracts. LPTA disincentivizes businesses, especially small businesses that lack capital, to be good, responsible employers. Trickle-down effects of LPTA are easy to trace, as unhappy employees perform unhappily, lowering the quality of their work products and the performance of their employers and customers.

The largest expenses, for any employer, are in employee salaries and benefits. When a contractor is competing on an LPTA basis, they must keep these costs as low as possible to remain competitive, leading to a scenario where skilled employees are underpaid and unhappy, or unskilled employees are fairly paid but cannot perform to a high standard. In the former case, according to an Employee Job Satisfaction and Engagement survey conducted by the Society for Human Resource Management (SHRM), compensation and benefits are among the top five measures of job satisfaction. Another study conducted by Harris Poll on behalf of Glassdoor found that 57 percent of respondents said benefits and perks are among their top considerations before accepting a job.

While the perks of a job may sound superfluous, when considering the cost of a disengaged employee it becomes clear benefits are worth the investment. For example, disengaged employees are often poor collaborators, lack enthusiasm for projects, miss deadlines, and don’t take initiative. For companies that do provide multiple benefits, such as wellness benefits, SHRM found that 40 percent of companies saw decreases in unplanned absences, and 33 percent cited a direct increase in productivity.

The absence of good pay and benefits also increases the risk of turnover–a major issue for government agencies that rely on contractors and institutional knowledge to be successful. When employees leave, contractors spend money to replace them. SHRM found that the total costs of replacing an employee can range from 90 percent to 200 percent of their annual income, depending on several factors. Contractors recoup that cost by raising their rates to the government over the long term. This leads to excessive costs for the government in the short term (due to lost productivity and knowledge loss) and long term (due to the higher rates). LPTA exacerbates this issue, leaving government contracts in short-term turmoil and actually raising costs over the long term.

HR 3019’s requirement that agencies justify an LPTA strategy is a good first step, but more action can be taken. Reliance on LPTA inevitably means that contracting firms will drive down prices in order to remain competitive, which will hurt their employees and the government. If the focus is changed from procuring services at a low price to procuring services at a fair, market-based price, the quality of the work the government receives will inevitably increase. While LPTA is well-intentioned–what better way to save taxpayer money than by driving prices down through a simple calculation and competitive procurement?–in practice it has proven to hold less value than strategies focused on performance balanced with cost.

The time has come to move on from LPTA and embrace other, smarter procurement strategies.

The Most Important Technology Trends: A Robotic Ambassador

How can we reach and impact the next generation of explorers? How can we make them care about science, engineering, and technology? How can we prepare them to work for NASA?

The Remotely Operated Vehicle for Education (ROV-E) is JPL’s Robotic Ambassador. (Photo: JPL)

What if we built a cute, smart rover that they can talk with? If we built it, would they engage?

As it turns out, the Jet Propulsion Lab (JPL) actually did build one…and ROV-E is her name-o.

The Remotely Operated Vehicle for Education (ROV-E) was built over the course of about a year by six Early Career Hires at JPL as a way to learn the end-to-end process for building a spacecraft. They designed and built ROV-E to go to schools and museums, drive over people as they lay on the floor, and have expandable USB ports so that future engineers and dreamers can give her new features.

This year, we used these USB ports to give ROV-E a voice through the power of Amazon’s Alexa and Amazon Lex.

We demonstrated ROV-E’s new features at Amazon’s re:Invent conference in December 2016. ROV-E, looking a bit like the Curiosity rover, strutted her stuff in front of several thousand people and they marveled over her prowess and agility.

ROV-E debuted at Amazon’s re:Invent conference. (Photos: JPL)

She drove and answered questions about Mars. She is built mostly from open source software and hardware components, as well as 3-D printed parts. She can take driving commands both from voice commands and an optional remote control. She can automatically follow you around using a camera and a 3-D depth sensor. She can avoid obstacles and even has a blinker to tell us where she’s going. She can do this through a constant Internet connection.

In this series, Tom Soderstrom, the IT Chief Technology and Innovation Officer of NASA’s Jet Propulsion Laboratory, discusses the future of technology: how work evolves, key technologies, and how to engage the next generation.

The grand finale of ROV-E’s demo happened when we turned off the Internet on stage–and ROV-E kept going. This feat constituted the worldwide launch of a new technology called Greengrass, developed by Amazon and JPL. Greengrass, which Amazon launched at the re:Invent conference, allows Lambda and Internet of Things functions to work without Internet. Once Internet is available, the devices reconnect, continue to stream data, and update themselves. This type of technology could be really important to NASA in offline situations such as sending rovers under vast sheets of ice or automatically guiding rovers to help find and rescue stranded miners.

So, did people engage? Yes, they did. ROV-E was the star of the show, with people even asking to take selfies with her. She has also performed at the NASA Booth at the Consumer Electronics Show and has gone to numerous outreach events. We have since released the logic for ROV-E’s voice to the public through a custom Alexa skill called “NASA Mars,” where anyone can get answers to thousands of questions about Mars as well as get the latest updates on what the real Curiosity rover is up to on Mars.

We’ve decided to take the next step and make ROV-E available to the world through inexpensive open source components and a simple instruction manual so that anyone can build one at home or at school. We want to make it cheaper and easier to build than it currently is. We also want to make it into a science platform where citizen scientists of all ages will be able to add their own hardware and program their own experiments.

By the end of this summer, ROV-E will come to explorers and tinkerers of all ages and to a school or museum near you. We hope that ROV-E will be a Robotic Ambassador that will make NASA proud and inspire everyone to participate in NASA’s mission.

GSA’s EIS has Feds Thinking About Cloud Unified Communications

Federal agencies are increasingly taking a serious look at their existing telecommunications infrastructure now that the General Services Administration (GSA) has awarded its Enterprise Information Services (EIS) telecommunications solutions contract. This contract offers Federal offices significant opportunities to not only improve services, but reduce both capital and operating costs. With the need for secure, reliable communications at an all-time government high, the solutions on EIS aren’t just in the “nice to have” category, they’re things your office “must have” if it is to continue meeting critical missions well.

One of the significant benefits offered by EIS is a full range of cloud-based solutions. Everything from applications to total Unified Communications (UC) is available via EIS. While Federal offices may have once been hesitant to move communications to a cloud-based environment, they are no more. Even before EIS, multiple civilian and defense agencies began moving communications-related services to the cloud.

And why not? The cost benefits of cloud solutions have long been established. Capital cost savings, better managed operating expenses, elimination of costly upgrades and repairs are just some of the major saving features of cloud solutions. The financial benefits of adopting cloud solutions are well-documented and accepted among Federal agencies.

Security, too, is enhanced. Contractors must adhere to rigorous FedRAMP and FISMA standards. Cloud systems, themselves, offer advanced survivability features above what is frequently available from more traditional approaches. Secure supply-chain requirements ensure that contractors are offering only properly sourced, secure solutions.

One feature of the EIS contract that’s getting a lot of early attention is the ability for Federal agencies to obtain cloud-based UC solutions. These systems embody the security and reliability discussions above. A good, FedRAMP-accredited cloud UC solution offers Federal offices a complete range of advanced communications capabilities–including voice, call management, unified messaging, video conferencing, and even (if not hosted) centralized Session Initiation Protocol (SIP) trunks. All of these services can be scaled to meet the needs of any customer.

Even better, no longer are customers considering a cloud-based UC solution tied to the proprietary technologies of just one company. Being “locked in” to one Original Equipment Manufacturer (OEM) solution has been a significant factor holding some customers back from obtaining all of the benefits a cloud-based UC solution has. No one likes having their options limited.

Today, however, UC solutions are available via EIS from companies offering true “best of breed” solutions that feature components from a variety of OEMs. Now solutions can be flexibly tailored and properly scaled to focus on what the solution must do to support the customer, rather than requiring the customer to adapt its operations to how the solution, itself, works. Not only do Federal offices obtain the best overall operating solution, they’re not tied in to one company’s technology and can ensure that they pay only for features they will actually use.

The “vendor agnostic” approach to acquiring cloud UC solutions offers Federal buyers significant savings not just over traditional telecommunications offerings, but better functionality and savings vs. a single-OEM approach. Cloud-based UC solutions could be the answer to your Federal office’s efforts to upgrade its current telecommunications infrastructure. They offer security, substantial new capabilities, cost savings, and true freedom to scale while using only the applications you need.

Make sure you ask EIS contractors what they can offer your office in terms of best in breed, cloud-based UC solutions.

The Most Important Technology Trends: Part 2

This is the second article in a series about the most important technology trends. The first article postulated that the key trend is the evolution of how we work. It discussed working like a startup and utilizing new methods of working including: agile development, open source, consumerization, continuous development/continuous integration, iterating with minimum viable products, DevOps, crowdsourcing, maker communities, and reducing wait states.

In this series, Tom Soderstrom, the IT Chief Technology and Innovation Officer of NASA’s Jet Propulsion Laboratory, discusses the future of technology: how work evolves, key technologies, and how to engage the next generation.

This article focuses on the key technologies that will deliver maximum benefits, especially when used together. Over the next three years, to work even faster and more effectively, we will use new natural user interfaces (NUI) to easily and seamlessly interact with previously unimaginable amounts of data in the cloud from real-time sensors created through the Internet of Things. We will make data-driven decisions in real time aided by complex algorithms that help make sense of the data through artificial intelligence (AI). And we will proactively measure everything through predictive and prescriptive analytics.

The way we interact with the computing systems will change. Today, we work using a mouse and keyboard, must have knowledge of what data we need, request permission, and then use a keyboard and mouse to access and, often painfully, combine data from different sources. It’s extremely labor intensive and, because of the many wait states, it’s too easy to lose the ever-important momentum and not deliver on time.

Tomorrow, we will simply use a NUI to query the system and receive an answer in seconds. As illustrated in Fig. 1, a scientist, engineer, programmer, or other stakeholder will simply ask a question using her or his voice, gesture to the camera, touch the data on a large touch screen, blink through the smart glasses, and/or think about the problem wearing a smart “helmet,” and the answer will appear. If we are successful, it will seem like AI magic.

Fig. 1 (Image: Tom Soderstrom)

So, what’s behind the scenes of this magic? The data will reside in clouds and be accessible through well-known APIs. The stakeholder’s questions will kick off a set of database queries and/or AI code that presents the solutions back to the user. Note that the system intelligently assists the human by presenting the data in the way that the user wants it and specifying the likelihood that the answer is correct. The user then chooses the action. This is the essence of Intelligent Assistance or IA.

IA will evolve to AI at the user’s desired pace. Once the user trusts the IA recommendations, she/he can choose to trust the system to implement the recommendation automatically. At that point, we have reduced an additional wait state and have evolved to true AI for that user and for that use case.

There are many current and future examples. Today, AI software called AEGIS runs on Curiosity. While Curiosity is driving on Mars, AEGIS automatically identifies interesting rocks and tells Curiosity to take photos, which are then sent to Earth. Humans can investigate them using augmented reality through Microsoft’s HoloLens Smart Glasses and ask Curiosity to turn around to drill into the rock (see Fig 2).

Fig. 2 (Image: Tom Soderstrom)

If this sounds too simple, we’re on the right track. AI is complicated and many people fear it. However, no one fears Alexa, Siri, or Cortana because of the apparent simplicity and because it’s simply advising the human, who takes the action. Hence the emphasis on IA.

What technologies are needed for us to execute this vision? JPL has experimented with all the below technologies and has found them both promising and useful.

  • Open source and commercial analytics tools: They are readily available, inexpensive, growing, and evolving quickly.
  • Cloud computing: This includes the related technologies of advanced and unlimited computing and storage, serverless computing, edge computing, containers, micro services, and API management. The list will continue to grow.
  • Crowdsourcing: We will partner more within and between NASA Centers and externally to organizations such as Kaggle for data science and analytics.
  • Internet of Things: IoT will allow us to automatically collect highly relevant data, gain automated situational awareness, and interact in an easy, natural way with any system.
  • AI frameworks and libraries: We can choose from available open source options and run the calculations in the cloud. Examples include Tensorflow from Google, MxNet from Amazon, Cognitive Toolkit from Microsoft, and Torch, Caffe, Keras, DeepLearning4J, Theano, as well as many more.
  • IA: By focusing on the user over the technology, we will employ Intelligent Assistance as a way to infuse AI while enabling human end-users to set the pace of infusion.

How can we deploy these technologies?

  • Question farm: Iterate quickly with end users to find low-hanging use case.
  • Experiment: Try the minimum viable product quickly with users and developers using small teams.
  • Focus on the data: Ensure that the data is accessible, consumable, reusable, and understandable.
  • Build in cybersecurity: Ensure that the solutions and the data are appropriately secured.
  • Take the quick, easy path: If we can’t get access to the data, we will drop this experiment. Instead, we will do the prototypes/experiments that show value with minimum wait states. We will also make it easy for the users by making the solutions easy to understand, build, and (re)use.
  • Measure everything: Is there enough end-user value to continue with this experiment?
  • Double down: If the experiment showed value and had an impact, we’ll iterate quickly.

Whether you think this is the right approach or think that it’s complete and utter hype, we’d love to hear from you about how we can help NASA answer the big questions that affect all of humanity. Next in the series I will discuss IA and AI in more detail and would appreciate hearing about your potential or actual use cases.

The Most Important Trend: Evolving How We Work

As we look at the exciting technology trends of the Next IT Decade (the next three years), one mega trend stands out: We will work very differently.

Why? Because we will need to work faster and more effectively with fewer wait-states (aka “bureaucracy”). Consumer technologies are evolving very quickly and have made us highly productive at home. However, enterprises are slower to adopt these trends, largely because there are legacy technologies and the cost of switching is higher and more time consuming.

In this series, Tom Soderstrom, the IT Chief Technology and Innovation Officer of NASA’s Jet Propulsion Laboratory, discusses the future of technology: how work evolves, key technologies, and how to engage the next generation.

So, because the Jet Propulsion Lab (JPL) and NASA are made up of IT consumers, a key disrupter is the adoption of the most meaningful emerging consumer capabilities in the enterprise. If JPL and NASA can do this, we can improve employee productivity and satisfaction, while also delivering the NASA mission faster, more securely, and at a lower cost.

But, which technologies and capabilities will matter and how can we use them? The answer is to predict the human behavior trends as human behavior affects IT, which affects human behavior, which affects IT…you get the point. Simply put, understanding human behavior trends helps us select which technologies are worth prototyping in the near-term, as those technologies are likely to be adopted in the enterprise.

From our research, the key human behavior trends for the next few years are the following:

  • Who will do the work? Entrepreneurs will come up with ideas. Makers will use 3-D printing, Arduino, and Raspberry Pi to prototype a solution. Crowdsourcing will help us find specific expertise and new, nontraditional partners who will work from anywhere to accomplish the NASA and JPL missions–this will also include public hackathons.
  • How will they work? They will use an agile approach, as well as open source and consumer technologies in the cloud to rapidly prototype a minimum viable product (MVP) and pivot quickly when needed. Crowdsourcing will be used to create the MVPs, both internally to the enterprise through hackathons and Kickstarters and externally through the NASA Open Innovation contracts, as well as other approaches.
  • What technologies and tools will they work on? They will apply advanced analytics and deep learning to make smart data from the current big data. They will evolve the cloud as the default development and operations platform, with rapid course corrections when needed. DevOps will be the expected way to work. The key enablers will be Internet of Things, wearables, natural user interfaces, and conversation-as-a-platform.
  • What are the key challenges they will face? We will no longer be able to lay out a long-term, fixed architecture. Instead, we will need to create a chaotic architecture, where frequent changes with effective and automated analytics is the new normal. Because of the size, scale, and speed of continuous development/continuous integration, manual operations will be replaced with automation and this change can be difficult both technically and culturally. Cybersecurity becomes ever more important and needs to be built in to all the solutions and automated with advanced visual and predictive analytics. Luckily, these challenges are not unique to our enterprise. By collaborating with others, we can meet these challenges quicker.

By paying attention to the human behavior trends, we will evolve the way we work to adopt new technologies faster, create automated and fully scalable solutions, and get effective help from new and varied partners.

Most importantly, this will help us use new techniques to answer the big questions quicker, such as:

  • Is there life in space?
  • How can we put humans on Mars?
  • How can we redirect an asteroid?
  • Where is Earth 2.0?
  • How can we help protect Mother Earth?

And that’s what it’s all about. An exciting future indeed!

Reorganization: What Happens and So What?

Since reorganization is a perennial issue in the Federal government, one would expect substantial academic literature to exist on this matter. While there is a considerable body of work on the histories of reorganization efforts (especially the various executive branch commissions), analysis of existing organizational arrangements (narrative pieces bemoaning overlap and duplication), and on what best might be called the politics of reorganization, the relationship between institutional and procedural reform and the policy output of the bureaucracy remains almost wholly unresearched.

Indeed, a review of the current state of actual knowledge concerning reorganizations and their effects is an unrewarding task, for knowledge of this kind is impressively slight. Dean Mann and Ted Anagnoson concluded, after an examination of the reorganization literature, that there was little explicit work on the results of reorganizations:

Almost nobody has asked the question: What difference have these reorganization plans and executive orders made? How have they been implemented and with what results?

The focus of this article is on these specific topics. It draws from my own work in the government not only on various organizational study teams, but more importantly from being deeply involved in creating two new agencies in government and abolishing two others.

As noted above, the what, what’s wrong, and to a certain extent, the what ought to be done, have been adequately covered–to say the least. But the specific consequences of restructuring efforts have been largely ignored. Let me begin this undertaking by assessing accomplishments in terms of the “goals” of reorganization. Reorganizations are usually designed to:

  • Simplify and streamline;
  • Bring about greater efficiency and economy;
  • Place program oversight under a single administrator;
  • Help make possible effective program management, sound financial control, and coherent delivery of services by consolidating program areas badly fragmented in the existing organization structure;
  • Simplify and strengthen the linkage between policy development and program administration;
  • Eliminate program fragmentation and end confusing organizational divisions;
  • Prevent fraud and abuse;
  • Achieve savings;
  • Reduce staff; and,
  • Make (the entity) more responsive to the millions of Americans that the Congress has directed (the entity) to serve.

These goals or objectives of reorganization seem quite consistent with traditional public administration doctrine and characteristic of what Harold Seidman regarded as “administrative orthodoxy.” It is somewhat difficult, therefore, to measure success in terms of such “proverbs” or “organizational platitudes,” but let’s seek to dig a bit deeper.

A number of the goals reflect a concern with structure (e.g., simplify and streamline, efficiency and economy, consolidating program areas, simplify linkages, end organizational divisions, and so on). However, during reorganizations, while a number of consolidations occur, numerous others remain. This is not surprising since in a government with multiple objectives and thousands of programs it is likely impossible to organize so that issues do not cross organizational lines. In fact, there probably is no way to structure the government so that all programs with interrelated objectives are in only one component. Certain organizational efficiencies may occur, but a number of others remain–untouched or, in some cases, caused by the reorganization. Reorganizations end certain duplications or program overlaps while creating new ones.

Other goals reflect a major concern of traditional public administration doctrine: economy and efficiency. But tracking agency savings, as almost any seasoned budget officer would tell you, is “dealing with funny money.” Most reorganization assessments seem to verify Rufus Miles’ assertion that economy as a ground for major reorganization is a will-o’-the-wisp.

In one of the departmental reorganizations I staffed, the secretary said as he announced the reorganization:

I recognize that it is far easier to announce a fundamental reorganization than to implement proposed plans adequately and to change materially the way in which our money is spent and our citizens are served.

He was addressing an issue that has received little attention: implementation of organizational reforms.  Donald Van Meter and Carl Van Horn have offered four reasons for the neglect of policy implementation:

  1. There is the naïve assumption that implementation follows automatically after policy formulation and that results do not deviate from expectations.
  2. The implementation process is assumed to be a series of mundane decisions and interactions.
  3. The focus on analysis of policy alternatives and rational policymaking has excluded attention “of the lower echelons of agencies responsible for implementation.”
  4. The enormous difficulties involved in studying implementation.

These same reasons appear valid for explaining the neglect of reorganization implementation. While the constraints are formidable, what can be said about this important matter.

Rufus Miles has noted that reorganizations have traumatic effects that should be carefully weighed. Of course, as Miles noted, organizations “vary widely in the degree to which they disrupt the skein of human relationships that are the communications and nerve networks of every organization.”

Some reorganizations cause little or no disruption, while others are traumatic. But any reorganized agency undertakes a heavy load of bureaucratic activities. People have to be reassigned; procedures have to be developed; policies have to be established; money has to be spent in a way that can be made accountable; office space and furniture have to be obtained. Personnel has to review proposed organization structures, review and rewrite position descriptions, fill new and existing vacancies, transfer employees, handle union concerns, and advise employees of their rights during reorganization.

The magnitude of these endeavors can only be understood by someone familiar with the complexity and arduousness of the Federal personnel system. Similar challenges exist in budget, finance, grants, acquisition, security, and other administrative services.

Implementation is almost considered to follow automatically, a rather common occurrence, according to I.M. Destler and the Government Accountability Office (GAO):

For reorganization, as for any other change, implementation is the bottom line. Without it, the whole exercise is show and symbolism. Yet in real-life attempts at reorganization, serious concern with implementation is typically too little and too late. Enormous attention is devoted to analyzing and deciding what changes should be made. The problem of getting from here to there is addressed only belatedly. To paraphrase Erwin Hargrove, implementation often seems the “missing link” in reorganization.

So what lesson can we draw from previous reorganizations? First, reorganization is not likely to make government measurably cheaper. Second, unwarranted stress should not be placed on efficiency as grounds for reorganization. The simple fact is that public administration and organizational theorists know very little about what type of reorganization promotes efficiency; in some cases they have turned to consolidation, in other cases, to decentralization. Third, government reorganizers must pay special attention to the problems that can be caused by excessive tinkering. As Miles has noted.

Traumatic reorganizations may be analogized to surgical operations. It is important that their purposes be carefully assessed and a thoughtful judgment reached that the wielding of the surgical knife is going to achieve a purpose that, after a period of recuperation, will be worth the trauma inflicted. And the surgical knife should not be wielded again and again before the healing process from earlier incisions has been completed.

Finally, this article should make obvious that since it is only through effective implementation that adopted reorganization proposals can bring about results; implementation is a crucial part of the reorganization process. And, it appears that implementation strategy cannot be left until after a reorganization program has been approved.

CIO Game of Thrones?

Federal CIOs are going down like Lannister banner men. Is this just the typical transition turmoil or is there something broader afoot? Speculation abounds that the Trump administration may be cutting Federal CIOs as part of a strategic initiative to centralize Federal IT under one CIO. This palace intrigue given more credence by yesterday’s White House ATC IT Modernization Report‘s focus on centralization. This also maps to the simplification priorities in the Cyber Executive Order. Do we expect to see one new Federal CIO at the operational level?

Former Federal CIO Tony Scott is skeptical of the notion that the new administration would go to a single CIO. “But I’m hoping that the Federal CIO job gets filled soon,” said Scott. “While career staff are doing a great job, there’s just no substitute for having  that slot filled without the ‘acting’ title.”

But, what would one central Federal CIO mean for FITARA? Congressman Gerry Connolly famously observed that when he and Congressman Darrell Issa wrote FITARA, there were 250 people with a CIO title within the 24 CFO-Act agencies. Instead of a Federal CIO to govern all agency CIOs, what if we have one CIO controlling IT at all Federal agencies? And lastly for the dismount, who’d be crazy enough to take the job?

Game of Thrones, eat your heart out.

Is DevOps Helping Feds Put the Pedal to the Metal on Cloud?

We are starting to hear very different language when federal IT leaders talk about modernization.

At the recent Federal Focus: The Cloud Generation event, Small Business Administration (SBA) Deputy CIO Guy Cavallo and CTO Sanjay Gupta talked about their migration from data center to cloud, noting that the key was to “burn the bridge back to the old way of doing things” and commit completely. The SBA successfully managed the migration in just 57 business days with no additional funding, demonstrating the art of the possible if you have the will and the leadership.

Our panel at the Federal Focus event explored development and operations, or DevOps, as an approach to modernize and speed new development efforts; shrinking the window between idea and delivery. This is especially important against the backdrop of the Data Center Optimization Initiative (DCOI) and other efforts to streamline Federal IT with a cloud-first approach.

Jennifer Hoover, DevOps program manager at the Transportation Security Administration (TSA) explained DevOps encompasses more than just software development. The true definition is much broader and includes the tools people use, the technology, the processes of acquiring services, and the people operating the systems.

“Inherently, DevOps is about the people,” Hoover shared. “It’s about the culture. Somewhere in the middle of that is DevOps.”

Sean Jennings, co-founder and senior vice president of Solutions Architecture for Virtustream, added that DevOps forces teams to come together. In the past, too often tools were built that did not meet functional needs. And, as we are all aware, far too many development efforts were executed in silos and abandoned before completion.

DevOps connects customers with the engineers and service relations staff up-front and throughout the development process, helping to avoid end products that might be “technically perfect,” but don’t meet mission needs. Much more incremental than traditional waterfall development approaches, DevOps lets teams see flaws and course correct more quickly. And, the process involves all the key stakeholders throughout the process – operations and service teams, security teams, and the end customer.

The challenge is that DevOps requires an enormous shift in mindset and culture, so communication is key. Chad Sheridan, CIO, USDA Risk Management Agency, shared that his job is to constantly communicate the “why” and help the different groups understand one another better. Both Hoover and Sheridan emphasized the importance of communication as you work to build trust in the new approach; using all-hands meetings, lighting talks, guest speakers, and other methods of interaction.

As we shift to a multi-cloud world and enable digital transformation efforts, we need faster, more effective ways to build and connect cloud-native applications. DevOps is providing new opportunities to rapidly deploy and scale new services and shrink that window from idea to delivery – literally putting the pedal to the metal on cloud.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

The Race to Innovate: IoT, Cloud, and the Future of Government

According to a Gartner study, there will be 26 billion internet-connected devices by 2020 – more than four devices for every human on earth.

This hyper-connected world presents opportunities and challenges for federal agencies, particularly given data security and privacy considerations, and the enormous variety of IoT devices (many of which, unlike a laptop, are difficult to update when we identify vulnerabilities).

I recently moderated a panel on the IoT at MeriTalk’s Cloud Computing Brainstorm. Federal leaders discussed successful IoT implementations and how the cloud can accelerate progress. The consensus was that we need a greater focus on IoT in federal IT, from an application and a security standpoint.

“When you think about the Internet of Things, we’re going to be in a race,” said Greg Capella, deputy director of the National Technical Information Service, Department of Commerce. Panelists discussed how keeping up with the IoT race means agencies need to adopt cloud computing and fully utilize cloud-based networks.

“We can’t do IoT without the cloud,” said Christine Calvosa, deputy CIO of Technology and Resiliency, Federal Communications Commission. “It’s just a network of networks.”

Traditional IT infrastructure can’t handle the surplus of data created by the IoT. Cloud provides the flexibility, scalability, and storage capacity needed to manage all of that information. “The public will demand services much like we demand cellphones in our pockets,” Capella said. “We’re basically facing the fact that there will be more data than we can digest.”

As federal agencies consider IoT initiatives, they need clear frameworks, including the ability to analyze and secure the data, and a path to build IoT systems that allow for future innovation.

The panel did warn against thinking about IoT too narrowly or just in terms of specific use cases. Just as we are using the Internet in ways early developers could not have imagined, the same will certainly be true for future IoT.

To be successful, we need everyone at the table – engineers, cyber security teams, procurement, and the mission owners. Collectively, we need to keep looking forward. Cloud and IoT will help us solve deliver more efficient, more connected, and ultimately smarter government.

Learn more about Dell EMC’s cloud computing capabilities: https://www.emc.com/en-us/cloud/hybrid-cloud-computing/index.htm.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

FITARA 4.0: What Agencies Can Learn

The latest FITARA scorecard revealed the first overall “A,” issued to USAID, which managed a significant improvement following a string of D’s on the last three scorecards. Unfortunately, more agency grades declined than improved.

The trend is frustrating, particularly as the December 2016 scorecard showed improvements. The Chief Information Officer (CIO) is supposed to drive most of the change required by FITARA. Yet, just a third of Chief Financial Officers (CFO) Act agencies have a permanent CIO in place. The rest are left with an interim CIO or have left the position empty altogether.

And, as we heard at the June 13 Oversight and Government Reform (OGR) hearing, too many agencies still don’t provide visibility into their IT programs – such as software license inventories, data center closures, and cost savings metrics – as required under FITARA.

At the OGR’s FITARA hearing, Rep. Gerry Connolly (D-Va.) emphasized that agencies need to look at USAID’s top scores to find the key to improving their own standing.  “They reached out to GAO to find out how to improve. They listened to advice and they implemented it,” Connolly said. “If there’s the political will, if there’s the managerial desire, you’ll have congressional support and you’ll have GAO support.”

Dell EMC, together with other IT leaders, recently met at the White House for the first American Technology Council Summit. We discussed how technology can improve services and support a more efficient, accountable government. We also talked about how we can get there, including retiring legacy systems, increasing the use of shared services, improving procurement processes, and using big data and analytics to improve services and reduce fraud, waste, and abuse.

At Dell EMC, we think about three key components for digital transformation and modernization: IT transformation, workforce transformation, and security transformation. As the federal government continues the modernization push, FITARA will drive accountability, transparency, and ultimately more consistent improvement across government.

If USAID’s success drives other agencies to reach out to GAO for best practices, we all win.

Learn more about Dell EMC’s perspective on digital transformation:  https://www.delltechnologies.com/en-us/perspectives/digital-transformation-index.htm.

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

Federal Cloud Forecast Getting Brighter: FedRAMP Evolving

The forecast is looking brighter for FedRAMP.

The FedRAMP Project Management Office (PMO) has worked to make the cloud procurement more transparent and more efficient. At June’s Cloud Brainstorm event, Congressmen Will Hurd (R-Texas), Gerry Connelly (D-Va.), and FedRAMP leadership from the General Services Administration (GSA) shared perspectives on progress to date and what’s ahead.

Most agree that the FedRAMP Accelerated program, which modified how the FedRAMP Joint Authorization Board (JAB) authorizes cloud service providers (CSPs) to make the process significantly faster and more predictable, has eased concerns and is driving positive change. Rep. Connolly said legislators are pleased with FedRAMP’s progress, sharing, “It wasn’t that long ago that we were feeling pretty dire about how FedRAMP was proceeding. Significant improvements have been made.”

An independent study of FedRAMP from May 2017 found that six agencies have used at least 20 CSPs approved under FedRAMP, and that there was an 80% growth in the use of FedRAMP certifications.

That said, industry representatives continue to see the reluctance of one agency to accept another agency’s Authority to Operate (ATO). While agencies are willing to go through the process to get a CSP approved by FedRAMP, contributing to the overall growth in certifications, one agency doesn’t necessarily trust a CSP brought through the process by a different agency, as each agency IT head has a different set of internal standards and guidelines. This is a significant issue, but leadership recognizes the challenges are driven by factors beyond the FedRAMP program.

Matt Goodrich, FedRAMP program director, says that given FedRAMP’s budget, it is neither realistic nor prudent for every vendor to go through Joint Authorization Board (JAB) approval. JAB must be reserved for cloud services that are truly government-wide.

Under the Federal Information Security Act (FISMA), the CIO is the sole individual responsible for accepting cyber risks for their own agency. Acceptable risk for one agency may not translate to acceptable risk for another.

What’s ahead for FedRAMP? The goal is to get to a point where a vendor holding one ATO can go through an even more accelerated process as they apply for the next. Hopefully, the FedRAMP program will continue to streamline and evolve.

FedRAMP can also serve as a driver for cloud adoption beyond federal agencies. Joe Moye, senior vice president of public sector, Virtustream, says, “The state and local government market creates an opportunity to leverage the FedRAMP platform beyond federal agencies. The focus on expediting some of the process is crucial.”

FedRAMP will play a vital role as agencies focus on digital transformation and modernization. It’s important we continue to engage in productive public/private dialog and work together to ensure agencies have the best and most secure cloud options.

Learn more about Dell and Dell Technologies FedRAMP-approved cloud services:  http://www.dell.com/learn/us/en/uscorp1/press-releases/2016-04-25-dell-cloud-for-us-government-meets-security-standards and http://www.virtustream.com/cloud/virtustream-federal-cloud/.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

Look Who’s MeriTalking: Dell EMC Federal’s Cameron Chehreh on the IoT, Cloud, and What’s Next

MeriTalk sat down with Cameron Chehreh, Chief Technology Officer at Dell EMC Federal, to discuss the Internet of Things (IoT), cloud, and what’s next for Federal agencies.

MeriTalk:  The number of Internet-driven devices continues to grow every day. How can cloud computing help manage the growing IoT landscape?

Cameron Chehreh: We are an “always on, always connected” society–whether it’s by phone, laptop, smart watch, or tablet–consumers are used to instant access to information. The Federal workforce brings these expectations to their jobs. And, as we have more connected devices, we have more information–Federal IT leaders are having a hard time keeping up with the surplus of data created by the IoT.

Cloud provides Federal agencies with the improved flexibility, scalability, and storage capacity they need to successfully manage the mountains of data being collected each day. By providing this foundation, cloud enables agencies to focus their resources on innovation–fueling successful digital transformation.

Cameron Chehreh (Photo: LinkedIn)

MeriTalk: What advice would you give to Federal CIOs considering IoT initiatives for their agencies?

CC:  Have you heard the saying, “If you fail to plan, plan to fail”? Same goes for organizations considering new approaches. CIOs should lay out a clear framework for their agency and place a strong focus on security.

We need to move as many legacy applications to a cloud-based model as possible. Let’s face it–traditional IT infrastructure won’t scale as needed. Consider a DevOps approach as you engineer new solutions and build new “cloud native” applications. Bringing development, operations, security, and end user teams together will give you the best chance for success.

MeriTalk: What do you see as the biggest challenge for Federal agencies that struggle to keep up with the IoT race?

CC:  A recent Business Insider report predicted there will be a total of 22.5 billion IoT devices by 2021–while there was just 6.6 billion in 2016.

As I’ve mentioned before, continued reliance on legacy technology and old-school processes is a challenge. In addition, one of the biggest challenges for Feds is security. As IoT introduces more devices, (and a greater variety of devices), we expand the threat. IoT is becoming more integrated in our daily lives–we all have our personal information on multiple devices. Security concerns expand beyond sensitive information to our very identities, and this has significant implications for the Federal workforce and Federal missions.

Some believe the excitement around IoT leads to increased risk as there is pressure for industry to push out the next innovative idea more quickly than competitors. This means Federal agencies need to take an even harder look at security, and find ways to mitigate potential risks.

MeriTalk: How can Dell EMC help Federal agencies keep up with the IoT race? 

CC: As we have discussed, cloud enables cost savings, improved operations, and better security. Dell EMC has an extensive portfolio of cloud computing capabilities to assist agencies. The hybrid cloud platform enables agencies to focus their time and resources on innovation–on their missions, and how they can put IoT data to work.

Overall, we want to help organizations “do more with more (data)”–and cloud is an important step.

MeriTalk: Based on your experience, what would you say are the top three best practices for successfully managing IoT?

CC: Workforce education and transformation is going to be very important and we will need more collaboration across the organization than in the past. Find ways to bring IT teams, cyber teams, engineers, etc., together.

Agencies also need to keep looking forward–planning for the unimaginable. For example, the other day I saw a delivery bot traveling along the sidewalk en route to its destination. Five years ago, we would have never thought that would be possible. Same goes in this situation–in a few years, we will be using IoT in ways we can’t imagine today.

Additionally, as we’ve touched on–fully utilizing cloud-based networks is the key to managing IoT. Without the cloud, agencies will be collecting more data than they are able to process.

MeriTalk:  What benefits can the growth of IoT provide for Federal agencies, and Federal CIOs?

CC:  The growth of IoT will help Federal agencies deliver a more efficient, more connected, and ultimately smarter government.

There are so many use cases–from improved security, to improved maintenance management, to improved preventative health care. The use cases will drive the development of the technology. As we have greater transparency, we’ll be able to make better decisions about how and where to use our resources.

MeriTalk: Make one prediction for the future of IoT. Where do you see this landscape in five years?

CC:  The IoT is going to empower the next-generation workforce, helping us make real-time decisions that will radically change how we manage people and resources, improving efficiency. And, when combined with analytics technology, AI will help us not just solve, but also predict and prevent problems.

Lessons From Silicon Valley (and Elsewhere)

Recently, John Chambers was in town for meetings of the U.S.-India Business Council, which he chairs. His visit coincided with Prime Minister Narendra Modi’s White House meetings with President Trump. Chambers is now chairman of the board at Cisco Systems after having served as CEO for more than 20 years–an eternity in Silicon Valley. He spoke at a CXO breakfast attended by 20-plus government CXOs. Here are some highlights of his remarks.

  • Countries that have embraced digital transformation have seen growth in their Gross Domestic Product (GDP), increased the number of startups (which in turn leads to job creation), and improved education and health care.
  • One can look to countries like India, Israel, and France, which have all launched digital transformation initiatives. France? A country better known for a 35-hour workweek and rigid labor laws? Yes. Visit a vast project in the heart of Paris, a refurbished train depot called Station F (ironic here with our 18F innovation office at General Services Administration headquarters). It already houses 1,000 startups and aims to amass the largest group of entrepreneurs, venture capital firms, incubators and accelerators anywhere in the world.
  • France has already become one of Europe’s top destinations for startup investment. Venture capital and funding deals last year surpassed that activity in Germany, making it second only to Great Britain in Europe.
  • The United States risks falling behind because it lacks a national digital strategy. What should be the elements of such a strategy? Here are four–as described in a recent speech by the new President (and former Minister of Economy, Industry and Digital Affairs) of France, Emmanuel Macron: (1) Invest–create investment vehicles to make grants and loans available to fund startup and accelerators at easy terms; (2) Promote Tech–Use smart and connected cities as tech hubs; (3) Offer Incentives–lure international tech talent by providing a fast-track work visa for entrepreneurs and their families; and (4) Ease Tax Laws–create a special tax status for innovative new companies.
  • The goal of any such strategy should be to disrupt. Focus on the technology and market transitions happening now. Focus on customers and citizens. Implement quicker. Embed security. Break down silos. One CXO interjected that in government we call them “cylinders of excellence.” Put an architecture in place and assemble “innovation playbooks” so you can replicate your success elsewhere. Finally, focus on outcomes (e.g., growing the economy, creating jobs, stirring new business startups, etc.).
  • What are the top challenges to creating a digital government here in the U.S.? They are culture, silos, security, lack of resources, overconfidence/arrogance. Our country risks being left behind (e.g., the number of IPOs on the New York Stock Exchange is declining).
  • The CXOs in attendance confirmed that agencies are moving in different directions, thereby adversely affecting the government’s ability to deliver integrated services to citizens. Chambers noted the need for leadership from the top and argued against “doing the right thing for too long.”
  • Cyber crime has passed physical crime in terms of its economic toll. Why? The famous criminal Willie Sutton was once asked why he robbed banks, and his response was simple, eloquent, and humorous: “Because that’s where the money is.” Today not only do you need to secure your network but you also need to secure the devices that people use in their everyday life and expect to also be able to use in their work life.
  • In closing, Chambers urged the group of technologists that represented agencies and departments across the government to educate their leadership on the importance of digital transformation, to create organizations that can deliver technology, systems, and results with the speed needed to respond to today’s changing landscape, and to work across the government to break down stovepipes to deliver better services to the public.

What were some of the takeaways for these government CXOs? One of the authors of this article queried a number of them a day or so after the breakfast. Their responses are instructive.

  • I’m going back to meet with my boss and talk about how we can enhance our interactions with our customers. How can we make the interactions more useful, more usable,  more efficient and so on?
  • What can we do to shift from legacy back-office systems that are consuming too many of our resources to digital platforms? How can we reallocate $$$ to more direct citizen engagement?
  • I’m going to meet with my own staff to ensure we have the proper technology platforms in place to support a digital transformation–an up-to-date infrastructure, analytics,  privacy, security, mobility, and user experience–to mention just a few examples.
  • Who else needs to be involved–top leadership (obviously). But who else? Management and Budget and the CFO. Human Resources. Legal. Who else?

 

 

Alan Balutis is a Distinguished Fellow and Senior Director, U.S. Public Sector for Cisco Systems. He has been in the public service and industry business for over 30 years. He was a founding member of the Federal CIO Council. His 28 years in the Federal sector were spent at the Department of Commerce, where he headed its management and budget office as its first CIO.

Martha Dorris is a founder at Dorris Consulting International. She spent more than 30 years at GSA and was president of the American Council for Technology. She was senior adviser to the board for the International Council for IT in Government Administration (ICA).

Will Ash leads Cisco’s Security sales team serving U.S. Public Sector customers. This team is focused on delivering protection to government and education customers across their extended network before, during, and after a cyberattack through threat-centric security solutions. He previously led Cisco’s Atlantic Enterprise team.

 

Look Who’s MeriTalking: ServiceNow’s Bob Osborn on Modernization

MeriTalk recently connected with Bob Osborn, Chief Technology Officer, ServiceNow Federal, to discuss President Donald Trump’s Executive Order (EO) for a Comprehensive Plan for Reorganizing the Executive Branch and what’s ahead for agencies charged with implementing modernization initiatives.

MeriTalk: Recently, the first draft of the Agency Reform Plans were due in response to President Donald Trump’s Executive Branch Reorganization EO. What types of recommendations do you expect to see take center stage?

Bob Osborn: The initial plans will set the tone for reorganization efforts moving forward from the White House, Office of Personnel Management, and agencies.  We expect to see a tempered initial response as agencies await developments with regard to whether mandates will ultimately be funded.

Bob Osborn (Photo: LinkedIn)

MeriTalk: The EO calls for agencies to better leverage technology to improve underlying business processes–including identifying opportunities to automate processes. What types of processes represent the greatest target for automation in today’s Federal government–even after years of modernization efforts?

BO: We have seen modernization as a theme of each new administration in recent history, but the barriers to modernization have been complex. Many manual processes remain, especially when it comes to processes that cut across multiple departments. Initiating a cross-cutting, agencywide approach to service delivery is necessary but challenging, especially when we look at holistic service delivery that marshals the data stored in legacy systems. For example, automating the onboarding of personnel spans human resources, financial, and IT infrastructures and processes– many of which remain disconnected. This function, which remains largely manual today, is one example of a prospect for modernization.

MeriTalk: What role can and should cloud play in helping to align the Federal workforce to meet today’s needs and those ahead?

BO: Cloud is critical as the foundation from which we provide modern, consumer-like services–including how we make decisions and run our lives. We’ve chosen a platform–phone or tablet–that runs applications presenting information on which we base decisions. We may utilize applications like Waze for driving directions, banking apps for online banking, or Uber for transportation services. With just a point, click, and drag you can do just about anything you like. This is the expectation of the workforce today.

To deliver at work (in this case, a Federal agency) the same type of modern user experience that we have in our “consumer” life, we need a multi-instance enterprise cloud that allows systems to pull information from multiple data sources. We don’t have that today in most Federal agencies. As a result, agencies are making decisions using stale data. We believe this is an archaic way to do business, especially as we modernize and treat government services as consumer services. A multi-instance enterprise cloud is central to achieving both a modern user experience and real-time, multistream data for better decision-making.

MeriTalk: How might we expect to see the role of managed services evolve under the EO?

BO: This is clearly the way to go. Tony Scott said on many occasions: Government needs to start acting like a multi-divisional enterprise with an overarching corporate structure with independent divisions performing different functions. Shared services is one aspect of this, and managed services is a maturity of shared services. This building-block approach involves capable and agile architecture and infrastructure and easy-to-use applications. It, ultimately, results in the hosting of services that leverage workflows and business processes to deliver the right information to the right person at the right time for appropriate decision-making. We see many organizations modernizing one department within their agency first. The applications developed through DevOps allow other departments to take advantage of them quickly–this can then be expanded across the entire agency. Core functions–like human resources and financial management–are ripe candidates for managed services. I believe the GSA unified shared services model is the right direction. It has already set the framework. The challenge is that you need a common platform for these services to be delivered through.

MeriTalk: How is ServiceNow positioned to help agencies in executing on the requirements of the Reorganization EO?

BO: We have taken the necessary steps through investment in our enterprise-class cloud and gone through FedRAMP certification process, so when the time comes agencies are confident putting their information into the ServiceNow cloud. We have more than 100 customers utilizing this modern consumer experience platform delivery model. The ability to do service delivery easily and rapidly in a platform that is mobile-device aware is new. The proper visualization is there regardless of the device–collaboration and chat, analytics, governance, risk, and compliance, and portfolio risk management are all integrated right out of the box. This reduces costs and cybersecurity risk, enabling agencies to effectively move out on the EO with confidence.

MeriTalk: What best practices are you sharing with agencies as they begin this journey?

BO: I encourage everyone to do their homework. Not all clouds are created equal, and not everyone who claims to have an enterprise-class cloud and enterprise service delivery really do. The ITIL model is critical because it provides a proven solution. Agencies must not be reluctant to start small and grow. With ServiceNow, you can start in one business area or department, and since everything is native on the platform, you can grow. This is not unlike a communications service provider, like Comcast or Dish. A basic channel package exists, and then you subscribe to additional channels you want. If you want more, you simply turn them on because they’re resident on the box.

ServiceNow is subscription based, so this same model applies. Feds can start in any one area, and then everything is certified in their environment. In the future, all they need to do is subscribe to other services. The certification process is minimal compared to other acquisitions–we’re transforming IT acquisitions. Today, CIOs struggle with keeping up with new technologies. This platform approach provides constant upgrades and new technology. New functionality and new apps are always added and subscriptions are always modernized. Because we use a single data model with common application logic–they are backward compatible. Instead of upgrades being a big issue, they become non-events.  Agencies can keep up with modern technology and lower cybersecurity risk all at a lower cost.

 

Look Who’s MeriTalking: Palo Alto Networks’ Ryan Gillis on CDM

MeriTalk recently connected with Ryan Gillis, vice president of cybersecurity strategy and global policy for Palo Alto Networks, to discuss continuous diagnostics and mitigation (CDM) implementation and how Palo Alto Networks can drive agencies to a more secure environment.

MeriTalk: What is your high-level assessment of where agencies stand in terms of CDM implementation, and specifically what are your thoughts about the prospect of dashboards being in place by the end of 2017?

Ryan Gillis: It’s hard to make a generalization across government but there are common issues as agencies implement CDM. As a baseline, CDM and EINSTEIN are the two programs through which the Department of Homeland Security provides certain complementary technologies and capabilities to each civilian agency. Additionally, each agency has the responsibility to secure their own networks, and deploys technology, people, and processes to accomplish those goals.

Ryan Gillis

Agencies are trying to drive as much harmonization as possible between what they’re getting through CDM, what they’re deploying themselves, and how that all functions together. One particular goal is to go through CDM to drive cost savings. Aligning what agencies are buying through CDM and what they are buying separately is a common issue that arises as agencies move through the CDM process.

When it comes to implementation–agencies are not just sticking with one static solution–CDM has done a good job of periodically evaluating and incorporating new technologies. This is to the benefit of the agencies as they’re trying to acquire new technologies to help secure their networks. In terms of the dashboard, I think it will remain a focus; however, it remains to be seen as to whether or not they accomplish this goal. There is a lot that we’ve seen over the last 10 years in regard to CDM and EINSTEIN deployments–but there is a new set of circumstances and challenges with the turnover in personnel and ushering in of a new administration that impact the complex deadline of deploying the dashboards by 2017.

MeriTalk: The CDM program has three phases. Where have organizations faced the greatest challenges in phases 1 and 2? What are the hurdles ahead for phase 3?

RG: The first challenge in the stand-up of the CDM program relates to how we bring in the technologies to meet the initial levels of requirements for phase 1, 2, and 3. Specifically, the identification of the suites of tools that would be utilized by the program and getting agencies to purchase was the most important challenge when getting the program off the ground.

The hurdles ahead relate to: How does CDM align with what agencies have in their networks for harmonization, how does the program manage technology refreshes as the tools that accomplish the requirements for CDM continue to evolve in the private sector, and how do the CDM requirements and capabilities evolve in phase 4 and beyond.

MeriTalk: What strategies should agencies adopt for each phase to minimize the pain and accelerate implementation?

RG: The lessons learned from the OPM breach and last administration’s Cyber Sprint are now being applied through the Trump administration’s cybersecurity executive order. For example, under the new executive order, each department and agency head is being held personally responsible for their own agency’s security, and agencies are now also mandated to implement the NIST framework based on risk management. As agencies look to CDM implementation, they should focus on areas of risk that matter most. What is the highest value asset on your network, and how do you apply people, processes and technology toward securing that.

MeriTalk: How does Palo Alto Networks Next-Generation Security Platform help to ease these challenges at each phase and accelerate time to value?

RG: We can break this into two categories–there are some requirements that we help agencies fulfill directly with our solutions and others where we strategically partner to complement our core competencies. To address challenges in phases 1 and 2, our solutions deliver visibility down to the application and user level, and we partner with companies like ForeScout and Tanium to enable agencies to fulfill the spectrum of required capabilities. In phase 3, we empower users to address the network boundary protection requirements. Finally, we have strong partners, such as VMware, that enable us to collaboratively address phase 4 focus on network segmentation and software-defined networking, so that our security capabilities can be applied in a better managed environment.

MeriTalk: The original CDM blanket purchase agreement is set to expire next summer. How can agencies accelerate their implementations and avoid pitfalls to ensure that they have contracted all phases in time for the deadline?

RG: Just recently, we have seen a move toward GSA’s increased use of special items numbers (SINs).  SINs make it easier for the products that are held on CDM (tools that agencies would want to buy and deploy) to be acquired and leveraged. This reduced the administrative burden and accelerated the process to acquisition and deployment. This is an acquisition and procurement move that will hopefully result in getting CDM tools into the hands of operators faster.

MeriTalk: How might the president’s cybersecurity executive order issued last month impact the future of CDM or the likeliness of its extension?

RG: I see the executive order as a continuity of policy. As discussed earlier, this EO is a reflection of the fact that cybersecurity policies have developed in a linear manner and in a nonpartisan way. It reinforces the responsibilities of agencies to secure their own networks by moving toward implementation of the NIST framework. CDM should help with that risk-based approach to securing their networks and high-value assets.

MeriTalk: How can CDM be better integrated with mobile and cloud environments? How can Palo Alto Networks solutions help to address this growing requirement?

RG: The core initial requirements within CDM are to know who and what is on your network and then protect your network boundaries. These philosophies should be applied regardless of how your network is configured–whether your data is stored in a data center or the cloud. We must tailor security solutions to this evolving environment. Agencies are going to need the same types of security to protect high-value assets regardless of where their data resides. Palo Alto Networks understands and is focused on this mission. We are also focused on prevention and stopping successful cyberattacks, an approach that is a core part of our security offerings. Our Next-Generation Security Platform employs a prevention-based approach that automatically stops threats across the attack life cycle–whether these threats are at the endpoint or in the cloud.

MeriTalk: What considerations should agencies weigh carefully when looking at solutions and partners for their CDM journey?

RG: One of the problems in the security industry is focusing narrowly on a solution that addresses a particular requirement or new attack vector and not viewing how that solution integrates into broader requirements. Customers, from the beginning, should ask their vendors, how does this solution solve the distinct problem I’m looking to address, and, as important, how does it complement my other solutions to deliver a more secure environment?

MeriTalk: One of the biggest challenges that agencies report about CDM is that there is not a “one-size-fits-all” approach. What have you seen in the agencies that you work with, and how is Palo Alto Networks uniquely prepared to work with agencies to deliver the right solution?

RG: There are some aspects of commonality, whether the focus is on CDM or corporate customers. A company like Palo Alto Networks needs to be flexible to meet the unique demands of customer sets–whether it be security in the data center, public cloud, private cloud, hybrid cloud, endpoints, or the network layer. Having a platform that works together is how we meet those goals and identify the best partners to bring in core competencies, including VMware, AWS, Verizon, and Proofpoint. These capabilities are instrumental to delivering the right solution for our customers.

 

Cloud Computing for the Win

How fitting that Cloud Computing (the horse) beat favorite Always Dreaming to the finish line at the Preakness this year, particularly for those of us working in Federal IT.

Just three days earlier, the Modernizing Government Technology (MGT) Act passed the House, signaling continued momentum for agencies working to reduce reliance on outdated systems and move to cloud – the foundation for digital transformation.

We need to stop managing resources piece-by-piece and building systems component-by-component. This approach isn’t fast enough for modern Federal missions, and can’t scale to manage the exponential increase in Federal data. As we look forward to the Internet of Things, machine learning, and artificial intelligence, we know we will quickly generate even more data and will have an even greater need to deploy IT services, faster.

Cloud enables IT as a service. We can put new applications in the hands of Federal teams, faster. IT becomes 100% mission focused. And, as a result, we can deliver improved service to the citizen and new ways to connect to government services, resources, and information.

And, we’re at the gate. The Dell EMC State of IT Trends 2016 Federal study found that Federal IT leaders anticipate cloud will be their most significant investment in 2017. OMB has specified that the 2018 budget requires that agencies focus on transitioning high-risk legacy IT systems to cloud and shared services.

It’s a trifecta of benefits – cost savings, improved operations, and, of course, better security. OMB has shared that, as an example, moving to a cloud-based collaboration solution can bring cost savings that range from $500,000 per year for smaller agencies to $10 million per year for a larger agency such as the Department of Justice. Lawrence Livermore National Laboratory consolidated their data centers from 63 to 18, using cloud computing to support the lab’s enterprise computing systems and enabling operational improvements through a common dashboard providing data streams to all lab employees.  The OMB Breach and Wannacry shows us the cyber liabilities of outdated IT.

Unlike horse racing, we’ll never reach the finish line – but we have proven cloud success across government. As we modernize, cloud computing will help us win.

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

Time to Get Serious About Federal Government Cybersecurity

It is generally accepted that, as the National Institute for Standards and Technology points out, cybersecurity threats exploit the increased complexity and connectivity of our critical infrastructure systems and can potentially place the nation’s security, economy, and public safety and health at risk. Like financial and reputational risk, cybersecurity risk affects the bottom line of both companies and nation-states. It can drive up costs and impact revenue. It can harm the ability to innovate and to gain and maintain customers, as well as make it difficult to meet the needs of citizens.

To address these risks, President Obama issued Executive Order 13636, “Improving Critical Infrastructure Cybersecurity,” on Feb. 12, 2013. According to the Department of Homeland Security, this executive order directed the executive branch to do five things: develop a technology-neutral voluntary cybersecurity framework; promote and incentivize the adoption of cybersecurity practices; increase the volume, timeliness, and quality of cyber threat information sharing; incorporate strong privacy and civil liberties protections into every initiative to secure our critical infrastructure; and explore the use of existing regulation to promote cybersecurity.

Almost exactly one year later, a cyber intrusion began at the United States Office of Personal Management. This intrusion went undetected for 13 months. As the Wall Street Journal, U.S. News & World Report and other media reports noted, this intrusion was described by Federal officials as among the largest breaches of government data in the history of the United States. Information targeted in the breach included personally identifiable information, such as Social Security numbers, as well as names, dates, places of birth, and addresses. The hack even involved the theft of detailed security clearance-related background information, including more than 5.6 million sets of fingerprints.

Clearly, EO 13636 was insufficient to prevent a major cybersecurity event.

Less than a month ago, President Trump signed a new executive order, “Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure,” designed to protect American innovation and values. This new executive order, which reflects considerable analysis, opens with four findings: that the executive branch has for too long accepted antiquated and difficult–to-defend IT; that effective risk management involves more than just protecting IT and data currently in place; that known but unmitigated vulnerabilities are among the highest cybersecurity risks faced by executive departments and agencies; and that effective risk management requires agency heads to lead integrated teams of senior executives with expertise in IT, security, budgeting, acquisition, law, privacy, and human resources.

The executive order goes on to explicitly hold agency heads accountable to the president for implementing risk management measures commensurate with the risk and magnitude of the harm that would result from unauthorized access, use, disclosure, disruption, modification, or destruction of IT and data. It also mandates the use of the rigorous and recently revised Framework for Improving Critical Infrastructure Cybersecurity developed by the National Institute of Standards and Technology that EO 13636 deemed voluntary.

Will this new executive order make a difference? The answer may rest in the implementation and enforcement of the order. With parallel progress in both pattern recognition algorithms and microelectronic technology, machine learning and artificial intelligence can likely already bridge the gap between the enormous volume of government intelligence data and people capable of analyzing it, as Jason Matheny, Director of the Intelligence Advance Research Project Agency, has forecast. IBM’s Watson, for example, can understand all forms of data, interact naturally with people, and learn and reason at scale. Accordingly, the compromise of even sensitive but unclassified information when analyzed by sophisticated means could enable perpetrators to “connect the dots” and jeopardize national security.

In this environment, will “mistakes” or negligence leading to compromised information be tolerated or will they be dealt with severely? Will agency heads be held accountable or will they get a pass? Will “antiquated and difficult-to-defend IT” be tolerated or will rigorous processes and modern applications, like layered security, limitations within network security, encryption of data at rest and in motion, and policy engines used in conjunction with access restriction and auditing software be mandated, implemented, and audited?

The answers will be revealed over the next weeks and months.

The challenge is clear—a well-thought-out and rigorous policy for Federal government cybersecurity is in place, now it must be implemented and enforced. Time is not on our side; the next hack or the next serious incident due to the negligence of a government employee or contractor could happen tomorrow or the next day. It is time to get serious about Federal government cybersecurity.

The President Proposes…

 

…and the Congress disposes. So goes the old adage in political science and on Capitol Hill. Congress’ power of the purse, enunciated in Article I of the Constitution, is at the very heart of its authority and role in our democracy. That’s why I’m  somewhat amused by all the concern, gnashing of teeth, press articles, and the like about President Trump’s recent release of his FY 2018 budget.

Why do I say that–“somewhat amused”? For several reasons; let me enumerate.

First, I’ve seen this movie before. I assumed the role as Director of the Office of Management, Budget and Planning at the Department of Commerce in the early 1980s, during the Reagan administration. The president’s budget was based on reports from the Grace Commission and proposed the elimination or drastic cutbacks of a number of programs, agencies, and so on. One could lay that list from 35 years ago against this president’s budget and find an almost complete overlap. Reagan was going to abolish the National Endowment for the Humanities, the National Endowment for the Arts, the Appalachian Regional Commission, the…well, you get the idea. As the administration, through the Office of Management and Budget, “instructed” me to reiterate this budget request year after year, I began to refer to it as my Don Quixote list. And those agencies and programs are still around.

Second, in the Federal acquisition and information technology (IT) community, we need to see the tree, not the forest. The tree, in this case, is Federal IT spending. President Trump’s budget proposes to spend $95.8 billion in FY 2018, $1.7 billion more than FY 2017 and $5.4 billion over FY 2016. Priorities remain similar to those in the Obama administration–modernization, cloud computing, cybersecurity, shared servicing, category management, and digital services to citizens. And the Congress has a history of appropriating more for IT than the administration has requested. So prospects are positive for this community.

Finally, the response to President Trump’s proposal has been…well, less than lukewarm. What have members of Congress had to say about it? Here are some comments:

“I thought Mexico was going to pay for the wall. Why is this in our budget?”

– Rep. Fred Upton, Michigan

“Taking funding away from Louisiana’s coastline is a non-starter.”

– Sen. Bill Cassidy, Louisiana

“It’s just a lot of people who don’t know what the hell is going on in farm country.”

– Sen. Pat Roberts, Kansas

“A lot of Benghazis in the making if we actually implemented the State Department cuts. So, this

budget is not going anywhere.”

– Sen. Lindsay Graham, South Carolina

“I do not support proposed cuts to the National Institutes of Health.”

– Sen. Pat Toomey, Pennsylvania

“Many of the proposed cuts to important domestic programs that many Michiganders rely on

are, frankly, non-starters.”

– Rep. Fred Upton, Michigan

“Meals on Wheels, even for some of us who are considered to be fiscal hawks, may be a bridge

too far.”

– Rep. Mark Meadows, North Carolina

“We know that the president’s budget won’t pass as proposed.”

– Sen. John Cornyn, Texas (also serves as Senate Majority Whip)

“We’ll be taking into account what the president recommended. They will not be determinative.”

– Sen. Majority Leader Mitch McConnell, Kentucky

“The proposed cuts to some Federal programs are not mere shavings, they are rather deep and

harmful to my district…”

– Rep. Hal Rogers, Kentucky, former chairman of the House Appropriations Committee

 

What do all these comments have in common? They are all from members of the president’s own party. Some budget proposals during the Obama administration were deemed dead on arrival. This one, as they said in The Wizard of Oz is likely:

“…not merely dead, it is really most sincerely dead.”

– Coroner, Munchkinland

 

Look Who’s MeriTalking: Bob Stevens on Mobile Security Risks

MeriTalk recently spoke with Bob Stevens, vice president of Federal systems for Lookout, about the unique mobile security risks facing the Federal government, where agencies are making progress, where they need to improve, and what they can do to get started. Lookout takes a mobile-first approach to security, creating mobile-first, cloud-first products for IT administrators, CISOs, and individuals.

MeriTalk: Why do mobile devices present unique security risks and challenges for the Federal government?

(Photo: Lookout)

Bob Stevens: Mobile devices are designed for consumer use, not Federal agency use. The government has to put a lot of checks and balances in place to ensure security, but many features aren’t available out of the box. Further, the Federal government needs visibility into what exists within its mobile ecosystem. To do this, agencies need encryption, device-specific ID management, and a mobile threat protection solution for continuous monitoring of mobile applications, the device, and network to provide near real-time threat detection.

It’s also important to consider that, when it comes to the Federal government, hackers and other malicious entities targeting intelligence agencies have very different motives and processes than those targeting consumer agencies. And, the wide range of mobile risks is especially concerning for agencies due to the critical data being accessed across their networks.

At the end of the day, Federal employees are individuals, too. Just like you and me, most Federal employees interact with friends and family on social media and stay connected to the world via a mobile device. While Federal data may not be housed on a personal mobile device, an infected device could give hackers the entryway they need to access our nation’s critical data across Federal networks.

MeriTalk: Where have agencies made the most progress?

BS: In a 2015 survey, Lookout found that 40 percent of Federal employees said that rules prohibiting personal smartphone use had little to no impact on their behavior. Before 2016, mobile wasn’t even a thought for DHS top priorities, and now it’s in the top three for the department.

Rapid adoption and innovation in mobile technology has forced Federal agencies to recognize that the shift toward mobility has a direct and immediate impact on their organizations, and that mobile security should be a priority.

Each of the steps being taken to address mobile security is a sign of progress for the Federal government. Just last week, President Trump issued his executive order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure, which among several provisions calls for agencies to adopt the National Institute of Standards and Technology (NIST) Cybersecurity Framework. Furthermore, the Department of Homeland Security (DHS) Science and Technology Directorate recently released a study on Mobile Device Security. The report outlined the risks mobile devices pose to the security of the government.

One Federal entity that is leading the charge when it comes to mobile security is the U.S. House of Representatives. CISO John Ramsey is forward looking, understands the threat of the mobile ecosystem, and is taking action. The U.S. House of Representatives, for example, is focused on deploying real-time diagnostics on devices themselves.

MeriTalk: What strategies and tactics do you advise to help agencies evolve beyond simple mobile device management?

BS: Agencies need clear visibility into the mobile ecosystem. They need to know every device that is connecting to the network and know the apps on those devices. They also need predictive analytics on the devices themselves.

For true mobile protection, mobile device management (MDM) is critical, but not all encompassing. Many agencies believe their MDM solution will protect them from malicious applications. However, because users can “sideload” apps onto their phone, we consistently see malicious applications appear on our enterprise customers’ devices.

To truly protect against the wide array of mobile threats, agencies need a large portfolio of mobile defense software. Mobile application management (MAM) solutions, for example, give Federal agencies an added layer of security for devices that are Federally managed.

MeriTalk: What best practices can you offer for continuously monitoring for threats and providing enterprisewide visibility into threat intelligence?

BS: While many of the same components of risk that affect PCs also apply to mobile endpoints, mobility has introduced a new generation of risk. Simply extending current PC security controls to your mobile fleet is not a viable option. Enterprise risk management needs to evolve to address mobile risks, and security professionals must architect mobile-specific security.

When it comes to threat intelligence, it is critical that agencies have insight into both external and internal risks. Lookout developed a unique program for mobile threat intelligence. The success of Lookout’s personal and enterprise endpoint products has given Lookout visibility into over 100M mobile devices worldwide. Every month millions of devices in over 150 countries send security telemetry to the Lookout Security Cloud, ensuring that Lookout can track evolving threat actors and continue to lead the industry in novel threat discoveries such as the Pegasus spyware.

In terms of providing enterprisewide visibility into threat intelligence, most organizations find that they have very limited visibility into most mobile risks. This is because many CISOs and security teams fail to get visibility into employee behaviors and device configurations, which is the first step to enabling mobile security. Agencies and organizations that are able to gain visibility into the entire spectrum of mobile risks facing their infrastructures will be able to foster an environment that enables the safe and efficient use of mobile technology.

Another best practice when it comes to providing enterprisewide visibility is enterprise management mobility (EMM). Specifically for internal mobile security risks, EMM addresses the vulnerabilities associated with personal devices. Simply put, EMM solutions have the ability to grant and revoke access to employee devices. When used together, threat protection and EMM solutions provide the necessary defense tools needed for managing risks inside and outside of the agency.

MeriTalk: How do machine learning and contextual analytics increasingly factor into the ability to protect devices from malicious applications? Where and how is Lookout delivering innovation on this front?

BS: Mobile security is a constant, fast-moving battle between the good guys and the bad guys. New threats appear all the time and enterprises need to be able to cover them all.

Modern threat management is all about data. It’s a big data problem. The bigger the data set, the more effective a solution is at identifying and protecting against threats.

Lookout has the biggest data set in mobile security as a result of our consumer user base, which is generated from a network of over 100 million devices. Those devices acquire 90,000 apps every day, contributing to a corpus of over 40 million apps, and enabling Lookout to auto-convict over 5,000 new pieces of malware each day. Our teams leverage machine learning to be able to quickly dive into our massive data set and pinpoint potential attacks, malicious applications, and risks at scale.

MeriTalk: What is the CISO’s role in the evolving digital ecosystem and the IoT?

BS: We understand that the CISO’s job is not an easy one. From the outside looking in, the CISO’s role appears to be a very strategic juggling act. With the recent explosion in mobile technology, another ball has been thrown into the mix.

In terms of evolving the digital ecosystem and IoT, the CISO should question the security protocols currently in use. By continuously taking a step back to understand the interconnectivity of attacks, the CISO can push for new approaches to security solutions. By providing security teams with a holistic sense of what they’re facing, they can develop improved strategies for securing the digital ecosystem and IoT.

MeriTalk: What emerging threats and dangers should CIOs and CISOs be paying attention to today to avoid larger issues down the road?

BS: How quickly network infrastructure is changing and will continue to change, for one. Mobile devices—smartphones and tablets—have completely changed the security assumptions that have been baked into most enterprise networks. Instead of all of your data being inside the firewall on tightly controlled servers and PCs, it’s now distributed between cloud services and mobile devices that don’t typically have the same security controls as their on-premise counterparts.

Look Who’s MeriTalking: Steve Harris of Dell EMC on MGT, DCOI

MeriTalk sat down with Steve Harris, GM/Public Sector at Dell EMC, to discuss MGT, DCOI, and other modernization efforts.

MeriTalk: The Modernizing Government Technology (MGT) Act has just been reintroduced. What are your thoughts?

Steve Harris: We applaud Congressman [Will] Hurd for introducing the Modernizing Government Technology Act. Under MGT, agencies save by streamlining IT systems, replacing legacy products, and transitioning to cloud computing. Those savings can then be placed into a working capital fund that can be accessed for up to three years for further modernization efforts.

The Federal government needs modern IT systems that can adapt to better serve constituents. It’s encouraging to see bipartisan support. For example, as part of the effort to protect Federal networks, the cybersecurity executive order–which places specific emphasis on modernizing Federal IT systems–was just signed. In this very contentious environment, modernizing our Federal IT systems so that we can increase the efficiency of government is one thing everyone agrees on.

MT: In the interim, what advice are you giving Federal CIOs as they evaluate their current and future data center modernization strategies?

Steve Harris

SH: CIOs should establish a strong framework to enable digital transformation.

Agency CIOs are still struggling at the crossroads of mandate and mission. They’re working to modernize data centers, but still struggling to maintain legacy systems that require maintenance and support and consume a disproportionate part of their IT budget. Re-platforming these systems is a critical piece. Open platforms will drive down the costs of maintaining infrastructure moving forward. To accelerate modernization, CIOs need two things: a comprehensive inventory of their IT assets and strong top-level communication and collaboration between mission functions and the IT teams.

MT: From your conversations with IT leaders, what are Fed CIOs’ top three priorities for modernization? Are you seeing any shift in these priorities? 

SH: Security, security, security. We should see new momentum with the cyber EO signed. Modernizing applications and endpoints is critical. Many vulnerabilities lie in the user community, legacy system, and application endpoints. There is also a real sense of urgency in the shift to move faster to a modern cloud strategy and other modern interfaces–agency IT leaders are focused on cloud, mobility, data analytics, and other technologies to deliver improved mission support and new efficiencies.

Most CIOs understand that digital transformation and modernization are inextricably linked. But, CIOs have a tough job in today’s government. They want to be change agents, but it’s only recently that they’ve been given new opportunity for progress–FITARA, specifically. FITARA empowers CIOs with new responsibility for mission goals, and this is certainly driving a shift in how priorities are addressed.

MT: What is the next frontier in terms of modernization? 

We are quickly moving to a state where ubiquitous mobility and connectivity are the norm, and cloud computing is the foundational core of any IT infrastructure. Agencies are already equipping themselves with flexible, software-based solutions to support tomorrow’s technology innovation while still meeting today’s mission goals. Realizing digital transformation is the next frontier as agencies continue to modernize.

MT: Are you seeing any new use cases for virtualization and flash emerging? If so, can you share your observations?

SH: Flash technology offers a great alternative for agencies that have multiple classification levels of sensitive data, some of which need to remain on prem. It’s agile, efficient, and fast. And, it has a significant impact on energy reduction in the data center. Flash storage will effectively support agencies that need to quickly access data during urgent, mission-critical decision-making activities–so DoD is a prime example, but also agencies like the department of State. We will see more Feds jumping on the flash bandwagon as they continue to address the need to meet Data Center Optimization Initiative (DCOI) mandates.

MT: Where do you see or anticipate the greatest traction for software-defined storage?

SH: Explosive data growth is overwhelming many organizations. This reality is especially profound in government given the broad scope of the mission. Software-defined storage (SDS) addresses data challenges as it offers elasticity, scale, and simplicity–with the added benefit of cost savings. Agencies working to meet the National Archives and Records Administration’s (NARA) record management mandates or evolving Electronic Health Records (EHR) requirements would be effectively served by SDS.

This said, SDS will gain the most traction supporting data center modernization and helping agencies meet the ambitious DCOI milestones.

MT: Make one prediction for the future of the hybrid cloud in Federal IT over the next three years.

SH: Integrated hybrid cloud will become the No. 1 spending priority for agencies over the next few years. We know cloud services are one of agencies’ top spending priorities. I believe hybrid cloud will quickly jump to the top of the list, given the need to modernize at a rapid pace, and the associated efficiency, cost savings, and speed benefits.

 

Meeting DCOI Goals with a Software-Defined Data Center

The Federal Data Center Consolidation Initiative (FDCCI) was intended to, in part, “reduce the cost of data center hardware, software, and operations […] and shift IT investments to more efficient computing platforms and technologies.” But it didn’t work – new data center construction continued. Six years after the FDCCI, the Data Center Optimization Initiative (DCOI) was introduced to impose tighter regulations, and has forced agencies to focus on meeting specific metrics to improve data center utilization and efficiency, and, ultimately, save money.

The primary DCOI deadline of September 30, 2018 is fast approaching, and agencies must do all of the following before that date:

  • Meet target metrics for energy metering, power usage effectiveness (PUE), virtualization, server utilization and automated monitoring, and facility utilization;
  • Install advanced energy meters;
  • Use automated monitoring and operations – no more manual collection and reporting;
  • Reduce data center costs by at least 25 percent, relative to the fiscal year 16 spending; and
  • Close at least 25 percent of tiered and 60 percent of non-tiered data centers.

As Federal agencies look at the challenge in front of them – installing new technology, optimizing infrastructure, improving performance, and reducing costs – how can they quickly achieve all of these goals?

One path forward is software-defined data centers (SDDCs). SDDCs have virtualized hardware components, deliver infrastructure-as-a-service, and are software-based. All of these components allow agencies to be flexible in the face of unknown, emerging data center demands. Federal agencies have repeatedly mentioned scalability as crucial to successfully modernizing IT – a recent Dell EMC survey found that 90 percent of Federal IT decision-makers want an IT architecture that allows them to tailor to specific application and workload needs.

That same survey found that most agencies are moving towards SDDCs. Sixty-four percent of Federal agencies said they had deployed software-defined solutions, 85 percent reported progress adopting SDDCs, and 56 percent reported being more than halfway done adopting an SDDC.

And, Federal IT managers reported the top three advantages of SDDCs are flexibility, agility, and efficiency. All of those characteristics put agencies in the best position to optimize and modernize their data centers against DCOI goals, support agency missions, and drive towards digital transformation.

As Federal agencies look to hit the DCOI bottom line, SDDCs seem to be the best path forward – increasing flexibility and efficiency, improving performance, and, most importantly, saving dollars. It’s a Federal IT win-win.

Learn more:  http://i.dell.com/sites/doccontent/shared-content/campaigns/en/Documents/Future_of_data_center.pdf

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

Are You Rational? Prioritizing Applications to Move Modernization Forward

As more and more Federal agencies turn towards cloud as the most viable modernization method – in a recent MeriTalk study, 76 percent of Federal IT cloud decision-makers said they are evaluating cloud solutions as an integral part of their overall IT strategy – agencies have to develop a clear plan on how to consolidate data centers and make the move to cloud.

As an immediate next step, Federal agencies require an understanding of which applications are ready – and able – to move from data centers into the cloud and operate successfully once in the cloud environment. 92 percent of Federal IT managers say it’s urgent for their agency to modernize legacy applications, with the top drivers being security issues, time required to manage and/or maintain systems, and inflexibility and integration issues.

Agency CIOs must determine how infrastructure and applications interact relative to mission priorities. What’s vital, what’s duplicative, what can be retired? This process can breathe new life into the application portfolio – providing improved efficiency and security before the move to the cloud.

So how can agencies create and implement effective application rationalization plans for their upgraded systems? A few thoughts:

  • Create a consolidation roadmap – establishing clear guidelines from the beginning helps agencies to stick to their goals and proceed in a timely manner.
  • Create an application inventory – this helps agencies see what applications they have, determine which ones are necessary, and identify what applications might be duplicates or are no longer necessary. Agencies can also rank applications based on mission value.
  • Identify security and performance requirements for each application – how will each application fare in a transfer? Are certain apps worth re-building, or is it better to eliminate them entirely?
  • Map out application connections – how are certain applications connected? Do any applications share data? This ensures eliminating one app does not interfere with the execution of another app.
  • Consider an automated tools-based approach – can speed the application rationalization process by 50 percent, using fewer resources than manual techniques.
  • Look for quick wins – identify opportunities to build modernized apps quickly, driving mission value and accelerating your success.

Also, Federal IT managers estimated that 55 percent of their current legacy applications could be successfully modernized through re-platforming the existing application, leveraging architecture-driven modernization, or by remediating the existing application to extend its useful life.

Investing in application rationalization as an early step puts modernization efforts on the right path. And, with Rep. Will Hurd (R-Texas) and Rep. Gerry Connolly (D-Va.) recently re-introducing the Modernizing Government Technology (MGT) Act, there should be more funding and support for these efforts.

Learn more about how Dell EMC can help with application rationalization:

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

Improving FITARA Scores:  Follow the Leaders

The December 2016 FITARA scorecard  revealed a number of agencies continue to rely on legacy IT systems, wasting billions each year and making it increasingly difficult to secure sensitive government information.

Many received Ds and Cs, and not one agency scored an overall A. Additionally, despite significant focus during the past several years, nearly 65% of agencies received a C, D, or F in the data center consolidation category.

These low grades reflect the complexity of the modernization challenge. Data center consolidation and optimization is a top priority for every Federal chief information officer – to reduce costs, improve efficiency, and keep data and systems secure.

The report cards bring new transparency, and provide the best, fastest way for agencies to learn from each other, evaluate their current plans, and move their IT forward. The Department of Homeland Security (DHS), for example, performed well, receiving an A in the data center consolidation category. Luke McCormack, the agency’s former CIO, says he expects the agency to have fewer than 25 data centers by fiscal year 2019. Rep. Gerry Connolly (D-Va.), also noted that DHS surpassed its data center consolidation savings goal, cutting $248 million.

Soraya Correa, chief procurement officer at DHS, was quoted in a recent article saying agencies must change the way they think about their workforce by modifying how personnel are trained and removing silos around different positions.

“If you really want to change the culture, you’ve got to change the way you train people,” Correa said, explaining that IT, procurement, and human resources personnel should all be receiving training specific to their particular positions, rather than one blanket onboarding process for all employees.

As another example, several of NASA’s scores jumped from F to C-plus on the December scorecard. Renee Wynn, NASA’s CIO, said using an internal Business Service Assessment (BSA) was a key factor in the improved scores. The BSA helped the agency restructure IT governance, update program management, and improve CIO oversight for major IT investments.

Wynn said the BSA yielded seven recommendations for the agency to streamline internal roles and responsibilities. One of the ideas that had the most profound affect was creating an IT Council comprised of senior agency officials and representatives from the ten centers that deliberate on IT investments. This strategy investment board ultimately changed the way NASA managed IT by providing recommendations and an overarching direction for the agency’s IT acquisitions – eliminating the prior, more piecemeal approach.

And that is what FITARA is all about. We can’t modernize without a hard focus on the mission, which comes from increased collaboration within each agency and between the mission owners and IT.

The next scorecard is expected out in May. We are all looking for signs of progress across more agencies, and even more examples of how we can move forward together.

By: Steve Harris, Senior Vice President and General Manager, Dell EMC Federal

1 4 5 6 7 8 19