Overnight Cybersecurity: Obama to Review Encryption Bill
President Obama’s briefing means the bill will not be released this week, as Sen. Richard Burr hoped. Meanwhile, the White House on Thursday denied reports that it will not offer its support to the bill.
Will Federal Data Center Construction Freeze Benefit Colocation Providers?
Will the latest White House freeze on data center expansion and construction by Federal agencies accelerate colocation and cloud deployments? In February 2011, the “Cloud First” initiative required Federal agencies to evaluate their technology sourcing strategies so that cloud computing options were fully considered. It stressed the importance of each Federal agency migrating the majority of their data to cloud-based servers by 2015.
Government Primes Federal Cybersecurity Reporting Rules for Insurers | Healthcare Dive
The Federal government’s Office of Personnel Management has announced plans to introduce new data breach reporting rules for health insurers that cover Federal employees, according to a Nextgov report. Director Beth Cobert argued given the breaches at OPM and other insurers and providers, the government and its partners must coordinate efforts to keep their data secure. The rules echo draft guidelines issues by the White House last August, Nextgov notes, that aim to standardize cybersecurity incident reporting among contractors that store Federal data on third-party systems.
MIT Open Data Portal: A One-Stop Shop for Federal Government Data?
Combing through Federal data has typically been a daunting affair. The lofty claim is delivered about a site that aggregates Federal open data from multiple sources and displays it in interactive visuals — colorful charts, maps, profiles and even a few pieces of data-based journalism. Unlike scores of citizen analytics sites before it, Data USA embraces the role of data curator and — with minimal nudges — guides its visitors to create actionable data insights.
Vast Majority of Federal IT Professionals Feel Their Agencies are at Risk
Ninety percent of IT professionals in the Federal government feel their organizations are vulnerable to a cybersecurity attack, according to a recent report by Vormetric. The numbers are disconcertingly high considering they come from professionals tasked with protecting the confidential information of millions of Americans as well as the classified information from certain Federal programs and policies. Despite those high numbers, nearly 60 percent of responding government IT professionals believe their network defenses are “very” effective at safeguarding data, a number the report notes is notably more optimistic than their private-sector counterparts; the U.S. average is 53 percent.
The Prez really gets the cyber problem – that’s why he jacked the FY17 cyber budget to $19 billion. That $5 billion hike was driven by OPM, the Cyber Sprint, and terrorist threats. (more…)
Just how tough has it become for Federal agencies to find skilled technical talent? It’s become so tough, in fact, that the National Security Agency is collecting resumes from “former civilian affiliates” who have the necessary skills, experience, and security clearance to help the agency “augment the existing work force on high priority projects or programs.”
Federated Identity Pilot
My Gaithersburg, Md., listening post has picked up signals coming from the National Strategy for Trusted Identities in Cyberspace that a new pilot program is underway to demonstrate the use of federated online identity technologies for use by hospitals and patients.
“Currently, patients and providers need to obtain new identity credentials to access health information at different organizations,” according to the Federal funding opportunity announcement intercepted by the Situation Report.
“Technologies exist to streamline authentication to web portals today but mostly exist to service one organization only. The goal for this project is for hospital systems to work with other regional health systems and provider groups to operationalize the acceptance of federated identity and operate the pilot for at least six months. The use cases for this pilot would involve a federated credential solution that allows patients and health care providers to use the same credential with at least two healthcare organizations.”
Applicants must apply via Grants.gov by June 1. NIST said it expects to start the $750,000 to $1 million pilot project by Oct. 1.
Insider Threat Updates
My E-Ring listening post at the Pentagon reports that the Defense Department continues to make significant progress on its insider threat detection program and the intelligence community’s new continuous evaluation (CE) effort. Reports indicate that DOD has expanded its CE program to 225,000 employees, and is on track to reach 500,000 personnel this year.
But the overall Federal effort to establish Insider Threat Programs at the agency level by December 2016 is at risk, according to the latest cross-agency priority goal quarterly update, obtained by the Situation Report. The National Insider Threat Task Force has missed all three key goal dates for establishing programs at the agency level.
“Most of the executive branch departments and agencies have accomplished program establishment tasks. Many departments and agencies are discovering challenges with issues such as organizational culture, legal questions, and resource identification, to name a few,” the report states.
In addition, it’s still taking much longer than it should to obtain a security clearance. For example, the 82,186 secret-level security clearances initiated during the first quarter of 2016 took an average of 116 days—that’s 42 days longer than the intelligence community would like. The 17,100 initial top-secret clearance requests took an average of 203 days to complete—89 days longer than the goal set by officials.
Migrant Intelligence
The real Donald Trump (@realDonaldTrump) will be happy to learn that my Oval Office listening post has picked up strong signals that the National Security Council is taking steps to address the intelligence challenges related to screening migrants. Monte Hawkins, an intelligence officer and NSC staffer, has been tapped to be the senior adviser for migrant screening and vetting on the NSC staff.
We all agree cloud consumption is inherently more efficient – helping agencies shift from CapEx to OpEx – and more flexible – enabling “anything as a service,” where agencies pay for what they use vs. what they project.
The plan is to use cloud to speed the Federal modernization path – a key goal considering just 32 percent of Federal IT managers anticipate their legacy applications will be able to meet mission needs in five years.
That said, Federal cloud transitions are moving slower than expected due to a series of challenges – with security, data governance, and procurement at the top of the list.
Fortunately, GSA is taking steps to address priority #1 – security – with plans to introduce a major reform effort to the FedRAMP program.
With cloud procurement, there are hundreds of Federal contracts and many factors to take into consideration. So – what’s the best path? Upfront planning and staying cognizant of all steps involved in the cloud transition will provide agencies with an improved pathway to the cloud.
The first question: What problems are you trying to address? Are there mandates unique to your agency? Geographic data storage requirements? Do you plan to share the technology or dedicate it to a single organization?
Identify what you need and choose the right cloud model. Infrastructure-as-a-Service (IaaS) is ideal for agencies that need to directly maintain the infrastructure and data. Platform-as-a-Service (PaaS) is ideal for teams who will be developing and delivering new applications or wanting to accelerate deployment of new capabilities
Remember: One cloud does not fit all. Agencies do not have to pick “one cloud” (i.e., public or private). Agencies can adopt a hybrid cloud approach, allowing them to direct workloads as appropriate to commercial or private clouds.
FITARA means Federal CIOs need improved transparency and visibility into their IT environment, and a cloud move can be a key step. ViON works with agencies to deliver a portal through which they can see their full environment, automate service provisioning, and access real-time consumption, funding, and billing data.
On the procurement side, ViON is a prime on key Federal contracts – GSA, NIH CIO, and NASA SEWP – simplifying the cloud procurement process. We eliminate the need for RFIs and RFPs, as we’ve already completed vendor evaluations. As a result, we can significantly simplify and expedite the procurement process.
We take on the upfront investment risk of hardware and software purchases, eliminating the burden on the agency and smoothing the cloud transition so agencies can lower cost, reduce risk, and easily scale to meet new mission requirements, quickly. That’s the real silver lining.
Federal IT leaders have the best possible cloud intentions – from Cloud First and Shared First to FDCCI and FedRAMP. And, there is motivation. By most accounts, 80 percent of Federal IT dollars are currently spent on life support for legacy systems – an equation that needs to change.
This month GAO and OMB provided additional motivation.
GAO reported 22 of 24 agencies show a “lack of progress” in achieving OMB’s goal to close at least 40 percent of all non-core data centers by the end of 2015.
OMB released a new Data Center Optimization Initiative (DCOI) on March 3, superseding FDCCI and taking further steps in the push to transition to more efficient infrastructure, such as cloud services and inter-agency shared services. Starting in 180 days, the memorandum advises agencies to not budget funds or resources toward a new data center or significantly expand an existing data center without approval from the OMB OFCIO.
Laying out the path forward, OMB says agencies will evaluate options for the consolidation and closure of existing data centers by (in order of priority):
Transitioning to provisioned services, including configurable and flexible technology such as Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), to the furthest practicable extent, consistent with the Cloud First policy.
Migrating to inter-agency shared services or co-located data centers.
Migrating to better optimized data centers within the agency’s data center inventory.
Good intentions aside, the lack of progress in getting IT workloads into the cloud raises red flags about agencies’ ability to meet data center consolidation goals.
There are legitimate concerns – from cost uncertainty to CSP stability to data security/availability to migration issues. The big question – how can agencies overcome the headwinds and accelerate the cloud transition?
At ViON, we take a straightforward “business model” approach, packaging infrastructure with a proprietary cloud financial model and 24/7/365 enterprise class support, professional services, and managed services.
We know there are existing applications that will not work well (or at all) in a commodity infrastructure public cloud. We also recognize it will take significant time and resources to re-platform these applications. So, we work with agencies to overcome these challenges and enable a cloud business model now, for all applications, not just the small subset that can survive on commodity cloud.
Technology is delivered via a pre-integrated, shared infrastructure pool, providing access to needed resources without over-buying – you pay for what you use.
At the core, agencies need a business model approach that reduces the risk of the cloud transition and ensures you achieve desired efficiency goals – so you can innovate. That’s the real silver lining. You don’t have to wait until the application is re-architected. You can limit risk and position your agency for success in the cloud today.
After serving 45 years in Federal service, it still amazes me that cybersecurity is treated as an add-on, focused on preventing access to the network, rather than as an integrated part of the foundation, working at every level of our systems to protect the data.
We have said for decades that security can’t just be a bolt-on feature and that it needs to be considered as part of the architecture and foundation. Part of the problem is that security implementation is still way too hard. It is still treated by vendors as an option that has to be turned on and integrated, and the default is to install software without security turned on. This is just plain wrong.
This may have made sense years ago, when implementing full security for classified systems could consume 20 percent to 30 percent of your resources, which might not work for high-performance systems. Nowadays, the security impact tends to be 2 percent or 3 percent, and no CIO is going to risk their carrier by turning it off without a lot of thought.
We have also discovered that security is needed in virtually all systems to protect privacy, financial, and commercial data, not just classified government data. Systems should be installed with security options turned on by default. CIOs and system administrators should have to choose to turn off security.
Vendors also need to simplify security implementation, so that it is much more plug-and-play, based on major security categories. Start with options for highly classified Security Technical Implementation Guides (STIG), then government restricted, financial, and health care/privacy, and other; and last, give an option for NO security (not recommended). That is around five options, not thousands of variables.
Installations should start by looking for the Identity and Access Management (IDAM) software and then inheriting security settings from it. Then move on to Role Based Access Management/Attribute Based Access Controls (RBAC/ABAC). All systems should have Privileged User Management Access (PUMA) controls in place, and database administrators should be able to see data only by exception. We have to stop relying on edge protection at the network level and build security in at every level.
Data should be protected at rest as well as in flight by encryption and dynamic Virtual Private Networks (VPN) at every level. Why do we continue to treat the network like an open party line that any device can listen in to? If the packet is not for you, then only the packet header should be visible, not all the data.
We also need continuous security monitoring that validates that the security is on and that it continuously meets security standards, not just when it is installed.
Security should also be baked in at the chip level, as Intel is starting to do. To make our systems secure and protect our digital futures we need to bake security in at all levels, simplify the installation and maintenance, and protect our data by default. Why do we build security so that it is so complex that almost no one can get it right? Remember the DVRs that constantly blinked the wrong time because few average people could figure it out? Now they install correctly by default and take the time from the power line. What we need is security for dummies, so that we can all get it right.
Let’s fix browsers so they are secure and can’t let in malware. Let’s stop the insider threat by limiting access and constantly monitoring usage. Let’s secure the Internet of Things (IOT) by building security into everything to include sensors, peripherals, devices, appliances, and vehicles.
My plea to vendors is “enough already”: Stop making security hard to implement and build it into the foundation of your systems. To government and commercial CIOs, start holding vendors accountable. If they do not build security into the DNA of their products, find a different vendor.
Always consider the cost of installing, maintaining, and operating your security as part of the total cost of ownership. How long are you going to keep your job if you have a major data breach? Don’t skimp on security, demand it as a starting point. It is time we all got serious about cybersecurity.
About the Author
Kenneth M. Ritchhart is vice president of Business Development & Strategic Planning at Oracle Public Sector.
MIT Team Wins $75M in Federal Funds for Research in High-Tech Fabrics
A consortium of colleges and businesses led by the Massachusetts Institute of Technology has won a national competition to host a novel federally funded research program to turn clothing fibers and fabrics into wearable electronic devices, officials are expected to announce Friday. Clothing fibers could be designed to change color, monitor health, or even store energy.
Google ‘Mic Drop’ April Fools’ Gag Goes Horribly Wrong
Google ended up with egg on its face after this year’s April Fools’ joke caused some Gmail users to insult contacts and, some claimed, lose employment opportunities. The “joke” was an actual feature that Google added to Gmail, called “Mic Drop.” An orange button next to the standard blue “send” button allowed people to send their email with an animated image of a Minions character dropping a microphone. Outraged Gmail users who use the service for professional purposes flooded Google’s product forums to complain about having accidentally clicked the button on important work mails.
Here’s Snapchat’s Latest Report on Government Requests For Data
As it continues to grow, ephemeral messaging app Snapchat is receiving an increasing number of government requests for user data. On Tuesday, the Venice, Calif., company published its latest report on these requests, which it does every six months. Between July 1 and Dec. 31, 2015, Snapchat received a total of 862 criminal legal requests from U.S. government entities, up from 761 in the six months before that.
Tech Giants, Government Struggle with Online Speech Policies
As social media outlets increasingly become the favorite channels for terrorist groups to spread messages of violence and recruit new members, the Internet companies that maintain those services are in a tough spot. Companies born on the Web like Google and Facebook promote an ethos of free speech, but at the same time recognize the dangers of terrorists, criminals, and other bad actors co-opting their platforms in service of a violent ideology or illegal activities.
Can the IRS Protect Taxpayer Data? Government Accountability Office Raises Concerns
Just in time for tax season, the Government Accountability Office is warning that weak financial controls at the Internal Revenue Service leave taxpayer information at risk.
In a report released this week to IRS Commissioner John Koskinen, the GAO noted the agency’s progress in information security but said ‘‘weaknesses in the controls limited their effectiveness in protecting the confidentiality, integrity, and availability of financial and sensitive taxpayer data.’’
My Capitol Hill listening post has picked up several encrypted messages from the Government Accountability Office suggesting that the Federal Information Technology Acquisition Reform Act may be widening some major fault lines across government. While the law was designed to strengthen the role of the chief information officer, the Situation Report has picked up signals that some agency CIOs continue to be bulldozed by their CxO counterparts.
“CIOs are being pressured to sign off on FITARA plans they don’t agree with,” said Dave Powner, Director of IT Issues at the GAO. “We need stronger CIOs across the board.”
Meanwhile, the fainthearted have retreated to the relative safety of complaining that FITARA is yet another compliance exercise designed to be an innovation speed bump. Indicators are strong that the old compliance checklist mantra isn’t going to work in this case. What is working, say insiders, is the slow but steady culling of feckless leaders.
There may be another, less treacherous, route to FITARA success—giving agencies the resources they need to remain in compliance with the law. Strong signals coming from the Agriculture Department indicate that FITARA done correctly required seven full-time employees and $3 million-plus during the first year. Since new funding wasn’t in the cards, USDA reverted to volunteers.
Get Ready For Agency Self-Assessments
Not. Remember those FITARA Self-Assessments that agencies were supposed to send to the Office of Management and Budget by April? Well, most of them are in, but my OMB listening post has picked up solid evidence that the White House has no plans to make those assessments public.
Data Center Disasters
The Federal government continues to make progress in its effort to close data centers. The Federal Data Center Consolidation Initiative (FDCCI) has spearheaded the closure of 3,125 of the 10,584 data centers that we know about. But leave it to OMB to redefine what a data center actually is and this is what you get: the very real possibility that the 10,584 data centers we currently know about could increase.
“We are nowhere close to optimizing our data centers,” according to Powner, who plans to release a status report in late May.
Congress also has plans to take a hard look at the government’s legacy system closure rate. My Capitol Hill remote outpost reports that most agencies have no plans for the millions of lines of old code still in existence. “Far less than half of those old systems have a plan for replacement,” according to one Hill watcher.
Cyber ISR Boost
Thanks to the Federation of American Scientists, the Situation Report got its hands on a heavily redacted copy of the Fiscal 2016 Congressional Budget Justification Book for the Military Intelligence Program. The 178-page document contains a few nuggets of back channel intelligence worth noting.
The Air Force National Guard is realigning up to 258 personnel to stand up a Cyber Intelligence, Surveillance, and Reconnaissance Group in Massachusetts, and another 89 members to stand up a similar squadron in California. Total cost: $19 million.
The volume and variety of endpoints is growing, as more and more devices connect to Federal networks. Feds are worried security can’t keep up.
A recent MeriTalk report estimates 44% of endpoints that access Federal agency networks are at risk. And nearly one-third have experienced breaches via endpoints.
Greg Godbout is the Chief Technology Officer and U.S. Digital Service lead at the Environmental Protection Agency (EPA).
MeriTalk: What is the EPA working on in IT right now?
Greg Godbout (Photo: Jessie Bur, MeriTalk)
Greg Godbout: The reason I came here is because the plan was to take a holistic view of, really, what I would call mission IT: the idea that IT is an incredibly important part of the mission, but, in and of itself, it’s not its own thing. It should be a part, just as acquisitions is, and hiring, and all those things. We’ve taken a very enterprise view of transformation across the whole enterprise to align with user-centered design practices and agile practices that could be conceived as IT, but they’re also just as good business practices as well. By aligning all of those groups, and all of our policy and all of our governance around the digital services movement sort of theme, it’s actually easier to make IT improvements, when you’re changing the ecosystem that you’re in as opposed to just fixing one project at a time.
MeriTalk: How is cloud computing affecting your agency and its plans?
GG: There’s a future vision state that we have of spending our resources to be a more adaptive agency. In the IT space, we need to be more iterative and agile and modular, so that we can be adaptive. I don’t know how to do that without the cloud. I think what a true, elastic, and scalable cloud has brought to the table is an ability to buy as you need and scale and be flexible. And it’s one of the 20 really important things to get to a DevOps [Development Operations] environment. You can’t skip it. You have to have that truly scalable and elastic cloud model to get there.
MeriTalk: How is that process going for you guys?
GG: I think it’s moving along well, but it should be going much faster. I think that, of all the innovations that have to occur for us to be more responsive and adaptive, the true cloud environment is disruptive to how government agencies work in general. And because it’s so disruptive, it’s a little more of a challenge to make the leap, because you really have to change the way you think of infrastructure. It’s become a utility. I’m sure, at some point, someone used to power heat in the buildings: If you’re in OMB you can see fireplaces. That was literally how they powered the heat. Now heat is just a utility. We have it; we click in and get the power. So this is the same thing. You get this cloud hosting these platforms, which have become part of the utility nature. And that’s true at the platform, infrastructure, or software-as-a-service layer. It just allows easier access to these things.
MeriTalk: How is FITARA affecting your job and the work you’re doing?
GG: FITARA is intricately woven into it. I sit inside the CIO’s office, which we call the Office of Environmental Information, and I, myself, you could say, have delegated authority. But I, myself, do the FITARA reviews around anything software or cloud-based to make sure they’re aligned with driving to the transformational change we need across the agency. And also, FITARA allows us a great set of meetings that allows you to get the business and the IT side and the acquisitions side totally aligned. I guess, conceivably, that could be done without FITARA, but FITARA makes that easier. We’re using FITARA as the main mechanism in which we help enforce this change.
MeriTalk: What project are you working on right now that you’re most passionate about?
GG: We have a challenge that’s going to be a Smart Cities Challenge. The reason this one excites me the most, is that it’s part of our Smart Cities initiative here and where we’re sort of evolving to. But I do believe the future is some sort of combination of big data analytics, Internet of Things, and crowdsourcing science. From an IT support standpoint, when I imagine what that world will be like, it will be thousands of sensors. Orders of magnitude greater of data coming in. And some of it will be video data and other types, not just classic structured data. When I think of OEI [Office of Environmental Information] and driving to mission IT, we have to fundamentally change everything we think about storage, and hosting, and being adaptive for this new environment. It’s not just to deal with today, but, right around the corner, a more open and transparent world where we can combine data sets from outside government, and consume them properly with data sets in here. In this Smart Cities Challenge, we’re going to ask some cities to put in some air sensors, a lot of them. It’s our first look at our IT capacity: What do we need to build our services to be like to be able to exist in that world? I don’t mean us hosting the data, because that’s not really what’s going to happen. It’s just that the data will exist, and we can consume it too. What does that change about the way we think? I feel like, for EPA in particular, one noticeable change will be becoming increasingly more dependent on data scientists versus software developers. I think data scientists will sort of be like the career of the future from an IT standpoint.
When it comes to data protection, are you a Cub Scout, a Boy Scout–or an Eagle Scout?
Your place in the IT and data security maturity life cycle is a lot like the scouts. As your organization becomes more mature, your view of protection becomes more sophisticated.
Tom Callahan
The Cub Scouts’ motto is: “Do your best.” Cub Scouting is about participation and affirmation. There are requirements, but the effort is measurable. On the other hand, the Boy Scout motto is: “Be prepared.” Boy Scouts focus on achievement and growth. Progress means meeting requirements. Self-reliance is the driving discipline.
Cub Scouts are at the earliest phase of an IT security maturity life cycle. Eventually Cub Scouts need to mature into Boy Scouts. And if they work hard enough, they can move to the elite ranks of Eagle Scouts, where they serve as role models for the best and brightest.
From an IT/cybersecurity standpoint, the current state of many organizations is at the “Do Your Best” phase. Do this, do that–try to stop the bad guys from getting your stuff. Your efforts will inevitably fall short, but you will have tried your best.
A more mature approach is the “Be Prepared” approach. Yes, these efforts at stopping the bad guys are important, and striving to stop them is good, but knowing that at some point security will fail, these security teams prepare for these failures by securing the data–or “securing the breach”–by encrypting and managing the keys.
Here’s how an organization relates to the three phases of a “scouting” security maturity life cycle.
Cub Scouts–Do Your Best
This part of the maturity life cycle focuses on basic security–that is, relying on perimeter protection, rather than protecting data. At this point, perimeter protection is itself fairly basic, relying on firewalls as the way to permit or reject traffic, based on its source. Routers, load balancers, and VPNs are all behind the firewall, allowing for the minimum level of network security.
Once your organization has accepted the fact that perimeter protection will eventually fail and a breach will occur, you are at the point of actually protecting your data, as opposed to just putting locks on your door.
Boy Scouts–Be Prepared
For this part of the cybersecurity maturity cycle, perimeter protection is still in place, but you start to apply additional security measures on your most valuable assets–the data itself. By embedding protection on data, even after the perimeter is breached, the information can remain secure.
Unfortunately, organizations often take an incomplete approach to data encryption, which can create gaps in security. For example:
Encrypting data in storage, but not data in motion;
Encrypting external communications, but not data inside the firewall;
Total reliance on cloud solution providers to protect data in the cloud;
Lack of multifactor password and user authentication; and
Storing cryptographic keys in software.
This last element (software storage of crypto keys) is particularly problematic for real data security. While storing keys for encrypting and decrypting data in software offers some measure of protection, it is simply more vulnerable than hardware-based storage.
The organizations that have moved beyond simple perimeter protection and employ robust encryption to protect their data encryption are the Eagle Scouts in security.
Eagle Scouts–Demonstrated Leadership
Even more forward thinking than rank-and-file Boy Scouts, these demonstrated leaders complete every task they strive to reach. In IT security, they are the group that hits all defense requirements thoroughly, and in depth. This group not only demands perimeter protection and data encryption, but the strongest authentication over users and passwords.
Strong authentication blocks unauthorized access and holds authorized individuals accountable for their usage of digital resources. Applying different authentication methods to different user groups, particularly privileged users with administrator access, ensures these organizations prevent the misuse of data and systems by insiders.
Aspects of this part of the security maturity life cycle include:
Encrypting all sensitive data, both structured and unstructured (on-premise, virtually, or in the cloud), across multiple locations;
Securing cryptographic keys in a hardware security module, as opposed to the partial security of software-based key management;
Employing a crypto management platform, which centralizes management of the entire key life cycle across the extended organization.
Of course, threat vectors are always changing, so you can never really finish the path to complete IT and data security. But in today’s cybersecurity environment, “Do your best” isn’t good enough anymore. It is time to “Be prepared”–and then some.
Tom Callahan is VP of sales for SafeNet Assured Technologies, LLC.He can be reached at thomas.callahan@SafenetAT.com.
90 Percent of U.S. Federal Agencies are Vulnerable to Data Threats
According to new research, 90 percent of IT security leaders in U.S. Federal agencies say they feel vulnerable to data threats. In addition, 61 percent have experienced a past data breach, with nearly one in five indicating a breach in the last year. The top barriers to adopting better security are named as skill shortages at 44 percent, and budgets at 43 percent.
The Man With The $84 Billion IT Budget: Federal CIO Tony Scott (Audio)
“If you want to work on the biggest problems at the biggest scale with the greatest impact, there’s no better place than the Federal government,” says Tony Scott, CIO of the United States of America. The veteran tech executive (General Motors, Disney, Microsoft, VMWare) is halfway through his term as Federal CIO and doing just fine. He’s had to wrestle with ornery congressmen, cybersecurity headaches out the wazoo, Apple vs. the FBI, net neutrality lobbyists, and a giant, often-creaky IT architecture in the middle of the world’s biggest refresh.
7 Things No One Tells a New CIO
The role of Federal agency CIOs, like most senior government positions, has always been a high-pressure, high-turnover job. FCW asked both current and former agency CIOs what they didn’t know when they started and desperately wish they had. Wrapping one’s head around an agency’s mission and culture, on the other hand, is a job in itself.
Chinese National Pleads Guilty to DOD Hacking Conspiracy
A Chinese national on Wednesday pleaded guilty to participating in a years-long conspiracy to hack into the computer networks of major U.S. defense contractors, steal sensitive military information, and send the stolen data to China. Su Bin, a China-based businessman who worked in aviation and aerospace, stole data relating to the C-17 strategic transport aircraft and certain military fighter jets, according to a Justice Department release. As part of the conspiracy, Su would email hackers with instructions regarding what individuals, companies and technologies to target.
Internet of Things Market Set to Rise due to Government Initiatives and Technological Advancements in Healthcare
The Internet of Things is a network comprising things or physical objects embedded with software, electronics, network connectivity, and sensors. The Internet of Things enables objects to be controlled remotely across existing network infrastructures and creates opportunities for direct integration among computer-based systems and the physical world. It is an intelligent and invisible network and improves the overall accuracy, economic benefit, and efficiency of the system in which it is used.
It’s no secret that my D.C. network of informants are concerned about the future of the Internet of Things and the potential for major tears in the social fabric if policy does not keep pace with technological development. But some members of the D.C. network have started sounding the alarm over China’s use of big data.
Open-source intelligence confirms that Chinese authorities have been hard at work during the last six months monitoring six pilot projects for the country’s new big data-based social credit system. It is an unnerving integration of financial credit data with a vast array of social data mining, including purchasing habits, friends, and interactions with authorities. Think of it as a credit score with a creepy twist. Get too many speeding tickets, and your score will go down. Spend too much time playing video games, and you might be considered lazy and unproductive, and your score will suffer. Associate on social media with people who have low “trustworthiness” scores, and your score will drop.
As creepy as that sounds, it gets worse. There will be penalties and rewards tied to these trustworthiness scores. Have a low Social Credit System (SCS) score, and you might find it hard to get a job, or the Internet speed at your home might be throttled back, or you might have to put up with more red tape at your local government agency when trying to get basic services. Keep that score high, and you’ll encounter far fewer speed bumps on the road of life.
One of the more insidious aspects of China’s new SCS score is the gamification of social control by encouraging people to monitor each other. That’s right, if you socialize with people who have low SCS scores, your score will suffer.
GSA’s Peace Corps Cloud
My remote listening post on the corner of 18th and F Streets in Washington, D.C., has picked up strong signals that the General Services Administration’s inspector general has a problem with the way the agency managed a 2014 cloud-based email pilot program at the Peace Corps.
Under the program, GSA provided the Peace Corps with cloud services and basic cloud computing applications, including email, calendars, document storage and management, and collaborative tools to create websites and communicate in small teams. But GSA represented the pilot program as a shared service without approval from the Office of Management and Budget, and that’s a no-no.
“Interagency shared services must be approved by OMB,” the IG’s audit report states. “GSA also used an agreement that was not designed to provide shared services and was not finalized until three months after services began. In addition, because the cloud email services were originally provided to the Peace Corps at no cost, GSA improperly cited the Economy Act as the authority for the services. Finally, GSA augmented the Peace Corps’ budget when it provided the cloud email services at no cost.”
Augmented the Peace Corps’ budget? Everybody knows that the augmenting of budgets in the Federal government is strictly prohibited! Wink-wink.
According to the IG, GSA provided the Peace Corps with $100,000 worth of cloud email services at no cost without obtaining the express authorization of the cloud service provider.
Opening Up the House Rules
My Capitol Hill intercept station reports that the House Rules Committee is warming up to the concept of open government. Committee Chairman Rep. Pete Sessions, R-Texas, announced Wednesday that the House rules and manual are now available in XML format through the Government Publishing Office’s GitHub account.
“Chairman Sessions’ decision to publish the Rules as searchable, open XML data is a leap forward–not just for public transparency, but also for the conduct of House business,” said Hudson Hollister, executive director of the Data Coalition, which represents technology companies supporting open data in government.
“The House Rules Committee plays an essential role in how Congress functions, and publishing the Rules of the House in XML will empower the public to more easily access and analyze the House Rules,” said John Wonderlich, executive director of the Sunlight Foundation, a nonpartisan nonprofit that advocates for transparency in government. “This is a positive step toward a more transparent Congress.”
Catch up on some reading this weekend. Here are a few interesting items from around the Web.
Want to foster innovation? Get out of your comfort zone (Video)
MIT’s Sloan Executive Education department workshop on innovation last week was led by Hal Gregersen, who brings years of experience working with companies to help CIOs to stay ahead of the curve. In fact, MIT gets in the trenches of a business to interview employees to see how work gets done day-to-day to fully understand the business as a whole. Often, the programs are directed at the C-suite and other higher-level employees, who can take the information back to the office.
Crypto battle presents ‘hard choices,’ says Federal CIO
The clash between the FBI and Apple over opening up a terrorist killer’s locked iPhone presents a hard choice between compelling arguments, according to Federal CIO Tony Scott. “It’s a really hard, hard topic,” he said. “I could make an argument on both sides. Strong encryption is important and a backdoor might be a problem,” he said. However, he also said society has an interest in pursuing and prosecuting criminals.
New Adobe Digital Price Index
Adobe has crafted a new way to analyze the nation’s complex economy with its Digital Price Index, whose findings include fresh insights about online job hunting, Web-based searches for housing and price trends for digital items. Over the last 12 months, online searches for jobs are on the decline, online searches for homes and apartments is rising, while prices for an array of digital items have slumped, according to the new index. San Jose-based Adobe argues that its system of studying billions of visits to websites, along with transactions involving more than 1 million items that were bought online, can provide valuable insight into what’s going on with the nation’s economy.
CSRA Execs: Federal Government Is In The Early Innings Of Cloud, But The Game Is On
CSRA CEO Larry Prior and others dish on why the U.S. government is working to modernize legacy infrastructure, security and the Federal adoption of cloud and how CSRA leverages Silicon Valley’s R&D for its customers.
The role of the FTC and federal regulators in Silicon Valley (Video)
So what, exactly, is the Federal Trade Commission? And how does the FTC differ from the Federal Communications Commission?
Its overall focus is on the consumer. But in spite of its breadth, Brill explains that the FTC does involve itself in regulatory issues fundamentally important to Silicon Valley — especially data privacy and what Brill calls the “gig economy” of sharing services like Uber and Airbnb.
The passage of the Federal Information Technology Acquisition Reform Act in 2014 was a watershed moment for Federal technology modernization. Finally, agency chief information officers were given the authority they had long sought to exercise practical control over their organizations’ technology investments.
A lot has changed since FITARA was introduced. In fact, we’ve actually seen a couple of strong CIOs—like Richard McKinney at the Department of Transportation—throw down the FITARA gauntlet to much success. But what hasn’t changed may, ultimately, be more important. And what hasn’t changed (at least not enough) is the culture of government.
It has been 15 months since the law was enacted and relatively few Federal agencies have taken meaningful steps to integrate the letter and spirit of the law into their day-to-day functions. Sure, there’s been a lot of talk—like the Department of Veterans Affairs CIO LaVerne Council, who this week told Congress that a year from now VA “will be the premiere government agency in FITARA.” But when it comes to bridging the FITARA gap—the chasm between the Federal CIO and the rank and file—little has been achieved.
The FITARA gap is a spiritual crisis between Federal policy wonks and those that get the job done on a daily basis. To the senior wonks and wonkettes, FITARA is a gift from the gods, promising to fundamentally alter the technological face of government by giving CIOs the agility and authority they need. But many Federal employees are less fervent in their buy-in.
For example, few predict that the legislation will revolutionize government IT, according to a recent survey by Dell and the Government Business Council. In fact, only 6 percent of respondents say FITARA will enact “major improvements.” In addition, 22 percent expect FITARA to have “little to no long-term impact,” and 7 percent believe it will make existing problems worse.
Closing the FITARA gap will require leadership—a different kind of leadership perhaps from a different kind of CIO.
“Our Department CIO has used this as a blatant power grab, and defensive maneuvering by my subagency has resulted in large
decreases in IT functionality and productivity,” one Dell survey respondent said. “The advantages of FITARA are entirely dependent on the skill, experience, and interest level of the CIO, which seems quite variable,” another respondent said.
With greater power comes greater responsibility. And Federal CIOs must do more than maneuver freely under the cover of FITARA—they must lead, and lead from a position of technical competence. Overcoming the FITARA gap will require proving that the new expansion of CIO powers is merited and will deliver on its promises.
MeriTalk will explore these issues and more at the third annual FITARA Forum on March 30. I hope to see you there.
G. Nagesh Rao is the Chief Technologist with the U.S. Small Business Administration Office of Investment and Innovation for the Small Business Technology Transfers (STTR)/ Small Business Innovation Research (SBIR) programs.
MeriTalk: What are you working on in the IT space with your programs?
G. Nagesh Rao (Photo: LinkedIn)
G. Nagesh Rao: We are an oversight policy program to make sure they are using those dollars to go to small businesses to do innovative research and highlight those success stories to encourage more participants into that innovative pipeline. We encourage that risk taking. That’s the whole point in the program. If it works out, great. If it doesn’t, then we’ve got data that can show what is going to work and what’s not going to work.That gets into what SBIR.gov is doing because that’s a gateway business intelligence platform tool for not only the government folks, but it’s good intelligence for the companies themselves to understand what are the agencies’ needs and what are the agencies looking to fund, currently or in the future, or what they’ve funded in the past. That helps them do what they want with respect to seed funding in high-risk areas that the free market just won’t touch. The Federal government, we’re more interested in space exploration, public health, food, and national security.One of the riskier projects we’ve seen coming out right now is Made in Space. That company is doing all the 3-D printing on the International Space Station. They’re developing that technology there for the future of space exploration development. NASA has the control of their own technology.
MeriTalk: How does cloud computing factor into what you do?
GNR: We have been funding some of that Internet of Things technology and a few other proposals recently that delve into that issue. From the data side on SBIR.gov, right now we’re in cold fusion storage servers technology. I’d like for us to see how we can get our system updated into the cloud, down the road. It’s a piece-by-piece process. I’m dealing with 11 agencies’ data and having all those systems talk to each other and interact with our main gateway platform is really important. The cloud seems to be a really efficient and cost-effective way to get all that data up and dealt with in a more streamlined manner.
MeriTalk: How has FITARA affected your job as the chief technologist?
GNR: It’s really just the Federal government trying to make sure our IT systems are up to date and done in a secure manner. We don’t have sensitive data as much on our system. What we’re doing is taking public data and making better sense of it.
MeriTalk: What’s an interesting, creative thing you’ve done with big data?
GNR: I’d say with SBIR.gov. That system was built with pennies. I did it really cheaply, but I had to be really efficient with the tight budget I had. If you look under “award listing” or “company listing” in the header, we did an analytics dashboard so you can see the number of awards that are issued each year by agencies–what states the awards are going to, where’s that hot pool of talent, innovation, and really cool companies emerging from across the country. You can actually drill it down to ZIP codes and see where those awards are being won. You can see how many small business companies have been focused on a couple of areas of interest or in multiple arrays of interests and get a landscape on how that company is maturating from a piece-by-piece perspective. What’s nice is it’s 30 years of data.
MeriTalk: What advice would you give to someone who wants to become a chief technologist?
GNR: Be yourself. Be fearless. And be honest. Be ready to jump to the next ship. I was restless and rather than sticking to it, I would grow curiosity to learn something new. There is no secret sauce. It’s more about ensuring that drive and innovation and crossing the threads. I’ve always been active in the nonprofit world. So by day, I would work one job, but by night, I’d be engaged in other projects, to try to cross those threads together. I went to get my master’s in law. I got my MBA. I always jumped around.
MeriTalk: What advice would you give to someone on the outside looking into the SBA who wants your job?
GNR: Don’t chase sexy. I think a lot of people are trying to make the headlines. Truth be told, the SBIR program kind of languished a little bit. Part of it is understanding the value in crafting the narrative. Finding the joy in it, then building out on that.
MeriTalk: How would you define your leadership style?
GNR: My boss, John Williams, he’s been great as a mentor. He’s the head of the entire SBIR program for the Federal government. He gives me a lot of autonomy and says, “just go,” do my thing. Good leaders are the ones who trust their people. I operate on a flat leadership perspective with our team. I let them take the reins on stuff. I mentor, I guide, but I’m more of an “I trust you, so go and get it done” kind of guy.
MeriTalk: What do you look for in an IT leader?
GNR: Being creative, yet pragmatic. Look for creative ideas. Look for creative solutions. Entrepreneurial mind-sets are folks who know how to do more with less.
That said, we understand the PMO’s working hard to make amends – and rolling out the long-awaited FedRAMP 2.0 on March 28th at GSA. Want a sneak peek at what they’ll roll out? Seems they took the hints in the Fix FedRAMP recommendations. Here’s what we hear.
Transparency:
First off, the PMO’s focused on transparency. It plans a new site that tracks ATOs and ATOs in progress – showing the duration of each CSP’s journey. MeriTalk already delivers this visibility on the FedRAMP OnRAMP. Check out the new CSP Journey tab – it shows the duration of CSPs’ FedRAMP certification processes. Maybe the FedRAMP PMO should partner with MeriTalk, rather than reinvent the wheel with taxpayers’ money? Call me crazy…
Quicker and Cheaper:
But, there’s more – a new ATO process. It’s supposed to be quicker and cheaper – and the PMO is not so engaged. Word is there are already three CSPs in the new process – two commercial providers and an 18F application.
Policing:
As the GSA PMO is taking a lesser role in approving ATOs, we understand it’ll work with OMB to better police agency ATO acceptance. The notion is to cut back on the horror stories of agencies’ refusing to accept other agencies’ ATOs. After all, sharing is caring – and isn’t that the essence of FedRAMP’s value proposition?
Catch up on some reading this weekend. Here are a few interesting items from around the Web.
Silicon Valley IT Company Sets Up Its Federal Sales Shop in Tysons (Video)
Silicon Valley IT company Nutanix is moving its Federal sales offices to the backyard of the Federal government technology hub in Tysons Corner, Va. The $2 billion San Jose, Calif.-based company is making the move to the East Coast at a time when the White House — led by federal Chief Information Officer Tony Scott and in the last couple weeks by Defense Secretary Ashton Carter— is attempting to tap the West Coast for ideas on innovation in the Federal government.
DHS Creates Intel Fast Lane For Select Analysts
In the Department of Homeland Security’s annual data mining report issued last month to Congress, the agency touted 2015’s improvements while noting the big target of instant data-sharing across systems is still out of reach because of interoperability issues. As a workaround, DHS has come up with a kind of special access account for a select group of analysts focused on time-sensitive threats to the homeland. The DHS Data Framework, meant to help link disparate DHS databases in support of the “One DHS” policy goal, entered an initial operational phase in April 2015, but One DHS remains a work in progress.
Sean McAfee To Run DHS Cyber Outpost in Silicon Valley
The Department of Homeland Security has sent one of its own to Silicon Valley to lead its cybersecurity efforts on the West Coast. Sean McAfee, who previously served as the deputy chief for the DHS National Cybersecurity Assessments and Technical Services (NCATS) team, will lead the efforts of the National Protection & Programs Directorate as it works to find private sector technology that can be used across the Federal government. McAfee started working in the Silicon Valley office Feb. 2. He has been working with Deron McElroy, a West Coast cybersecurity adviser for DHS.
The National Institute of Standards and Technology found that as organizations move to cloud-based systems and platforms, the accessibility for employees with disabilities can be compromised. Accessibility tools for those with disabilities rely on local computers capable of running them. And with most software and information now migrating to remote locations accessed through the Internet, NIST warned, those tools might not function.
Democratic presidential front-runner Hillary Clinton on Wednesday said if elected president she would work to “end the revolving door” between Washington and Wall Street. But those aren’t the only two locales served by the revolving door—Silicon Valley and the Tysons Corner corridor in Virginia are picking up speed, even if they remain under the radar.
The Department of Veterans Affairs this week announced the $22.3 billion Transformation Twenty-One Total Technology Next Generation, or T4NG, acquisition program. Not surprising, some insiders who spoke to The Situation Report took note not of the size and scope of the contract, but some of its lesser-known political links.
Of the 21 companies selected by VA, several have interesting ties to the agency. For example, Art Gonzalez, the former senior vice president and chief information officer at TISTA Science and Technology Corporation, has spent the last three years at VA as the agency’s deputy CIO for service delivery and engineering. Although he just announced his resignation, TISTA was among the T4NG winners. In addition, the Situation Report’s intelligence analysts recently discovered that Gonzalez had joined VA in 2013 just one month after the agency cut TISTA a check for $25 million for a financial auditing services contract.
Another lucky winner of the T4NG contract is Booz Allen Hamilton. The Situation Report has been monitoring the chattersphere and has picked up solid signals that Stan Lowe, VA’s highly respected former chief information security officer, is now an executive adviser at the company.
Remember Charles De Sanno? We had our analysts run the former executive director for VA Enterprise Systems Engineering through our proprietary data analytics system and discovered that the 25-year veteran of the VA ended up at Systems Made Simple Inc., a Syracuse, N.Y.-based health IT firm that—you guessed it—is one of the recipients of the coveted T4NG contract.
The revolving door between big-money government and big-money industry doesn’t stop there. Rob Thomas II, who joined VA last March, was recently promoted by VA CIO LaVerne Council to lead the agency’s new Enterprise Program Management Office. Thomas, an obvious choice for the job, brings a lot of experience to the table, including nearly four years as the vice president for enterprise transformation at Accenture—another T4NG contract winner.
Young Global Leader
Larry Page, Sergey Brin, Mark Zuckerberg, Marissa Mayer, Queen Rania of Jordan, and—David Bray of the FCC?
You got that right. My remote eavesdropping station in Davos, Switzerland, has picked up strong indicators that Bray, the FCC’s chief information officer, will be named a 2016 Young Global Leader this month by the World Economic Forum.
Well done, David, and much deserved. This is one revolving door that should keep turning.
Needed: 400 Million New Servers
It seems like everybody is taking part in the mad dash to figure out the true impact of the Internet of Things. Well, we might finally have an answer, thanks to Mark Thiele. Signals emanating from Thiele’s LinkedIn Pulse indicate that the number of servers that will be necessary to power the IoT will be on the order of 400 million by 2020.
How did Thiele get to that number? He started with the predictions of Gartner and Cisco, which peg the number of connected devices by 2020 at between 20 billion and 50 billion, respectively. To be fair, he split the difference and started with the figure 30 billion—10 billion PCs, servers, laptops, tablets, and phones, and 20 billion net new devices like dishwashers, toasters, weather sensors, connected cars, environment controls, automated lighting, and road sensors, etc.
Thiele then estimated that there are currently about 7 billion devices and 100 million servers in use—approximately one server for every 70 devices.
“Now let’s extrapolate the potential number of servers needed based on the assumption that we’ll have not 7 billion connected devices, but 30 billion. 30 billion divided by 70 equals roughly 425,000,000 servers. Yes, you read that right, if the numbers follow a historical precedent at all we will need roughly 400 million servers to support our 2020 IoT and technology demands,” wrote Thiele.
“Now let’s look at what 400 million servers means to data centers. A massive data center with 5,000 racks with 20 servers per rack has 100,000 servers. In order to have enough data centers for 400 million servers we would need to add another 4,000 massive data centers measuring roughly [400,000 square feet] with approximately 50 Mega Watts of power each.”
Intercept any signals for The Situation Report? Email them in confidence to dverton@meritalk.com
Whether in government, enterprise, or the nonprofit sector, there exists a colossal amount of data in need of protection. With the exponential growth of data comes the exponential growth of professional positions needed to protect that data, yet the fact continues to be true that our nation has a significant workforce gap of cybersecurity professionals.
To face this nationwide dilemma head-on, U.S. Cyber Challenge created a comprehensive methodology to identify potential talent, assess their skill levels, teach proper ethics, and network these individuals with public and private sector organizations that are hiring.
Karen Evans is the national director for U.S. Cyber Challenge (USCC), Center for Internet Security. (Photo: LinkedIn)
In April, U.S. Cyber Challenge will open their annual Cyber Quests competition where thousands of individuals across the nation will attempt to earn an invitation to one of the three summer cyber camps by achieving a top score. Southern Utah University will host the Western Regional camp, while Illinois and Delaware will host camps for top competitors who live east of the Mississippi River. Each camp has a series of intensive classwork with top-rated instructors, an ethics program, resume workshop, a career fair, and a culminating competition to give the participants more hands-on experience.
In most professional fields, traditional methods of learning and achievement–including higher education degrees and certifications–are highly valued, if not required, to land a job. Yet today in cybersecurity, some of the most talented individuals are not necessarily acquiring their knowledge and skills through traditional means. Instead, they are teaching themselves by watching YouTube videos, reading blogs, and simple trial and error. Competitions, like U.S. Cyber Challenge, are validating the skills acquired through these untraditional methods.
In addition to the camps and competitions, U.S. Cyber Challenge initiated the development of a new online social network specifically created for “cybersecurity enthusiasts,” those involved in cybersecurity either from a hobbyist level all the way up to the C-suite, called CyberCompEx (www.CyberCompEx.org). On the portal, members have the opportunity to learn more about cybersecurity and network with others in the field 24/7 through the use of forums, articles, podcasts, events, and competitions. CyberCompEx recently launched new tools specifically focused on career development. The platform added resume creation and job searching capabilities that help round out the site’s mission to help talent from start to finish as they navigate their career paths and fill the workforce gap.
The mission of U.S. Cyber Challenge is to significantly reduce the shortage in today’s cyber workforce by serving as the premier program to identify, attract, recruit and place the next generation of cybersecurity professionals.
On March 15 in Washington D.C., U.S. Cyber Challenge will be hosting its 3rd Annual Cybersecurity Summit in partnership with AFFIRM. The summit will bring together chief information officers, chief information security officers and other influential leaders in government, industry, and academia to discuss the growing threat in cybersecurity and methods to attract and retain talent.
Speakers include Dr. Diana Burley of The George Washington University, Donald Davidson Jr. of the U.S. Department of Defense, Dr. Douglas Maughan of the U.S. Department of Homeland Security, Randy Marchany of Virginia Tech, Richard McKinney of the U.S. Department of Transportation, Rodney J. Petersen of the U.S. Department of Commerce, John D. Ramsey of the U.S. House of Representatives, Angela Rey of DISA, Ben Scribner of the U.S. Department of Homeland Security, Steven J. Spano of the Center for Internet Security, Bobbie Stempfley of The MITRE Corporation, Ira Hobbs of Hobbs and Hobbs, LLC, Mike Causey of Federal News Radio, and Aliya Sternstein of Next Gov.
Karen Evans is the National Director for the U.S. Cyber Challenge (USCC), a nationwide talent search and skills development program focused specifically on the cyber workforce. She is also the former Administrator for E-Government and Information Technology at the Office of Management and Budget (OMB) within the Executive Office of the President, where she oversaw the Federal IT budget of nearly $71 billion, which included implementation of IT throughout the Federal government.
The Chief Information Security Officer (CISO) community has good reason to stay awake at night. Recently, hackers breached the Department of Justice (DoJ), released over 9,000 Department of Homeland Security (DHS) employees information, and claimed they will leak data for 20,000 FBI employees, according to Computerworld.
Catch up on some reading this weekend. Here are a few interesting items from around the Web.
Federal IT Survey Reveals Concerns, Progress
The survey, “Paving the Way Toward Mission-Aligned IT,” found that 72% of nearly 400 respondents said they were completely, very, or moderately satisfied with IT at their agency. At the same time, respondents said they wanted to see improvements such as apps and portals that offer better customer experience, better records management, more efficient internal communication tools, and field access to real-time data.
How Big Data, Info Sharing Make Hackers’ Lives Harder
Creating a new piece of malware is easy: Find something you need, then tweak the code to bypass firewalls and trick anti-virus software. By analyzing the provenance, code defenders are disrupting the reuse cycle and making the adversary’s life harder. Thomas Ruoff, director of technology innovation and mission integration for the Department of Homeland Security, likened provenance to a professor trying to determine whether a student wrote a paper or plagiarized it.
Managing the Growing Mountain of Government Data
The U.S. government’s use of big data continues to grow, stemming from the Obama administration’s 2012 commitment to invest $200 million in big data capabilities, as well as the expected growth of agency spending on enabling technologies such as cloud infrastructure. Recognizing the opportunity that big data presents, agencies are working hard to incorporate it into their operations to serve residents to the best of their ability. Yet the complexity and diversity of this data — from machines and sensors, structured and unstructured — makes using a single tool to accomplish this task next to impossible.
Healthcare Underspends on Cybersecurity as Attacks Accelerate
Healthcare providers are far behind other industries when it comes to protecting their data and the number of attacks is only expected to accelerate. A newly released survey offered a sobering take on healthcare’s flimsy defenses. Healthcare providers are averaging less than 6% of their budget expenditures on security, according to the survey from HIMSS Analytics, the research arm of the Health Information and Management Systems Society, and security firm Symantec.
NIST: Science-Based Data Collection Key to Better Wildland Fire Defense
A new report by the National Institute of Standards and Technology (NIST) describes how researchers analyzed a major 2011 Texas wildland fire using a rigorous and scientifically based post-fire data collection approach, a system they believe will lead to improved defensive measures and strategies for significantly reducing structural damage and property loss.
Department of Commerce Chief Information Officer Steve Cooper is hiring.
“I’m reaching out to my network to seek recommendations for, and direct expressions of interest in, two [senior executive service] positions in the Office of the CIO at Commerce,” Cooper wrote to his LinkedIn connections—including The Situation Report. “Both openings are career positions and are the result of a departure for promotion and a retirement. One position represents our Deputy CIO role, and the other represents our CTO role.”
Meanwhile, Cooper’s career remains in motion. The Commerce CIO on March 1 joined the Network Centric Operations Industry Consortium advisory council, which is led by former Defense Information Systems Agency director Harry Raduege.
A Purdy Nice Gig
Remember Andy Purdy? He’s the former acting director of the Department of Homeland Security’s National Cyber Security Division who most recently went on to become the chief security officer of Huawei Technologies USA. Huawei, as some of you may recall, is the China-based global electronics giant that the U.S. intelligence community has kept at arm’s length because of its close ties to the Chinese government. In fact, The Situation Report has picked up recent signals that despite a secret meeting between NSA officials and Huawei executives brokered by former NSA Director Michael Hayden, the company remains on the “do not trust” list. Some of Purdy’s former colleagues have not forgiven him for the move to Huawei, either.
However, my Shanghai listening post has picked up strong signals emanating from the Changhang Building in the Pudong District, that Purdy is now the vice chairman of the Open Group Trusted Technology Forum. Ironically, the OTTF is focused on the development of a global supply chain integrity program, framework, and standard to provide buyers of IT products with a choice of accredited technology partners and vendors.
Boot Camp & Casting Calls at VA
The Department of Veterans Affairs’ Enterprise Program Management Office (EPMO) reached initial operating capability last month, according to an internal email from VA CIO LaVerne Council, obtained by The Situation Report. According to the email, Rob C. Thomas II, the VA’s deputy assistant secretary, facilitated the first of a series of EPMO Bootcamps. “These sessions communicate upcoming changes, share the direction of the EPMO, and provide a forum for staff and teams to get their questions answered in real time. These sessions will continue through March and April, and they will culminate with a Project Manager Summit later this spring,” Council wrote.
VA’s corporate and Veterans Health Administration clinical IT Account Managers–Jackie Patillo and Alan Constantian–are now in place and are working together to create a new way for the Office of Information and Technology to interact with its VA partner organizations. “They will work collaboratively with the EPMO and Service Delivery and Engineering (SDE) to ensure that we are anticipating our business partners’ needs to provide a seamless IT experience,” said Council.
In addition, Council recently launched a casting call for a new Employee Engagement Task Force, “a group whose impact will be felt throughout OI&T as they build a better employee experience,” Council wrote. “If you think you have the mind-set and skills to join that cast, please apply.”
All-Hands at FedRAMP
Matt Goodrich kicked off March with what will certainly go down as the shortest all-hands strategy meeting in the history of strategy meetings. The FedRAMP Program Management Office posted a summary video Wednesday of its all-hands meeting that ran for a whopping 46 seconds. Our Internet surveillance team picked up an audio feed from the meeting, at which Goodrich proclaimed that FedRAMP is “one team, one dream.”
The March FedRAMP All-Hands Meeting.
Have a Situation Report to share? Send it in confidence to dverton@meritalk.com.
Dr. Michael Valivullah is the CTO of the National Agricultural Statistics Service and the U.S. Department of Agriculture.
MeriTalk: What are you working on in terms of big data at NASS and USDA?
Michael Valivullah
Michael Valivullah: My interest right now is in precision agriculture, and that brings in the big data effort to our agency. What we’re trying to do with NASS is to reduce respondent burdens, because we publish more than 400 reports every year. And we send out surveys, hundreds and hundreds of surveys to farmers, ranchers, producers, and farm businesses to create data. Our mission is to publish timely, accurate, and useful information to U.S. agriculture. We have over 500 statisticians, and all they do is crunch data all day. That’s what we do. So my interest in precision agriculture is because of the need.
Precision agriculture uses very sensitive sensors and gathers data to the very smallest level possible. The crops can be treated scientifically to decrease the amount of water, fertilizer, and chemicals used, and to increase the productivity of that particular patch of soil, using unmanned aerial systems and all kinds of onboard sensors.
My interest is that, because we are collecting the data, the number of people responding to our surveys has decreased. So we have this non-response challenge from the people who are supposed to provide us with data, and the quality of the data tends to be subjective. It takes time and money for us to do it, and we need to decrease the cost, because … we send people to these farmers and ranchers to collect data. So if we get the data directly from the sensors, with permission of course, we will be able to use the data directly and more objectively, while decreasing the data collection cost. The accuracy of that data will increase and the reporting cycle will decrease.
MeriTalk: Talk a little bit about big data. What are the opportunities and challenges?
MV: The challenge is the infrastructure to process the data. The other one is integration: How are you going to integrate the different types of data that are coming into the organization? Then people’s skills: It takes a different type of skill set to process this type of data, so usually organizations do not have the required skill set to do it well. They need to send people to training in some field that is new, and people just don’t know what process they should use or how to manage all this data.
The opportunities are immense, because big data can add trillions of dollars to GDPs according to several management and consulting companies. Meaning that there is so much value in data today, and people just don’t know about it. There are four different ways of looking at the data. One, you can use it to describe the information that you have, which is descriptive analytics. Then, if you don’t know the problem the data is showing, or you have some issue you want to identify, you can use diagnostic modeling analytics to see what is wrong and where. Then you can use it for predictive analytics; based on the data you have, you will be able to tell what the future is going to hold. Then you have prescriptive analytics; once you see what is wrong, you can define or write a prescription for it. For us, we will be able to predict crop estimations. What we do is, during the growing season, we estimate what the crop yield is going to be during the harvest season. That is going to help immensely, because now we are gathering the data manually and it’s subjective. Hopefully, big data will make our estimates closer to real yields when the harvest comes. Those are the benefits.
MeriTalk: How can one manage privacy and security in a big data world?
MV: That is a big challenge for us. There is a law called the Confidential Information Protection and Statistical Efficiency Act (CIPSEA), which says that we are, by law, required to safeguard the information we receive from farmers and ranchers. When we get this information, data such as their name and particulars related to their crop land, we publish and anonymize the data so that no farmer, rancher, or agricultural business will be identified. We need to protect the data, and security is a major concern for us. Security has three things in it: confidentiality, integrity, and availability. So we want to make sure that the data is confidential, has been integrated with integrity, has not been changed, and is available whenever it’s needed.
MeriTalk: What are some of the IT challenges that NASS or USDA face?
MV: As time goes by, we are getting less and less people interested in giving us the data. Non-response is the biggest issue we are facing, but also the cost of the data. Our budgets are getting cut, and we don’t have enough money to employ the required number of people to go get the data or have other systems in place to collect that information. It is a resource constraint like everybody else, and all the other agencies also have the same issue, because their budgets are getting cut. The only way that we can make do is to employ automated technologies or artificial intelligence to the point where this data can be collected by machines and not reduce manual and in-person data collections to reduce cost and time taken to collect data.
MeriTalk: How has cloud computing affected your agency and its plans?
MV: We have been very fortunate in having the first government cloud implemented a long time ago. USDA is one of the first Federal agencies that set up their own FedRAMP-certified cloud, and NASS was lucky enough to jump on that bandwagon. First, we rolled out our email on the private cloud, and we were able to save almost 60-65 percent on our email system, so it was a big cost saving. We have taken advantage of cloud computing, shared services, and reduced our footprint in terms of data centers. We went from 46 to 2 data centers. The cloud has been very good for NASS to hold all of the Federal data.
MeriTalk: How does your agency support mobility?
MV: NASS was the first agency to deploy iPads in the field, about four years ago. We’ve been out collecting data using iPads in the field and advancing in that area. We are rolling out more and more features, as we have iPads, iPhones, and now adaptive computing. Any particular endpoint can do the surveys, so we can send it to the farmers and they can do the surveys on their iPhone if they like. We’ve gone pretty far on that iPad rollout, and we will be able to implement that soon enough. We also implemented VDI, Virtual Desktop Infrastructure, about four years ago.
MeriTalk: For those that want to become a CTO, what advice do you have?
MV: First of all, as the name implies, they need to be very up-to-date with technology, what is happening in the industry, and have a good understanding of technology. Also, they need to understand how it relates to the mission of the agency. That is very important. As the leader, integrity is very important to this position, because anything that you say or do will be reflecting upon you and also the agency. And people skills, how to collaborate, how to work with other people, are important. The CTO is an advisory position, so you have to be a big thinker and have some conceptual, high level of thinking. Look at the future: how can you innovate; how can you learn from previous experience; how can the agency benefit from technology? The CTO’s job is to look at the future and move the agency forward with forward thinking.
MeriTalk: Finally, what do you look for in an IT leader?
MV: First of all I look at their background, their skills, their experiences, and their comfort level with technology. They should be confident learners. People need to have the mentality to learn; those are the folks who make good employees. I look for people who have the aptitude and the excitement to learn, because IT is constantly changing. As an IT employee, one needs to be able to learn all the time. I also look for working with other people and the ability to work as a team.
I am pretty satisfied with the role I am playing, have played all along, in working in different environments. Every time I go to a new place, I learn a lot, and that gets me excited and motivated. Technology is constantly changing, and I am a learner; I like to learn new things and try new things and do new things. That works for me very well.
The Federal government is the leading creator, collector, consumer, and communicator of information in the United States. If there are changes to its regulatory requirements, it is entirely possible those changes will eventually spread into the commercial sector. Such is the case with two related risk management programs developed by the Federal government that now enforce commercial organizations working contractually with the Federal government to employ Federal security standards[1].
The Federal Risk and Authorization Management Program (FedRAMP) and the Federal Information Security Management Act (FISMA) work together to provide Authority to Operate (ATO) to information systems utilized by Federal agencies. However, it is important to note that the perspectives and approaches are different. FISMA defines a framework to protect all Federal data, and FedRAMP is designed to assist agencies in meeting FISMA requirements for cloud systems[2]. Though not required for non-Federal affiliated organizations, commercial cloud service providers and private-sector businesses (like banks) have begun thinking about their cloud security standards and have looked to the Federal requirements for guidance[3]. Security organizations, such as the SANS Institute, have recommended private industry businesses reference the FedRAMP program when looking to implement security requirements around cloud services[4]. The reality of our current cyber risk climate is that it’s laden with danger, and the threats are only expected to worsen.
For this reason, it’s essential for cloud-computing companies and commercial businesses to protect information by scrutinizing security — especially if they want to compete for business from Federal agencies. Service providers or private-sector businesses interested in implementing an information security program can review the FISMA and FedRAMP guidance and reference materials at http://csrc.nist.gov/groups/SMA/fisma/ and http://www.fedramp.gov/resources/documents/. Organizations interested in providing services to the Federal government will need to implement the FISMA and FedRAMP requirements and work with Federal agencies to apply for a FISMA or FedRAMP (for cloud service providers) ATO. Here’s what you need to know about both authorization processes.
Federal Information Security Management Act
FISMA was enacted in 2002 and requires all Federal agencies, departments and contractors to secure their information systems and assets to a reasonable and adequate degree whether or not they are cloud service providers. The National Institute of Standards and Technology (NIST) aids in developing the standards and principles for FISMA via specialized publications, and Federal agencies and departments are mandated to report annually on their information security status. NIST Special Publication (SP) 800-37 Revision 1 defines guidelines to apply the Risk Management Framework to Federal information systems[5]. It involves six steps:
Step 1 – Categorize: The information system owner categorizes the information system based on Federal Information Processing Standard (FIPS) Publication 199 and documents the system categorization and system boundaries in the System Security Plan (SSP).
Step 2 – Select: Identify the security controls of the information system based on FIPS 200 and NIST SP 800-53 Revision 4 and document the security control descriptions in the SSP.
Step 3 – Implement: Implement the security controls and document the security control implementation descriptions in the SSP.
Step 4 – Assess: Assess the security controls against the security control implementation description. Security control assessments in support of initial and subsequent security authorizations must be conducted by independent assessors. The assessor documents issues, findings, and recommendations for the organization to put into a remediation plan.
Step 5 – Authorize: Provide the SSP and assessment results to the authorizing official to perform a risk-based decision whether to grant the system an ATO.
Step 6 – Monitor: Continually update the SSP, remediation plan, and other system documentation as a result of information system and environment changes, ongoing security assessments, ongoing remediation actions, key updates, security status reporting, and risk determination and acceptance.
FISMA’s authorization process allows for an individual agency’s senior officials to authorize the information system. Agencies can require vendors to meet specific demands that are unique to the agency, and requirements for one agency may not be the same for another. That’s why some vendors carry many ATOs. Authorization end dates are influenced by Federal and organizational policies and by the requirements of authorizing officials that may establish maximum authorization periods.
Federal Risk and Authorization Management Program
FedRAMP launched in 2011 and requires that all Federal agencies that currently use or plan to use a cloud-based solution implement the FedRAMP program to assess the security risks associated with using a cloud environment. It involves four process areas modified from the NIST SP 800-37 Risk Management Framework[6]:
Document: The cloud service provider (CSP) must categorize the information system, select, implement, and document system security controls in the SSP and additional required documentation. The security controls requirements are based on NIST SP 800-53 Revision 4 and build on those required for FISMA authorization.
Assess: The CSP must contract an independent assessor to perform an assessment of the security controls. If pursuing a provisional ATO (P-ATO) from the Joint Authorization Board (JAB) or utilizing the CSP-supplied path, the organization must hire a third-party assessment organization (3PAO) to perform an independent assessment.
Authorize: Authorizing officials review the CSP’s SSP, associated documentation, and the completed independent assessment, otherwise known as the FedRAMP assessment package, and make a risk-based decision whether or not to authorize the information system. There are three “paths” a cloud service provider (CSP) can pursue to achieve an authorization:
The Agency Sponsor path involves a Federal agency and the FedRAMP Project Management Office (PMO) reviewing the assessment package and the agency determining whether to provide the CSP with an ATO.
The JAB path involves the JAB and FedRAMP PMO reviewing the assessment package and determining whether to provide the CSP with a P-ATO. Federal agencies can then decide whether to grant the CSP an ATO.
The CSP-supplied path allows the CSP to provide their assessment package to the FedRAMP PMO for review and then allowing Federal agencies to review the package to determine if they want to grant the ATO.
Once a CSP receives an ATO, it can be leveraged by other Federal agencies who want to utilize the cloud service.
Monitor: Once the JAB or agency grants the CSP a FedRAMP authorization, the CSP must implement continuous monitoring activities via ongoing assessment and authorization to ensure the cloud system maintains an acceptable risk posture.
The FedRAMP authorization process is the more rigorous of the two because it was designed to act as a one-stop shop for all agencies to get services from authorized cloud providers that fulfill the FedRAMP requirements. Generally speaking, for a moderate impact system, a FedRAMP assessor is mandated to assess 297 NIST SP 800-53 rev.4 security controls required by FedRAMP compared to the 261 NIST SP 800-53 rev.4 security controls required by FISMA. The FedRAMP-required security controls also include additional FedRAMP requirements and guidance, and FedRAMP assessors are required to follow specific guidance issued by the FedRAMP PMO for particular testing, such as penetration testing.
About the Author
Christina McGhee is a Manager at Schellman & Company, Inc. where she performs FedRAMP 3PAO assessments as well as and integrated with SOC 1 and 2 examinations. Christina has experience in performing SOC, Federal Information Security Management Act of 2002 (FISMA), and Financial Statement audits and assessments for civilian agencies and departments. Christina also has supported multiple large cloud service providers as they were preparing for and going through the FedRAMP authorization process.
Catch up on some reading this weekend. Here are a few interesting items from around the Web.
A Cloud-Based Way for Lawmakers to Share with the Public
They are from Silicon Valley, they are certified “cool,” they are in 44 states and now they have their sights on the Federal space. OpenGov—venture capital-backed and fast-growing—offers cloud-based software-as-a-service for better managing government finances. And they offer a way for public officials to share with the public they represent because the company is built around the same open data concepts embraced by the Obama administration and Congress.
Obama’s Precision Medicine Initiative Is The Ultimate Big-Data Project
The National Institutes of Health (NIH) announced the PMI Cohort Program, which will enroll at least 1 million people for a longitudinal study—one that tracks people’s health over many years—in order to learn about a variety of diseases. Vanderbilt University and Verily, Google’s big-data health spin-off, are being tapped to pilot the project, which aims to recruit its first 79,000 participants by the end of the year.
InQuisient Introduces FITARA Fast Track Tool for Acquisition Process Automation
The company said Wednesday FITARA Fast Track works to assist CIOs by providing an automated daily management tool to streamline the acquisition process. The tool also aims to function as a portfolio management tool that could tie budget information with the enterprise architecture and provide a daily dashboard for each user.
DEA Considers Expanding DARPA’s Big Data Program
The Drug Enforcement Agency and DARPA have been working together to create a big data processing system that boosts law enforcement capabilities while maintaining the security of classified information. DEA has been trying to get the final program off the ground since early 2014 and is in need of a third-party integrator to help move things forward and eliminate the project backlog.
Former NSA Director Michael Hayden’s new book, Playing To The Edge: American Intelligence In The Age Of Terror, raises some tough questions about the Obama administration’s lack of support for the agency in the days and weeks following the Snowden revelations.
“That most members [of Congress] who were most knowledgeable of what NSA was doing were supportive was heartening,” Hayden writes. “But there were members of the executive branch who were (or should have been) equally knowledgeable, and their silence was puzzling.”
Hayden does not shy away from pointing out who those senior administration officials were. “If the vice president made any public defense of NSA, I must have missed it. So, too, with the secretary of defense, for whom the director of NSA works,” writes Hayden. “The same applies to officials like the national security advisor and the secretary of state, who actually help set intelligence requirements and receive reports based on that tasking. Where did they think this stuff came from?”
FedRAMP Organizes Hasty Defense
Matt Goodrich and GSA’s Federal Risk and Authorization Management Program (FedRAMP) have been taking heavy fire since the publication last month of the Fix FedRAMP position paper by the FedRAMP Fast Forward Industry Advisory Group. That paper, which details major problems experienced by cloud service providers—problems that may spell the end of the program if they are not addressed in an honest and deliberate fashion—will be the subject of a major event March 3 on Capitol Hill hosted by the Cloud Computing Caucus Advisory Group.
But The Situation Report’s scouts are reporting significant activities behind the lines. Not only has GSA turned its back to the government and industry participants who contributed to the report by refusing the opportunity to participate in the March 3 discussion on Capitol Hill, but human intelligence indicates that at least one industry leader may have “overheard” a not-so-subtle premonitory muttering.
18F’s Unfair Advantage?
The General Services Administration’s in-house innovation lab, popularly known as 18F, announced Feb. 23 that it is expanding its consulting services to assist Federal agencies that issue grants for technology projects to state and local governments. My Tysons Corner listening post has picked up strong signals that GSA is crossing yet another line—solidifying the fact that the government has established its own IT consulting firm, creating a potential competition nightmare for industry.
“This seems to be crossing a line where the US govt is now directly competing w/ the private sector,” tweeted one beltway bandit.
“Yes, it’s always been a consulting firm inside a Govt agency,” responded another bandit and former Federal consulting firm CEO.
Others are questioning—yet again—whether or not GSA and 18F should even be in the business of consulting with state and local agencies when the Federal technological infrastructure is crumbling under their feet. This seems to be a common refrain when it comes to GSA. Does anybody remember the response to cloud.gov?
GSA’s shiny new tech innovators need to tread carefully. There’s a long history dating back to 1955 that clearly states “the Federal Government…will not start or carry on any commercial activity to provide a service or product for its own use if such product or service can be procured from private enterprise through ordinary business channels.”
Job Opening
18F may not think it competes with private industry as a government-subsidized consulting firm, but its job openings sure make it sound like a private sector entity. Take the latest opening for the agency’s new Director of Custom Partner Solutions, for example. Here are a few of the job requirements:
Ensure the fiscal health of this business unit, including business planning, resource and revenue projection, and business development.
We may use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
AWSALBCORS
7 days
Amazon Web Services set this cookie for load balancing.
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie records the user consent for the cookies in the "Advertisement" category.
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
CookieLawInfoConsent
1 year
CookieYes sets this cookie to record the default button state of the corresponding category and the status of CCPA. It works only in coordination with the primary cookie.
JSESSIONID
session
New Relic uses this cookie to store a session identifier so that New Relic can monitor session counts for an application.
PHPSESSID
session
This cookie is native to PHP applications. The cookie stores and identifies a user's unique session ID to manage user sessions on the website. The cookie is a session cookie and will be deleted when all the browser windows are closed.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
_pxhd
1 year
PerimeterX sets this cookie for server-side bot detection, which helps identify malicious bots on the site.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Cookie
Duration
Description
lidc
1 day
LinkedIn sets the lidc cookie to facilitate data center selection.
li_gc
5 months 27 days
Linkedin set this cookie for storing visitor's consent regarding using cookies for non-essential purposes.
UserMatchHistory
1 month
LinkedIn sets this cookie for LinkedIn Ads ID syncing.
__cf_bm
30 minutes
Cloudflare set the cookie to support Cloudflare Bot Management.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Cookie
Duration
Description
AWSALB
7 days
AWSALB is an application load balancer cookie set by Amazon Web Services to map the session to the target.
_gat
1 minute
Google Universal Analytics sets this cookie to restrain request rate and thus limit data collection on high-traffic sites.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
AnalyticsSyncHistory
1 month
Linkedin set this cookie to store information about the time a sync took place with the lms_analytics cookie.
CONSENT
2 years
YouTube sets this cookie via embedded YouTube videos and registers anonymous statistical data.
ln_or
1 day
Linkedin sets this cookie to registers statistical data on users' behaviour on the website for internal analytics.
pardot
past
The pardot cookie is set while the visitor is logged in as a Pardot user. The cookie indicates an active session and is not used for tracking.
UID
1 year 1 month 4 days
Scorecard Research sets this cookie for browser behaviour research.
vuid
1 year 1 month 4 days
Vimeo installs this cookie to collect tracking information by setting a unique ID to embed videos on the website.
_ga
1 year 1 month 4 days
Google Analytics sets this cookie to calculate visitor, session and campaign data and track site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognise unique visitors.
_ga_*
1 year 1 month 4 days
Google Analytics sets this cookie to store and count page views.
_gcl_au
3 months
Google Tag Manager sets the cookie to experiment advertisement efficiency of websites using their services.
_gid
1 day
Google Analytics sets this cookie to store information on how visitors use a website while also creating an analytics report of the website's performance. Some of the collected data includes the number of visitors, their source, and the pages they visit anonymously.
__gads
1 year 24 days
Google sets this cookie under the DoubleClick domain, tracks the number of times users see an advert, measures the campaign's success, and calculates its revenue. This cookie can only be read from the domain they are currently on and will not track any data while they are browsing other sites.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
anj
3 months
AppNexus sets the anj cookie that contains data stating whether a cookie ID is synced with partners.
bcookie
1 year
LinkedIn sets this cookie from LinkedIn share buttons and ad tags to recognize browser IDs.
bscookie
1 year
LinkedIn sets this cookie to store performed actions on the website.
GoogleAdServingTest
session
Google sets this cookie to determine what ads have been shown to the website visitor.
IDE
1 year 24 days
Google DoubleClick IDE cookies store information about how the user uses the website to present them with relevant ads according to the user profile.
li_sugr
3 months
LinkedIn sets this cookie to collect user behaviour data to optimise the website and make advertisements on the website more relevant.
muc_ads
1 year 1 month 4 days
Twitter sets this cookie to collect user behaviour and interaction data to optimize the website.
personalization_id
1 year 1 month 4 days
Twitter sets this cookie to integrate and share features for social media and also store information about how the user uses the website, for tracking and targeting.
test_cookie
15 minutes
doubleclick.net sets this cookie to determine if the user's browser supports cookies.
uuid2
3 months
The uuid2 cookie is set by AppNexus and records information that helps differentiate between devices and browsers. This information is used to pick out ads delivered by the platform and assess the ad performance and its attribute payment.
VISITOR_INFO1_LIVE
5 months 27 days
YouTube sets this cookie to measure bandwidth, determining whether the user gets the new or old player interface.
YSC
session
Youtube sets this cookie to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt-remote-device-id
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt.innertube::nextId
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requests
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.
_mkto_trk
1 year 1 month 4 days
This cookie, provided by Marketo, has information (such as a unique user ID) that is used to track the user's site usage. The cookies set by Marketo are readable only by Marketo.
__gpi
1 year 24 days
Google Ads Service uses this cookie to collect information about from multiple websites for retargeting ads.