Traditionally, Federal IT applications were managed through a set of disparate point tools, each requiring human involvement. As applications grew out of the traditional data center into private cloud – and subsequently into public and hybrid clouds – the old way of doing things could no longer scale to meet the needs of Federal agencies. […]

Government mandates are pushing agencies to accelerate cloud migration, causing them to improve applications to make them cloud ready, while also consolidating data centers and connecting remote workers to new cloud environments. This is leading to a massive increase in application data, which needs continuous monitoring and analysis. […]

Artificial intelligence (AI) and machine learning (ML) are driving breakthroughs in innovation, but Federal agencies struggle to keep up with the growth of data that these technologies produce. IT leaders’ frustration with their agency’s data management strategies is driving new demands for secure and efficient data storage and management. […]

Across every level of government and industry, employees expect constant and reliable access to the data, applications, and tools they use to complete daily tasks. The onset of the global pandemic only accelerated this trend. Government agencies and contractors have been working to arm their employees with the tools and resources needed to remain efficient. To solve remote work challenges and maintain office productivity, agencies can implement a GPU-accelerated Virtual Desktop Infrastructure (VDI). […]

Harness was created to help every company build and deliver software with the speed and quality of the world’s top software firms. Customers expect top tier software – but most agencies don’t have the budget or developer time/talent to execute. Enter Harness: your ticket to becoming an elite performer in software delivery. […]

Manufacturing for government projects requires stringent security to protect intellectual property. The onset of the global pandemic and remote work at scale forced organizations to move from physical workstations to a virtual workstation model to maintain productivity and security. […]

Today, the world is in the grip of a “digital revolution,” encompassing an explosion of emerging technologies, such as artificial intelligence (AI), robotics, nanotechnology, and more. This provides new opportunities and challenges for safeguarding data in a modernized environment. […]

Eighty percent of federal data today has a Geospatial Information Systems component. As the amount of data and project sizes continue to snow ball, federal teams need high-compute power at any location to process data quickly, effectively, and access that data. […]

Federal agencies have new opportunities to integrate emerging technology directly at the network’s edge, often in rugged environments. Accelerated processing at the edge is critical for AI-enabled applications, including disaster relief, intelligence gathering, or cyber security, where real-time insights are critical. More on building the digital future with solutions designed for unique mission requirements. […]

Without a cohesive data strategy, your organization will be at a disadvantage in dramatically changing markets. To be successful, you need a data strategy that enables you to access all your core data wherever it resides. […]

Deliver critical data and AI services without the associated IT resources and expenses. IBM Cloud Pak for Data as a Service is an integrated data and AI platform, fully managed on the IBM Cloud. […]

ViON on Demand is a service that allows IT organizations to dynamically order and use IT infrastructure – server, storage compute and data center networking – as needed, scaling its usage up and down to align with the organization’s unique requirements. Not only can the customer have this IT infrastructure installed on-premise, ViON on Demand allows for a high level of customization to suit the specific environment and need. […]

The Internal Revenue Service’s research unit has a suitably expansive agenda for big data. The tax agency pulls in voluminous data on revenue collection, refunds, and enforcement efforts. The task of the IRS Research, Analysis, and Statistics division is to sort through that data and help the tax agency make better decisions. The division’s responsibilities include econometric modeling, forecasting, and compliance studies. It also serves as IRS’ focal point for developing research databases and supporting infrastructure. The research group taps big data to support its activities. The data-driven approach promotes greater efficiency in resources used for tax administration, according to Jeff Butler, director of Research Databases within the IRS Research, Analysis, and Statistics division. […]

NASA’s Jet Propulsion Laboratory, like many large organizations, is taking on the Big Data problem: the task of analyzing enormous data sets to find actionable information. In JPL’s case, the job involves collecting and mining data from 22 spacecraft and 10 instruments including the Mars Science Laboratory’s Curiosity rover and the Kepler space telescope. Tom Soderstrom, IT chief technology officer at JPL, joked that his biggest Big Data challenge is more down to Earth: dealing effectively with his email inbox. But kidding aside, JPL now confronts Big Data as a key problem and a key opportunity. “If we define the Big Data era as beginning where our current systems are no longer effective, we have already entered this epoch,” Soderstrom explained. […]

For a tough big data challenge, look no further than the U.S. Postal Service (USPS). USPS faces a classic double-whammy: the agency has to collect and crunch massive amounts of data, and tackle the job quickly. Speed is important as the agency aims to detect fraud, update customers tracking mail pieces, and respond to requests from regulators. The postal service has responded with an architectural approach designed to rapidly ingest and process data culled from thousands of sources throughout the postal enterprise. Scot Atkins, program manager, Supercomputing & Revenue Protection, USPS, has helped shape the postal service’s big data efforts. He cited the agency’s push for real-time processing as perhaps its biggest big data challenge. […]

MeriTalk sat down with Scott Pearson, director, big data solutions, Brocade to discuss the state of Big Data in the Federal government: What is the most interesting thing about Big Data? Who is driving Big Data adoption at agencies? What should IT keep in mind as they look to deliver Big Data solutions? […]

Big data has less to do with size and more to do with the growing recognition that data and analysis have a seemingly limitless potential for improving government and society. But data alone does not deliver value. Real value is created when government can bring together data – big or traditional – from multiple sources or locations, and present that information in a way that encourages exploration and insight. Qlik allows you to extend big data analytics to the edges of your agency. Read Qlik’s case study to learn more. […]

Data has become a critical advantage for information-driven agencies. By providing unprecedented access to actionable information, agencies can use data to better understand their operations, improve their services, and ultimately fulfill their mission requirements. To access this information, agencies need to be able to effectively operationalize data across their operations. This includes discovering and embedding past-, present-, and future-looking analytics into their end users’ workflow in order to move the metrics that matter. […]

Government data is growing and agencies are looking to leverage big data to support government mission outcomes. However, most agencies lack the data storage/access, computational power, and personnel they need to take advantage of the big data opportunity, according to a study by MeriTalk sponsored by NetApp. The report, “The Big Data Gap,” reveals that Federal IT professionals believe big data can improve government but that the promise of big data is locked away in unused or inaccessible data. […]

Unlike its predecessors, big data has emerged as more than just a new technology. It has proven to be one of the most promising yet challenging technologies for both government and industry. Before IT leaders can harness the full potential of big data, there are key issues to address surrounding infrastructure, storage, and training. MeriTalk surveyed 17 visionary big data leaders to find out what they see as the big data challenges and opportunities as well as how government can best leverage big data. […]

Despite the buzz, big data is a new concept and many state and local agencies are behind the curve. Although thought leaders are extolling the virtues and benefits of big data, the truth is that to capture the full potential of big data, state and local agencies need the ability to easily store and access data, robust computational power and software to manipulate the data, and trained personnel to analyze the data. MeriTalk surveyed state and local IT professionals to better understand the current data position for state and local agencies and to identify the current gap between the big data opportunity and reality. […]

The TechAmerica Big Data Commission’s recent report predicts the term “Big Data” will be forgotten in 10 years, but its principles will underpin society. MeriTalk surveyed 150 Federal IT executives to gauge if agencies are taking the steps needed to operationalize the Big Data opportunity. […]

This white paper, written by Jean Yan, program manager, Data.gov, U.S. General Services Administration, seeks to promote, lead, and collaborate in the era of big data. The white paper addresses why big data is a hot topic, how big data works, what the main differences between traditional data analytics and big data analytics is, and what the major challenges, risks, and limitations are. […]

State and local governments are deploying the Big Five – data center consolidation, mobility, security, big data, and cloud computing – but are they prepared? Download the report at: https://meritalk.com/bigfiveinoverdrive […]

MeriTalk’s recent study, “Balancing the Cyber Big Data Equation,” examines the symbiotic relationship between Big Data and cyber security. The report captures insight from 18 Federal cyber and Big Data leaders on this two-way street and explores what agencies are doing today, how they are balancing access and risk, and what’s next as technologies and policies mature. Download the report at: https://meritalk.com/Balancing-Cyber-BigData […]

In this blog from Joyce Hunter, deputy CIO for policy and planning, United States Department of Agriculture (USDA), learn more about the potential for direct access for food safety and food sourcing information through open data. […]

In this blog from Joyce Hunter, deputy CIO for policy and planning, United States Department of Agriculture (USDA), learn more about USDA’s support of the Open Data Policy and the recently launched Food, Agriculture, and Rural virtual community on Data.gov. […]

MeriTalk’s study, “The Big Data Cure,” examines the current and future landscape of Federal healthcare and healthcare research in relation to Big Data. Is Big Data the cure? Download the report to learn more: https://meritalk.com/bigdatacure […]

Although the American economy has stabilized, Capitol Hill is still closely examining the spending and budgets of government agencies with an eye toward program cuts wherever possible. […]

On October 3, 2012, NASA, the Department of Energy, and the National Science Foundation announced a competition to develop new ways to capitalize on the influx of big data in the Federal government. […]