For a few years now, local, state, and federal governments have been embracing agile software development and integrating DevOps into their work. These concepts have grown in popularity as various levels of government have sought to replace legacy systems or develop new systems. Over this same period of time, more and more governments have begun to leverage data as a strategic asset. The role of Chief Data Officer (CDO) has grown year-over-year in governments of all shapes and sizes. The CDO role is still not well defined in government. Some CDO’s are focused more on analytics while others are focused on data management and most fall somewhere in between. For some reason though, these initiatives have generally not converged.
More often than not, the vision for data and analytics in government begins and ends with a “dashboard.” Often this means that decisions regarding how to produce such a dashboard are defined up front. The result is a perception that there is a need for a technology solution that doesn’t already exist, which in turn means there’s a need to purchase something that hasn’t been budgeted for. Ultimately, business users simply want efficient access to the data they need to do their job. However, IT organizations are often more interested in initiatives around data governance and master data management, which to seek prevent competing copies of data and ensure data are of a quality that’s suitable for analysis. Within government this creates enough friction that far more time and effort goes into making data suitable and available for analysis than actually using the data. These are well intended and necessary initiatives, however they take time to mature and generally do not need to occur in insolation. Ultimately, once command and control structures create limitations for users, they will seek to avoid them at all costs.
DataOps is a fairly new approach to address these issues which is gaining momentum in the private sector. DataOps seeks to promote communication and integration of data, teams, and systems. It is an approach that is intended to connect everyone involved in the data pipeline including the people collecting or entering data all the way to the people actually using data for decision making purposes. By connecting the people, processes, and technology related to data using Lean and agile methodologies, DataOps can allow organizations to start leveraging data for insights now, while simultaneously addressing data quality and other concerns. DataOps should start with a seemingly simple use-case or question, such as “how many clients do we serve?”Answer this question easily and repeatedly, with the tools and technology currently available, and scale from there. In government, DataOps must first focus on process oriented thinking. If data quality is an issue, where in the data pipeline is the most efficient place to fix it? If data access is the issue, what are the issues preventing access and how do we address them? Once those issues are addresses, consider how new software or technology adds value.
"Ultimately, business users simply want efficient access to the data they need to do their job."
The goal should be the continuous delivery of insight through the efficient use of data. This means that DataOps is a process of continuous improvement where organizations are evaluating and measuring the impact by ensuring that the questions they are answering are actually adding value to the organization. DataOps is more about culture change in government than it is technological change.
The following are series of principles we have developed to implement DataOps in Connecticut:
• Everyone involved in the data pipeline must know how data are being used within the organization. The person collecting/ entering data must know that it is being aggregated/analyzed down the road; why & for what purpose
• Data must be re-usable, easily
• Everyone is equally valuable
• It is interdisciplinary - working with colleagues from different backgrounds allows us harness the work in those fields.
• Data/Analytics is a process not a product, DataOps must focus on process-thinking aimed at achieving continuous improvement both in terms of data quality, but also analytics quality; ultimately leading to organizational/functional improvements and outcomes
• Frequent, face-to-face communication is a requirement
• Deliver simple, incremental insight first – Don’t set out to develop a dashboard or reproduce a report. Answer simple questions first, then build on those.
• Use existing tools first – There’s no need to purchase additional software or special tools until you’ve determined how they add value to your data analytics process
• Frequent reflection and feedback, to all team members’ is critical. Be positive when things go well, be constructive when things need improvement
• Be open: the decisions we make using data affect people. Thus we must, to the extent legally possible, make the data, analysis, and methods we used accessible and reproduce able.
Overall, actually implementing DataOps is long but necessary journey; but so are more traditional approaches like imposing standards, data governance, and master data management. The difference with DataOps is that organizations can get started now. Starting with a single question that can be answered easily and repeatedly demonstrates immediate value that will lead to another question. This isn’t a panacea, and it’s not the one solution for all data processes, but it’s a way to get started without spending a dollar and can deliver real value.