Noticia

How can public administrations harness the value of data? This question is not a simple one to address; its answer is conditioned by several factors that have to do with the context of each administration, the data available to it and the specific objectives set.

However, there are reference guides that can help define a path to action. One of them is published by the European Commission through the EU Publications Office, Data Innovation Toolkit, which emerges as a strategic compass to navigate this complex data innovation ecosystem.

This tool is not a simple manual as it includes templates to make the implementation of the process easier. Aimed at a variety of profiles, from novice analysts to experienced policy makers and technology innovators, Data Innovation Toolkit is a useful resource that accompanies you through the process, step by step.

It aims to democratise data-driven innovation by providing a structured framework that goes beyond the mere collection of information. In this post, we will analyse the contents of the European guide, as well as the references it provides for good innovative use of data.

Structure covering the data lifecycle

The guide is organised in four main steps, which address the entire data lifecycle.

  1. Planning

The first part of the guide focuses on establishing a strong foundation for any data-driven innovation project. Before embarking on any process, it is important to define objectives. To do so, the Data Innovation Toolkit suggests a deep reflection that requires aligning the specific needs of the project with the strategic objectives of the organisation. In this step, stakeholder mapping is also key. This implies a thorough understanding of the interests, expectations and possible contributions of each actor involved. This understanding enables the design of engagement strategies that maximise collaboration and minimise potential conflicts.

To create a proper data innovation team, we can use the RACI matrix (Responsible, Accountable, Consulted, Informed) to define precise roles and responsibilities. It is not just about bringing professionals together, but about building multidisciplinary teams where each member understands their exact role and contribution to the project. To assist in this task the guide provides:

  • Challenge definition tool: to identify and articulate the key issues they seek to address, summarising them in a single statement.
  • Stakeholder mapping tool: to visualise the network of individuals and organisations involved, assessing their influence and interests.
  • Team definition tool: to make it easier to identify people in your organisation who can help you.
  • Tool to define roles: to, once the necessary profiles have been defined, determine their responsibilities and role in the data project in more detail, using a RACI matrix.
  • Tool to define People:  People is a concept used to define specific types of users, called behavioural archetypes. This guide helps to create these detailed profiles, which represent the users or clients who will be involved in the project.
  • Tool for mapping Data Journey: to make a synthetic representation describing step by step how a user can interact with his data. The process is represented from the user's perspective, describing what happens at each stage of the interaction and the touch points.
  1. Collection and processing

Once the team has been set up and the objectives have been identified, a classification of the data is made that goes beyond the traditional division between quantitative and qualitative data.

Quantitative scope:

  • Discrete data, such as the number of complaints in a public service, represents not only a number, but an opportunity to systematically identify areas for improvement. They allow administrations to map recurrent problems and design targeted interventions. Ongoing data, such as response times for administrative procedures, provide a snapshot of operational efficiency. It is not just a matter of measuring, but of understanding the factors that influence the variability of these times and designing more agile and efficient processes.

Qualitative:

  • Nominal (name) data enables the categorisation of public services, allowing for a more structured understanding of the diversity of administrative interventions.

  • Ordinal (number) data, such as satisfaction ratings, become a prioritisation tool for continuous improvement.

A series of checklists are available in the document to review this aspect:

  • Checklist of data gaps: to identify if there are any gaps in the data to be used and, if so, how to fill them.
  • Template for data collection: to align the dataset to the objective of the innovative analysis.
  • Checklist of data collection: to ensure access to the data sources needed to run the project.
  • Checklist of data quality: to review the quality level of the dataset.
  • Data processing letters: to check that data is being processed securely, efficiently and in compliance with regulations.
  1. Sharing and analysis

At this point, the Data Innovation Toolkit proposes four analysis strategies that transform data into actionable knowledge.

  1. Descriptive analysis: goes beyond the simple visualisation of historical data, allowing the construction of narratives that explain the evolution of the phenomena studied.
  2. Diagnostic analysis: delves deeper into the investigation of causes, unravelling the hidden patterns that explain the observed behaviours.
  3. Predictive analytics: becomes a strategic planning tool, allowing administrations to prepare for future scenarios.
  4. Prescriptive analysis: goes a step further, not only projecting trends, but recommending concrete actions based on data modelling.

In addition to analysis, the ethical dimension is fundamental. The guide therefore sets out strict protocols to ensure secure data transfers, regulatory compliance, transparency and informed consent. In this section, the following checklistis provided:

  • Data sharing template: to ensure secure, legal and transparent sharing.
  • Checklist for data sharing: to perform all the necessary steps to share data securely, ethically and achieving all the defined objectives.
  • Data analysis template: to conduct a proper analysis to obtain insights useful and meaningful for the project.
  1. Use and evaluation

The last stage focuses on converting the insights into real actions. The communication of results, the definition of key performance indicators (KPIs), impact measurement and scalability strategies become tools for continuous improvement.

A collaborative resource in continuous improvement

In short, the toolkit offers a comprehensive transformation: from evidence-based decision making to personalising public services, increasing transparency and optimising resources. You can also check the checklist available in this section which are:

  • Checklist for data use: to review that the data and the conclusions drawn are used in an effective, accountable and goal-oriented manner.
  • Data innovation through KPI tool: to define the KPIs that will measure the success of the process.
  • Impact measurement and success evaluation tools: to assess the success and impact of the innovation in the data project.
  • Data innovation scalability plan: to identify strategies to scale the project effectively.

In addition, this repository of innovation resources and data is a dynamic catalogue of knowledge including expertise articles, implementation guides, case studies and learning materials.

You can access here the list of materials provided by the Data Innovation Toolkit.

You can even contact the development team if you have any questions or would like to contribute to the repository:

To conclude, harnessing the value of data with an innovative perspective is not a magic leap, but a gradual and complex process. On this path, the Data Innovation Toolkit can be useful as it offers a structured framework. Effective implementation will require investment in training, cultural adaptation and long-term commitment.

calendar icon
Evento

Events are a perfect option to learn more about those issues that we had pending. As this year we have to maintain the aforementioned social distance, one of the best options is online seminars or also called Webinars. The success of this format lies in the fact that it is mostly free content that you can see from a distance. Thanks to webinars, it is possible to take part in interesting conferences with a large number of participants or small talks from the comfort of our computer.

These digital events are promoted from both companies and public institutions. For example, the European Commission has launched two interesting appointments:

  • Inspire 2020 Conference. Under the theme: “Bringing sustainability and digitalization together”, European experts will discuss how digitization can help build a more sustainable Europe, also analysing environmental, economic and social risks that entails. The event is held from June 3 to 11.
  • Empower your city with data. The European Commission is conducting a series of webinars on Context Broker, a standard API that allows users to collect, integrate and contextualize data in real time, and Big Data Test Infrastructure (BDTI), a free testing infrastructure that offers Virtual environment templates to explore and experiment with various data sources, software tools and Big Data techniques. The first two appointments have already been held - the recording is available on the web - but you have time to join the next two webinars: June 4 or 18.

In addition, there are many companies that are taking advantage of new technologies to spread their knowledge through various talks. This interest from companies highlights the great business opportunities behind the data. Here are some examples:

  • Data sharing and AI innovation. Every Thursday in June the team from IBM Research and IBM Data Science and IT organize an exchange of ideas and discussions on Artificial Intelligence. Experts and researchers from IBM Data and AI will share new approaches, techniques and perspectives to facilitate Artificial Intelligence-driven automation, prediction and data optimization at the seminar. The seminars are fully open to questions, so you can interact and chat with the experts.
  • What is the future of data strategy? This seminar on the different processes of data management is held on June 25. The goal is for attendees to learn about the next trends that will change the world of data, with the focus on data visualization.
  • CxO to CxO on scaling AI for growth and innovation - Michael Murray president and director of Wunderman Thompson Data, together with Seth Dobrin vice president of Data and AI of IBM will explore in this online seminar the future perspectives of Artificial Intelligence and the exponential growth of these new technologies. The event is already available, you just have to register to watch it.
  • The future of Data Management. At this event by analyst firm Gartner, the future of the data management market will be discussed extensively. Aimed at companies, it will show how they should plan and organize to be a data-driven organization and stay ahead of the competition. As in the previous case, the event is already available under registration.

This is just a small selection of content. Do you know or are you organizing a webinar on data and new technologies? Tell us in the comments.

Data science and Artificial Intelligence remain at the forefront offering models and predictions that help us to understand the business and also social world. Thanks to these webinars, we can see how they both make their way in our day to day in a dizzying mode.

calendar icon