Just a few days ago, the Directorate General of Traffic published the new Framework Programme for the Testing of Automated Vehicles which, among other measures, contemplates "the mandatory delivery of reports, both periodic and final and in the event of incidents, which will allow the DGT to assess the safety of the tests and publish basic information [...] guaranteeing transparency and public trust."
The advancement of digital technology is making it easier for the transport sector to face an unprecedented revolution in autonomous vehicle driving, offering significant improvements in road safety, energy efficiency and mobility accessibility.
The final deployment of these vehicles depends to a large extent on the availability, quality and accessibility of large volumes of data, as well as on an appropriate legal framework that ensures the protection of the various legal assets involved (personal data, trade secrets, confidentiality, etc.), traffic security and transparency. In this context, open data and the reuse of public sector information are essential elements for the responsible development of autonomous mobility, in particular when it comes to ensuring adequate levels of traffic safety.
Data Dependency on Autonomous Vehicles
The technology that supports autonomous vehicles is based on the integration of a complex network of advanced sensors, artificial intelligence systems and real-time processing algorithms, which allows them to identify obstacles, interpret traffic signs, predict the behavior of other road users and, in a collaborative way, plan routes completely autonomously.
In the autonomous vehicle ecosystem, the availability of quality open data is strategic for:
- Improve road safety, so that real-time traffic data can be used to anticipate dangers, avoid accidents and optimise safe routes based on massive data analysis.
- Optimise operational efficiency, as access to up-to-date information on the state of roads, works, incidents and traffic conditions allows for more efficient planning of journeys.
- To promote sectoral innovation, facilitating the creation of new digital tools that facilitate mobility.
Specifically, ensuring the safe and efficient operation of this mobility model requires continuous access to two key categories of data:
- Variable or dynamic data, which offers constantly changing information such as the position, speed and behaviour of other vehicles, pedestrians, cyclists or weather conditions in real time.
- Static data, which includes relatively permanent information such as the exact location of traffic signs, traffic lights, lanes, speed limits or the main characteristics of the road infrastructure.
The prominence of the data provided by public entities
The sources from which such data come are certainly diverse. This is of great relevance as regards the conditions under which such data will be available. Specifically, some of the data are provided by public entities, while in other cases the origin comes from private companies (vehicle manufacturers, telecommunications service providers, developers of digital tools...) with their own interests or even from people who use public spaces, devices and digital applications.
This diversity requires a different approach to facilitating the availability of data under appropriate conditions, in particular because of the difficulties that may arise from a legal point of view. In relation to Public Administrations, Directive (EU) 2019/1024 on open data and the reuse of public sector information establishes clear obligations that would apply, for example, to the Directorate General of Traffic, the Administrations owning public roads or municipalities in the case of urban environments. Likewise, Regulation (EU) 2022/868 on European data governance reinforces this regulatory framework, in particular with regard to the guarantee of the rights of third parties and, in particular, the protection of personal data.
Moreover, some datasets should be provided under the conditions established for dynamic data, i.e. those "subject to frequent or real-time updates, due in particular to their volatility or rapid obsolescence", which should be available "for re-use immediately after collection, through appropriate APIs and, where appropriate, in the form of a mass discharge."
One might even think that the high-value data category is of particular interest in the context of autonomous vehicles given its potential to facilitate mobility, particularly considering its potential to:
- To promote technological innovation, as they would make it easier for manufacturers, developers and operators to access reliable and up-to-date information, essential for the development, validation and continuous improvement of autonomous driving systems.
- Facilitate monitoring and evaluation from a security perspective, as transparency and accessibility of such data are essential prerequisites from this perspective.
- To boost the development of advanced services, since data on road infrastructure, signage, traffic and even the results of tests carried out in the context of the aforementioned Framework Programme constitute the basis for new mobility applications and services that benefit society as a whole.
However, this condition is not expressly included for traffic-related data in the definition made at European level, so that, at least for the time being, public entities should not be required to disseminate the data that apply to autonomous vehicles under the unique conditions established for high-value data. However, at this time of transition for the deployment of autonomous vehicles, it is essential that public administrations publish and keep updated under appropriate conditions for their automated processing, some datasets, such as those relating to:
- Road signs and vertical signage elements.
- Traffic light states and traffic control systems.
- Lane configuration and characteristics.
- Information on works and temporary traffic alterations.
- Road infrastructure elements critical for autonomous navigation.
The recent update of the official catalogue of traffic signs, which comes into force on 1 July 2025, incorporates signs adapted to new realities, such as personal mobility. However, it requires greater specificity with regard to the availability of data relating to signals under these conditions. This will require the intervention of the authorities responsible for road signage.
The availability of data in the context of the European Mobility Area
Based on these conditions and the need to have mobility data generated by private companies and individuals, data spaces appear as the optimal legal and governance environment to facilitate their accessibility under appropriate conditions.
In this regard, the initiatives for the deployment of the European Mobility Data Space, created in 2023, constitute an opportunity to integrate into its design and configuration measures that support the need for access to data required by autonomous vehicles. Thus, within the framework of this initiative, it would be possible to unlock the potential of mobility data , and in particular:
- Facilitate the availability of data under conditions specific to the needs of autonomous vehicles.
- Promote the interconnection of various data sources linked to existing means of transport, but also emerging ones.
- Accelerate the digital transformation of autonomous vehicles.
- Strengthen the digital sovereignty of the European automotive industry, reducing dependence on large foreign technology corporations.
In short, autonomous vehicles can represent a fundamental transformation in mobility as it has been conceived until now, but their development depends, among other factors, on the availability, quality and accessibility of sufficient and adequate data. The Sustainable Mobility Bill currently being processed in Parliament is a great opportunity to strengthen the role of data in facilitating innovation in this area, which would undoubtedly favour the development of autonomous vehicles. To this end, it will be essential, on the one hand, to have a data sharing environment that makes access to data compatible with the appropriate guarantees for fundamental rights and information security; and, on the other hand, to design a governance model that, as emphasised in the Programme promoted by the Directorate-General for Traffic, facilitates the collaborative participation of "manufacturers, developers, importers and fleet operators established in Spain or the European Union", which poses significant challenges in the availability of data.
Content prepared by Julián Valero, Professor at the University of Murcia and Coordinator of the Research Group "Innovation, Law and Technology" (iDerTec). The contents and points of view reflected in this publication are the sole responsibility of its author.
Valencia City Council has launched a call to reward projects that promote the culture of open information and open data in the city. Specifically, it seeks to promote the culture of government transparency and good governance through the reuse of open data.
If you are thinking of participating, here are some of the keys you should take into account (although do not forget to read the complete rules of the call for more information).
What do the prizes consist of?
The awards consist of a single category that encompasses projects that demonstrate the potential of the reuse of public open data , and may also include private data. Specifically, applications, technological solutions, services, works, etc. may be presented. that use public data from the city of Valencia to benefit the community.
The requirements that must be met are the following:
- To present an innovative character and highlight its impact on improving the lives of people and their environment.
- Be current and be implemented in general, in the territorial area of the municipality of Valencia. The final projects of bachelor's, master's or doctoral theses can have been carried out at any university, but it is mandatory that they refer to and base their research on areas of transparency in the city of Valencia.
- Use inclusive and non-sexist language.
- Be written in Spanish or Valencian.
- Have a single author, which may be a legal entity or association.
- Be written in accordance with the terms and conditions of the call, and articles previously published in journals may not participate.
- Not have received a subsidy from the Valencia City Council for the same purpose.
Who can participate?
The contest is aimed at audiences from wide sectors: students, entrepreneurs, developers, design professionals, journalists or any citizen with an interest in open data.
Both natural and legal persons from the university field, the private sector, public entities and civil society can participate, provided that they have developed the project in the municipality of Valencia.
What is valued and what do the prizes consist of?
The projects received will be evaluated by a jury that will take into account the following aspects:
- Originality and degree of innovation.
- Public value and social and urban impact.
- Viability and sustainability.
- Collaborative nature.
The jury will choose three winning projects, which will receive a diploma and a financial prize consisting of:
- First prize: 5,000 euros.
- Second prize: 3,000 euros.
- Third prize: 2,000 euros.
In addition, the City Council will disseminate and publicize the projects that have been recognized in this call, which will be a loudspeaker to gain visibility and recognition.
The awards will be presented at a public event in person or virtually in the city of Valencia, to which all participants will be invited. An opportunity to engage in conversation with other citizens and professionals interested in the subject.
How can I participate?
The deadline for submitting projects is 7 July 2025. The application can be made in two ways:
- In person, presenting the standard form and Annex 1 of the declaration of responsibility.
- Digitally through the Electronic Office, where an online application form (which includes the responsible declaration) will be completed.
In both cases, in addition, an explanatory report of the project will have to be presented. This document will contain the description of the project, its objectives, the actions developed and the results obtained, detailed in a maximum of 20 pages. It is also necessary to review the additional documentation indicated in the rules, necessary according to the nature of the participant (natural person, legal entity, associations, etc.).
For those participants who have doubts, the email address sctransparencia@valencia.es has been enabled. You can also ask any questions on the 962081741 and 962085203 phones.
You can see the complete rules at this link.
Generative artificial intelligence is beginning to find its way into everyday applications ranging from virtual agents (or teams of virtual agents) that resolve queries when we call a customer service centre to intelligent assistants that automatically draft meeting summaries or report proposals in office environments.
These applications, often governed by foundational language models (LLMs), promise to revolutionise entire industries on the basis of huge productivity gains. However, their adoption brings new challenges because, unlike traditional software, a generative AI model does not follow fixed rules written by humans, but its responses are based on statistical patterns learned from processing large volumes of data. This makes its behaviour less predictable and more difficult to explain, and sometimes leads to unexpected results, errors that are difficult to foresee, or responses that do not always align with the original intentions of the system's creator.
Therefore, the validation of these applications from multiple perspectives such as ethics, security or consistency is essential to ensure confidence in the results of the systems we are creating in this new stage of digital transformation.
What needs to be validated in generative AI-based systems?
Validating generative AI-based systems means rigorously checking that they meet certain quality and accountability criteria before relying on them to solve sensitive tasks.
It is not only about verifying that they ‘work’, but also about making sure that they behave as expected, avoiding biases, protecting users, maintaining their stability over time, and complying with applicable ethical and legal standards. The need for comprehensive validation is a growing consensus among experts, researchers, regulators and industry: deploying AI reliably requires explicit standards, assessments and controls.
We summarize four key dimensions that need to be checked in generative AI-based systems to align their results with human expectations:
- Ethics and fairness: a model must respect basic ethical principles and avoid harming individuals or groups. This involves detecting and mitigating biases in their responses so as not to perpetuate stereotypes or discrimination. It also requires filtering toxic or offensive content that could harm users. Equity is assessed by ensuring that the system offers consistent treatment to different demographics, without unduly favouring or excluding anyone.
- Security and robustness: here we refer to both user safety (that the system does not generate dangerous recommendations or facilitate illicit activities) and technical robustness against errors and manipulations. A safe model must avoid instructions that lead, for example, to illegal behavior, reliably rejecting those requests. In addition, robustness means that the system can withstand adversarial attacks (such as requests designed to deceive you) and that it operates stably under different conditions.
- Consistency and reliability: Generative AI results must be consistent, consistent, and correct. In applications such as medical diagnosis or legal assistance, it is not enough for the answer to sound convincing; it must be true and accurate. For this reason, aspects such as the logical coherence of the answers, their relevance with respect to the question asked and the factual accuracy of the information are validated. Its stability over time is also checked (that in the face of two similar requests equivalent results are offered under the same conditions) and its resilience (that small changes in the input do not cause substantially different outputs).
- Transparency and explainability: To trust the decisions of an AI-based system, it is desirable to understand how and why it produces them. Transparency includes providing information about training data, known limitations, and model performance across different tests. Many companies are adopting the practice of publishing "model cards," which summarize how a system was designed and evaluated, including bias metrics, common errors, and recommended use cases. Explainability goes a step further and seeks to ensure that the model offers, when possible, understandable explanations of its results (for example, highlighting which data influenced a certain recommendation). Greater transparency and explainability increase accountability, allowing developers and third parties to audit the behavior of the system.
Open data: transparency and more diverse evidence
Properly validating AI models and systems, particularly in terms of fairness and robustness, requires representative and diverse datasets that reflect the reality of different populations and scenarios.
On the other hand, if only the companies that own a system have data to test it, we have to rely on their own internal evaluations. However, when open datasets and public testing standards exist, the community (universities, regulators, independent developers, etc.) can test the systems autonomously, thus functioning as an independent counterweight that serves the interests of society.
A concrete example was given by Meta (Facebook) when it released its Casual Conversations v2 dataset in 2023. It is an open dataset, obtained with informed consent, that collects videos from people from 7 countries (Brazil, India, Indonesia, Mexico, Vietnam, the Philippines and the USA), with 5,567 participants who provided attributes such as age, gender, language and skin tone.
Meta's objective with the publication was precisely to make it easier for researchers to evaluate the impartiality and robustness of AI systems in vision and voice recognition. By expanding the geographic provenance of the data beyond the U.S., this resource allows you to check if, for example, a facial recognition model works equally well with faces of different ethnicities, or if a voice assistant understands accents from different regions.
The diversity that open data brings also helps to uncover neglected areas in AI assessment. Researchers from Stanford's Human-Centered Artificial Intelligence (HAI) showed in the HELM (Holistic Evaluation of Language Models) project that many language models are not evaluated in minority dialects of English or in underrepresented languages, simply because there are no quality data in the most well-known benchmarks.
The community can identify these gaps and create new test sets to fill them (e.g., an open dataset of FAQs in Swahili to validate the behavior of a multilingual chatbot). In this sense, HELM has incorporated broader evaluations precisely thanks to the availability of open data, making it possible to measure not only the performance of the models in common tasks, but also their behavior in other linguistic, cultural and social contexts. This has contributed to making visible the current limitations of the models and to promoting the development of more inclusive and representative systems of the real world or models more adapted to the specific needs of local contexts, as is the case of the ALIA foundational model, developed in Spain.
In short, open data contributes to democratizing the ability to audit AI systems, preventing the power of validation from residing only in a few. They allow you to reduce costs and barriers as a small development team can test your model with open sets without having to invest great efforts in collecting their own data. This not only fosters innovation, but also ensures that local AI solutions from small businesses are also subject to rigorous validation standards.
The validation of applications based on generative AI is today an unquestionable necessity to ensure that these tools operate in tune with our values and expectations. It is not a trivial process, it requires new methodologies, innovative metrics and, above all, a culture of responsibility around AI. But the benefits are clear, a rigorously validated AI system will be more trustworthy, both for the individual user who, for example, interacts with a chatbot without fear of receiving a toxic response, and for society as a whole who can accept decisions based on these technologies knowing that they have been properly audited. And open data helps to cement this trust by fostering transparency, enriching evidence with diversity, and involving the entire community in the validation of AI systems.
Content prepared by Jose Luis Marín, Senior Consultant in Data, Strategy, Innovation & Digitalization. The contents and views reflected in this publication are the sole responsibility of the author.
Open data is a fundamental fuel for contemporary digital innovation, creating information ecosystems that democratise access to knowledge and foster the development of advanced technological solutions.
However, the mere availability of data is not enough. Building robust and sustainable ecosystems requires clear regulatory frameworks, sound ethical principles and management methodologies that ensure both innovation and the protection of fundamental rights. Therefore, the specialised documentation that guides these processes becomes a strategic resource for governments, organisations and companies seeking to participate responsibly in the digital economy.
In this post, we compile recent reports, produced by leading organisations in both the public and private sectors, which offer these key orientations. These documents not only analyse the current challenges of open data ecosystems, but also provide practical tools and concrete frameworks for their effective implementation.
State and evolution of the open data market
Knowing what it looks like and what changes have occurred in the open data ecosystem at European and national level is important to make informed decisions and adapt to the needs of the industry. In this regard, the European Commission publishes, on a regular basis, a Data Markets Report, which is updated regularly. The latest version is dated December 2024, although use cases exemplifying the potential of data in Europe are regularly published (the latest in February 2025).
On the other hand, from a European regulatory perspective, the latest annual report on the implementation of the Digital Markets Act (DMA)takes a comprehensive view of the measures adopted to ensure fairness and competitiveness in the digital sector. This document is interesting to understand how the regulatory framework that directly affects open data ecosystems is taking shape.
At the national level, the ASEDIE sectoral report on the "Data Economy in its infomediary scope" 2025 provides quantitative evidence of the economic value generated by open data ecosystems in Spain.
The importance of open data in AI
It is clear that the intersection between open data and artificial intelligence is a reality that poses complex ethical and regulatory challenges that require collaborative and multi-sectoral responses. In this context, developing frameworks to guide the responsible use of AI becomes a strategic priority, especially when these technologies draw on public and private data ecosystems to generate social and economic value. Here are some reports that address this objective:
- Generative IA and Open Data: Guidelines and Best Practices: the U.S. Department of Commerce. The US government has published a guide with principles and best practices on how to apply generative artificial intelligence ethically and effectively in the context of open data. The document provides guidelines for optimising the quality and structure of open data in order to make it useful for these systems, including transparency and governance.
- Good Practice Guide for the Use of Ethical Artificial Intelligence: This guide demonstrates a comprehensive approach that combines strong ethical principles with clear and enforceable regulatory precepts.. In addition to the theoretical framework, the guide serves as a practical tool for implementing AI systems responsibly, considering both the potential benefits and the associated risks. Collaboration between public and private actors ensures that recommendations are both technically feasible and socially responsible.
- Enhancing Access to and Sharing of Data in the Age of AI: this analysis by the Organisation for Economic Co-operation and Development (OECD) addresses one of the main obstacles to the development of artificial intelligence: limited access to quality data and effective models. Through examples, it identifies specific strategies that governments can implement to significantly improve data access and sharing and certain AI models.
- A Blueprint to Unlock New Data Commons for AI: Open Data Policy Lab has produced a practical guide that focuses on the creation and management of data commons specifically designed to enable cases of public interest artificial intelligence use. The guide offers concrete methodologies on how to manage data in a way that facilitates the creation of these data commons, including aspects of governance, technical sustainability and alignment with public interest objectives.
- Practical guide to data-driven collaborations: the Data for Children Collaborative initiative has published a step-by-step guide to developing effective data collaborations, with a focus on social impact. It includes real-world examples, governance models and practical tools to foster sustainable partnerships.
In short, these reports define the path towards more mature, ethical and collaborative data systems. From growth figures for the Spanish infomediary sector to European regulatory frameworks to practical guidelines for responsible AI implementation, all these documents share a common vision: the future of open data depends on our ability to build bridges between the public and private sectors, between technological innovation and social responsibility.
Satellite data has become a fundamental tool for understanding and monitoring our planet from a unique perspective. This data, collected by satellites in orbit around the Earth, provides a global and detailed view of various terrestrial, maritime and atmospheric phenomena that have applications in multiple sectors, such as environmental care or driving innovation in the energy sector.
In this article we will focus on a new sector: the field of fisheries, where satellite data have revolutionised the way fisheries are monitored and managed worldwide. We will review which fisheries satellite data are most commonly used to monitor fishing activity and look at possible uses, highlighting their relevance in detecting illegal activities.
The most popular fisheries-related satellite data: positioning data
Among the satellite data, we find a large amount ofpublic and open data , which are free and available in reusable formats, such as those coming from the European Copernicus programme. This data can be complemented with other data which, although also public, may have costs and restrictions on use or access. This is because obtaining and processing this data involves significant costs and requires purchasing from specialised suppliers such as ORBCOMM, exactEarth, Spire Maritime or Inmarsat. To this second type belong the data from the two most popular systems for obtaining fisheries data, namely:
- Automatic Identification System (AIS): transmits the location, speed and direction of vessels. It was created to improve maritime safety and prevent collisions between vessels, i.e. its aim was to prevent accidents by allowing vessels to communicate their position and obtain the location of other ships in real time. However, with the release of satellite data in the 2010s, academia and authorities realised that they could improve situational awareness by providing information about ships, including their identity, course, speed and other navigational data. AIS data went on to facilitate maritime traffic management, enabling coastal authorities and traffic centres to monitor and manage the movement of vessels in their waters. This technology has revolutionised maritime navigation, providing an additional layer of safety and efficiency in maritime operations. Data is available through websites such as MarineTraffic or VesselFinder, which offer basic tracking services for free, but require a subscription for advanced features..
- Vessel Monitoring System (VMS): designed specifically for fisheries monitoring, it provides position and movement data. It was created specifically for the monitoring and management of the modern fishing industry. Its development emerged about two decades ago as a response to the need for improved monitoring, control and surveillance of fishing activities. Access to VMS data varies according to jurisdiction and international agreements. The data are mainly used by government agencies, regional fisheries management organisations and surveillance authorities, who have restricted access and must comply with strict security and confidentiality regulations.The data are used mainly by government agencies, regional fisheries management organisations and surveillance authorities, who have restricted access and must comply with strict security and confidentiality regulations.. On the other hand, fishing companies also use VMS systems to manage their fleets and comply with local and international regulations.
Analysis of fisheries satellite data
Satellite data has proven to be particularly useful for fisheries observation, as it can provide both an overview of a marine area or fishing fleet, as well as the possibility of knowing the operational life of a single vessel. The following steps are usually followed:
- AIS and VMS data collection.
- Integration with other open or private sources. For example: ship registers, oceanographic data, delimitations of special economic zones or territorial waters.
- Application of machine learning algorithms to identify behavioural patterns and fishing manoeuvres.
- Visualisation of data on interactive maps.
- Generation of alerts on suspicious activity (for real-time monitoring).
Use cases of fisheries satellite data
Satellite fisheries data offer cost-effective options, especially for those with limited resources to patrol their waters to continuously monitor large expanses of ocean. Among other activities, these data make possible the development of systems that allow:
- Monitoring of compliance with fishing regulations, as satellites can track the position and movements of fishing vessels. This monitoring can be done with historical data, in order to perform an analysis of fishing activity patterns and trends. This supports long-term research and strategic analysis of the fisheries sector.
- The detection of illegal fishing, using both historical and real-time data. By analysing unusual movement patterns or the presence of vessels in restricted areas, possible illegal, unreported and unregulated (IUU) fishing activities can be identified. IUU fishing is worth up to US$23.5 billionper year in seafood products.
- The assessment of the fishing volume, with data on the carrying capacity of each vessel and the fish transhipments that take place both at sea and in port.
- The identification of areas of high fishing activity and the assessment of their impact on sensitive ecosystems.
A concrete example is work by the Overseas Development Institute (ODI), entitled "Turbid Water Fishing", which reveals how satellite data can identify vessels, determine their location, course and speed, and train algorithms, providing unprecedented insight into global fishing activities. The report is based on two sources: interviews with the heads of various private and public platforms dedicated to monitoring IUU fishing, as well as free and open resources such as Global Fishing Watch (GFW) - an organisation that is a collaboration between Oceana, SkyTruth and Google - which provides open data.
Challenges, ethical considerations and constraints in monitoring fishing activity
While these data offer great opportunities, it is important to note that they also have limitations. The study "Fishing for data: The role of private data platforms in addressing illegal, unreported and unregulated fishing and overfishing", mentions the problems of working with satellite data to combat illegal fishing, challenges that can be applied to fisheries monitoring in general:
- The lack of a unified universal fishing vessel register. There is a lack of a single database of fishing vessels, which makes it difficult to identify vessels and their owners or operators. Vessel information is scattered across multiple sources such as classification societies, national vessel registers and regional fisheries management organisations.
- Deficient algorithms. Algorithms used to identify fishing behaviour are sometimes unable to accurately identify fishing activity, making it difficult to identify illegal activities. For example, inferring the type of fishing gear used, target species or quantity caught from satellite data can be complex.
- Most of this data is not free and can be costly. The most commonly used data in this field, i.e. data from AIS and VMS systems, are of considerable cost.
- Incomplete satellite data. Automatic Identification Systems (AIS) are mandatory only for vessels over 300 gross tonnes, which leaves out many fishing vessels. In addition, vessels can turn off their AIS transmitters to avoid surveillance.
- The use of these tools for surveillance, monitoring and law enforcement carries risks, such as false positives and spurious correlations. In addition, over-reliance on these tools can divert enforcement efforts away from undetectable behaviour.
- Collaboration and coordination between various private initiatives, such as Global Fishing Watch, is not as smooth as it could be. If they joined forces, they could create a more powerful data platform, but it is difficult to incentivise such collaboration between competing organisations.
The future of satellite data in fisheries
The field of satellite data is in constant evolution, with new techniques for capture and analysis improving the accuracy and utility of the information obtained. Innovations in geospatial data capture include the use of drones, LiDAR (light detection and ranging) and high-resolution photogrammetry, which complement traditional satellite data. In the field of analytics, machine learning and artificial intelligence are playing a crucial role. For example, Global Fishing Watch uses machine learning algorithms to process millions of daily messages from more than 200,000 fishing vessels, allowing a global, real-time view of their activities.
The future of satellite data is promising, with technological advances offering improvements in the resolution, frequency, volume, quality and types of data that can be collected. The miniaturisation of satellites and the development of microsatellite constellations are improving access to space and the data that can be obtained from it.
In the context of fisheries, satellite data are expected to play an increasingly important role in the sustainable management of marine resources. Combining these data with other sources of information, such as in situ sensors and oceanographic models, will allow a more holistic understanding of marine ecosystems and the human activities that affect them.
Content prepared by Miren Gutiérrez, PhD and researcher at the University of Deusto, expert in data activism, data justice, data literacy and gender disinformation. The contents and views reflected in this publication are the sole responsibility of the author.
Access to financial and banking data is revolutionising the sector, promoting transparency, financial inclusion and innovation in economic services. However, the management of this data faces regulatory challenges in balancing openness with security and privacy.
For this reason, there are different ways of accessing this type of data, as we will see below.
Open Banking and Open Finance versus Open Data.
These terms, although related, have important differences.
The term Open Banking refers to a system that allows banks and other financial institutions to securely and digitally share customer financial data with third parties. This requires the customers' express approval of the data sharing conditions . This consent can be cancelled at any time according to the customer's wishes.
Open Finance, on the other hand, is an evolution of Open Banking which embraces a broader range of financial products and services. When we talk about Open Finance, in addition to banking data, data on insurance, pensions, investments and other financial services are included.
In both Open Banking and Open Finance, the data is not open (Open Data), but can only be accessed by those previously authorised by the customer. The exchange of data is done through an application programming interface or API , which guarantees the agility and security of the process. All of this is regulated by the European directive on payment services in the internal market (known as PSD2), although the European Commission is working on updating the regulatory framework.
-
Applications of Open Banking and Open Finance:
The purpose of these activities is to provide access to new services based on information sharing. For example, they facilitate the creation of apps that unify access to all the bank accounts of a customer, even if they are from different providers. This improves the management and control of income and expenditure by providing an overview in a single environment.
Another example of use is that they allow providers to cross-check information more quickly. For example, by allowing access to a customer's financial data, a dealer could provide information on financing options more quickly.
Open data platforms on banking
While private banking data, like all types of personal data, is strictly regulated and cannot be openly published due to privacy protection regulations, there are sets of financial data that can be freely shared. For example, aggregate information on interest rates, economic indicators, historical stock market data, investment trends and macroeconomic statistics, which are accessible through open sources.
This data, in addition to boosting transparency and confidence in markets, can be used to monitor economic trends, prevent fraud and improve risk management globally. In addition, fintechcompanies, developers and entrepreneurs can take advantage of them to create solutions such as financial analysis tools, digital payment systems or automated advice.
Let's look at some examples of places where open data on the banking and financial sector can be obtained.
International sources
Some of the most popular international sources are:
-
European Central Bank: provides statistics and data on euro area financial markets, through various platforms. Among other information, users can download datasets on inflation, bank interest rates, balance of payments, public finances, etc.
-
World Bank: provides access to global economic data on financial development, poverty and economic growth.
-
International Monetary Fund: provides simplified access to macroeconomic and financial data, such as the outlook for the global or regional economy. It also provides open data from reports such as its Fiscal Monitor, which analyses the latest developments in public finances.
- Federal Reserve Economic Data (FRED): focuses on US economic data, including market indicators and interest rates. This repository is created and maintained by the Research Department of the Federal Reserve Bank of St. Louis.
National sources
Through the National Open Data Catalogue of datos.gob.es a large number of datasets related to the economy can be accessed. One of the most prominent publishers is the Instituto Nacional de Estadística (INE), which provides data on defaults by financial institution, mortgages, etc.
In addition, the Banco de España offers various products for those interested in the country's economic data:
- Statistics: the Banco de España collects, compiles and publishes a wide range of economic and financial statistics. It includes information on interest and exchange rates, financial accounts of the institutional sectors, balances of payments and even household financial surveys, among others.
- Dashboard: the Banco de España has also made available to the public an interactive viewer that allows quarterly and annual data on external statistics to be consumed in a more user-friendly way.
In addition, Banco de España has set up asecure room for researchers to access data that is valuable but cannot be opened to the general public due to its nature. In this sense we find:
- BELab: the secure data laboratory managed by the Banco de España, offering on-site (Madrid) and remote access. These data have been used in various projects.
- ES_DataLab: restricted microdata laboratory for researchers developing projects for scientific and public interest purposes. In this case, it brings together micro-data from various organisations, including the Bank of Spain.
Data spaces: an opportunity for secure and controlled exchange of financial data
As we have just seen, there are also options to facilitate access to financial and banking data in a controlled and secure manner. This is where data spaces come into play, an ecosystem where different actors share data in a voluntary and secure manner, following common governance, regulatory and technical mechanisms.
In this respect, Europe is pushing for a European Financial Data Facility (EEDF), a key initiative within the European Data Strategy. The EEDF consists of three main pillars:
- Public reporting data ("public disclosures"): collects financial reporting data (balance sheets, revenues, income statements), which financial firms are required by law to disclose on a regular basis. In this area is the European Single Access Point (ESAP)initiative, a centralised platform for accessing data from over 200 public reports from more than 150,000 companies.
- Private customer data of financial service providers: encompasses those data held by financial service providers such as banks. In this area is the framework for access to financial data, which covers data such as investments, insurance, pensions, loans and savings.
- Data from supervisory reports: for this type of data, the supervisory strategy, which covers data from different sectors (banks, insurance, pension funds...) has to be taken into account in order to promote digital transformation in the financial sector.
In conclusion, access to financial and banking data is evolving significantly thanks to various initiatives that have enabled greater transparency and that will encourage the development of new services, while ensuring the security and privacy of shared data. The future of the financial sector will be shaped by the ability of institutions and regulators to foster data ecosystems that drive innovation and trust in the market.
Data reuse continues to grow in Spain, as confirmed by the last report of the Multisectorial Association of Information (ASEDIE), which analyses and describes the situation of the infomediary sector in the country. The document, now in its 13th edition, was presented last Friday, 4 April, at an event highlighting the rise of the data economy in the current landscape.
The following are the main key points of the report.
An overall profit of 146 million euros in 2023
Since 2013, ASEDIE's Infomediary sector report has been continuously monitoring this sector, made up of companies and organisations that reuse data - generally from the public sector, but also from private sources - to generate value-added products or services. Under the title "Data Economy in its infomediary scope", this year's report underlines the importance of public-private partnerships in driving the data economy and presents relevant data on the current state of the sector.
It should be noted that the financial information used for sales and employees corresponds to the financial year 2023, as financial information for the year 2024 was not yet available at the time of reporting. The main conclusions are:
- Since the first edition of the report, the number of infomediaries identified has risen from 444 to 757, an increase of 70%. This growth reflects its dynamism, with annual peaks and troughs, showing a positive evolution that consolidates its recovery after the pandemic, although there is still room for development.
- The sector is present in all the country's Autonomous Communities, including the Autonomous City of Melilla. The Community of Madrid leads the ranking with 38% of infomediaries, followed by Catalonia, Andalusia and the Community of Valencia, which represent 15%, 11% and 9%, respectively. The remaining 27% is distributed among the other autonomous communities.
- 75% of infomediary companies operate in the sub-sectors of geographic information, market, economic and financial studies, and infomediation informatics (focused on the development of technological solutions for the management, analysis, processing and visualisation of data).
- The infomediary sector shows a growth and consolidation trend, with 66% of companies operating for less than 20 years. Of this group, 32% are between 11 and 20 years old, while 34% are less than a decade old. Furthermore, the increase in companies between 11 and 40 years old indicates that more companies have managed to sustain themselves over time.
- In terms of sales, the estimated volume amounts to 2,646 million euros and the evolution of average sales increases by 10.4%. The average turnover per company is over 4.4 million euros, while the median is 442,000 euros. Compared to the previous year, the average has increased by 200,000 euros, while the median has decreased by 30,000 euros.
- It is estimated that the infomediary sector employs some 24,620 people, 64% of whom are concentrated in three sub-sectors. These figures represent a growth of 6% over the previous year. Although the overall average is 39 employees per company, the median per sub-sector is no more than 6, indicating that much of the employment is concentrated in a small number of large companies. The average turnover per employee was 108,000 euros this year, an increase of 8% compared to the previous year.
- The subscribed capital of the sector amounts to EUR 252 million. This represents an increase of 6%, which breaks the negative trend of recent years.
- 74% of the companies have reported profits. The aggregate net profit of the 539 companies for which data is available exceeded 145 million euros.
The following visual summarises some of this data:

Figure 1. Source: Asedie Infomediary Sector Report. "Data Economy in its infomediary scope" (2025).
Significant advances in the ASEDIE Top 10
The Asedie Top 10 aims to identify and promote the openness of selected datasets for reuse. This initiative seeks to foster collaboration between the public and private sectors, facilitating access to information that can generate significant economic and social benefits. Its development has taken place in three phases, each focusing on different datasets, the evolution of which has been analysed in this report:.
- Phase 1 (2019), which promoted the opening of databases of associations, cooperatives and foundations. Currently, 16 Autonomous Communities allow access to the three databases and 11 already offer NIF data. There is a lack of access to cooperatives in a community.
- Phase 2 (2020), focusing on datasets related to energy efficiency certificates, SAT registers and industrial estates. All communities have made energy efficiency data available to citizens, but one is missing in industrial parks and three in SAT registers.
- Phase 3 (2023), focusing on datasets of economic agents, education centres, health centres and ERES-ERTES (Expediente de Regulación de Empleo y Expediente de Regulación Temporal de Empleo). Progress has been made compared to last year, but work is ongoing to achieve greater uniformity of information.
New success stories and best practices
The report concludes with a section compiling several success stories of products and services developed with public information and contributing to the growth of our economy, for example:
- Energy Efficiency Improvement Calculator: allows to identify the necessary interventions and estimate the associated costs and the impact on the energy efficiency certification (EEC).
- GEOPUBLIC: is a tool designed to help Public Administrations better understand their territory. It allows for an analysis of strengths, opportunities and challenges in comparison with other similar regions, provinces or municipalities. Thanks to its ability to segment business and socio-demographic data at different scales, it facilitates the monitoring of the life cycle of enterprises and their influence on the local economy.
- New website of the DBK sectoral observatory: improves the search for sectoral information, thanks to the continuous monitoring of some 600 Spanish and Portuguese sectors. Every year it publishes more than 300 in-depth reports and 1,000 sectoral information sheets.
- Data assignment and repair service: facilitates the updating of information on the customers of electricity retailers by allowing this information to be enriched with the cadastral reference associated with the supply point. This complies with a requirement of the State Tax Administration Agency (AEAT).
The report also includes good practices of public administrations such as:
- The Callejero Digital de Andalucía Unificado (CDAU), which centralises, standardises and keeps the region's geographical and postal data up to date.
- The Geoportal of the Madrid City Council, which integrates metadata, OGC map services, a map viewer and a geolocator that respect the INSPIRE and LISIGE directives. It is easy to use for both professionals and citizens thanks to its intuitive and accessible interface.
- The Canary Statistics Institute (ISTAC), which has made an innovative technological ecosystem available to society. It features eDatos, an open source infrastructure for statistical data management ensuring transparency and interoperability.
- The Spanish National Forest Inventory (IFN) and its web application Download IFN, a basic resource for forest management, research and education. Allows easy filtering of plots for downloading.
- The Statistical Interoperability Node, which provides legal, organisational, semantic and technical coverage for the integration of the different information systems of the different levels of administrative management.
- The Open Cohesion School, an innovative educational programme of the Generalitat de Catalunya aimed at secondary school students. Students investigate publicly funded projects to analyse their impact, while developing digital skills, critical thinking and civic engagement.
- The National Publicity System for Public Subsidies and Grants, which has unveiled a completely redesigned website. It has improved its functionality with API-REST queries and downloads. More information here.
In conclusion, the infomediary sector in Spain consolidifies itself as a key driver for the economy, showing a solid evolution and steady growth. With a record number of companies and a turnover exceeding 2.6 billion euros in 2023, the sector not only generates employment, but also positions itself as a benchmark for innovation. Information as a strategic resource drives a more efficient and connected economic future. Its proper use, always from an ethical perspective, promises to continue to be a source of progress both nationally and internationally.
In a world increasingly exposed to natural hazards and humanitarian crises, accurate and up-to-date geospatial data can make the difference between effective response and delayed reaction. The building footprints, i.e. the contours of buildings as they appear on the ground, are one of the most valuable resources in emergency contexts.
In this post we will delve deeper into this concept, including where to obtain open building footprint data, and highlight its importance in one of its many use cases: emergency management.
What are buildings footprints
The building footprints are geospatial representations, usually in vector format, showing the outline of structures built on the ground. That is, they indicate the horizontal projection of a building on the ground, viewed from above, as if it were a floor plan.
These footprints can include residential buildings as well as industrial, commercial, institutional or even rural buildings. Depending on the data source, they may be accompanied by additional attributes such as height, number of floors, building use or date of construction, making them a rich source of information for multiple disciplines.
Unlike an architectural plan showing internal details, building footprints are limited to the perimeter of the building in contact with the ground. This simplicity makes them lightweight, interoperable and easily combined with other geographic information layers, such as road networks, risk areas, critical infrastructures or socio-demographic data.
Figure 1. Example of building footprints: each polygon represents the outline of a building as seen from above.
How are they obtained?
There are several ways to generate building footprints:
- From satellite or aerial imagery: using photo-interpretation techniques or, more recently, artificial intelligence and machine learning algorithms.
- With cadastral data or official registers: as in the case of the Cadastre in Spain, which maintains precise vector bases of all registered constructions.
- By collaborative mapping: platforms such as OpenStreetMap (OSM) allow volunteer users to manually digitise visible footprints on orthophotos.
What are they for?
Building footprints are essential for:
- Urban and territorial analysis: allow the study of built density, urban sprawl or land use.
- Cadastral and real estate management: are key to calculating surface areas, applying taxes or regulating buildings.
- Planning of infrastructures and public services: they help to locate facilities, design transport networks or estimate energy demand.
- 3D modelling and smart cities: serve as a basis for generating three-dimensional urban models.
- Risk and emergency management: to identify vulnerable areas, estimate affected populations or plan evacuations.
In short, building footprints are a basic piece of the spatial data infrastructure and, when offered as open, accessible and up-to-date data, they multiply their value and usefulness for society as a whole.
Why are they key in emergency situations?
Of all the possible use cases, in this article we will focus on emergency management. During such a situation - such as an earthquake, flood or wildfire - responders need to know which areas are built-up, how many people can inhabit those structures, how to access them and where to concentrate resources. The building footprints allow:
- Rapidly estimate the number of people potentially affected.
- Prioritise intervention and rescue areas.
- Plan access and evacuation routes.
- Cross-reference data with other layers (social vulnerability, risk areas, etc.).
- Coordinate action between emergency services, local authorities and international cooperation.
Open data available
In an emergency, it is essential to know where to locate building footprint data. One of the most relevant developments in the field of data governance is the increasing availability of building footprints as open data. This type of information, previously restricted to administrations or specialised agencies, can now be freely used by local governments, NGOs, researchers and businesses.
Some of the main sources available for emergency management and other purposes are summarised below:
- JRC - Global Human Settlement Layer (GHSL): the Joint Research Centre of the European Commission offers a number of products derived from satellite imagery analysis:
- GHS-BUILT-S: raster data on global built-up areas.
- GHS-BUILD-V: AI-generated vector building footprints for Europe.
- Downloading data: https://ghsl.jrc.ec.europa.eu/download.php
- IGN and Cadastre of Spain: the footprints of official buildings in Spain can be obtained through the Cadastre and the Instituto Geográfico Nacional (IGN).. They are extremely detailed and up-to-date.
- Download centre of the IGN: https://centrodedescargas.cnig.es
- Cadastral Surveyor: https://www.sedecatastro.gob.es.
- Copernicus Emergency Management Service: provides mapping products generated in record time when an emergency (earthquakes, floods, fires, etc.) is triggered.. They include damage maps and footprints of affected buildings.
- Download centre: https://emergency.copernicus.eu/mapping/list-of-components/EMSR
- Important: to download detailed vector data (such as footprints), you need to register on the platform DIAS/Copernicus EMS or request access on a case-by-case basis.
- OpenStreetMap (OSM): collaborative platform where users from all over the world have digitised building footprints, especially in areas not covered by official sources. It is especially useful for humanitarian projects, rural and developing areas, and cases where rapid updating or local involvement is needed.
- Downloading data: https://download.geofabrik.de
- Google Open Buildings: this Google project offers more than 2 billion building footprints in Africa, Asia and other data-scarce regions, generated with artificial intelligence models. It is especially useful for humanitarian purposes, urban development in countries of the global south, and risk exposure assessment in places where there are no official cadastres.
- Direct access to the data: https://sites.research.google/open-buildings/
- Microsoft Building Footprints: Microsoft has published sets of building footprints generated with machine learning algorithms applied to aerial and satellite imagery. Coverage: United States, Canada, Uganda, Tanzania, Nigeria and recently India. The data is open access under the ODbL licence.
- Meta (ex Facebook) AI Buildings Footprints: Meta AI has published datasets generated through deep learning in collaboration with the Humanitarian OpenStreetMap Team (HOT). They focused on African and Southeast Asian countries.
- Direct access to the data: https://dataforgood.facebook.com/dfg/tools/buildings.
Comparative table of open building footprints sources
| Source/Project | Geographic coverage | Data type | Format | Requires registration | Main use |
|---|---|---|---|---|---|
| JRC GHSL | Global (raster) / Europe (vector) | Raster and vector | GeoTIFF / GeoPackage / Shapefile | No | Urban analysis, European planning, comparative studies |
| IGN + Cadastre Spain | Spain | Official vector | GML/Shapefile/WFS/WMS | No | Cadastral data, urban planning, municipal management |
| Copernicus EMS | Europe and global (upon activation) | Vector (post-emergency) | PDF / GeoTIFF / Shapefile | Yes (for detailed vector data) | Rapid mapping, emergency management |
| OpenStreetMap | Global (varies by area) | Collaborative vector | .osm / shapefile / GeoJSON | No | Base maps, rural areas, humanitarian aid |
| Google Open Buildings | Africa, Asia, LatAm (selected areas) | Vector (AI-generated) | CSV / GeoJSON | No | Risk assessment, planning in developing countries |
| Microsoft Buildings Footprints | USA, Canada, India, Africa | Vector (AI) | GeoJSON | No | Massive data, urban planning, rural areas |
| Meta AI | Africa, Asia (specific areas) | Vector (AI) | GeoJSON / CSV | No | Humanitarian aid, complementing OSM in uncovered areas |
Figure 2. Comparative table of open building footprint sources.
Combination and integrated use of data
One of the great advantages of these sources being open and documented is the possibility of combining them to improve the coverage, accuracy and operational utility of building footprints. Here are some recommended approaches:
1. Complementing areas without official coverage
- In regions where cadastre is not available or up to date (such as many rural areas or developing countries), it is useful to use Google Open Buildings or OpenStreetMap as a basis.
- GHSL also provides a harmonised view on a continental scale, useful for planning and comparative analysis.
2. Cross official and collaborative layers
- The Spanish cadastre footprints can be enriched with OSM data when new or modified areas are detected, especially after an event such as a catastrophe.
- This combination is ideal for small municipalities that do not have their own technical capacity, but want to keep their data up to date.
3. Integration with socio-demographic and risk data
- Footprints gain value when integrated into geographic information systems (GIS) alongside layers such as:
- Population per building (INE, WorldPop).
- Flood zones (MAPAMA, Copernicus).
- Health centres or schools.
- Critical infrastructure (electricity grid, water).
This allows modelling risk scenarios, planning evacuations or even simulating potential impacts of an emergency.
4. Combined use in actual activations
Some real-world examples of uses of this data include:
- In cases such as the eruption on La Palma, data from the Cadastre, OSM and Copernicus EMS products were used simultaneously to map damage, estimate the affected population and plan assistance.
- During the earthquake in Turkey in 2023, organisations such as UNOSAT and Copernicus combined satellite imagery with automatic algorithms to detect structural collapses and cross-reference them with existing footprints. This made it possible to quickly estimate the number of people potentially trapped.
In emergency situations, time is a critical resource. Artificial intelligence applied to satellite or aerial imagery makes it possible to generate building footprints much faster and more automated than traditional methods.
In short, the different sources are not exclusive, but complementary. Its strategic integration within a well-governed data infrastructure is what allows moving from data to impact, and putting geospatial knowledge at the service of security, planning and collective well-being.
Data governance and coordination
Having quality building footprints is an essential first step, but their true value is only activated when these data are well governed, coordinated between actors and prepared to be used efficiently in real-world situations. This is where data governancecomes into play: the set of policies, processes and organisational structures that ensure that data is available, reliable, up-to-date and used responsibly.
Why is data governance key?
In emergency or territorial planning contexts, the lack of coordination between institutions or the existence of duplicated, incomplete or outdated data can have serious consequences: delays in decision-making, duplication of efforts or, in the worst case, erroneous decisions. Good data governance ensures that:
- Data must be known and findable: It is not enough that it exists; it must be documented, catalogued and accessible on platforms where users can easily find it.
- Have standards and interoperability: building footprints should follow common formats (such as GeoJSON, GML, shapefile), use consistent reference systems, and be aligned with other geospatial layers (utility networks, administrative boundaries, risk zones...).
- Keep up to date: especially in urban or developing areas, where new construction is rapidly emerging. Data from five years ago may be useless in a current crisis.
- Coordination between levels of government: municipal, regional, national and European. Efficient sharing avoids duplication and facilitates joint responses, especially in cross-border or international contexts.
- Clear roles and responsibilities are defined: who produces the data, who validates it, who distributes it, who activates it in case of emergency?
The value of collaboration
A robust data governance ecosystem must also foster multi-sector collaboration. Public administrations, emergency services, universities, the private sector, humanitarian organisations and citizens can benefit from (and contribute to) the use and improvement of this data.
For example, in many countries, local cadastres work in collaboration with agencies such as national geographic institutes, while citizen science and collaborative mapping initiatives (such as OpenStreetMap) can complement or update official data in less covered areas.
Emergency preparedness
In crisis situations, coordination must be anticipated. It is not just about having the data, but about having clear operational plans on how to access it, who activates it, in what formats, and how it integrates with response systems (such as Emergency Coordination Centres or civil protection GIS).
Therefore, many institutions are developing protocols for activating geospatial data in emergencies, and platforms such as Copernicus Emergency Management Service already work on this principle, offering products based on well-governed data that can be activated in record time.
Conclusion
Building footprints are not just a technical resource for urban planners or cartographers: they are a critical tool for risk management, sustainable urban planning and citizen protection. In emergency situations, where time and accurate information are critical factors, having this data can make the difference between an effective intervention and an avoidable tragedy.
Advances in Earth observation technologies, the use of artificial intelligence and the commitment to open data by institutions such as the JRC and the IGN have democratised access to highly valuable geospatial information. Today it is possible for a local administration, an NGO or a group of volunteers to access building footprints to plan evacuations, estimate affected populations or design logistical routes in real time.
However, the challenge is not only technological, but also organisational and cultural. It is imperative to strengthen data governance: to ensure that these sets are well documented, updated, accessible and that their use is integrated into emergency and planning protocols. It is also essential to train key actors, promote interoperability and foster collaboration between public institutions, the private sector and civil society.
Ultimately, building footprints represent much more than geometries on a map: they are a foundation on which to build resilience, save lives and improve decision-making at critical moments. Betting for its responsible and open use means betting for a smarter, more coordinated and people-centred public management.
Content prepared by Mayte Toscano, Senior Consultant in Data Economy Technologies. The contents and points of view reflected in this publication are the sole responsibility of the author.
Open data portals help municipalities to offer structured and transparent access to the data they generate in the exercise of their functions and in the provision of the services they are responsible for, while also fostering the creation of applications, services and solutions that generate value for citizens, businesses and public administrations themselves.
The report aims to provide a practical guide for municipal administrations to design, develop and maintain effective open data portals, integrating them into the overall smart city strategy. The document is structured in several sections ranging from strategic planning to technical and operational recommendations necessary for the creation and maintenance of open data portals. Some of the main keys are:
Fundamental principles
The report highlights the importance of integrating open data portals into municipal strategic plans, aligning portal objectives with local priorities and citizens' expectations. It also recommends drawing up a Plan of measures for the promotion of openness and re-use of data (RISP Plan in Spanish acronyms), including governance models, clear licences, an open data agenda and actions to stimulate re-use of data. Finally, it emphasises the need for trained staff in strategic, technical and functional areas, capable of managing, maintaining and promoting the reuse of open data.
General requirements
In terms of general requirements to ensure the success of the portal, the importance of offering quality data, consistent and updated in open formats such as CSV and JSON, but also in XLS, favouring interoperability with national and international platforms through open standards such as DCAT-AP, and guaranteeing effective accessibility of the portal through an intuitive and inclusive design, adapted to different devices. It also points out the obligation to strictly comply with privacy and data protection regulations, especially the General Data Protection Regulation (GDPR).
To promote re-use, the report advises fostering dynamic ecosystems through community events such as hackathons and workshops, highlighting successful examples of practical application of open data. Furthermore, it insists on the need to provide useful tools such as APIs for dynamic queries, interactive data visualisations and full documentation, as well as to implement sustainable funding and maintenance mechanisms.
Technical and functional guidelines
Regarding technical and functional guidelines, the document details the importance of building a robust and scalable technical infrastructure based on cloud technologies, using diverse storage systems such as relational databases, NoSQL and specific solutions for time series or geospatial data. It also highlights the importance of integrating advanced automation tools to ensure consistent data quality and recommends specific solutions to manage real-time data from the Internet of Things (IoT).
In relation to the usability and structure of the portal, the importance of a user-centred design is emphasised, with clear navigation and a powerful search engine to facilitate quick access to data. Furthermore, it stresses the importance of complying with international accessibility standards and providing tools that simplify interaction with data, including clear graphical displays and efficient technical support mechanisms.
The report also highlights the key role of APIs as fundamental tools to facilitate automated and dynamic access to portal data, offering granular queries, clear documentation, robust security mechanisms and reusable standard formats. It also suggests a variety of tools and technical frameworks to implement these APIs efficiently.
Another critical aspect highlighted in the document is the identification and prioritisation of datasets for publication, as the progressive planning of data openness allows adjusting technical and organisational processes in an agile way, starting with the data of greatest strategic relevance and citizen demand.
Finally, the guide recommends establishing a system of metrics and indicators according to the UNE 178301:2015 standard to assess the degree of maturity and the real impact of open data portals. These metrics span strategic, legal, organisational, technical, economic and social domains, providing a holistic approach to measure both the effectiveness of data publication and its tangible impact on society and the local economy.
Conclusions
In conclusion, the report provides a strategic, technical and practical framework that serves as a reference for the deployment of municipal open data portals for cities to maximise their potential as drivers of economic and social development. In addition, the integration of artificial intelligence at various points in open data portal projects represents a strategic opportunity to expand their capabilities and generate a greater impact on citizens.
You can read the full report here.
The value of open satellite data in Europe
Satellites have become essential tools for understanding the planet and managing resources efficiently. The European Union (EU) has developed an advanced space infrastructure with the aim of providing real-time data on the environment, navigation and meteorology.
This satellite network is driven by four key programmes:.
- Copernicus: Earth observation, environmental monitoring and climate change.
- Galileo: high-precision satellite navigation, alternative to GPS.
- EGNOS: improved positioning accuracy, key to aviation and navigation.
- Meteosat: padvanced meteorological prediction and atmospheric monitoring.
Through these programmes, Europe not only ensures its technological independence, but also obtains data that is made available to citizens to drive strategic applications in agriculture, security, disaster management and urban planning.
In this article we will explore each programme, its satellites and their impact on society, including Spain''s role in each of them.
Copernicus: Europe''s Earth observation network
Copernicus is the EU Earth observation programme, managed by the European Commission with the technical support of the European Space Agency (ESA) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT).. It aims to provide free and open data about the planet to monitor climate change, manage natural resources and respond to emergencies.
The programme is structured into three main components:
- Space component: consists of a series of satellites called Sentinel, developed specifically for the needs of Copernicus. These satellites provide high quality data for various applications, such as land, sea and atmospheric monitoring.
- Component in situ: includes data collected through ground, air and sea stations. These data are essential to calibrate and validate the information obtained by the satellites, ensuring its accuracy and reliability.
- Operational Services: offers six thematic services that transform collected data into useful information for users:
- Atmospheric monitoring
- Marine monitoring
- Terrestrial monitoring
- Climate change
- Emergency management
- Safety
These services provide information in areas such as air quality, ocean status, land use, climate trends, disaster response and security, supporting informed decision-making in Europe.
Spain has played a key role in the manufacture of components for the Sentinel satellites. Spanish companies have developed critical structures and sensors, and have contributed to the development of data processing software. Spain is also leading projects such as the Atlantic Constellation, which will develop small satellites for climate and oceanic monitoring.
Sentinel satellite
| Satellite | Technical characteristics | Resolution | Coverage (capture frequency) | Uses |
|---|---|---|---|---|
| Sentinel-1 | C-band SAR radar, resolution up to 5m | Up to 5m | Every 6 days | Land and ocean monitoring, natural disasters |
| Sentinel-2 | Multispectral camera (13 bands), resolution up to 10m | 10m, 20m, 60m | Every 5 days | Agricultural management, forestry monitoring, water quality |
| Sentinel-3 | Radiometer SLSTR, Spectrometer OLCI, Altimeter SRAL | 300m (OLCI), 500m (SLSTR) | Every 1-2 days | Oceanic, climatic and terrestrial observation |
| Sentinel-5P | Tropomi spectrometer, resolution 7x3.5 km². | 7x3.5 km² | Daily global coverage | Air quality monitoring, trace gases |
| Sentinel-6 | Altimeter Poseidon-4, vertical resolution 1 cm | 1cm | Every 10 days | Sea level measurement, climate change |
Figure 1. Table satellites Sentinel. Source: own elaboration
Galileo: the european GPS
Galileo is the global navigation satellite system developed by the European Union, managed by the European Space Agency (ESA) and operated by the European Union Space Programme Agency (EUSPA). It aims to provide a reliable and highly accurate global positioning service, independent of other systems such as the US GPS, China''s Beidou or Russia''s GLONASS. Galileo is designed for civilian use and offers free and paid services for various sectors, including transport, telecommunications, energy and finance.
Spain has played a leading role in the Galileo programme. The European GNSS Service Centre (GSC), located in Torrejón de Ardoz, Madrid, acts as the main contact point for users of the Galileo system. In addition, Spanish industry has contributed to the development and manufacture of components for satellites and ground infrastructure, strengthening Spain''s position in the European aerospace sector.
| Satellite | Technical characteristics | Resolution | Coverage (capture frequency) | Uses |
|---|---|---|---|---|
| Galileo FOC | Medium Earth Orbit (MEO), 24 operatives | N/A | Continuous | Precise positioning, land and maritime navigation |
| Galileo IOV | First test satellites of the Galileo system | N/A | Continuous | Initial testing of Galileo before FOC |
Figure 2. Satellite Galileo. Source: own elaboration
EGNOS: improving the accuracy of GPS and Galileo
The European Geostationary Navigation Overlay Service (EGNOS) is the European satellite-based augmentation system (Satellite Based Augmentation System or SBAS) designed to improve the accuracy and reliability of global navigation satellite systems ( Global Navigation Satellite System, GNSS), such as GPS and, in the future, Galileo. EGNOS provides corrections and integrity data that allow users in Europe to determine their position with an accuracy of up to 1.5 metres, making it suitable for safety-critical applications such as aviation and maritime navigation.
Spain has played a leading role in the development and operation of EGNOS. Through ENAIRE, Spain hosts five RIMS Reference Stations located in Santiago, Palma, Malaga, Gran Canaria and La Palma. In addition, the Madrid Air Traffic Control Centre, located in Torrejón de Ardoz, hosts one of the EGNOS Mission Control Centres (MCC), operated by ENAIRE. The Spanish space industry has contributed significantly to the development of the system, with companies participating in studies for the next generation of EGNOS.
| Satellite | Technical characteristics | Resolution | Coverage (capture frequency) | Uses |
|---|---|---|---|---|
| EGNOS Geo | Geostationary GNSS correction satellites | N/A | Real-time GNSS correction | GNSS signal correction for aviation and transportation |
Figure 3. Table satellite EGNOS. Source: own elaboration
Meteosat: high precision weather forecasting
The Meteosat programme consists of a series of geostationary meteorological satellites initially developed by the European Space Agency (ESA) and currently operated by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). These satellites are positioned in geostationary orbit above the Earth''s equator, allowing continuous monitoring of weather conditions over Europe, Africa and the Atlantic Ocean. Its main function is to provide images and data to facilitate weather prediction and climate monitoring.
Spain has been an active participant in the Meteosat programme since its inception. Through the Agencia Estatal de Meteorología (AEMET), Spain contributes financially to EUMETSAT and participates in the programme''s decision-making and operations. In addition, the Spanish space industry has played a key role in the development of the Meteosat satellites. Spanish companies have been responsible for the design and supply of critical components for third-generation satellites, including scanning and calibration mechanisms.
| Satellite | Technical characteristics | Resolution | Cobertura (frecuencia de captura) | Usos |
|---|---|---|---|---|
| Meteosat Primera Gen. | Initial weather satellites, low resolution | Low resolution | Every 30min | Basic weather forecast, images every 30 min. |
| Meteosat Segunda Gen. | Higher spectral and temporal resolution, data every 15 min. | High resolution | Every 15min | Improved accuracy, early detection of weather events |
| Meteosat Tercera Gen. | High-precision weather imaging, lightning detection | High resolution | High frequency | High-precision weather imaging, lightning detection |
Figure 4. Metosat satellite. Source: own elaboration
Access to the data of each programme
Each programme has different conditions and distribution platforms in terms of access to data:
- Copernicus: provides free and open data through various platforms. Users can access satellite imagery and products through the Copernicus Data Space Ecosystem, which offers search, download and processing tools. Data can also be obtained through APIs for integration into automated systems.
- Galileo: its open service (Open Service - OS) allows free use of the navigation signals for any user with a compatible receiver, free of charge. However, direct access to raw satellite data is not provided. For information on services and documentation, access is via the European GNSS Services Centre (GSC):
- Galileo Portal.
- Registration for access to the High Accuracy Service (HAS) (registration required).
- EGNOS: This system improves navigation accuracy with GNSS correction signals. Data on service availability and status can be found on the EGNOS User Support platform..
- Meteosat: Meteosat satellite data are available through the EUMETSAT platform. There are different levels of access, including some free data and some subject to registration or payment. For imagery and meteorological products you can access the EUMETSAT Data Centre..
In terms of open access, Copernicus is the only programme that offers open and unrestricted data. In contrast, Galileo and EGNOS provide free services, but not access to raw satellite data, while Meteosat requires registration and in some cases payment for access to specific data.
Conclusions
The Copernicus, Galileo, EGNOS and Meteosat programmes not only reinforce Europe''s space sovereignty, but also ensure access to strategic data essential for the management of the planet. Through them, Europe can monitor climate change, optimise global navigation, improve the accuracy of its positioning systems and strengthen its weather predictioncapabilities, ensuring more effective responses to environmental crises and emergencies.
Spain plays a fundamental role in this space infrastructure, not only with its aerospace industry, but also with its control centres and reference stations, consolidating itself as a key player in the development and operation of these systems.
Satellite imagery and data have evolved from scientific tools to become essential resources for security, environmental management and sustainable growth. In a world increasingly dependent on real-time information, access to this data is critical for climate resilience, spatial planning, sustainable agriculture and ecosystem protection.
The future of Earth observation and satellite navigation is constantly evolving, and Europe, with its advanced space programmes, is positioning itself as a leader in the exploration, analysis and management of the planet from space.
Access to this data allows researchers, businesses and governments to make more informed and effective decisions. With these systems, Europe and Spain guarantee their technological independence and strengthen their leadership in the space sector.
Ready to explore more? Access the links for each programme and discover how this data can transform our world.
| Copernicus | https://dataspace.copernicus.eu/ | Download centre |
|---|---|---|
| Meteosat | https://user.eumetsat.int/data-access/data-centre/ | Download centre |
| Galileo | https://www.gsc-europa.eu/galileo/services/galileo-high-accuracy-servic…/ | Download centre, after registration |
| EGNOS | https://egnos-user-support.essp-sas.eu/ | Project |
Figure 5. Source: own elaboration
Content prepared by Mayte Toscano, Senior Consultant in Data Economy Technologies. The contents and points of view reflected in this publication are the sole responsibility of the author.
