18 posts found
What data governance should look like in open source AI models
Open source artificial intelligence (AI) is an opportunity to democratise innovation and avoid the concentration of power in the technology industry. However, their development is highly dependent on the availability of high quality datasets and the implementation of robust data governance framework…
Data Sandboxes: Exploring the potential of open data in a secure environment
Data sandboxes are tools that provide us with environments to test new data-related practices and technologies, making them powerful instruments for managing and using data securely and effectively. These spaces are very useful in determining whether and under what conditions it is feasibl…
Global principles of AI journalism
General ethical frameworks
The absence of a common, unified, ethical framework for the use of artificial intelligence in the world is only apparent and, in a sense, a myth. There are a multitude of supranational charters, manuals and sets of standards that set out principles of ethical use, although…
Big Data Test Infrastructure: A free environment for public administrations to experiment with open data
The Big Data Test Infrastructure (BDTI) is a tool funded by the European Digital Agenda, which enables public administrations to perform analysis with open data and open source tools in order to drive innovation.
This free-to-use, cloud-based tool was created in 2019 to accelerate d…
Artificial intelligence to improve interoperability in the European public sector
The European Union has placed the digital transformation of the public sector at the heart of its policy agenda. Through various initiatives under the Digital Decade policy programme, the EU aims to boost the efficiency of public services and provide a better experience for citizens.…
Discover IATE: the European Union's inter-institutional terminology bas
IATE, which stands for Interactive Terminology for Europe, is a dynamic database designed to support the multilingual drafting of European Union texts. It aims to provide relevant, reliable and easily accessible data with a distinctive added value compared to other sources of lexical informatio…
Segment Anything Model: Key Insights from Meta's Segmentation Model Applied to Spatial Data
Image segmentation is a method that divides a digital image into subgroups (segments) to reduce its complexity, thus facilitating its processing or analysis. The purpose of segmentation is to assign labels to pixels to identify objects, people, or other elements in the image.
Image segmentation is c…
European Webinars: Monitoring Climate Change and Digital Development with Open Data
The "Stories of Use Cases" series, organized by the European Open Data portal (data.europe.eu), is a collection of online events focused on the use of open data to contribute to common European Union objectives such as consolidating democracy, boosting the economy, combating climate change, and driv…
Initiatives for training machine learning models with open data
Behind a voice-enabled virtual assistant, a movie recommendation on a streaming platform, or the development of some COVID-19 vaccines, there are machine learning models. This branch of artificial intelligence enables systems to learn and improve their performance.
Machine learning (ML) is one of th…
Quantifying the value of data
There is a recurring question that has been around since the beginning of the open data movement, and as efforts and investments in data collection and publication have increased, it has resonated more and more strongly: What is the value of a dataset?
This is an extremely difficult question to answ…