13 posts found
Federated machine learning: generating value from shared data while maintaining privacy
Data is a fundamental resource for improving our quality of life because it enables better decision-making processes to create personalised products and services, both in the public and private sectors. In contexts such as health, mobility, energy or education, the use of data facilitates more effic…
What data governance should look like in open source AI models
Open source artificial intelligence (AI) is an opportunity to democratise innovation and avoid the concentration of power in the technology industry. However, their development is highly dependent on the availability of high quality datasets and the implementation of robust data governance framework…
The role of open data in the evolution of SLM and LLM: efficiency vs. power
Language models are at the epicentre of the technological paradigm shift that has been taking place in generative artificial intelligence (AI) over the last two years. From the tools with which we interact in natural language to generate text, images or videos and which we use to create creativ…
The unique relevance of interoperability in the Data Regulation(Data Act)
One of the main objectives of Regulation (EU) of the European Parliament and of the Council of 13 December 2023 on harmonised rules for fair access to and use of data (Data Regulation) is to promote the development of interoperability criteria for data spaces, data processing services and smart cont…
The obligation to provide data to public bodies in exceptional situations in the Data Regulations (Data Act)
The recent Regulation (EU) 2023/2854 of the European Parliament and of the Council of 13 December 2023 on harmonised rules for fair access to and use of data (Data Act) introduces important new developments in European legislation to facilitate access to data generated by connected products and rela…
Linguistic corpora: the knowledge engine for AI
The transfer of human knowledge to machine learning models is the basis of all current artificial intelligence. If we want AI models to be able to solve tasks, we first have to encode and transmit solved tasks to them in a formal language that they can process. We understand as a solved task informa…
How to measure carbon footprint using open data
The carbon footprint is a key indicator for understanding the environmental impact of our actions. It measures the amount of greenhouse gas emissions released into the atmosphere as a result of human activities, most notably the burning of fossil fuels such as oil, natural gas and coal. These gases,…
Big Data Test Infrastructure: A free environment for public administrations to experiment with open data
The Big Data Test Infrastructure (BDTI) is a tool funded by the European Digital Agenda, which enables public administrations to perform analysis with open data and open source tools in order to drive innovation.
This free-to-use, cloud-based tool was created in 2019 to accelerate d…
User access to data from connected products and related services in the new European Data Regulation ( Data Act)
The adoption of the Regulation (EU) of the European Parliament and of the Council of 13 December 2023 on harmonised rules for fair access to and use of data (Data Law) is an important step forward in the regulation of the European Union to facilitate data accessibility. This is an initiative already…
Challenges and uncertainties for the deployment of the Data Economy in Europe
Four years after the publication of the European Commission's Communication 'A Data Strategy', the European Commission has published a Communication on the European Commission's 'Data Strategy'A Data Strategy' (February 2020) (February 2020) - setting out the broad outlines of the broad outlines of…