10 posts found
AI tools for research and a new way to use language models
AI systems designed to assist us from the first dives to the final bibliography.
One of the missions of contemporary artificial intelligence is to help us find, sort and digest information, especially with the help of large language models. These systems have come at a time when we most need to mana…
Open source auto machine learning tools
The increasing complexity of machine learning models and the need to optimise their performance has been driving the development of AutoML (Automated Machine Learning) for years. This discipline seeks to automate key tasks in the model development lifecycle, such as algorithm selection, data process…
Re3gistry: facilitating the semantic interoperability of data
The INSPIRE (Infrastructure for Spatial Information in Europe) Directive sets out the general rules for the establishment of an Infrastructure for Spatial Information in the European Community based on the Infrastructures of the Member States. Adopted by the European Parliament a…
Aplicación de la Especificación UNE 0079:2023 de gestión de calidad a los datos abiertos
Continuamos en esta segunda entrega de la serie de artículos con la aplicación de las especificaciones UNE. Antes de nada, recordemos que las Especificaciones UNE 0077, UNE 0078 y UNE 0079 introducen las buenas prácticas en el gobierno del dato, gestión del dato y gestión de calidad del dato con una…
Aplicación de las Especificación UNE 0078: 2023 a los datos abiertos
Este artículo constituye la tercera y última entrega de la serie de artículos dedicados dedicado a la aplicación de las especificaciones UNE de Gobierno, Gestión y Gestión de la calidad del dato a la publicación de datos abiertos. Recordemos que lo estamos realizando, aplicándolo al caso ficticio de…
Vinalod: The tool to make open datasets more accessible
Public administration is working to ensure access to open data, in order to empowering citizens in their right to information. Aligned with this objective, the European open data portal (data.europa.eu) references a large volume of data on a variety of topics.
However, although the data belong to di…
Free tools to work on data quality issues
Ensuring data quality is an essential task for any open data initiative. Before publication, datasets need to be validated to check that they are free of errors, duplication, etc. In this way, their potential for re-use will grow.
Data quality is conditioned by many aspects. In this sense, the Aport…
Different approaches to identifying high-value data
Since the publication of Directive (EU) 2019/1024 on open data and re-use of public sector information, the European Commission is undertaking a number of actions to develop the concept of high-value data that this directive introduced as an important novelty in June 2019.
We recall that high-value…
Technical Standards to achieve Data Quality
Transforming data into knowledge has become one of the main objectives facing both public and private organizations today. But, in order to achieve this, it is necessary to start from the premise that the data processed is governed and of quality.
In this sense, the Spanish Association for Standardi…
API Friendliness Checker. A much needed tool in the age of data products.
Many people don't know, but we are surrounded by APIs. APIs are the mechanism by which services communicate on the Internet. APIs are what make it possible for us to log into our email or make a purchase online.
API stands for Application Programming Interface, which for most Internet users means no…