12 posts found
AI tools for research and a new way to use language models
AI systems designed to assist us from the first dives to the final bibliography.
One of the missions of contemporary artificial intelligence is to help us find, sort and digest information, especially with the help of large language models. These systems have come at a time when we most need to mana…
Open source auto machine learning tools
The increasing complexity of machine learning models and the need to optimise their performance has been driving the development of AutoML (Automated Machine Learning) for years. This discipline seeks to automate key tasks in the model development lifecycle, such as algorithm selection, data process…
What can we find in the CRUE report "Data Analytics at the University"?
Last November 2023, Crue Spanish Universities published the report TIC360 "Data Analytics in the University". The report is an initiative of the Crue-Digitalisation IT Management working group and aims to show how the optimisation of data extraction and processing processes is key to the generation…
The importance of the Telco sector in the deployment of a Digital Europe
Building Europe's digital infrastructure of tomorrow
As a global technology race unfolds, Europe is deploying the regulatory framework and investments needed to foster innovation and technological leadership in areas such as online platforms, artificial intelligence, data, cloud, quantum technologie…
Re3gistry: facilitating the semantic interoperability of data
The INSPIRE (Infrastructure for Spatial Information in Europe) Directive sets out the general rules for the establishment of an Infrastructure for Spatial Information in the European Community based on the Infrastructures of the Member States. Adopted by the European Parliament a…
Vinalod: The tool to make open datasets more accessible
Public administration is working to ensure access to open data, in order to empowering citizens in their right to information. Aligned with this objective, the European open data portal (data.europa.eu) references a large volume of data on a variety of topics.
However, although the data belong to di…
Free tools to work on data quality issues
Ensuring data quality is an essential task for any open data initiative. Before publication, datasets need to be validated to check that they are free of errors, duplication, etc. In this way, their potential for re-use will grow.
Data quality is conditioned by many aspects. In this sense, the Aport…
The role of universities in the dissemination of open data
Spanish universities have numerous quality data that have great economic and social potential. For this reason, for some time now we have been witnessing a data opening movement by universities in our country, with the aim of promoting the use and reuse of the information they generate, as well as i…
The UK data strategy and its reform
Just over a year ago - and after an extensive process of analysis, research and public consultation - the UK government announced its new national data strategy created in response to the increasingly important role of data in all aspects of our society. The strategy builds on other related initiati…
Edge computing and its importance in real-time data management
Autonomous vehicles, smart waste management services, trainers that monitor how much we exercise... We live in an increasingly digital and connected environment, with greater similarities to the future we dreamed of as children. It is the so-called Internet of Things (IoT), a network of physical obj…