12 posts found
GeoPackage in INSPIRE: efficiency and usability for geospatial data geospatial data.
In the field of geospatial data, encoding and standardisation play a key role in ensuring interoperability between systems and improving accessibility to information.
The INSPIRE Directive (Infrastructure for Spatial Information in Europe) determines the general rules for the establishment of an Inf…
SLM, LLM, RAG and Fine-tuning: Pillars of Modern Generative AI
In the fast-paced world of Generative Artificial Intelligence (AI), there are several concepts that have become fundamental to understanding and harnessing the potential of this technology. Today we focus on four: Small Language Models(SLM), Large Language Models(LLM), Retrieval Augmented Generation…
GRAPH QL. Your best ally for the creation of data products.
The era of digitalisation in which we find ourselves has filled our daily lives with data products or data-driven products. In this post we discover what they are and show you one of the key data technologies to design and build this kind of products: GraphQL.
Introduction
Let's start at the beginni…
UNE specifications as a complement to ISO standards for the governance, management and quality of Information Systems and Technologies
Standardisation is essential to improve efficiency and interoperability in governance and data management. The adoption of standards provides a common framework for organising, exchanging and interpreting data, facilitating collaboration and ensuring data consistency and quality. The ISO standards,…
RAG - Retrieval Augmented Generation: The key that unlocks the door to precision language models
Teaching computers to understand how humans speak and write is a long-standing challenge in the field of artificial intelligence, known as natural language processing (NLP). However, in the last two years or so, we have seen the fall of this old stronghold with the advent of large language models (L…
A common language to enable interoperability between open dataset catalogs
Open data plays a relevant role in technological development for many reasons. For example, it is a fundamental component in informed decision making, in process evaluation or even in driving technological innovation. Provided they are of the highest quality, up-to-date and ethically sound, data can…
Improving efficiency in the legal sector: LegalTech and data analytics
Digital transformation affects all sectors, from agriculture to tourism and education. Among its objectives is the optimization of processes, the improvement of the customer experience and even the promotion of new business models.
The legal sector is no exception, which is why in recent years…
We tested SpaCy: much more than a library for creating real natural language processing projects
Few abilities are as characteristic of human beings as language. According to the Aristotelian school, humans are rational animals who pursue knowledge for the mere fact of knowing. Without going into deep philosophical considerations that far exceed the purpose of this space for dissemination, we c…
The UK data strategy and its reform
Just over a year ago - and after an extensive process of analysis, research and public consultation - the UK government announced its new national data strategy created in response to the increasingly important role of data in all aspects of our society. The strategy builds on other related initiati…
Why should you use Parquet files if you process a lot of data?
It's been a long time since we first heard about the Apache Hadoop ecosystem for distributed data processing. Things have changed a lot since then, and we now use higher-level tools to build solutions based on big data payloads. However, it is important to highlight some best practices related to ou…