7 posts found
Altruistic projects to create AI models in co-official languages
Artificial intelligence (AI) assistants are already part of our daily lives: we ask them the time, how to get to a certain place or we ask them to play our favorite song. And although AI, in the future, may offer us infinite functionalities, we must not forget that linguistic diversity is still a pe…
GeoPackage in INSPIRE: efficiency and usability for geospatial data geospatial data.
In the field of geospatial data, encoding and standardisation play a key role in ensuring interoperability between systems and improving accessibility to information.
The INSPIRE Directive (Infrastructure for Spatial Information in Europe) determines the general rules for the establishment of an Inf…
Linguistic corpora: the knowledge engine for AI
The transfer of human knowledge to machine learning models is the basis of all current artificial intelligence. If we want AI models to be able to solve tasks, we first have to encode and transmit solved tasks to them in a formal language that they can process. We understand as a solved task informa…
A common language to enable interoperability between open dataset catalogs
Open data plays a relevant role in technological development for many reasons. For example, it is a fundamental component in informed decision making, in process evaluation or even in driving technological innovation. Provided they are of the highest quality, up-to-date and ethically sound, data can…
Hackathons, a new way of attracting talent
Technology is now an essential component of our daily lives. It is no secret that a large number of companies worldwide have been making significant investments in order to digitize their processes, products or services and thus offer greater innovation in them.
All this has led to an increase in th…
What are the advantages of participating in a hackathon or data-related competition like the Aporta Challenge?
Hackathons, contests or challenges related to data are a different way to test your ideas and/or knowledge, while acquiring new skills. Through this type of competition, solutions to real problems are sought, often in multidisciplinary teams that share diverse knowledge and points of view. In additi…
Why should you use Parquet files if you process a lot of data?
It's been a long time since we first heard about the Apache Hadoop ecosystem for distributed data processing. Things have changed a lot since then, and we now use higher-level tools to build solutions based on big data payloads. However, it is important to highlight some best practices related to ou…