47 posts found
The rise of predictive commerce: open data to anticipate needs
In a world where immediacy is becoming increasingly important, predictive commerce has become a key tool for anticipating consumer behaviors, optimizing decisions, and offering personalized experiences. It's no longer just about reacting to the customer's needs, it's about predicting what they…
How to ensure the authenticity of satellite imagery
Synthetic images are visual representations artificially generated by algorithms and computational techniques, rather than being captured directly from reality with cameras or sensors. They are produced from different methods, among which the antagonistic generative networks (Generative Adversarial…
Altruistic projects to create AI models in co-official languages
Artificial intelligence (AI) assistants are already part of our daily lives: we ask them the time, how to get to a certain place or we ask them to play our favorite song. And although AI, in the future, may offer us infinite functionalities, we must not forget that linguistic diversity is still a pe…
Federated machine learning: generating value from shared data while maintaining privacy
Data is a fundamental resource for improving our quality of life because it enables better decision-making processes to create personalised products and services, both in the public and private sectors. In contexts such as health, mobility, energy or education, the use of data facilitates more effic…
ALIA and foundational models What are they and why are they key to the future of AI?
The enormous acceleration of innovation in artificial intelligence (AI) in recent years has largely revolved around the development of so-called "foundational models". Also known as Large [X] Models (Large [X] Models or LxM), Foundation Models, as defined by the Center for Research on Foundation Mod…
Using Pandas for quality error reduction in data repositories
There is no doubt that data has become the strategic asset for organisations. Today, it is essential to ensure that decisions are based on quality data, regardless of the alignment they follow: data analytics, artificial intelligence or reporting. However, ensuring data repositories with high levels…
Understanding Word Embeddings: how machines learn the meaning of words
Natural language processing (NLP) is a branch of artificial intelligence that allows machines to understand and manipulate human language. At the core of many modern applications, such as virtual assistants, machine translation and chatbots, are word embeddings. But what exactly are they and why are…
The challenge of boosting the Data Economy by deploying the EU's digital diplomacy at the global level
The growing importance of data goes beyond the economic and social spheres at state level to a multinational dimension that raises the challenges, opportunities, threats and uncertainties surrounding the development of the Data Economy to a global scale. In the case of the European Union, the issue…
Challenges and uncertainties for the deployment of the Data Economy in Europe
Four years after the publication of the European Commission's Communication 'A Data Strategy', the European Commission has published a Communication on the European Commission's 'Data Strategy'A Data Strategy' (February 2020) (February 2020) - setting out the broad outlines of the broad outlines of…
Why data spaces?
A data space is the place where value is generated around data through voluntary sharing in an environment of sovereignty, trust and security. The data space enables you to determine who accesses what data and under what conditions, thus facilitating the deployment of different use cases to mee…