Sustainable Cities for Citizens Home Sustainable Cities for Citizens Home

This exploratory tells stories about cities and people living in it.  Data scientists describe those territories by means of data, statistics and models. This allows citizens and local administrator to better understand cities and how to improve them.

Visit the VRE Public page

Migration Studies Migration Studies

This exploratory analyses the phenomenon of international migration with Big Data tools. We look at migration flows and stocks, migrant integration, cultural diversity, return of migrants..

Enter the Exploratory

Sports Data Science Sports Data Science

This exploratory tells stories about sports analytics. Sports data scientists describe performances by means of data, statistics and models. This allows coaches, fans and practitioners to better understand and boost sports performance.

Enter the Exploratory

 

Societal Debates and Misinformation Societal Debates and Misinformation

By analysing discussions on social media and newspaper articles, in this exploratory we study public debates to understand which are the most discussed topics. We can identify themes, following the discussions around them and tracking them through time and space. 

Visit the VRE Public page

Demography, Economy and Finance 2.0 Demography, Economy and Finance 2.0

This exploratory uses data of purchases in supermarkets and investigates the changes in people’s behavior after the economic crisis. This study allows to work out an early indicator of disease. We also study the measurement of the real cost of life by studying the price variation.

Furthermore we try to correlate people well being with their social and mobility data, discovering that they change in poor areas.

Enter the Exploratory

 

 

Social Impact and Explainable AI Social Impact and Explainable AI

We are evolving, faster than expected, from a time when humans are coding algorithms and carry the responsibility of the resulting software quality and correctness, to a time when sophisticated algorithms automatically learn to solve a task by observing many examples of the expected input/output behavior.

Most of the times the internal reasoning of these algorithms is obscure even to their developers. For this reason, the last decade has witnessed the rise of a black box society. Black box AI systems for automated decision making, often based on machine learning over big data, map a user's features into a class predicting the behavioral traits of individuals, such as credit risk, health status, etc., without exposing the reasons why.

This is troublesome not only for lack of transparency but also for possible biases inherited by the algorithms from human prejudices and collection artifacts hidden in the training data, which may lead to unfair or wrong decisions. It is therefore urgent to develop a set of techniques which allows the user to understand why an algorithm made a decision.

Enter the Exploratory