Our Guest Pau Garcia Discusses
Pau Garcia: How AI Is Connecting People With Their Pasts
Today on Digital Disruption, we’re joined by Pau Garcia, media designer and founder of Domestic Data Streamers.
Pau is a media designer and the founder of Domestic Data Streamers, a Barcelona-based studio that has been creating immersive “info-experiences” and generative AI projects since 2013. His work spans over 45 countries, collaborating with institutions like the United Nations, Barcelona City Hall, and Citizen Lab. Pau is the chair of the Master in Data in Design program at ELISAVA University and is a guest lecturer at institutions including the School of Visual Arts (NY), the Hong Kong Design Institute, the Royal College of Art (London), and the Barcelona School of Economics. In 2021, he founded HeyHuman!, an artist residency program that merges music, journalism, and data to drive artistic research and social justice.
Pau sits down with Geoff to talk about how his organization is unlocking some of the most human experiences through synthetic memories. This initiative uses generative AI to recreate lost visual memories, particularly for refugees, migrants, and those affected by war or displacement.
They explain how AI is not just shaping the future but also helping to reclaim the past, as well as the conversation that led to the creation of this technology. They also discuss ethical considerations of working with vulnerable communities. The project is also being explored in dementia care, where AI-generated memories are being tested as a tool for reminiscence therapy.
00;00;01;03 - 00;00;30;03
Geoff Nielson
I'm super excited to talk to Paolo today how in the work being done with domestic data streamers. They are using design. They're using art. They're using data to unlock some of the most human experiences with us. They've eviscerated the idea that AI is this, you know, kind of cold, neutral technology. And I want to hear a lot about, you know, what they're doing and how they're helping us, you know, humanize all of our experiences with AI.
00;00;30;03 - 00;00;33;18
Geoff Nielson
So it should be a great conversation.
00;00;33;20 - 00;00;54;07
Geoff Nielson
This is digital disruption. I'm Jeff Nielsen and joining me today is Paola like Garcia of the domestic data streamers. How I'm so excited to talk to you today about, you know, everything that the domestic data streamers have been working on in the world of synthetic memories and beyond. So, you know, maybe we could just start off by talking a little bit about, you know, what are synthetic memories?
00;00;54;07 - 00;00;58;10
Geoff Nielson
What what is this project you've been working on? And, you know, what's the impact of it?
00;00;58;13 - 00;01;50;01
Pau Aleikum Garcia
Synthetic memories is, is the reconstruction of a visual memory from someone using generative AI models? It's a project we started two years ago already, almost three years ago. And come from the idea of helping people that because, war, political persecution do not all disasters or any of those reasons they have had to leave, their country, sometimes migrating and leaving a lot of thought behind between all this stuff, sometimes, photo albums, diaries, on a big part of these subjective, individual cultural visual heritage that they had, and the idea evolved into a set of methodologies, tools.
00;01;50;03 - 00;01;55;23
Pau Aleikum Garcia
And now, a whole almost foundation that, that these doing the projects all over the world now.
00;01;55;26 - 00;02;18;24
Geoff Nielson
That's, that's so amazing. And it's just, it's honestly one of the coolest applications of AI, you know, of generative AI have ever come across. How did you know? You talk about specifically talking about it with, you know, refugees or people who have been, you know, displaced by war? It was that the the genesis of the idea, how, how how did this come about initially as an idea?
00;02;18;26 - 00;02;49;13
Pau Aleikum Garcia
So, at the studio, domestic data streamers that we have been working for very long time with data and in a lot of different social context. One of those was in 2014, we were in Greece, in Athens, in one of the biggest refugee crisis, in Europe in the last decades. And almost I think more than 3 million refugees from Syria came over the border, and were located in different places in than in, in Athens specifically.
00;02;49;16 - 00;03;14;12
Pau Aleikum Garcia
So we were helping them allocate themselves in different, old schools, abandoned hospitals and spaces that they call the ability to to inhabit for a while. And I remember one night we were having dinner with, with, an elderly woman and she told me, well, Bo, I am not afraid of being a refugee. Now, what I'm afraid is that, my grandkids will be refugees for a very long time.
00;03;14;14 - 00;03;39;23
Pau Aleikum Garcia
And I said, how come? And she said, well, the thing is that our home does not exist anymore. Like, our neighborhoods don't exist. Our photo albums and exist, they're like a big part of their things that somehow build up our identity don't exist. And when my grandkids ask, where do I come from? There will be very few things that can answer that.
00;03;39;25 - 00;04;09;14
Pau Aleikum Garcia
And that I think was kind of the seed of that conversation, that trigger of of understanding how important our images and, physical spaces for a cultural identity and subjective one. So, the in 2020, we were one of the early, studios that could actually do some testing with open AI to. And we were already like, trying to figure out what we could do with this amazing technology.
00;04;09;16 - 00;04;33;19
Pau Aleikum Garcia
And we said, okay, what if we use it for that specific? So we started to do like very simple experiments, first year in Barcelona in our hometown. And we invited, like several participants, elderly people to come and talk about their memories. And from there we kind of saw the impact that it had in them, just seeing images that before were only in their heads, just seeing right away.
00;04;33;19 - 00;04;48;27
Pau Aleikum Garcia
After just orally expressing their memories, they will be able to actually see them in front and just that impact, seeing how that was in very impactful for them, kind of brought their enough energy to build up the project there.
00;04;48;29 - 00;04;57;24
Geoff Nielson
And what what was that impact, you know, how did they react to it and how did you feel hearing their reactions?
00;04;57;26 - 00;05;27;29
Pau Aleikum Garcia
So for me, it was very weird, you know, like the project itself when you when we thought about this, it felt like Minority Report, you know, like this kind of almost dystopian future artistic idea. But because I don't know, we had, like, this instinct that this could be helpful. We tried nevertheless. And then, the first, participants were people that we know or people that were friends or friends or grandparents of people that we knew.
00;05;28;01 - 00;06;00;27
Pau Aleikum Garcia
Right. So kind of close environment. So we were comfortable there. And with these participants, it for each one was a bit different. But I remember that the second participant way that I explained her memory, his memory, he was talking about, his brother, who died almost ten years ago, he was talking about when they were kids, and they used to do, like, some stuff in their fields with, their mother and alone.
00;06;00;27 - 00;06;24;04
Pau Aleikum Garcia
And then we generated an image six like of that specific situation. And one, Juan one was the participant. When he saw the image, he started to cry like it was like, oh, I never had an image of my brother like this. And these are very beloved memory of me. So just being able to actually, tangibly rise, like, physically.
00;06;24;04 - 00;06;51;21
Pau Aleikum Garcia
Is this into our image? Was was impactful, I think for other people was a bit different. Some people actually wanted to create images of, of sometimes traumatic things, things that happened to them that were important, but were not happy, happy memories. And, for them, in this case, it was more about, dignifying the past about saying that happened.
00;06;51;23 - 00;07;27;13
Pau Aleikum Garcia
And this is another proof that this happened is on not only in my head. This is, another media that I can use now to explain what really happened to me and how I got here. Right. So there are several studies, we have been doing that for like, these two years with a lot of different people that really wanted to share traumatic events and situations that were like, really, really difficult for them, but that somehow externalizing them and transforming them into an image that these are universal language, to say so, was also kind of liberating.
00;07;27;16 - 00;07;34;11
Pau Aleikum Garcia
Can can I, I have seen a lot of people feeling kind of a release when they see the image.
00;07;34;13 - 00;07;57;20
Geoff Nielson
Yeah. What I mean, it's so it's so interesting and as you said, like it's such an emotional, you know, journey for these people, whether it's positive emotion or trauma or both, you know, did you ever feel pow doing this that, you know, oh, maybe we've gone too far as a studio or maybe, maybe there's something happening here that's now kind of beyond our our level of expertise.
00;07;57;27 - 00;08;07;10
Geoff Nielson
And, you know, do we work directly with people? Do we work with other, you know, organizations? Or, you know, did you feel kind of firmly in control of of where this is going?
00;08;07;13 - 00;08;32;14
Pau Aleikum Garcia
I think at the beginning we didn't. And that's why we only worked with family members and, yeah, those to us. And then when we started to see that we could create a methodology that was more controlled and that there were like certain factors that we could actually, isolate, and guide very well. We were like fed very well.
00;08;32;17 - 00;08;55;23
Pau Aleikum Garcia
But then when we started to work with, for example, migrant communities, we started to join forces with different organizations. So something that we always do is whenever we generate, synthetic memories and there are always three people involved, that is the interviewee, the interviewer and the prompter. The interviewer also has to be from the same community as they do every week.
00;08;55;25 - 00;09;38;12
Pau Aleikum Garcia
That means that they share this common background, they understand each other in their mother tongue and so on. And that's a way that you can really reduce the, the potential risks of misunderstanding and misunderstanding this alignment and so on. And also when I say community, I say someone that have experienced something similar. Right. Even if sometimes it's very difficult because the time gap on the different political moments that we're living in different parts of the world, but we always try to work with local organizations that somehow already have like this local community.
00;09;38;14 - 00;09;54;17
Geoff Nielson
Right. So so what sorts of, you know, you mentioned, you know, Greece and the Syrian refugee camps, what what sort of places are you actually deploying this right now? And, you know, do you have a broader vision for where it could be? And, you know, where does this go from here?
00;09;54;19 - 00;10;24;16
Pau Aleikum Garcia
Yeah, I think for us, probably one of the best moment was when we started to work with, care care homes and nursing homes. So I had this friend who was actually, like, working in a nursing home. He was taking care of 12, patients with dementia. And he was telling me about this methodology that he was following called, reminiscence therapy.
00;10;24;19 - 00;10;48;14
Pau Aleikum Garcia
And this is a kind of therapy. Maybe you have heard about that is very common, specifically in the States, but in the 60s has kind of grown and and it's the use of music, for example, music from the times of that person, or a food or the smell or the voice of someone that that person recognizes, to actually trigger this kind of cognitive enhancement.
00;10;48;16 - 00;11;14;11
Pau Aleikum Garcia
I, for a while, it's temporary. And then we said, what if we kind of cross these methodologies and do, some pilot experiments? So we did, a couple of pilot experiments here in Barcelona, and the results were really, really good. The engagement of, of, of the participants were really high. And then and right now we are actually in deployment of this more clinical perspective on this.
00;11;14;13 - 00;11;50;25
Pau Aleikum Garcia
And we have partnerships with the University of Toronto, the University of, of the British Columbia. And we are exploring exactly how, which is the best way to integrate that in a safe or ethical, protected and controlled way into a clinical environment. But the first results that I have been, like, really, really good. And I think in that sense, for me, the future is very much in that direction, understanding that this could be actually used for, therapy and it could be a like I have seen it, but the subjective I need not yeah.
00;11;50;27 - 00;12;12;23
Pau Aleikum Garcia
Academic in that sense, but I think with 3 or 4 years of of research, with the right partners, we will get into, a really good position. And this could be actively deployed and used in a lot of different centers for very like, like really almost no cost. Well, that's what is interesting, but it's something that it's very accessible.
00;12;12;23 - 00;12;40;18
Pau Aleikum Garcia
You don't need like really hard core computing power. You don't need like, more than an internet connection and just like $20 per month thing. So it's something that could be very accessible for a lot of people and could actually improve the life not only of the patients with dementia and or Taymor, but also family members, caregivers, and as well as all the communities that have lost all these images because of the reasons other than health.
00;12;40;20 - 00;13;03;28
Geoff Nielson
How one of the things that, you know, I found so amazing about this and what caught me by surprise is when you actually look at the, you know, the footage of this, it's not it's not picture perfect, right? It's not like this high quality, everything is perfect fidelity. You know, it looks like Hollywood, which is one of the concerns that people have about AI video is, you know, it's got these shortcomings.
00;13;04;01 - 00;13;18;07
Geoff Nielson
But, you know, having heard you talk about it, that's not really a shortcoming. Right? Like, it almost creates an advantage that it has, you know, a dreamlike quality. How did that, you know, how does that work? And how are you, you know, using that to your advantage?
00;13;18;09 - 00;13;50;21
Pau Aleikum Garcia
It was purely accidental. As we were, like, really testing the early, very, very, very early models of, of generative AI with images. This this is how these algorithms were at that point. But then when we started to use like more realistic models, we started to see that their, engagement started to be poorer, like lower, like people started to feel less connected to the models we were generating.
00;13;50;23 - 00;14;11;20
Pau Aleikum Garcia
And we stayed there for a while. We were like, what's going on? Maybe we are doing something wrong. And then we one day we did the experiment of going back to one of the early models, and then, like, participants started to engage again, and then we said, okay, that was the problem. And then we kind of this is and this is of course on hypotheses.
00;14;11;20 - 00;14;45;05
Pau Aleikum Garcia
But what about this is, is that it's not at the factual accuracy of an image that actually, brings this connection, but the emotional embedding, right? Whenever something is super realistic, you try it without knowing. Subconsciously you try to find the things that were not exactly like you remember, while when you do something that is more blurry and define a bit more abstract, and it's more about the symbolic representation and, and therefore you're not focused on the detail because you understand that this is not a photo of the moment.
00;14;45;07 - 00;15;08;20
Pau Aleikum Garcia
And then what we are creating here is that instead of an image, or a photography, we are creating an image that works as a vector of memory. Right? So if I say that this ring that I have here, it was a ring that was given to me by my grandmother. This ring will not only be a ring will be my grandmother ring, and these will be a vector to my memory, to my grandmother, and so on.
00;15;08;20 - 00;15;14;18
Pau Aleikum Garcia
So I think these images are like artificial generations of vectors of memory.
00;15;14;20 - 00;15;22;27
Geoff Nielson
Right. So that they're more like they can be a vessel that people then project their own memories onto versus being, you know, perfectly accurate.
00;15;22;28 - 00;15;44;06
Pau Aleikum Garcia
And I think the blurriness and, definition also has a lot to do and match very well with the way our memory works, because our memory, although if you have photographic memory, is a very low, a low percent of society, like most of us like the way we remember things is we remember a couple of things of that situation.
00;15;44;06 - 00;16;12;21
Pau Aleikum Garcia
And then every time we remember, we reconstruct the rest. And I think is very similar to how these models work. They do probabilistic, imagination at some point, and when they are more blurry, they focus more on the things that you specifically that say that were like this. So I think that that's one common thing between early models and, memories, human memories, biologic memories.
00;16;12;24 - 00;16;41;16
Geoff Nielson
Yeah. It's so interesting. And it is not something I would have expected, but it's like it's it's just really nice that you can do it. I'm sure it makes it easier to do it with less, you know, power or less, you know, advanced models, which is so cool. Is there, you know, as you talk about this method, methodology and how you get this right, and walk through the sensitivity of this, were there other lessons or other things that you found, you know, oh, this is something you have to get right.
00;16;41;16 - 00;16;46;14
Geoff Nielson
That makes a big deal. You know, for the interviewee.
00;16;46;16 - 00;17;10;22
Pau Aleikum Garcia
So, I think the most important thing is that everyone knows. What are you wannado? Because, it's either we have the interaction to say, hey, we are gonna reconstruct miserable memories from your past. Yeah. For most of the people, it's kind of a science fiction. So something that we always do is we do a bit of, like, kind of play.
00;17;10;25 - 00;17;32;03
Pau Aleikum Garcia
We, we play with, models. We try to generate. For example, we started with dreams, and then we show how the models work, right? And when they know what they can do and that they when they feel that they are under control of the situation, then we can go and start actually asking things. And we always start by what is your earliest memory.
00;17;32;05 - 00;17;58;16
Pau Aleikum Garcia
And and from there we start to kind of find which is the, the best way to create a symbolic memory of a whole story. Because normally, images try to encapsulate a story that this big around the image itself. Right? That is not because you remember a specific place or a specific person is more about what that person meant to you or what that place meant to you.
00;17;58;19 - 00;18;12;29
Pau Aleikum Garcia
I and depending on that, you will build the image in one way or the other. So it's always about the story and having kind of the instinct to find the right image to represent the story.
00;18;13;01 - 00;18;25;08
Geoff Nielson
Yeah. Yeah. And and it the AI makes sense. You know, you don't want to start with something traumatic. You want to, you know, ease them into the what the technology can do and, and, you know, build up toward a it sounds like game.
00;18;25;10 - 00;18;36;03
Pau Aleikum Garcia
And then what is very important, of course, is to explain that you are not creating an image of that man in, specific photographic units, because otherwise people can feel disappointed at that point, like.
00;18;36;05 - 00;18;37;20
Geoff Nielson
Setting the expectations.
00;18;37;22 - 00;19;00;18
Pau Aleikum Garcia
Yes. What is my mother? I was expecting to see and and then. Yes. So I think setting up the expectation and showing examples of other things that have been done before are important. And, and then you have to if you are part an interviewee or, an interviewer or a prompter, you have to just, be ready for whatever.
00;19;00;18 - 00;19;21;18
Pau Aleikum Garcia
Because sometimes people connect, like right away, and sometimes it takes time for people to just go through these and then say, oh no, this was not exactly like these. It was during the night. Because was and the table was green, not brown. And then you need like to fine tune and find the right way to, to get to the event.
00;19;21;18 - 00;19;47;04
Pau Aleikum Garcia
And sometimes you don't get to the image you need to find another memory. Right. I remember this, kind of interview that we had with, 94 year old man, an engineer, and he remember very well when she, when he was four years old, and the first time that he was in that car and, and he remembered the model of the car, you remember that color of the seeds?
00;19;47;07 - 00;19;58;24
Pau Aleikum Garcia
He remembered everything in so much detail that for that was really difficult to reconstruct that image. And we spent almost an hour and a half building up that memory. Right.
00;19;58;26 - 00;20;00;03
Geoff Nielson
Prompt. Yeah.
00;20;00;05 - 00;20;17;10
Pau Aleikum Garcia
When he saw it at the end, he said, yes, it was. And it was a joy for everyone. Like we had been one hour and a half just figuring out that it's but it was worth it. So sometimes you need patience on that.
00;20;17;12 - 00;20;48;01
Geoff Nielson
It's such an amazing technology. And you know, hearing how you are using it, you know, it's very clearly centered on using it as a force for good. How to, you know, how to help people you know in need, whether it's, you know, refugees or people who are trying to reconstruct their memories, you know, because of, you know, Alzheimer's or, you know, advanced age, how have you been approached or are you working with any more commercial organizations that are interested in this for, you know, corporate use or marketing or anything like that?
00;20;48;03 - 00;20;58;00
Geoff Nielson
And, you know, if so, what does that look like? And does that application concern you at all, or how do you feel about it being used more broadly?
00;20;58;03 - 00;21;28;11
Pau Aleikum Garcia
I mean, we have collaborated with Google about it has been with Google Arts, which at the end is the foundation, and it's trying to push for this project to to expand through other spaces. So I think the commercial use of these, I think I saw something I, I so like this company that were actually, doing something similar, for people who were almost dying and they wanted to reconstruct, like the bulk of their life.
00;21;28;13 - 00;21;51;06
Pau Aleikum Garcia
And this was kind of a commercial project, not very successful. I think the marketing was quite a creepy. Yeah. That, I think I guess at some point there will be a commercial interest in our case. This is part of a bigger research that we are doing with artificial intelligence, specifically generative, that is trying to understand that of course, there is this dichotomy.
00;21;51;06 - 00;22;12;17
Pau Aleikum Garcia
Now, there is these people saying this is the end of the world. This will destroy everything that we know and so on. And we have these other people with this very naive perspective that artificial intelligence will solve all our problems. We will not need to work anymore or, just if we will find all the cures to our diseases.
00;22;12;19 - 00;22;32;13
Pau Aleikum Garcia
And I think we are a bit in between in, like, this critical, perspective on how we can use that having into account the, the responsibility that it have to hold something like this. And I think really for our generation we are it's too late to be pessimistic. So we, we really need to build up stuff with this.
00;22;32;18 - 00;22;59;06
Pau Aleikum Garcia
Amazing that we have to say this is really, really amazing technology that can actually change a lot of things. But it it's, it's not neutral like these kind of technologies. Artificial intelligence is never neutral. And, I think it was Nicholas Carter who said that the the value of a well-made tool, lies not only in what it produces for us, but what it produces in us, within us.
00;22;59;07 - 00;23;24;05
Pau Aleikum Garcia
Right. So, I think when we are creating systems, technologies, methodologies, tools, I always try to think more in that dimension. And maybe that way we are not so interested in, like this more commercial aspect of, of the technology. Although of course it drives the market. It drives a lot of things. But but sometimes it doesn't drive a lot of impact in people.
00;23;24;08 - 00;23;55;22
Pau Aleikum Garcia
Right. So yes, I think Synthetic Memories is a good example of this, of an experiment that started as a kind of an artistic, experiment, like an experiment, and then started to grow from there. And now it's funded mainly with research grants. So it's, people that is really, like, hopeful that this could be actually used for dementia patient, which is a like gigantic problem in the world right now with with a society that is getting older and older.
00;23;55;24 - 00;24;15;28
Pau Aleikum Garcia
So, yeah, I think that that's one way. But we have like so many other projects like this one. I'm a skeptic, Rivera, which is brilliant, that we released, a month ago. It's a plugin that we like published online for free. And, and it's kind of a bullshit detector for the internet. And so you can use it in YouTube in any new media channel.
00;24;15;28 - 00;24;43;20
Pau Aleikum Garcia
And it, it uses like standard LMS to actually find at logical fallacies and, and ways that articles could be manipulating or it's just biased toward one side and not the other, and just not giving all the information or contextual information. So you can actually have a critical opinion. And and this again is another example of how we can use this technology to actually engage with information in a deeper way.
00;24;43;26 - 00;25;03;00
Pau Aleikum Garcia
Yeah, I think synthetic memory is more about how we can connect with each other and with our past in a deeper way, using this tech, skeptically. There is a technology to actually be more critical instead of just like editing, if something is truth or not, just try to be more critical about what you, read.
00;25;03;02 - 00;25;06;10
Geoff Nielson
Or. And how I missed it. What's that? What's that? Technology called.
00;25;06;12 - 00;25;08;13
Pau Aleikum Garcia
Skeptic there. Okay.
00;25;08;13 - 00;25;09;07
Geoff Nielson
Skeptic reader, I guess.
00;25;09;15 - 00;25;15;17
Pau Aleikum Garcia
Yeah, it's a Chrome and Firefox plugin, and I think it's very,
00;25;15;20 - 00;25;19;13
Geoff Nielson
Yeah. It's it. Yeah. As you said, it's it's it's a bullshit detector. Right.
00;25;19;15 - 00;25;24;03
Pau Aleikum Garcia
Exactly. If someone is listening to this podcast, they can actually proceed with this podcast.
00;25;24;03 - 00;25;27;15
Geoff Nielson
It's also it uses audio to not just text. Yes.
00;25;27;18 - 00;25;31;25
Pau Aleikum Garcia
Yes. Well, it uses the transcripts of you, the.
00;25;31;27 - 00;25;38;09
Geoff Nielson
I'm so curious to hear late to try it later and see, see if see if you're full of shit on this podcast. That's hilarious.
00;25;38;11 - 00;25;44;12
Pau Aleikum Garcia
It's always find something. Yeah. I'm sorry. I always find a way.
00;25;44;14 - 00;25;52;29
Geoff Nielson
Yeah. So do you typically pointed at, you know, journalism websites or, I don't know, government websites. What is it typically pointed out?
00;25;53;01 - 00;26;20;12
Pau Aleikum Garcia
Yes. Mainly like, media outlets. So it's more about like from Fox News to the Guardian and everything in between. But of course, now a lot of people is using YouTube as an information channel. And any kind of like, channel that is like BBC news, and kind of all day it's, on, on wars, on situations, political situations, natural disasters and so on.
00;26;20;14 - 00;26;26;29
Geoff Nielson
And how long does it, you just got to be curious about how long does it take to actually analyze an article or a piece of media?
00;26;27;02 - 00;26;29;18
Pau Aleikum Garcia
It's, we're talking about five seconds. Something like this.
00;26;29;24 - 00;26;39;00
Geoff Nielson
Yeah, yeah. So, I mean, that's that's such a powerful tool for consumers to be able to, you know, before they even consume something. Is this worth my time? You know, what kind of slant does it have?
00;26;39;00 - 00;27;06;05
Pau Aleikum Garcia
I'm starting to doubt that, there is something that we are a skeptic about our own creation, about whether that is because people is lazy in general. I don't want to say that, but a lot of people's is very like, likes comfort ability. And this plugin actually makes you uncomfortable sometimes because, for example, I, there are journalists that I love and I love how they write and how they think and so on.
00;27;06;11 - 00;27;25;17
Pau Aleikum Garcia
And I started to use this plugin with them, because I will not use it with journalists that I already can smell the bullshit. I will use it with the journalism that they like. And then I started to see certain biases and certain situations that I said, oh, this is making me think more and more and more. Right.
00;27;25;23 - 00;27;45;08
Pau Aleikum Garcia
And sometimes you need. Yeah. I mean, it's, I think is modifier. The people that is using it right now that I know mainly comes from the academy. Right. Always people this automatically in that space of research and really want to go to the to the the end of the stuff.
00;27;45;10 - 00;28;03;04
Geoff Nielson
Well and there's a, there's a separate issue. I mean maybe there's two separate issues. The first one is are you actually willing to stop reading journalists that you like, you know, if there's bias? And then the second one is, are there any is there any journalism that is truly free of bias or close to free of bias?
00;28;03;04 - 00;28;08;14
Geoff Nielson
Because if there isn't anywhere to turn to where there isn't that bias, you're kind of stuck, right?
00;28;08;16 - 00;28;14;06
Pau Aleikum Garcia
Well, I, I, I don't think so. I don't think that because something is bias. You should stop reading it, okay?
00;28;14;06 - 00;28;15;21
Geoff Nielson
You should. It's awareness.
00;28;15;23 - 00;28;37;21
Pau Aleikum Garcia
Yes. It's it just understanding that this is not neutral, that this is not the truth. It's just perspective. Right. And I think that is missed most of the time. And right away in this moment of polarization in most of the Western world, you can feel that like people say things as if there were the only truth possible.
00;28;37;23 - 00;29;01;21
Pau Aleikum Garcia
And this is very dangerous. I think, this category that actually was designed to kind of depolarize, to say, even if you have read people that you like, you can find problems in that in this sense. So yeah, it's also about showing that there is a lot of gray areas. I think now simplistic messages are a very, fashion.
00;29;01;21 - 00;29;08;01
Pau Aleikum Garcia
And I think we need more, sometimes more gray modal spaces of yeah, yeah, I don't know.
00;29;08;07 - 00;29;10;03
Geoff Nielson
More skepticism and nuance.
00;29;10;04 - 00;29;36;17
Pau Aleikum Garcia
Not yes. And not only skepticism is also about creating common spaces that, join forces of people from different perspectives. And I think if you can be critical with the people and journalists that you like, then you are opening a door for also people that seem totally different to to share these conversation and think, be more aware that you can be similar to them in, in so many ways.
00;29;36;17 - 00;29;41;27
Pau Aleikum Garcia
Right, right. So it's more about like building tools for critical thinking. Yeah.
00;29;41;29 - 00;30;00;17
Geoff Nielson
No, I love that. And you know what? To me, it comes back to what you were talking about with this, you know, sort of mission of how do you use these tools to, you know, unlock something within us or, you know, humanize us more in some way? Is that am I articulating that properly in terms of your, you know, sort of mission for the studio?
00;30;00;24 - 00;30;02;29
Geoff Nielson
Would you frame that a little bit differently?
00;30;03;01 - 00;30;34;17
Pau Aleikum Garcia
Well, so the studio domestic data streamers, actually the focus is, to fight indifference towards data, which is how we can actually transform statistics data and like this very raw kind of information landscape into something that can actually make people move. It was not long ago that there was like this huge new, here in Europe, but it was, I think, something even bigger in the States than it was the cost of corruption in the country, in the, in the whole continent.
00;30;34;20 - 00;30;53;27
Pau Aleikum Garcia
And it was like really hardcore. It was about a 90 billion, something like this. So I figured that when I saw it, I was angry. But I was not angry about the figure. I was angry about the journalists that put the figures there, because no one can understand the difference between 1 billion and 90 billion like only billionaires.
00;30;53;27 - 00;31;19;21
Pau Aleikum Garcia
That is a very small fraction of society. So I say, look like we truly need systems that can actually make us understand these situations because they're bigger realities. We're living today are big numbers, like when we talk about massive migration, climate change, economy, like all these things has to do with big numbers and statistics. And we our brains are not we don't have the hardware to actually understand what these identify behind these numbers.
00;31;19;29 - 00;31;47;06
Pau Aleikum Garcia
And if we don't understand that, how are we going to do anything to change it? If we don't get angry because there is corruption, because we cannot give, dimension to this corruption, how are we going to do anything to change it? I so for me actually at transform this raw data into something that is comprehensible, that native people can interact, an understanding of better level is a way of making a change, of making people want to do things differently.
00;31;47;09 - 00;32;02;09
Pau Aleikum Garcia
UN and this is like this more part of like this whole ecosystem that we tried to tackle with the studio that is transforming data and information and technology, information technology into systems that can help us understand better what is hidden behind.
00;32;02;12 - 00;32;24;27
Geoff Nielson
Right. So for people, for people out there who, you know, have data but are struggling to tell stories with it or are struggling to, you know, change behaviors or outcomes with it, do you do you have any principles or any, you know, beliefs around what are the best tactics for getting people's attention beyond just, you know, throwing a big number at them?
00;32;24;29 - 00;32;57;18
Pau Aleikum Garcia
I think the most important one is to talk about something that is important to the people that you're talking at. I think familiarity. That's why we use the name domestic, because we are always trying to figure out the most, domestic daily way to, to express information. The biggest I think the most common example is, whenever there is a fire that they give the size of the fire in swimming pools, in Olympic swimming pools.
00;32;57;21 - 00;33;24;04
Pau Aleikum Garcia
And, and this kind of brings kind of an idea of how big is that area. Right. So, bringing the always metaphors, we can only understand something when we can connect it to something that we already understand. Right. Yeah, but we need metaphors. And that's why it's so important to use art. Poetry, and, and design to actually bring this data to life.
00;33;24;07 - 00;33;51;08
Pau Aleikum Garcia
That's one, I think, very important, idea behind communicating information. And the second one is to truly talk about something why this is important, because sometimes we just grab information and we put it there, but we don't talk about the consequences of this information being there. So I think, a lot of times we forget the why why I should care about this week number.
00;33;51;10 - 00;34;11;01
Pau Aleikum Garcia
And this is more about contextualization. Give people context. So whenever you show the information, they say, well, okay, we need to change something, right? Otherwise it's just numbers and numbers leave there in this abstract, symbolic, you know, universe.
00;34;11;03 - 00;34;25;04
Geoff Nielson
So, so applying that back to the, you know, the, the case of the, the cost of corruption, did you, did you come to a way to represent that in a way to get people's attention? Was there was there an answer in that case, or is that one you're still figuring out.
00;34;25;07 - 00;34;47;19
Pau Aleikum Garcia
The of course, we we do a lot of projects around this. The first thing that we do is pointing out whenever there is data that is not being explained correctly. So, for example, banks are very good at explaining data, but they explain it in a way that is very comfortable for them. So for example, a bank will always say that they have recovered 1000 for 100 properties that were in default.
00;34;47;22 - 00;34;57;11
Pau Aleikum Garcia
They will never say that they have left 1400 homes uninhabited and 1000 for 100 families out of their homes, right?
00;34;57;13 - 00;34;58;19
Geoff Nielson
Yeah.
00;34;58;22 - 00;35;25;28
Pau Aleikum Garcia
So it's also about understanding the vocabulary that you use around information. The alpha museums that are used sometimes to talk about things. So something sometimes is just about giving clarity. What are you actually saying when you are saying this? What are you hiding? Because data of course can hide and a lot of information, language I think, not only describes the world but also comes to exit.
00;35;26;00 - 00;36;02;18
Pau Aleikum Garcia
Right. And if we define, and if we create a language for information to be there, correct, a very clear language, people will understand the world and will construct a world that is closer to a reality. And I think right now we live in, in our situation or in a moment that a big part of the society is kind of disentangled with the other, like we are like, so separated because we don't use even the same language of data, the same language of words, to describe the same the specific saying, what I.
00;36;02;21 - 00;36;22;06
Geoff Nielson
So I'm just, I'm just thinking about that now because, you know, you mentioned earlier that AI is not a neutral tool, right? It can be used. It's a tool can be used for anything. And it sounds like you believe the same thing about data, right? That despite our best intentions, data is not inherently neutral. It can be, you know, manipulated.
00;36;22;09 - 00;36;40;18
Geoff Nielson
So how do you how do you use that to depolarize? Like is there a way to use this to bring people together? Or what do we have to do to make sure that we're more informed about that? And can, you know, I think you mentioned earlier like find common spaces.
00;36;40;20 - 00;37;08;11
Pau Aleikum Garcia
Yeah. I going to this idea of, not neutrality. I think that there is a story that I love that was about the, during like one of the first rebellions of the Jews in Rome was over. The data was over the census because they did not want to be counted. Counting something in the first is the first form of power because it is understanding everything in its balance.
00;37;08;15 - 00;37;27;05
Pau Aleikum Garcia
So it's like giving kind of the size of something so you can control it. And at this problem is still happening in some parts of South America because, being in the census means that you have to be to be part of the military. Right. And a lot of people is trying to hide their kids, so they are not to on the census.
00;37;27;09 - 00;37;54;09
Pau Aleikum Garcia
So they cannot come to this house and pick up their kids when they are in need to go and join the militaries. Right? So data has always had something to do with power and control. And that's why I mean, data and specifically artificial intelligence is never neutral because it's something that is connected to a worldview. And, and someone holds it like it's a DNA, are a piece of information that you can use in, in advance.
00;37;54;11 - 00;38;21;13
Pau Aleikum Garcia
And at the same time, data is kind of reductionist because it's it's beautiful and it's very tempting to distribute the entire world into a single code. Right. And universal. So that can go back in any kind of phenomena like two hemispheres, five continents, male and female, animal and vegetable, singular and plural, right, left, four seasons, like we like to classify things into categories, right?
00;38;21;13 - 00;38;45;00
Pau Aleikum Garcia
It's very comfortable because it gives you like this false feeling that you control it, that you understand it. And we try to do that to, of course, have, a feeling that everything is under control and we can comprehend it. And actually, it has helped us a lot. But unfortunately, the world does not work like this. It has never worked like this.
00;38;45;03 - 00;39;12;18
Pau Aleikum Garcia
And and I think a big part of the polarization we are living right now is because this classifies nation by us that we have that we say, oh, this person is a liberal or this person is I don't you already usual the prejudices around that group to classify that person. And this is becoming more and more because I think algorithms, and artificial intelligence kind of give more power to this idea.
00;39;12;21 - 00;39;40;08
Pau Aleikum Garcia
So the idea that everything can be classified and predicted. So, how large what a god of based on the is, prejudice, biases, very strong biases that are very functional, of course, but that, kind of power or empower the intrinsic biases that we have as human beings. And I think the way to the escalate that is to kind of break these borders.
00;39;40;12 - 00;39;56;22
Pau Aleikum Garcia
So for example, is a category that is our way of doing that as we are showing the gray areas, even if they're in the people that do like, synthetic memories for me is a way of doing that because we are doing the same with different generations of people. Now, you can see what other people went through, right?
00;39;56;22 - 00;40;33;05
Pau Aleikum Garcia
And you can see your eyes to see that you don't need language like it's just a visual reconstruction on that. But I think it has also to do a lot with, the fields of and the professions that have always been leading certain tools. So, for example, artificial intelligence is, is a field that is preeminently, developed by white men like you and me and from, from, Western countries and, and, and most of the times they come from engineering, like informatics, engineering schools.
00;40;33;08 - 00;40;37;21
Pau Aleikum Garcia
So that's a very small portion of the world. And that's a bit like it's.
00;40;37;21 - 00;40;41;04
Geoff Nielson
Biased, right. They, they brings its own bias that. Yeah.
00;40;41;07 - 00;41;15;24
Pau Aleikum Garcia
So I think the future for artificial intelligence or a brighter future, it will be of course, part of a more diverse set of people thinking about it, like from lawyers to journalists to poets to all kinds of people from different walks, different countries trying to figure out which is the right dataset, which is the right bias, and building not one general area, but several very diverse, very biased towards different things, models that can be used for some things, one for some others.
00;41;15;24 - 00;41;24;25
Pau Aleikum Garcia
And I think that will be very helpful for the, the polarization, the fact that we don't have one way but several of them.
00;41;24;27 - 00;41;45;25
Geoff Nielson
So so this to me is one of the big, you know, trends and risks, I think of our time, right? Is can we democratize? I cannot go down a path of being, you know, open sourced and for everybody versus how much of it is controlled by the same for, you know, mega corporations and they have all the data about everybody.
00;41;46;03 - 00;41;56;09
Geoff Nielson
Are you are you optimistic about that power? Are you worried about it? How do you see that unfolding? And and you know, what role, if any, do you see. You know, your organization is playing there.
00;41;56;11 - 00;42;24;22
Pau Aleikum Garcia
You know, I, I was very pessimistic until this week when deep secure released and I think the open source kind of had a kick. I, I don't know anymore because the, the, the stage is changing so fast that it's really difficult to see if, private corporations or open source will kind of go wild, and win this kind of battle.
00;42;24;25 - 00;43;07;01
Pau Aleikum Garcia
I, I certainly see artificial intelligence as a kind of a centralization of power because at the end it's it's for a big corporation. It's easier to integrate the, the already existing tools that we are already using. Yeah. So it will concentrate more power. And of course, the other way, the other day I was talking with, someone at token AI and they were telling me, well, you know, the thing is that the data that you could get from Google is very, you can get a lot of information of someone, of course, but the data that you can get from the conversations of ChatGPT from one individual, we are going to another level
00;43;07;01 - 00;43;31;10
Pau Aleikum Garcia
of intimacy. Right. And I think it was just, we are going from the battle of attention from all the social media. I said and to one like, I want to stay your the maximum amount of time in my platform to a battle of intimacy that is more not so much about the amount of time, but about the amount of trust that you give to a certain private company.
00;43;31;12 - 00;43;49;27
Pau Aleikum Garcia
And and for me, this is like probably the most dangerous part that we are going like beyond a certain amount of, I know, control and trust to certain corporate institutions that of course, even though they try their best sometimes to do like the right thing.
00;43;49;29 - 00;43;51;13
Geoff Nielson
Maybe they may out of.
00;43;51;16 - 00;44;15;07
Pau Aleikum Garcia
They they are corporate institutions and they have stakeholders and they have to make a revenue out of it. So they will prioritize the revenue over a certain social well-being. As we have seen with meta and other organizations. And so I think that that's very problematic. But of course, as well, as long as there are other options, it's okay.
00;44;15;10 - 00;44;17;20
Pau Aleikum Garcia
Yeah, I think ya.
00;44;17;23 - 00;44;35;21
Geoff Nielson
So I want to come back to that, that was, that was super, super enlightening. And it gave me a lot to think about. And I could I could follow that train of thought for a very, very long time. But but just, you know, in the interest of time as well, I do want to come back to, domestic data streamers and some of the projects you know, you're working on.
00;44;35;28 - 00;44;47;25
Geoff Nielson
I mean, where do you see this technology going in the next few years? And what, you know, upcoming projects, you know, that you're willing to talk about. You have, in the pipeline.
00;44;47;27 - 00;45;15;06
Pau Aleikum Garcia
So right now what we are exploring is how we can help, certain organizations with, information entanglement that we say so, for example, here in Spain, one of the while we were talking the other day with, it's kind of the, a public service institution for, for children that have no burns. Right? So they they are hosted in a public institution.
00;45;15;08 - 00;45;38;09
Pau Aleikum Garcia
And they have a lot of kids there. Sadly, they don't have enough time to publish all their, to report all the things that are happening there because the ministry asked them to report on every little thing that happened. So a big part of the time of their caregivers is focused on reporting, bureaucracy.
00;45;38;11 - 00;45;39;20
Geoff Nielson
Not care.
00;45;39;23 - 00;46;08;07
Pau Aleikum Garcia
Not care. So actually, what was scary to me is that one of the social workers tell me, told me that from the five days of work, there are three that are spanning bureaucracy. Oh, my country only involving kids. And then so what we are trying to do is like workflows are for example, they can use voice knowledge from like Telegram or Signal and they will be transformed directly into a report with their right language, everything correct.
00;46;08;07 - 00;46;29;07
Pau Aleikum Garcia
And then they will need to draft it in just like five minutes, install instead of two hours. And only with that we were like kind of getting that we were actually improving two days less of working bureaucracy. So that's of course, two days more of go, go. And I'm taking care of the kids and these for me, like green lighting.
00;46;29;09 - 00;46;54;05
Pau Aleikum Garcia
And other than these, we are doing projects mainly with cultural institutions and museums. We, we will be releasing a new project at the Barbican in London, I think in April, March, I, I'm not sure when, and other than these, we are doing several projects more connected to the way, museums can use artificial intelligence in, in their environment.
00;46;54;10 - 00;47;12;08
Pau Aleikum Garcia
So for example, how you can explain history from a more contemporary, perspective, how you can personalize, the narratives to the different audiences that you have. And this kind of, of examples and exercises that we like, really enjoy.
00;47;12;11 - 00;47;23;09
Geoff Nielson
Oh, that's so, that's so interesting. And, you know, comes back, I guess, to the, you know, to the viewpoint of how do you make it for individual people versus just one, you know, kind of blanket engagement?
00;47;23;09 - 00;47;49;19
Pau Aleikum Garcia
Well, you know, one of the last projects we did, it was really fun. It was around a facial that we were using these facial recognition models to detect whenever a politician was falling asleep in the, the Congress or the parliament. So we created this, bots that will connect to streaming cameras from Congress of all different countries and create head maps of them.
00;47;49;22 - 00;47;54;27
Pau Aleikum Garcia
Sleep areas of the Congress and parliaments, because, of course, no one is watching.
00;47;54;28 - 00;47;56;13
Geoff Nielson
No, no. Yeah.
00;47;56;16 - 00;48;16;27
Pau Aleikum Garcia
But we can now with this bodies. So I think that that an interesting project because it turns AI technology of control in the technology of accountability. Right. So the idea is to release this algorithm and make it open source for anyone who want to use it in their own parliament, and probably in a couple of months ago.
00;48;16;29 - 00;48;42;07
Geoff Nielson
I love that, you know, just just taking a moment to reflect on, you know, the amazing and interesting stuff you're doing around, as you just said, accountability. But also, you know, recovering memories, you know, democratizing AI, humanizing data. You know, you mentioned that, you know, domestic, you know, the domestic data streamers project started more than ten years ago.
00;48;42;07 - 00;48;50;23
Geoff Nielson
Did you have any idea that this is where this was going? And, you know, how does it compare in your mind to that, you know, original vision of what you guys would be up to?
00;48;50;25 - 00;49;16;22
Pau Aleikum Garcia
No, no, there was no vision at all. Had the beginning of VR. It was kind of. Yes. I think a group of friends with Alexandra and Adrian and Paul and Axel, trying to figure out like which was the the right way to explore data and art, which was the thing that we were curious about. I think that was, mythical quote from Bruce Mao, the designer.
00;49;16;25 - 00;49;57;17
Pau Aleikum Garcia
And he said something like, if you had a design studio, the main job you have, as the name suggests, is to study. Right. And I think this is something that, we have always tried to do, to understand that our work is not only about creating things, but to understand them. And I think that that, that kind of have brought us here, in not just solving problems on trying to find the right solution for the specific needs of a client or, or a partner, but more trying to understand what is the problem that is behind, the technologies that exist in the environment.
00;49;57;17 - 00;50;14;09
Pau Aleikum Garcia
And it's like having the right environment to later on find solutions are fine ideas that can help with that and sometimes are very lateral, not what you expected at all. So and in that sense, it was never expected to get here and do these kind of projects.
00;50;14;12 - 00;50;30;03
Geoff Nielson
I love that that's you know, it's so exciting and the stuff you're doing is so cool. So I can't wait to see, you know, what's coming out of the, the studio next. I want to say a big thank you to for joining us today. You've given me, lots and lots to think about. So really appreciate it.
00;50;30;03 - 00;50;31;07
Geoff Nielson
Thanks for being here.
00;50;31;09 - 00;50;32;10
Pau Aleikum Garcia
It thanks for having me.



The Next Industrial Revolution is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Daniel Pink Discusses
Daniel Pink: How the Future of Work Is Changing
Today on Digital Disruption, we’re joined by Daniel Pink, New York Times best-selling author and workplace expert.
Our Guest Pau Garcia Discusses
Pau Garcia: How AI Is Connecting People With Their Pasts
Today on Digital Disruption, we’re joined by Pau Garcia, media designer and founder of Domestic Data Streamers.
Our Guest Ramin Hasani Discusses
CEO of Liquid AI Ramin Hasani Says a Worm Is Changing the Future of AI
Today on Digital Disruption, we're joined by Ramin Hasani, co-founder and CEO of Liquid AI and a machine learning Scientist.
Our Guest Conor Grennan Discusses
Chief AI Architect at NYU: How to Adopt AI Without Causing Chaos
Today on Digital Disruption, we’re joined by Conor Grennan, Chief AI Architect at NYU Stern School of Business and a best-selling author.