WEBVTT 1 00:00:01.160 --> 00:00:04.360 Billions of people use the Internet every day, 2 00:00:05.200 --> 00:00:09.360 but the Internet and the companies which dominate it are using them too. 3 00:00:10.520 --> 00:00:12.880 We live in an online world. 4 00:00:13.080 --> 00:00:18.360 There are 8.5 billion searches on Google daily, while for many it seems 5 00:00:18.360 --> 00:00:22.160 that smartphones have become extensions of our very selves. 6 00:00:22.320 --> 00:00:26.640 Many of the services we use every day on our phones and laptops are free, 7 00:00:27.360 --> 00:00:30.760 but we pay in another way with our personal information. 8 00:00:31.790 --> 00:00:36.630 Big Move was not only to invade personal experience to turn 9 00:00:36.640 --> 00:00:40.630 it into data, but then to claim those data as the 10 00:00:40.640 --> 00:00:42.620 private property of the corporation. 11 00:00:42.630 --> 00:00:46.990 Wow, Monetizing data catapulted companies like Google from being 12 00:00:46.990 --> 00:00:50.070 on profitable startups to multi billion dollar empires in 13 00:00:50.070 --> 00:00:52.060 the space of a few years. 14 00:00:52.070 --> 00:00:55.950 But how our data is being used by these companies is now a point 15 00:00:56.030 --> 00:00:57.860 of intense debate. 16 00:00:57.870 --> 00:01:00.950 Because this is so new, we have sort of an awful lot of panic. 17 00:01:02.160 --> 00:01:05.440 You know, it reminds me a lot of, you know, with the days when you know 18 00:01:05.480 --> 00:01:07.869 Elvis Presley was a threat to democracy and you know, 19 00:01:07.880 --> 00:01:10.830 punk was a threat to democracy and bicycles were going 20 00:01:10.840 --> 00:01:13.300 to make young women unfertile. 21 00:01:13.310 --> 00:01:16.980 Surveillance, capitalism or democracy? 22 00:01:17.470 --> 00:01:19.100 We can have one or we can have the other. 23 00:01:19.110 --> 00:01:20.740 We cannot have both. 24 00:01:20.750 --> 00:01:25.750 I call it a death match between these two global institutional orders. 25 00:01:25.990 --> 00:01:29.190 Will the status quo hold amid increasing legal pressure? 26 00:01:30.110 --> 00:01:33.630 And I think we will gradually move to a system where 27 00:01:33.630 --> 00:01:37.230 we don't take these companies as like heroes and you know, 28 00:01:37.270 --> 00:01:41.190 Zuckerberg or some some Google executive as like the new messiahs. 29 00:01:41.590 --> 00:01:43.259 But we'll realize they're just a company. 30 00:01:43.270 --> 00:01:44.580 They do make mistakes as well. 31 00:01:44.590 --> 00:01:47.229 They need proper regulation and we need to deal with it. 32 00:01:47.360 --> 00:01:50.400 It's coming up on this episode of Business Beyond. 33 00:01:50.870 --> 00:01:52.253 The word surveillance 34 00:01:52.253 --> 00:01:53.586 has negative connotations. 35 00:01:53.630 --> 00:01:54.926 It implies being watched 36 00:01:54.926 --> 00:01:56.460 or spied upon, usually without 37 00:01:56.460 --> 00:01:58.110 knowledge or consent. 38 00:01:58.190 --> 00:02:00.700 And it's a word that's increasingly linked to companies like 39 00:02:00.710 --> 00:02:04.510 Google and Meta through the phrase surveillance capitalism. 40 00:02:08.490 --> 00:02:11.810 The phrase was coined by the author Shoshana Zuboff in her 41 00:02:11.810 --> 00:02:14.440 book The Age of Surveillance Capitalism. 42 00:02:14.450 --> 00:02:17.200 She believes this so-called surveillance capitalism, represents a 43 00:02:17.210 --> 00:02:20.850 new economic model which claims human experience and personal 44 00:02:20.850 --> 00:02:25.250 information as a raw material to be exploited for commercial purposes. 45 00:02:25.970 --> 00:02:29.690 And she says it all started here at Google in the late 1990s and 46 00:02:29.690 --> 00:02:31.710 early 2000s, CEO of Google. 47 00:02:32.313 --> 00:02:34.633 at the beginning of the of the.com era. 48 00:02:36.350 --> 00:02:38.317 Some young men sat 49 00:02:38.490 --> 00:02:40.553 around the table trying to figure out 50 00:02:40.910 --> 00:02:43.830 how they were going to make money at Google. 51 00:02:43.840 --> 00:02:46.550 How to make money out of search. 52 00:02:47.520 --> 00:02:50.760 It seemed like everything that could be commodified 53 00:02:51.200 --> 00:02:55.880 already had been commodified, and so there was a lot of head 54 00:02:55.880 --> 00:03:01.360 scratching going on until Larry Page and his cohort Larry Page 55 00:03:01.360 --> 00:03:03.190 being the founder of Google. 56 00:03:03.200 --> 00:03:06.240 One of the founders, Larry Page and Sergey Brin. 57 00:03:07.610 --> 00:03:12.889 He came up with the breakthrough and the breakthrough was. 58 00:03:14.440 --> 00:03:16.240 The next virgin forest? 59 00:03:17.280 --> 00:03:24.000 Ready for commodification was human experience itself. 60 00:03:24.000 --> 00:03:27.200 She says,Google developed increasingly sophisticated ways to use data they 61 00:03:27.200 --> 00:03:30.030 had gotten from Internet searches to make predictions about 62 00:03:30.040 --> 00:03:32.190 people's future online behaviour. 63 00:03:32.200 --> 00:03:37.040 They used the data to analyse it to predict human behaviour, 64 00:03:37.320 --> 00:03:38.990 and that's what they sell. 65 00:03:39.000 --> 00:03:42.600 And their breakthrough prediction was the click through rate. 66 00:03:43.110 --> 00:03:46.390 That launched online targeted advertising. 67 00:03:46.430 --> 00:03:50.230 We associate Google, also known by its parent company name Alphabet, 68 00:03:50.350 --> 00:03:52.180 with search above all else. 69 00:03:52.190 --> 00:03:55.190 But Google is known for many other services such as Google Maps, 70 00:03:55.550 --> 00:03:59.630 Gmail and other companies that owns, such as YouTube, Self Driving Branch, 71 00:03:59.630 --> 00:04:02.710 Waymo and the fitness electronics system Fitbit. 72 00:04:03.230 --> 00:04:06.350 Google has also been active over the years, making acquisitions from 73 00:04:06.350 --> 00:04:09.750 the likes of YouTube to hundreds of other companies, big and small, 74 00:04:09.910 --> 00:04:14.310 in a variety of fields, despite this diversification of service. 75 00:04:15.080 --> 00:04:17.310 The vast majority of Google's huge revenues more than 76 00:04:17.320 --> 00:04:20.260 $257 billion in 2021. 77 00:04:20.270 --> 00:04:22.940 Come from advertising more than 80%. 78 00:04:22.950 --> 00:04:25.860 Google did not respond to our request for an interview. 79 00:04:25.870 --> 00:04:29.630 However, its privacy policy does say that it uses data to improve its 80 00:04:29.630 --> 00:04:33.630 overall service and to customize the user experience. 81 00:04:34.360 --> 00:04:36.540 Let's say you search for mountain biking. 82 00:04:36.550 --> 00:04:40.110 We use what you search for other searches you've made, your location, 83 00:04:40.110 --> 00:04:42.380 and what other people did when they searched for mountain 84 00:04:42.390 --> 00:04:43.820 biking to find you the results. 85 00:04:43.830 --> 00:04:46.380 You're looking for mountain biking trails near you. 86 00:04:46.390 --> 00:04:51.390 What Google outlines there is a simplified explanation of how it uses 87 00:04:51.390 --> 00:04:53.990 data to optimise search results. 88 00:04:54.390 --> 00:04:56.620 But here's where the real money comes from. 89 00:04:57.610 --> 00:04:59.870 Depending on your settings, we may also use your info to show 90 00:04:59.880 --> 00:05:01.800 you personalized ads. 91 00:05:01.960 --> 00:05:05.160 So next time you're on YouTube, you might see an ad for biking gear. 92 00:05:06.080 --> 00:05:09.200 Google says it doesn't share anything with advertisers, which personally 93 00:05:09.200 --> 00:05:12.950 identifies people, so your personal information is safe with us. 94 00:05:14.279 --> 00:05:17.760 But critics like Zuboff say our personal information is anything 95 00:05:17.760 --> 00:05:20.510 but safe with Google, and that they have built and nurtured 96 00:05:20.520 --> 00:05:23.480 a system as dangerous as it is lucrative. 97 00:05:24.440 --> 00:05:27.440 So the really strange thing about this is while it's called 98 00:05:27.440 --> 00:05:30.720 personalization, that is a very. 99 00:05:32.440 --> 00:05:34.880 Cruel euphemism. 100 00:05:35.390 --> 00:05:39.750 For the fact that they want to be able to identify us, 101 00:05:39.950 --> 00:05:41.900 but not to help us. 102 00:05:41.910 --> 00:05:48.070 They want to take our data from us, but not to use it to improve our 103 00:05:48.070 --> 00:05:50.590 lives in any substantive way. 104 00:05:51.560 --> 00:05:55.600 What they simply want is to be able to identify our lives in order to 105 00:05:55.600 --> 00:05:58.640 extract the data, in order to create predictions, 106 00:05:58.640 --> 00:06:01.760 in order to sell, in order to generate revenues, 107 00:06:01.760 --> 00:06:03.710 in order to generate profit. 108 00:06:03.720 --> 00:06:06.600 That's the world that we live in. 109 00:06:07.070 --> 00:06:10.310 Not everyone agrees with the phrase surveillance capitalism. 110 00:06:10.310 --> 00:06:12.980 I can understand that that might be a good way to try 111 00:06:12.990 --> 00:06:14.270 to sell a book. 112 00:06:14.790 --> 00:06:16.820 I can understand that that's a lucrative way to 113 00:06:16.830 --> 00:06:18.460 get speaker engagements. 114 00:06:18.470 --> 00:06:19.460 I can understand it. 115 00:06:19.470 --> 00:06:21.900 It's eye-catching, it's attention grabbing. 116 00:06:21.910 --> 00:06:27.390 It sounds very scary, but it's very difficult to 117 00:06:27.390 --> 00:06:30.470 think of a way that that connects to any kind of tangible reality. 118 00:06:30.750 --> 00:06:34.310 Tech analyst Benedict Evans takes particular issue with 119 00:06:34.310 --> 00:06:36.590 the word surveillance. 120 00:06:36.710 --> 00:06:39.950 I find surveillance actually kind of offensive as a term to use 121 00:06:40.390 --> 00:06:42.900 because to me this sort of expropriates the suffering of 122 00:06:42.910 --> 00:06:45.500 people who lived in actual surveillance states, 123 00:06:45.510 --> 00:06:47.660 the people who actually lived in East Germany. 124 00:06:47.670 --> 00:06:49.500 You know, if the government is interested in 125 00:06:49.510 --> 00:06:52.220 you, they have 15 people following you around and 126 00:06:52.230 --> 00:06:55.310 they have microphones in your home and they're, you know, 127 00:06:55.310 --> 00:06:57.420 one of your friends is informing on you. 128 00:06:57.430 --> 00:06:59.300 That's what surveillance means. 129 00:06:59.310 --> 00:07:00.420 Surveillance is not. 130 00:07:00.430 --> 00:07:02.940 You read a car magazine and you looked at 10 car ads. 131 00:07:02.950 --> 00:07:05.150 So therefore will show you a car ad. 132 00:07:05.550 --> 00:07:08.750 There is an argument that the data economy is the price we pay 133 00:07:08.750 --> 00:07:11.020 for the huge leaps in innovation and technological 134 00:07:11.030 --> 00:07:13.180 progress of the last two decades. 135 00:07:13.190 --> 00:07:14.963 There was a time when 136 00:07:14.963 --> 00:07:16.718 the chemical industry, which produced 137 00:07:16.777 --> 00:07:18.270 magical things we'd 138 00:07:18.270 --> 00:07:19.510 never seen before. 139 00:07:19.510 --> 00:07:24.310 Was allowed to just dump its excess pollution into 140 00:07:24.310 --> 00:07:27.910 our common waters and into our rivers, and it took a 141 00:07:27.910 --> 00:07:32.670 long time before it was forced to face up to its responsibilities. 142 00:07:32.910 --> 00:07:34.780 We're at the same stage now. 143 00:07:34.790 --> 00:07:39.030 In fact, I think right now we're in the middle of Dieselgate for tech. 144 00:07:40.110 --> 00:07:44.110 Everyone likes cars, they like how they purr, they really 145 00:07:44.110 --> 00:07:45.630 like petrol cars. 146 00:07:46.600 --> 00:07:49.840 And yet there's a moment of responsibility where you 147 00:07:49.840 --> 00:07:52.950 must acknowledge that what we have is not sustainable. 148 00:07:52.960 --> 00:07:56.880 Mass media does cause problems, you know, But you kind of have to 149 00:07:56.880 --> 00:08:00.920 stand back and think, yes, but is it a really a terrible thing 150 00:08:01.000 --> 00:08:03.470 that we can have a one hour video call from New York 151 00:08:03.480 --> 00:08:05.800 to Germany for free? 152 00:08:06.470 --> 00:08:07.340 Do we want to get rid of that? 153 00:08:07.350 --> 00:08:10.180 Really, Do we want to, do you know this call would have cost 154 00:08:10.190 --> 00:08:11.900 like $200 in 1980? 155 00:08:11.910 --> 00:08:14.110 Do you think that would have been better? 156 00:08:15.080 --> 00:08:17.350 We've already touched on the complex world of 157 00:08:17.360 --> 00:08:21.040 online targeted advertising, but let's look a little deeper now. 158 00:08:21.800 --> 00:08:26.040 The Irish Council for Civil Liberties is a non profit organization which 159 00:08:26.040 --> 00:08:28.950 focuses much of its work on privacy violations. 160 00:08:28.960 --> 00:08:32.910 Last year it released a much publicized report on a complex 161 00:08:33.040 --> 00:08:36.550 and critical component of the online targeted advertising industry. 162 00:08:37.030 --> 00:08:38.380 Load a web page. 163 00:08:38.390 --> 00:08:41.260 There are rectangles on the web page that will contain 164 00:08:41.270 --> 00:08:47.420 ads, and often you'll notice there's a split second where the material 165 00:08:47.429 --> 00:08:51.750 you're trying to read gets bounced down the page because the ad has just 166 00:08:51.750 --> 00:08:53.429 been dropped in. 167 00:08:53.870 --> 00:08:56.830 What's happening there is that there's an auction 168 00:08:56.840 --> 00:08:58.380 for your attention. 169 00:08:58.390 --> 00:09:01.509 This auction is known as real time bidding. 170 00:09:02.210 --> 00:09:06.710 Known commonly as RTB, real time bidding is one of the most 171 00:09:06.710 --> 00:09:09.830 effective and controversial forms of online advertising. 172 00:09:09.840 --> 00:09:12.540 The entirely automated process is an auction where you, 173 00:09:12.550 --> 00:09:14.300 the user, is being bid over. 174 00:09:14.309 --> 00:09:16.200 It all happens in the milliseconds between you 175 00:09:16.210 --> 00:09:18.989 clicking a link and that website opening. 176 00:09:19.150 --> 00:09:21.700 After the user clicks a certain website, 177 00:09:21.710 --> 00:09:24.460 large volumes of their personal data and browsing 178 00:09:24.470 --> 00:09:27.460 history is shared with prospective advertisers. 179 00:09:27.470 --> 00:09:30.910 Advertisers can then see how valuable it will be for that user 180 00:09:30.910 --> 00:09:32.109 to see their ad. 181 00:09:32.150 --> 00:09:34.460 Bids are placed by advertisers with the highest and 182 00:09:34.470 --> 00:09:35.830 most relevant bid winning. 183 00:09:35.840 --> 00:09:39.139 That ad is then placed on the website the user has just clicked. 184 00:09:40.120 --> 00:09:43.640 Now, what that means is that every time you visit a commercial web page, 185 00:09:44.240 --> 00:09:47.320 nearly every single rectangle is sending information 186 00:09:47.340 --> 00:09:50.540 about what you're reading and where you are in the real world 187 00:09:50.600 --> 00:09:53.150 to a very, very large number of companies. 188 00:09:53.160 --> 00:09:55.959 And we don't know what they do with the data. 189 00:09:56.670 --> 00:09:59.990 In the 2022 report, the Irish Council for Civil Liberties 190 00:09:59.990 --> 00:10:03.300 described RTB as one of the biggest data breaches in history. 191 00:10:03.790 --> 00:10:06.910 It says that on average, a person in the US has their online activity 192 00:10:06.910 --> 00:10:12.390 and location exposed 747 times a day by the RTB industry alone. 193 00:10:12.840 --> 00:10:15.990 In Europe, it's 376 times a day. 194 00:10:16.670 --> 00:10:19.660 It's said that every year RTB broadcasts user info 195 00:10:19.670 --> 00:10:24.190 about people in the US and Europe 178 trillion times. 196 00:10:24.429 --> 00:10:28.540 The industry was worth around $117 billion in 2021 alone. 197 00:10:29.550 --> 00:10:32.770 But all advertising is targeted in one way or another. 198 00:10:32.990 --> 00:10:36.420 And there's the argument that in theory, that's not a bad thing. 199 00:10:36.960 --> 00:10:38.400 Procter and Gamble want to show 200 00:10:38.400 --> 00:10:39.881 ads for nappies to people who have 201 00:10:39.910 --> 00:10:40.966 babies and not show them 202 00:10:40.966 --> 00:10:42.309 to people who don't. 203 00:10:42.309 --> 00:10:45.390 And they would kind of like to know which ads work and which ads don't. 204 00:10:45.870 --> 00:10:48.929 They don't care what your baby's name is at all. 205 00:10:49.090 --> 00:10:50.950 They don't care if your baby's circumcised. 206 00:10:50.960 --> 00:10:53.100 They don't want to know anything about you. 207 00:10:53.110 --> 00:10:55.140 They just want to show happy ads to people who have 208 00:10:55.150 --> 00:10:57.700 babies and not show them to people who don't. 209 00:10:57.710 --> 00:11:00.950 And I think we can get very kind of bound up on the idea that there's 210 00:11:00.950 --> 00:11:04.030 a sort of Stasi, like, personal file on every individual and 211 00:11:04.030 --> 00:11:06.060 somebody could go and look at it and read out you read, 212 00:11:06.070 --> 00:11:06.980 read about you. 213 00:11:06.990 --> 00:11:08.149 No. 214 00:11:08.590 --> 00:11:10.780 Somebody went to Facebook and said I want to show ads to 215 00:11:10.790 --> 00:11:14.429 nappies to 10,000 people in Frankfurt who've got babies. 216 00:11:15.150 --> 00:11:17.580 But critics say targeted advertising has become 217 00:11:17.590 --> 00:11:20.790 about profit maximization for Google and Meta and uses 218 00:11:20.790 --> 00:11:23.949 private data in a way which is unnecessary. 219 00:11:24.550 --> 00:11:27.300 In order to have high quality contextual advertising 220 00:11:27.309 --> 00:11:30.260 that is useful to everyone and makes everybody money, 221 00:11:30.270 --> 00:11:32.460 we literally don't need what we built. 222 00:11:32.470 --> 00:11:33.909 But we built it. 223 00:11:34.230 --> 00:11:37.590 The Austrian lawyer and privacy activist Max Schrems has launched 224 00:11:37.590 --> 00:11:40.540 several legal campaigns against Facebook over the years for 225 00:11:40.550 --> 00:11:42.670 its privacy rights violations. 226 00:11:43.600 --> 00:11:45.236 If you buy a product and you know you 227 00:11:45.236 --> 00:11:46.280 really need the address to deliver 228 00:11:46.280 --> 00:11:47.520 the product, fair enough, 229 00:11:47.520 --> 00:11:48.932 you need the address. 230 00:11:49.360 --> 00:11:52.110 But that doesn't mean you can sell it to 10 other people and you can track 231 00:11:52.120 --> 00:11:55.100 me and try to kind of put a profile around me and so on. 232 00:11:55.110 --> 00:11:58.350 And I think what's really interesting is the profit margins. 233 00:11:58.350 --> 00:12:01.469 Like, no one doubts that they should make a profit, 234 00:12:01.510 --> 00:12:03.220 but a lot of that profit can be done 235 00:12:03.230 --> 00:12:05.820 without tracking every little bit of users. 236 00:12:05.830 --> 00:12:07.900 It's really about profit maximization. 237 00:12:07.910 --> 00:12:10.820 And I think that is where this surveillance capital is. 238 00:12:10.830 --> 00:12:14.710 Narrative kind of works quite well to say it's really a 239 00:12:14.710 --> 00:12:17.750 system where it's like, how can I even push more money out of 240 00:12:17.750 --> 00:12:19.660 the last bit of information? 241 00:12:19.670 --> 00:12:20.220 He says. 242 00:12:20.230 --> 00:12:23.060 It is important to make a distinction between ads linked to a 243 00:12:23.070 --> 00:12:25.780 user's direct searches on Google and those of companies 244 00:12:25.790 --> 00:12:28.580 which are reliant on bundling a user's personal information 245 00:12:28.590 --> 00:12:30.709 together to build a profile. 246 00:12:31.720 --> 00:12:35.480 Google is really good at advertising because they right now 247 00:12:35.480 --> 00:12:37.350 have you right there when you want something. 248 00:12:37.360 --> 00:12:40.520 If you put in red shoes, you actually want to buy red shoes in 249 00:12:40.520 --> 00:12:42.950 that moment and you can put an advertisement there, but they 250 00:12:42.960 --> 00:12:45.030 don't need to know who you are in that situation. 251 00:12:45.040 --> 00:12:48.080 And I think that is a type of advertisement, for example, 252 00:12:48.120 --> 00:12:51.400 that usually is very effective but not overly privacy invasive. 253 00:12:51.559 --> 00:12:54.800 There's another huge player in the global advertising business 254 00:12:55.080 --> 00:12:58.679 that hasn't had the luxury of its own search engine to help 255 00:12:58.679 --> 00:13:00.679 build its data mountain. 256 00:13:01.110 --> 00:13:04.990 Meta Platforms controls Facebook, Instagram and WhatsApp, 257 00:13:05.030 --> 00:13:08.110 Some of the biggest social networks and messaging platforms in the world. 258 00:13:09.150 --> 00:13:12.060 Facebook has found itself mired in multiple data privacy 259 00:13:12.070 --> 00:13:13.740 controversies over the years. 260 00:13:13.750 --> 00:13:16.589 One of the biggest, Cambridge Analytica. 261 00:13:16.630 --> 00:13:19.580 That scandal revealed that a political consultancy called 262 00:13:19.590 --> 00:13:22.990 Cambridge Analytica gained access to the private data of 263 00:13:22.990 --> 00:13:27.429 more than 80 million Facebook users and used that data to target them 264 00:13:27.470 --> 00:13:30.870 with political ads in advance of the 2016 Brexit 265 00:13:30.870 --> 00:13:34.630 referendum and the US presidential election that same year. 266 00:13:36.280 --> 00:13:38.630 It led to Facebook CEO Mark Zuckerberg 267 00:13:38.640 --> 00:13:41.920 being hauled before U.S. Congress in a now infamous appearance. 268 00:13:42.590 --> 00:13:43.620 I started Facebook. 269 00:13:43.630 --> 00:13:47.510 I run it and I'm responsible for what happens here. 270 00:13:47.510 --> 00:13:51.070 Mr. Zuckerberg, would you be comfortable sharing with us the name 271 00:13:51.070 --> 00:13:53.149 of the hotel you stayed in last night? 272 00:13:59.150 --> 00:14:05.360 No. Max Schrems was on a semester abroad studying law in California 273 00:14:05.600 --> 00:14:08.319 when a Facebook lawyer spoke to his class. 274 00:14:08.360 --> 00:14:10.750 Shrimps was surprised by how little the lawyer seemed to 275 00:14:10.760 --> 00:14:15.679 know about European privacy law, and so Shrems asked Facebook to show 276 00:14:15.679 --> 00:14:18.030 him all the data they had on him. 277 00:14:18.040 --> 00:14:20.630 So, for example, when I first got my data from 278 00:14:20.640 --> 00:14:22.873 Facebook, I was able to demonstrate 279 00:14:22.873 --> 00:14:24.230 that my deleted data was still there. 280 00:14:24.230 --> 00:14:26.740 That they, for example, tried to figure out my geolocation 281 00:14:26.750 --> 00:14:29.750 without me ever sharing any location information with them. 282 00:14:30.070 --> 00:14:32.580 And that is the stuff that is very hard to find out. 283 00:14:32.590 --> 00:14:36.790 Especially, for example, users do have a right to ask for a 284 00:14:36.790 --> 00:14:39.220 copy of their data, but right now most of these 285 00:14:39.230 --> 00:14:40.940 large companies just don't comply with that. 286 00:14:40.950 --> 00:14:43.580 And the regulators also don't go into their service and 287 00:14:43.590 --> 00:14:46.180 actually check what's what's in the background. 288 00:14:46.190 --> 00:14:49.670 Like Google, Facebook makes the vast majority of its revenues from 289 00:14:49.670 --> 00:14:54.190 advertising, using data to tap into the targeted advertising goldmine is 290 00:14:54.190 --> 00:14:57.750 a key reason it has outlasted so many other social media platforms. 291 00:14:58.350 --> 00:15:01.220 For a long time, Facebook needed all that information 292 00:15:01.230 --> 00:15:03.980 because they were not with you in the search situation, 293 00:15:03.990 --> 00:15:06.500 in a situation where you actually wanted something. 294 00:15:06.510 --> 00:15:08.580 So they had to kind of figure out other ways that 295 00:15:08.590 --> 00:15:10.940 you wanted something and needed more data. 296 00:15:10.950 --> 00:15:13.940 Because Meta owns both WhatsApp and Instagram, 297 00:15:13.950 --> 00:15:16.190 the question often arises. 298 00:15:16.470 --> 00:15:20.270 Can the company take data from those platforms and use 299 00:15:20.270 --> 00:15:22.340 them for targeted ads on Facebook? 300 00:15:22.350 --> 00:15:25.990 The German competition authority has taken a case against 301 00:15:25.990 --> 00:15:29.670 matter and said it is not lawful for you to combine 302 00:15:29.670 --> 00:15:33.550 data between your subsidiaries WhatsApp, Instagram, 303 00:15:33.630 --> 00:15:38.150 Facebook and of course, their virtual reality subsidiary Oculus. 304 00:15:38.710 --> 00:15:42.190 Now that case has gone all the way up to Luxembourg to 305 00:15:42.190 --> 00:15:45.390 Europe's highest court, and we're waiting to see what 306 00:15:45.390 --> 00:15:46.660 the answers are. 307 00:15:46.670 --> 00:15:49.830 But this issue about the combination of data. 308 00:15:50.230 --> 00:15:54.590 Within Facebooks or metas properties, including WhatsApp, 309 00:15:54.910 --> 00:15:55.900 that's a live issue. 310 00:15:55.910 --> 00:15:57.430 That's a real thing. 311 00:15:57.590 --> 00:15:59.540 For many years, WhatsApp was not end 312 00:15:59.550 --> 00:16:02.420 to end encrypted, but has been since 2016. 313 00:16:02.430 --> 00:16:04.820 Encryption is a form of data scrambling which prevents 314 00:16:04.830 --> 00:16:07.869 unauthorized parties accessing the information. 315 00:16:08.750 --> 00:16:13.270 But that doesn't mean that what we do on WhatsApp can't be used as data. 316 00:16:13.870 --> 00:16:17.390 It may even provide a clue into one of the great mysteries of 317 00:16:17.390 --> 00:16:19.180 the smartphone age. 318 00:16:19.190 --> 00:16:20.500 Are they listening to us? 319 00:16:20.510 --> 00:16:22.620 WhatsApp is actually quite a good example because it used 320 00:16:22.630 --> 00:16:25.830 to be you paid one year or a year and were let alone. 321 00:16:26.230 --> 00:16:28.620 Now that model Facebook simply took away. 322 00:16:28.630 --> 00:16:32.270 They just said you're not going to be the €1.00, but we're going to 323 00:16:32.270 --> 00:16:35.180 get your metadata, and metadata is something no one 324 00:16:35.190 --> 00:16:37.140 really understands in the first time around. 325 00:16:37.150 --> 00:16:40.100 But it basically says the content of your communication is encrypted. 326 00:16:40.110 --> 00:16:43.110 Fine, but we're going to use who writes to whom, 327 00:16:43.110 --> 00:16:45.940 at what time and how often, and that allows you to build 328 00:16:45.950 --> 00:16:47.470 this social network. 329 00:16:48.440 --> 00:16:51.030 Metadata may show that two people chat a lot in the evening 330 00:16:51.040 --> 00:16:52.110 or on weekends. 331 00:16:52.120 --> 00:16:55.800 The timing and intensity of their communication may lead Meta to make 332 00:16:55.800 --> 00:16:59.080 very accurate predictions on their relationship. 333 00:16:59.630 --> 00:17:02.460 And a good example is where people oftentimes have this feeling of, 334 00:17:02.470 --> 00:17:04.780 oh you know, this advertisement listened to me 335 00:17:04.790 --> 00:17:07.060 because I chatted with my friend about this topic. 336 00:17:07.070 --> 00:17:10.670 And now suddenly as he advertisement, a typical way how that's really 337 00:17:10.670 --> 00:17:14.859 done is they know that your friend was really looking on this one topic 338 00:17:14.920 --> 00:17:16.740 all over the place for the last two weeks. 339 00:17:16.750 --> 00:17:18.640 Then they know there was a lot of communication. 340 00:17:18.650 --> 00:17:20.660 So they just give the advertisement to you as well, 341 00:17:20.670 --> 00:17:23.990 assuming that probably your friend was chatting about the one 342 00:17:23.990 --> 00:17:26.550 thing he's obsessed about for last week. 343 00:17:26.790 --> 00:17:28.660 And that is quite interesting because people 344 00:17:28.670 --> 00:17:31.930 think they're listened or spied on their communication. 345 00:17:32.270 --> 00:17:34.940 But these big data analytics and the metadata that you can get, 346 00:17:34.950 --> 00:17:38.390 for example from WhatsApp allows very accurate and kind of quite 347 00:17:38.410 --> 00:17:41.950 creepy advertisement without even going that far. 348 00:17:42.410 --> 00:17:45.580 We asked Meta for an interview, they declined, 349 00:17:45.590 --> 00:17:47.790 but they did give us this statement. 350 00:17:47.869 --> 00:17:50.700 Protecting the privacy and security of people's data is 351 00:17:50.710 --> 00:17:53.160 fundamental to how our business works. 352 00:17:53.170 --> 00:17:56.859 That's why we've invested heavily in tools like Privacy Checkup and adds 353 00:17:56.869 --> 00:18:00.100 preferences that provide more transparency and controls for 354 00:18:00.109 --> 00:18:03.389 people to understand and manage their privacy settings. 355 00:18:04.070 --> 00:18:07.100 We're committed to respecting our users privacy and we welcome 356 00:18:07.109 --> 00:18:10.129 engagement with regulators which helps us to do this. 357 00:18:10.550 --> 00:18:13.630 Companies like Google and Meta and other tech giants such 358 00:18:13.650 --> 00:18:17.150 as Apple, Amazon and Microsoft have changed their data privacy 359 00:18:17.170 --> 00:18:20.950 practices over the years in the wake of scandals and political pressure. 360 00:18:21.170 --> 00:18:23.940 However, today they are facing arguably more legal scrutiny 361 00:18:23.950 --> 00:18:27.190 than ever before for how they handle our personal data. 362 00:18:27.920 --> 00:18:31.159 The European Union is driving the regulatory crackdown. 363 00:18:31.270 --> 00:18:35.840 Its signature legislation and privacy was implemented in 2018. 364 00:18:35.840 --> 00:18:38.710 That's the General Data Protection Regulation, 365 00:18:38.720 --> 00:18:41.119 known commonly as GDPR. 366 00:18:41.560 --> 00:18:43.750 So the GDPR does two things. 367 00:18:43.760 --> 00:18:46.880 It puts a number of obligations on the people that want 368 00:18:46.900 --> 00:18:50.590 to use your information, and it gives a number of rights to 369 00:18:50.600 --> 00:18:55.200 yourself vis A vis the persons who are using your data. 370 00:18:56.760 --> 00:18:58.910 The legislation has its implications for the way firms and 371 00:18:58.920 --> 00:19:03.080 organisations across Europe handle personal data, but has it impacted 372 00:19:03.080 --> 00:19:06.080 the lucrative business of online targeted ads? 373 00:19:06.410 --> 00:19:12.230 So while the GDPR does not outright ban any of the practices, 374 00:19:12.230 --> 00:19:16.340 it does regulate them and it puts high sanctions on them, 375 00:19:16.350 --> 00:19:18.300 which was not the case in the old regime. 376 00:19:18.310 --> 00:19:22.430 So it has forced the big tech, big tech companies to 377 00:19:22.670 --> 00:19:28.630 take those rules into account, and it has brought them under a high 378 00:19:28.650 --> 00:19:30.300 level of regulatory scrutiny. 379 00:19:30.310 --> 00:19:35.270 They're very much being watched for what they what they do. 380 00:19:36.070 --> 00:19:40.070 We discussed Real Time bidding the online auction for our attention. 381 00:19:41.230 --> 00:19:46.100 GDPR now requires that companies and websites must ask us for explicit 382 00:19:46.109 --> 00:19:50.270 permission before they can take the kind of data this process requires. 383 00:19:50.470 --> 00:19:53.470 It has resulted in something that is much more 384 00:19:53.470 --> 00:19:56.710 familiar to all of us is the famous cookie consent 385 00:19:56.710 --> 00:20:00.190 boxes you see popping up on many websites. 386 00:20:00.670 --> 00:20:04.830 And the requirement there is of course to obtain a consent from 387 00:20:04.830 --> 00:20:10.750 the individual before you can put a cookie on his device, 388 00:20:10.750 --> 00:20:13.420 which is often the first step to collecting his data. 389 00:20:13.430 --> 00:20:16.149 That's often how it works technically. 390 00:20:16.160 --> 00:20:18.970 And it's not enough for companies to assume if you keep 391 00:20:18.980 --> 00:20:21.250 browsing, you're giving consent. 392 00:20:21.260 --> 00:20:24.500 Max Trem says it's wrong that ordinary citizens are expected to 393 00:20:24.500 --> 00:20:28.300 understand the extremely complex ways in which their data is being used. 394 00:20:30.230 --> 00:20:33.270 Not even if you work for Google, you probably fully understand 395 00:20:33.270 --> 00:20:34.460 how everything works. 396 00:20:34.470 --> 00:20:37.160 Everything that Google does work because you're just working on this 397 00:20:37.170 --> 00:20:40.910 one little piece here and that is quite interesting that we just 398 00:20:40.920 --> 00:20:45.510 overload the the consumer and think that that's a fairway of of dealing 399 00:20:45.510 --> 00:20:48.950 with it, which I mean as an asterisk, the law actually doesn't allow that. 400 00:20:48.950 --> 00:20:50.660 But that's at least the narrative of what what 401 00:20:50.670 --> 00:20:52.699 companies put forward there. 402 00:20:52.710 --> 00:20:55.220 Penalties for data privacy breaches have gotten more 403 00:20:55.230 --> 00:20:57.580 severe since GDPR came in. 404 00:20:57.590 --> 00:21:00.750 Fines in the €100 million range have become common. 405 00:21:01.420 --> 00:21:05.940 In January this year, Meta was fined nearly €400 billion for forcing users 406 00:21:05.940 --> 00:21:09.220 to accept targeted ads, but privacy campaigners say 407 00:21:09.220 --> 00:21:11.690 the legislation is still not being properly enforced. 408 00:21:11.700 --> 00:21:17.609 What we've seen is almost no material enforcement, the 409 00:21:17.619 --> 00:21:21.980 GDPR still is not yet real and we've we've just had 410 00:21:21.980 --> 00:21:25.740 a situation where we were able to manage. 411 00:21:27.310 --> 00:21:30.179 We we managed to prevail on the European Commission. 412 00:21:30.340 --> 00:21:34.780 To start monitoring all large scale cases across Europe. 413 00:21:35.300 --> 00:21:39.220 And the reason we had to do that was because the national enforcers and 414 00:21:39.220 --> 00:21:42.820 the Lander enforcers were not producing fast enough. 415 00:21:43.060 --> 00:21:46.740 And when decisions emerged, those decisions, particularly from 416 00:21:46.740 --> 00:21:50.330 Ireland and Luxembourg did not solve the problem of big tech. 417 00:21:50.340 --> 00:21:55.220 The Commission for a long time had about had kind of the story of the 418 00:21:55.220 --> 00:21:56.770 GPR as a success story. 419 00:21:56.780 --> 00:22:00.040 Don't rain on the parade, just say it's great and move on. 420 00:22:00.550 --> 00:22:03.750 And I think we have to be honest, like as a legislative proposal, 421 00:22:03.750 --> 00:22:05.429 it's actually quite. 422 00:22:05.460 --> 00:22:07.290 Kind of top notch in the world. 423 00:22:07.300 --> 00:22:10.859 It's probably not the best where have 100 years down the road probably to 424 00:22:10.859 --> 00:22:14.220 make it better but it's quite a significant like level in a lot of 425 00:22:14.220 --> 00:22:17.740 other countries copy it But that doesn't mean that it's 426 00:22:17.740 --> 00:22:20.650 perfect and that doesn't mean that the enforcement works well. 427 00:22:20.660 --> 00:22:22.850 We've looked at some of the complex ways in which our 428 00:22:22.859 --> 00:22:26.060 data is used and the increasing legal challenges 429 00:22:26.060 --> 00:22:28.020 that system is facing. 430 00:22:28.060 --> 00:22:33.420 But what's ultimately at stake here some say democracy itself in my 431 00:22:33.420 --> 00:22:35.540 country in the United States. 432 00:22:35.750 --> 00:22:37.550 The head of our federal drug. 433 00:22:38.100 --> 00:22:39.899 Administration. 434 00:22:40.380 --> 00:22:46.820 Told the world in this year, in the year 2022, that the number one 435 00:22:47.460 --> 00:22:55.470 the primary cause of death in America in 2022. 436 00:22:55.470 --> 00:22:57.510 Is disinformation. 437 00:22:58.500 --> 00:23:00.139 Disinformation. 438 00:23:01.109 --> 00:23:03.310 False information intended to mislead. 439 00:23:04.300 --> 00:23:08.260 Many believe its rising power is directly tied to the 440 00:23:08.260 --> 00:23:12.260 targeted advertising models which support the modern Internet. 441 00:23:13.660 --> 00:23:17.660 It's free to use, free to post, and that it is 442 00:23:17.660 --> 00:23:19.090 underwritten and supported 443 00:23:19.090 --> 00:23:20.500 by the advertising infrastructure 444 00:23:20.500 --> 00:23:24.380 of programmatic advertising, which is is what we just discussed. 445 00:23:25.950 --> 00:23:29.830 Because it's free and there's so there's literally no 446 00:23:29.830 --> 00:23:32.889 cost to using it from the point of view of a consumer, 447 00:23:33.430 --> 00:23:36.669 and because every consumer can be a producer. 448 00:23:37.790 --> 00:23:40.230 Then malicious producers spring up. 449 00:23:41.310 --> 00:23:43.940 Disinformation is a direct consequence of 450 00:23:43.950 --> 00:23:47.030 the economic mechanisms of surveillance capitalism. 451 00:23:47.220 --> 00:23:51.980 And that it in in itself has become murderous, 452 00:23:52.460 --> 00:23:56.660 the primary source of death in American Society. 453 00:23:56.700 --> 00:24:00.220 So this is something that is it's it's literally 454 00:24:00.540 --> 00:24:03.340 so brazen and outrageous. 455 00:24:04.310 --> 00:24:09.030 So counter to human society, so counter to democratic aspirations. 456 00:24:10.670 --> 00:24:13.260 Matter, through Facebook and WhatsApp, has been caught up 457 00:24:13.270 --> 00:24:16.149 in multiple scandals over disinformation. 458 00:24:16.340 --> 00:24:18.690 Most recently, the whistleblower and former Facebook 459 00:24:18.700 --> 00:24:22.580 employee Francis Haugen gave damaging testimony to U.S. Congress 460 00:24:22.580 --> 00:24:25.580 about the way Meta deals with hate speech and disinformation. 461 00:24:26.460 --> 00:24:29.210 But there is plenty of disagreement over who is 462 00:24:29.220 --> 00:24:32.480 ultimately responsible for the spread of disinformation. 463 00:24:32.980 --> 00:24:35.650 It's very easy to talk about misinformation and say, well, 464 00:24:35.660 --> 00:24:37.340 Facebook should take this down. 465 00:24:38.310 --> 00:24:39.420 What does that mean? 466 00:24:39.430 --> 00:24:41.859 You know, if I send you a text message that says 467 00:24:41.869 --> 00:24:44.990 vaccines cause cancer, we don't expect Deutsche Telekom to 468 00:24:45.030 --> 00:24:47.109 intercept that test message. 469 00:24:47.460 --> 00:24:49.450 Message and stop you from getting it. 470 00:24:49.460 --> 00:24:52.290 If I make you a phone, if I were on the phone and I say I think 471 00:24:52.300 --> 00:24:55.010 vaccines cause cancer, we don't expect the phone company to 472 00:24:55.020 --> 00:24:57.219 pick that conversation up and cut it off. 473 00:24:58.190 --> 00:24:59.700 There is also the argument that so-called 474 00:24:59.710 --> 00:25:03.030 threats to democracy do not always materialize as such. 475 00:25:04.580 --> 00:25:07.290 There's a wonderful quote from Douglas Adams who said 476 00:25:07.300 --> 00:25:10.340 that, you know, anything that existed before you were 477 00:25:10.340 --> 00:25:13.600 20 or 10 is just the way the world has always been. 478 00:25:13.740 --> 00:25:17.180 And anything created when you're a teenager in your 20s is amazing and 479 00:25:17.180 --> 00:25:19.770 wonderful and exciting and you can have a career in it and 480 00:25:19.780 --> 00:25:22.919 anything created after you about 30 is a threat to democracy. 481 00:25:23.060 --> 00:25:25.609 We've come to the end of our look into the myriad ways in 482 00:25:25.619 --> 00:25:28.850 which our personal data drives the digital economy, 483 00:25:28.859 --> 00:25:31.580 and it seems we've arrived at a fork in the road. 484 00:25:31.660 --> 00:25:33.609 A big question looms. 485 00:25:33.619 --> 00:25:35.659 Where do we go from here? 486 00:25:36.580 --> 00:25:40.220 We have strong and clear law and we're now in front 487 00:25:40.220 --> 00:25:42.939 of courts in multiple jurisdictions. 488 00:25:43.100 --> 00:25:46.540 I don't think that the current situation inside 489 00:25:46.540 --> 00:25:49.660 tech will really exist for much longer. 490 00:25:50.780 --> 00:25:53.130 This is why I think we're at this Dieselgate moment. 491 00:25:53.140 --> 00:25:56.700 This is the last chapter, but it has taken far, 492 00:25:56.700 --> 00:25:58.810 far too long to get here. 493 00:25:58.820 --> 00:26:03.020 And although the legislator did its job, we have the right law. 494 00:26:03.380 --> 00:26:05.650 The enforcers have completely failed. 495 00:26:05.660 --> 00:26:09.300 I think it's really interesting to compare the Internet with cars. 496 00:26:09.900 --> 00:26:13.300 Like here is this sort of transformative technology that 497 00:26:13.300 --> 00:26:15.930 changes how we live, that changes how society works, 498 00:26:15.940 --> 00:26:18.530 that changes how city work, cities work. 499 00:26:18.540 --> 00:26:20.250 And it comes with a bunch of problems. 500 00:26:20.260 --> 00:26:21.659 That's complicated. 501 00:26:22.630 --> 00:26:26.430 Most of the questions that come up in technology again, are complicated. 502 00:26:26.430 --> 00:26:29.430 And they're not complicated because they're complicated because their 503 00:26:29.430 --> 00:26:33.950 policy, tech policy isn't any easier or any simpler than education 504 00:26:33.950 --> 00:26:36.940 policy or energy policy policy or transport policy. 505 00:26:36.950 --> 00:26:37.700 It's all complicated. 506 00:26:37.710 --> 00:26:39.270 Policy is complicated. 507 00:26:40.220 --> 00:26:43.010 Shoshana Zubov still believes in what she calls our 508 00:26:43.020 --> 00:26:46.820 democratic digital future, despite the dystopian imagery of much 509 00:26:46.820 --> 00:26:47.899 of her work. 510 00:26:50.100 --> 00:26:56.859 We need lawmakers to join together to chart the legislative path, to create 511 00:26:56.859 --> 00:27:02.060 the scaffolding so that we can do this for the sake of 512 00:27:02.340 --> 00:27:06.859 every democratic society and every society struggling to 513 00:27:06.859 --> 00:27:12.260 become a democracy, because without that, we will cede 514 00:27:12.260 --> 00:27:17.780 the death match to these forces of surveillance and control. 515 00:27:19.230 --> 00:27:21.250 That have a very different. 516 00:27:21.260 --> 00:27:25.380 Ambition for our future, one that involves the substitution 517 00:27:25.380 --> 00:27:27.459 of computational governance. 518 00:27:28.430 --> 00:27:33.150 Where they rule for democratic governance, Where the people rule. 519 00:27:33.750 --> 00:27:36.100 That's all from this edition of Business Beyond. 520 00:27:36.109 --> 00:27:39.300 If you want to see more from us, check out our playlist. 521 00:27:39.310 --> 00:27:42.140 A good place to start would be our recent video on how the war in 522 00:27:42.150 --> 00:27:44.820 Ukraine has transformed the global economy. 523 00:27:44.830 --> 00:27:47.630 Thank you for watching, and until the next time, take care.