Privacy And Power: The Illusion Of Choice (Part 3) | NBC Nightly News

Channel: NBC News
Published: 12/10/2019 09:33 PM

Once data collection companies have compiled a detailed profile of who you are, they can effectively use that information to serve you targeted ads of things you’ll most likely want to buy. That same information, when in the wrong hands, can be used for covert manipulation of your choices. In th...

[ music ], just because you may personally feel as if you have nothing to hide. I think that potentially undermines what i would call our collective privacy, meaning that you can think of privacy as something that's completely individual and only applies to you. But at the same time, society needs privacy. We don't function as a society without some sense of agreement over what is private and what ...
is not. You think you have a moral responsibility to run a platform that protects our democracy, yes or no. Yes, have users of facebook who were caught up in the cambridge analytical debacle been notified, yes, we're starting. It starts with the launch of facebook platform, which was 2000 to 72008. We pre apps pre mobile era and what facebook platform enabled people to do was for third-party developers to create apps on facebook. This is the era of farmville and words with friends. Mark says what makes facebook better than the rest online are its privacy features. First, you have to belong to one of its 40,000 approved networks. Second, only you control what is on your profile and, most importantly, who has access to it and one of the things that facebook did when they created platform.

Is they allowed third-party apps not only to access your data? They allowed these third-party apps to have access to all of your friends profile information, so not just their namesnot, just their photos, but many of the same fields that apps could get about you. So what you like, the group's you, belong to, if you filled out all of those different fields about what books you liked and music you like, and, i think importantly, political affiliation apps - could get access to all of that. Now they weren't supposed to keep that data forever and they weren't supposed to sell it. But what we saw was that alexander kogan at university of cambridge, he created an app that people added to their profiles. They were paid an essence to do this and take a survey, and so he was able to collect data from the people who addedthe app to their profiles, their survey data from the survey they took and then all of the information from all their friends. So he had something like 300,000 people at his app. He was able to get something like 70 million facebook users data from the original 300,000 and that's because he could get access to every single person on every individual, app users friend list. So from that, he had a very robust data set that he then took and sold the cambridge analytic. Oh, it wasn't illegal, but he definitely broke his agreement with facebook as a third-party developer. You'Re not allowed or you're not supposed to be able to do that. What is clear to menow that i made a mistake and not appreciating how people would feel about us using their data, and for that i'm deeply. Sorry, we thought collecting people's data like we did, was completely normal accepted and that people whose data was being collected and transferred knew that it was regularly happening.

So there's clearly a big consent issue there around the fact that i wouldn't know if you added this app to your profile necessarily and that it had my data, but the other piece of it to what facebook has about us. The things we tell them, which is mostly what your profile data includes, is really rich. It'S not just where i shopped. It'S not only wherei've gone, it's what i care about it's, what i'm thinking about in some cases, it's much more qualitative data, so cambridge analytic. I had that data and then, as we probably most of us know, they then use that to target people in the 2016 election over the course of my career. I have seen a number of challenges to our democracy. The russian government's effort to interfere in our election is among the most serious it kept bothering music really about proud of us. Is it really about the fact that all those millions of users did not click? I agree. Was it really about that? And i don't think it's about that, because there is a hugeinformation feast, everybody's coming everybody's eating everybody's, getting a piece everybody's getting a share. We agree that everybody will do that. But then one day someone crashed our party and you know, took a bite without asking. Is that what we're really care about? I don't think so.

I think what happened in the cambridge analytic a story is a really good showcase for a problem that we're gon na see more and more of, and that's the problem of manipulating people's choices without them being aware based on the information collected about them. So we already know that information about us is being collected. We already know that this information is being analyzed to create a very specific detailed, profileof us, but the next part is beyond targeting products and services is to understand how we think and to try to kind of mess with our brain in a way. We don't even understand or know and just give us the feeling that we actually chose a over b when we didn't when the choices were presented, in a way that it was very, very likely based on who we are, that we're gon na go with a and Not with me, this power to influence people's choices is not new in the supermarket. The bakery is always located at the very far end of the store, because they want to attract you in you're, going to smell those pastriesyou're going to walk all the way in and then on the way back, you might grab something. No, so the candies are always in the eye level of children, so they'll ask their parents also tipping options. They ran a study in new york for new york taxis and they found that they give three options for tipping, and most people chose the middle option because we we always look for like something that he's not risky extreme, seem to extremes that we go at the Middle, but what is different about today is that power to architecture became personalized. Okay, it's not about general human weaknesses, it's about your own unique weaknesses and we get to architect choiceon an individual level. So if you respond well to fear-mongering messages, that's what we're going to show you so psychologists studied the search engine, manipulation effect and they found that with undecided voters, they were able to change people's perspectives by in the extremes, 80 %. Just by rearranging search results. Google can influence your choice very effectively, they've tested it google's algorithm is, it is a trade secret? We don't know what goes into the algorithm. All we know is that we get search results and we don't know if they're being manipulated or not.

I give google the credit that they do not do that, but they could [ music ] jigsaw. Is a google owned, think tank and jigsawran an experiment that turned into the redirect method that is still in use. The redirect method works like this. It identifies potential isis recruits because when you search on google, we can learn a lot from your searches, you're very interested in their work. You look for information about isis and then, after you're being identified as a potential recruit, google will start showing you a sponsored content. Playlists from youtube with videos, that's counter isis propaganda. I could never condone the act of those who take the law into their own hands and kill civilians he's wrong simple. It illustrates the power to just redirect to architect people's choices, and, even though i think this is an exampleof a way to personally architect, choice that is done responsibly, it could definitely turn into covert manipulation and the cambridge analytic. Our business model is exactly that when you have a large number of different personality types and you target to those different personality types, information based on what will be mostly effective. This is manipulation. People thought they're, making a choice, but they didn't. I think this is why people were so upset about cambridge analytics.

This is to me is the greatest concern, because once choice becomes illusionary, we are no longer autonomous human beings, and at this point we are at a crucial risk to any democratic value. We believe in it's about the power andpower that all these companies and the government and everybody has just by collecting all this information about us and knowing us, is so damn well, [ music, ], hey nbc news fans, thanks for checking out our youtube channel subscribe by Clicking on that button down here and click on any of the videos over here to watch, the latest interviews show highlights and digital exclusives thanks for watching.

Watch Next