Speaker 1: So you are critical of Facebook, uh,
Speaker 1: but not of the people itself
Speaker 1: because you actually write here that,
Speaker 1: uh, I don't think
Speaker 1: this is about the people when somebody asked,
Speaker 1: uh, if you, uh,
Speaker 1: Zuckerberg should step down,
Speaker 1: you said I believe this is about the business model,
Speaker 1: if you don't change the business model,
Speaker 1: it doesn't matter who's running it.
Speaker 2: That's, I believe that to the bottom of my heart.
Speaker 2: I, I had a wonderful relationship with Mark as
Speaker 2: a mentor back 2006 to 2009.
Speaker 2: You know I had
Speaker 2: a wonderful relationship with Sheryl Sandberg,
Speaker 2: helped bring her into the company,
Speaker 2: and I have enormous respect for them
Speaker 2: just as I do for Larry and Sergey at Google,
Speaker 2: you know I think these are brilliant people,
Speaker 2: but I think the culture of Silicon Valley,
Speaker 2: the culture of the business world today in
Speaker 2: this unregulated environment and
Speaker 2: combined with the sort of brilliance
Speaker 2: of these, these people,
Speaker 2: and the sort of general sense of
Speaker 2: exceptionalism this notion that
Speaker 2: there are no rules that apply
Speaker 2: to the smartest people in the Valley,
Speaker 2: that's what got us into trouble and,
Speaker 2: you know, when I look at this,
Speaker 2: it's now become a huge issue for democracy,
Speaker 2: for public health, for privacy
Speaker 2: and frankly for innovation and growth.
Speaker 1: So when do you think Facebook
Speaker 1: crossed the line in its business model?
Speaker 2: You know I wish I knew Hope.
Speaker 2: I mean the truth of the matter is I'm a lifetime male,
Speaker 2: I se- I spent 35 years covering
Speaker 2: tech and I was at the tail end of my career.
Speaker 2: You know, Facebook was one of the last big
Speaker 2: investments I made and I
Speaker 2: wasn't paying close enough attention
Speaker 2: so I missed th- the inflection.
Speaker 2: When I stopped being an insider in 2009, so a decade ago,
Speaker 2: they didn't yet have the current business model
Speaker 2: that really sort
Speaker 2: of came in 2012,2013.
Speaker 2: And the truth is it was invented by Google in 2002,
Speaker 2: it was really this notion that when Google is
Speaker 2: improving the search results in 2002,
Speaker 2: they noticed that they
Speaker 2: only need about one percent of the day,
Speaker 2: they gather to do that.
Speaker 2: And so they went to see was there any value in
Speaker 2: the rest of the data and they just got wow,
Speaker 2: there's some behavioral prediction.
Speaker 2: So then they decide, well,
Speaker 2: we need to know who these people are.
Speaker 2: So they create Gmail
Speaker 2: and then they basically tell you we're,
Speaker 2: we're going to look at your messages
Speaker 2: because we're trying to find out what you think.
Speaker 2: And so at Gmail,
Speaker 2: they not only lo- find out who people are,
Speaker 2: they find out if you're trying to behavioral prediction.
Speaker 2: E-mails are a really great way to
Speaker 2: find out where people are going to do next.
Speaker 2: Then they do Google Maps so we're going to
Speaker 2: find out where everybody is and
Speaker 2: they add all of these things and then
Speaker 2: they start doing street view where they drive
Speaker 2: in public places and basically create
Speaker 2: a market for the data
Speaker 2: that used to be shared by all of us.
Speaker 1: Right.
Speaker 2: And then they do Google Glass to
Speaker 2: kind of get in our face and do that up close.
Speaker 2: And the problem with this model is that,
Speaker 2: they've basically taken all of
Speaker 2: this stuff without our permission.
Speaker 2: You know, yeah, we may check off a box in the thing,
Speaker 2: but we don't actually know what we're doing, right?
Speaker 2: And that they've gone to great pains to hide it.
Speaker 2: And Google perfected this model,
Speaker 2: they've been doing for a long time to the point where
Speaker 2: first they get a behavior prediction, then they create.
Speaker 1: Right. But your book is about Facebook.
Speaker 2: No, it's about the whole problem. It starts with
Speaker 2: Facebook [OVERLAPPING] because I was Mark's mentor.
Speaker 2: And so that's what I saw
Speaker 2: first and plus it's a great title, right.
Speaker 1: Yes, of course.
Speaker 2: And so that's why I went,
Speaker 2: but it's really about this whole model.
Speaker 2: So it's Google, Facebook, Amazon, Microsoft, Horizon.
Speaker 3: As Mark's mentor, what's
Speaker 3: fundamentally different about the Mark that
Speaker 3: you knew and the Mark that is
Speaker 3: post Instagram acquisition,
Speaker 3: [OVERLAPPING] WhatsApp acquisition.
Speaker 2: Brad I wish I knew.
Speaker 2: I loved Mark, I thought he
Speaker 2: was a really great person to work with.
Speaker 2: And again, he had lots of
Speaker 2: mentors when I was there and eventually
Speaker 2: the business became more mature and
Speaker 2: so I was not the right guy to keep doing that.
SPEAKER 1: I think success gave everybody in
SPEAKER 1: that company a sense
SPEAKER 1: that everything they touch turn to gold.
SPEAKER 1: They became resistant to any kind of criticism or
SPEAKER 1: negative feedback and I think it got very
SPEAKER 1: hard for them to imagine that
SPEAKER 1: anybody would use their products differently than
SPEAKER 1: they intended and so as a consequence when
SPEAKER 1: I went to them in October of 2016.
SPEAKER 1: I wasn't expecting him to kind of immediately roll over,
SPEAKER 1: but I was hoping they would do
SPEAKER 1: an investigation and figure out,
SPEAKER 1: was there a structural problem with the ad products?
SPEAKER 1: With the algorithms and
SPEAKER 1: the business model that let bad people
SPEAKER 1: hurt innocent people and then of course at
SPEAKER 1: the election let people change the outcome of elections.
FEMALE 1: Have you heard from Zuckerberg after?
SPEAKER 1: I haven't heard from anybody at Facebook since
SPEAKER 1: February 2017 and I understand that also.
SPEAKER 1: I was the wrong messenger for market share.
SPEAKER 1: I mean I spent three months privately trying to
SPEAKER 1: persuade Facebook to do what
SPEAKER 1: Johnson and Johnson did with
SPEAKER 1: Tylenol in 1983 when somebody poisoned bottles,
SPEAKER 1: and what Boeing should have done with the 737 MAX,
SPEAKER 1: which is to recognize your first duty is to
SPEAKER 1: protect the people who use your product and you have
SPEAKER 1: to just stop doing business long
SPEAKER 1: enough to find the problems and fix them
SPEAKER 1: and you know for
SPEAKER 1: whatever reason they didn't find that helpful and so,
SPEAKER 1: that's when I became an activist.
SPEAKER 2: For Facebook, what should consent
SPEAKER 2: requirements entail for leveraging people's data?
SPEAKER 1: I think the great question to ask Brad is,
SPEAKER 1: why is it legal for companies to
SPEAKER 1: make a third party market in private data.
SPEAKER 1: Why is it legal for
SPEAKER 1: credit card processors to
SPEAKER 1: sell our credit card transaction history.
SPEAKER 1: Why is it legal for
SPEAKER 1: cell phone companies to sell our location.
SPEAKER 1: Why is it legal for
SPEAKER 1: health apps to sell our wellness data.
SPEAKER 1: Why is it legal for anybody to
SPEAKER 1: sell data about where we go on the Internet.
SPEAKER 1: Why is it legal to even collect data on minors.
SPEAKER 1: I think that we have to roll all of that stuff
SPEAKER 1: back and then ask the question, when is it okay?
SPEAKER 2: Well, their argument is that, they're
SPEAKER 2: just indexing it and
SPEAKER 2: then leveraging it to put ads on top.
SPEAKER 1: No! That is their-that is
SPEAKER 1: their argument but that's
SPEAKER 1: not what they're actually doing.
SPEAKER 1: What they're really doing is they're making
SPEAKER 1: behavior of predictions and then using
SPEAKER 1: filter bubbles and they're using
SPEAKER 1: recommendation engines to make those things come
SPEAKER 1: true and let me give you an example of
SPEAKER 1: the kind of thing we should be worried about.
SPEAKER 1: So, when you go into
SPEAKER 1: it like a news site and they ask you,
SPEAKER 1: are you a robot?
SPEAKER 1: Look at these pictures, right?
SPEAKER 1: Do you see the cars or do you see the street signs.
SPEAKER 1: That's called captcha, It's a Google product.
SPEAKER 1: Well captcha isn't really
SPEAKER 1: to figure out if you're a human.
SPEAKER 1: It's to help train
SPEAKER 1: their artificial intelligence for the car.
SPEAKER 1: They figure out you're a human from
SPEAKER 1: the mouth's movement but here's the deal,
SPEAKER 1: if I get a little bit older and
SPEAKER 1: my mouth's movement gets slower
SPEAKER 1: and let's say it gets shaky.
SPEAKER 1: That might be the first sign of
SPEAKER 1: something like Parkinson's disease.
SPEAKER 1: Now, they're under no legal requirement
SPEAKER 1: to alert me that I may have a neurological problem.
SPEAKER 1: In fact, they're not even governed by HIPAA.
SPEAKER 1: So, they don't even have to protect my privacy.
SPEAKER 1: They're free to sell that to
SPEAKER 1: the highest bidder in
SPEAKER 1: the United States that's almost certainly
SPEAKER 1: my insurance company who wants to raise
SPEAKER 1: my rates or terminate my insurance and my point here is,
SPEAKER 1: markets are about both sides
SPEAKER 1: having roughly equal information.
SPEAKER 1: Now, we have this situation where
SPEAKER 1: any company goes to market has to go through Google,
SPEAKER 1: Facebook or Amazon who
SPEAKER 1: have perfect information on everybody and
SPEAKER 1: all we have is what they choose to show
SPEAKER 1: us and I just want to have that conversation.
SPEAKER 1: To me that-that's what the book is about.
SPEAKER 1: That's what this whole debate-with
SPEAKER 1: 2020 with the election coming up.
SPEAKER 1: We have the perfect time.
SPEAKER 1: Every candidate should have to
SPEAKER 1: declare what they're going to do about this stuff.
FEMALE 1: For sure. So, you have
FEMALE 1: a new Congress now and you've got people in there
FEMALE 1: who are looking at this and you're also very tech savvy
FEMALE 1: and we understand it, which is fantastic.
FEMALE 1: We know regulation is always
FEMALE 1: slow to catch up to innovation, right?
SPEAKER 1: Yeah that's true.
FEMALE 1: So-So we maybe are just as a-
FEMALE 1: as a budding industry Digital media is right?
FEMALE 1: Maybe that is a -really quick.
FEMALE 1: I do agree with you in a lot of ways that
FEMALE 1: social media you say has enabled
FEMALE 1: personal views that have previously
FEMALE 1: kept in check by social pressure as well.
FEMALE 1: But is it too late to roll these things back in
FEMALE 1: a social culture that we've
FEMALE 1: developed through these platforms.
SPEAKER 1: This is the really good news.
SPEAKER 1: So, the first thing is we're
SPEAKER 1: not going to lose the things we
SPEAKER 1: like in this transition, because one,
SPEAKER 1: these guys would- they're not going to be
SPEAKER 1: unprofitable if we make the changes I'm
SPEAKER 1: talking about they would be insane to
SPEAKER 1: leave the market and how long
SPEAKER 1: do you think it would take people to
SPEAKER 1: have alternatives to these products.
SPEAKER 1: You could create alternate-I
SPEAKER 1: was at a tech thing the other day I said,
SPEAKER 1: what do you think the over and
SPEAKER 1: under is and how long it would
SPEAKER 1: take you to replace Facebook and people said two weeks.
FEMALE 1: I mean the platforms and satellites and
FEMALE 1: even the behaviors that
FEMALE 1: we've adopted as a result of this, okay?
SPEAKER 1: But I'm saying a lot of those behaviors are not
SPEAKER 1: actually helping us the way we think and here's my point.
SPEAKER 1: I don't think these companies are innovative at all.
SPEAKER 1: I think they've actually frozen
SPEAKER 1: innovation in terms of what they give us.
SPEAKER 1: Their innovations are entirely
SPEAKER 1: about how they suck profits
SPEAKER 1: out of customers and I think that part's unhealthy.
SPEAKER 1: So my point to you here is.
SPEAKER 1: I've spent time with
SPEAKER 1: the antitrust division of the Justice Department.
SPEAKER 1: I've spent time with the FTC and they
SPEAKER 1: are really digging into these issues because they
SPEAKER 1: matter and the thing people know is that any trust is
SPEAKER 1: a pro-growth form of
SPEAKER 1: intervention in tech the history is amazing.
SPEAKER 1: Every inter trust and
SPEAKER 1: every antitrust intervention has created
SPEAKER 1: a new industry and
SPEAKER 1: massive innovation and I
SPEAKER 1: believe the same thing would be
SPEAKER 1: true here and you're right Hope.
SPEAKER 1: It's going to take a little time to get all this done,
SPEAKER 1: but the important thing is for
SPEAKER 1: people to get engaged in the conversation.
SPEAKER 1: To understand what's at stake here,
SPEAKER 1: because-again I've got nothing against these people.
SPEAKER 1: I think they're really smart but I think they've
SPEAKER 1: acquired a political power without getting
SPEAKER 1: elected and without any accountability and
SPEAKER 1: that political power right now is
SPEAKER 1: destabilizing too many things
SPEAKER 1: in our economy and our society.
SPEAKER 2: Well, coming back to his point a little bit
SPEAKER 2: here and you've said this in the past,
SPEAKER 2: that Facebook is terrible for America.
SPEAKER 2: It's driving a lot of how we think
SPEAKER 2: about each activity that we do on a day in day out basis.
SPEAKER 2: In your eyes there and you mentioned
SPEAKER 2: the 2020 election I want to
SPEAKER 2: come back to that in your eyes,
SPEAKER 2: what should the platform be doing to
SPEAKER 2: prepare for 2020 election knowing that
SPEAKER 2: it's inevitable that there will be some sort
SPEAKER 2: of influence that their platform at least have.
SPEAKER 1: I think the most impactful thing they could do
SPEAKER 1: would be to voluntarily withdraw
SPEAKER 1: from having any targeted election advertising in
SPEAKER 1: the last two-three weeks before
SPEAKER 1: the camp- before the election itself.
SPEAKER 1: To do that would be an amazing thing.
SPEAKER 1: I mean right now Mark offered that they're going to
SPEAKER 1: withdraw from any country that
SPEAKER 1: has either a human rights violation policy or privacy
SPEAKER 1: felt-just withdrawing for a few weeks
SPEAKER 1: at the end of the campaign,
SPEAKER 1: because that voter suppression which
SPEAKER 1: is so easy to do on these platforms,
SPEAKER 1: because the country is polarized and the
SPEAKER 1: products -you don't have to
SPEAKER 1: convince people of anything on these things.
SPEAKER 1: You just need to scare them
SPEAKER 1: and then blame the other guy or blame
SPEAKER 1: both guys and so that would be the thing I think
SPEAKER 1: is most important because it really bothers me.
SPEAKER 1: Facebook created that database of political ads
SPEAKER 1: and then they blocked
SPEAKER 1: ProPublica from doing any research on it,
SPEAKER 1: which is just like guys come on!
SPEAKER 1: ProPublica as the good guys, right?
SPEAKER 1: You got to let them do that.
FEMALE 1: Roger, do we look at Apple,
FEMALE 1: Microsoft differently because they would argue
FEMALE 1: that they are not making money off of our data.
SPEAKER 1: So, Microsoft is definitely making money off our data.
SPEAKER 1: So, Microsoft remember they've got LinkedIn.
SPEAKER 1: They've got BING the search engine
SPEAKER 1: and they are huge in A.I, okay?
SPEAKER 1: So, I would say Microsoft was late
SPEAKER 1: to the party but they want to be right there.
SPEAKER 1: I think Apple is different.
SPEAKER 1: Apple is taking a different thing.
SPEAKER 1: Sales force is taking a different thing
SPEAKER 1: and IBM at least in its A.I.
SPEAKER 1: business is attempting to be
SPEAKER 1: ethical and again we'll have to see the results
SPEAKER 1: but what I'm looking
SPEAKER 1: at right now is I think the guys you've
SPEAKER 1: got to watch out for are Google by a mile.
SPEAKER 1: Then Facebook and Amazon,
SPEAKER 1: Verizon and Microsoft those are the five people who
SPEAKER 1: have- if you will
SPEAKER 1: this data economy working for them in one way or another.
FEMALE 1: And just lastly, this pivot to privacy,
FEMALE 1: Mark Zuckerberg's latest post.
FEMALE 1: All right we're going to femoral encryption.
SPEAKER 1: That's a dodge. It's all about avoiding responsibility
SPEAKER 1: for the hate speech and
SPEAKER 1: all the divisive stuff that's on there.
SPEAKER 1: If it's encrypted end-to-end,
SPEAKER 1: then they can go, Hey! I can't see it.
SPEAKER 1: The truth of this is,
SPEAKER 1: I love what he said about getting out
SPEAKER 1: of the countries which have the bad policies
SPEAKER 1: on human rights and-but
SPEAKER 1: the other part of what's wrong with
SPEAKER 1: that whole conversation is that
SPEAKER 1: 99 percent of the value is not the stuff you put in.
SPEAKER 1: Right? What you put in is one percent.
SPEAKER 1: The 99 is the metadata.
SPEAKER 1: It's all the data they acquire in other places.
SPEAKER 1: All the things they get from surveillance
SPEAKER 1: and he's told you,
SPEAKER 1: flat out in his Jonathan Zittrain interview that he's
SPEAKER 1: not going to end those practices
SPEAKER 1: and -i- until we force them to. They're not going to.
FEMALE 1: How do we fix this business model because we got to go?
SPEAKER 1: Yeah we got-we got to eliminate
SPEAKER 1: the collection and the trading of that data.
SPEAKER 1: No more credit card data.
SPEAKER 1: No more personal health data.
SPEAKER 1: No more location data.
SPEAKER 1: No more Internet travel data- and they're all
SPEAKER 1: going to squawk like hell and I go- I get it guys.
SPEAKER 1: We're going to get rid of it, then we're going to have
SPEAKER 1: the conversation about what's legit.
Back to Top