Cloudflare Internet Summit (2018)

>>SPEAKER: Good morning, everybody. Welcome to
Cloudflare. If you would take your seats, we're going to start
in 5 minutes. Thank you. >>SPEAKER: Good morning! Great
to see a bunch of familiar faces. This is the fourth year
we have done the CloudFlare internet summit in San
Francisco. We did it this time in London as well.
And you areoge going to hear — I have done it once. So I will
do it two more times, you will hear the word Cloudflare today.
Because today is not us talking about us, it is about us talking
with and having conversations with the people who really
inspire us that are doing really amazing and interesting things
on the internet, or sometimes terrify us. They do things that
are really scary. Last year, for instance, we had the CTO of
Cambridge Analytica debate Larry Lessig on stage with the
question, will they destroy democracy, and that was in
September, well before Cambridge Analytica became a household
name. Those are the conversations that we have here
that we hope to have today and we brought some amazing people
in order to do that. Enjoy the day, have a great time. This is
a tight space, we get a lot of people into a tight space. So
be courteous to everyone else. We love it when people have
conversations and talk about things, but if you do it in that
back area, then everyone here on stage can hear you. So we put
arrows on the floor and to direct you out, we have a back
patio that way and a refreshment and lounge area in the back. If
you want to step away from the stage, go into those places so
you are not interrupting the speakers on stage. So, with
that, I'm excited to kick off the fourth San Francisco
Cloudflare internet summit and welcome to the stage my
co-founder, Michelle Zatlyn and to trochabout running a company
in the internet age. >>SPEAKER: Well, I'm honored to
be here with you today. You had an incredible career, you ran a
large and really important company, GE, for 16 years, that
was a transformation. And you left the chairman of a health company,
and you worked in DC on some U.S. policy councils.
There are not many people that look like you, both large and
small companies.
So I wanted to start talking about a large company, GE, and
how you decide to wake up, you said, one day I will go through
a digital transformation, or did it happen in an evolution? So
help and frame that, we will move on from there.
>>SPEAKER: I will be micro and be more pervasive. The biggest
aspect of this company is the long-term equipment we had
with the customers. So you always have to be keen on
what technology is going to help those assets perform better.
Historically, that has been material science. But, starting
at about 2010, the combination of sensors, analytics, and
things like that, started to allow you to change the way you
interface with your customers. So we used to sell jet engines,
and now you sell outcomes. So I think our notion wasn't so
much that we had the envy of Silicon Valley or anything like
that, it was the context of how you become a fantastic,
productive, suppier to your customers. And that led — a lot
of industrial companies was led down a digital journey. If you
broaden that, Michelle, there's a wave of technology that is
coming through the system right now, from blockchain to
bigdata/AI , things like that, that are going to interface with
every company in the world. You have to figure out what is your
strategy in that context. In our case, owning the analytics
round of products, that was a make strategy. We had to do
that. Things like security and things
like that, that's a bi by-strategy, that is important. Michelle Zatlyn: You have to
decide what you are going to buy and then commit.
Jeff Immelt: Think about with Amazon and AWS. He recognized
he was the use case for that. And so in data, for industrial
products, and in manufacturing, we were the use case for that.
We were the biggest practitioner, we knew more, we
had more domain, why not it do do it? On AI , if you compare
our computing power with Google, we're a pimple. We're like an
ant. So it is weaving through all that stuff to decide, you
know, kind of what your strategy is going to be.
Speaker: So you used the AI example, and how you are an ant,
and do you feel that they are the leaders and betting on them
for a path forward to drive good outcomes for your customers?
Jeff Immelt: And from the customer standpoint, the first
wave of the internet passed us by. So in other words, in IT,
your ID department are project managers, so you would implement
an ERP from Oracle, or a CRM from salesforce, but you didn't
know anything about how it was going to go. Your expectations
for benefits were low. I think the next wave of IT
tools are practitioner-based, these are outcome-based.
So the metrics have to be different, the talent has to be
different. And I use this to just upgrade, you know, in the
'90s and 2000s, an IT person was a product manager
and thought not a technologist. So we had great materials
science and not industrial technologists. And that's a
weakness in the word today, if you don't have a cape capant
capability, you cannot be a smart buyer of the technologies
that have to be installed. Michelle Zatlyn: Yes, you have
to know technology or how could you buy the right tools?
Jeff Immelt: So in the 1990s, Oracle would do all the
consulting for you, and it was a compliance tool and not an
outcomes tool. And now everything that you have is an
outcome tool, you need somebody that is a good practitioner.
Michelle Zatlyn: I wanted to spend a few minutes talking
about people. You said that we need to bring a different kind
of person into GE to make the right decisions and execute on
this vision that you articulated so clearly. How did you do that
and go and recruit new types of digital talent into GE, and then
I would like to hear about it — the current team that you
had, how did you get them to buy in with this change?
Jeff Immelt: You have to demonstrate that the use case is
interesting, you know. And so, in other words, you are
committed to it in the use case of — if you actually want to go
know how to build advanced models around a jet engine, that
is quite a good technical challenge, and this is the best
place to do it. And we are committed to build the right
kind of capability and talent around doing that. Unless you
can commit to being the best, people are going to think you
are pretending. They are going to think you are going through
the motions and, you know, maybe to a fault, I brought in the
digital team and protected them. Theoge only way to get talent
is to show you are in it to win it. And in that space, there's
C3 IoT, upptake, a bunch of companies that are venture
based. But in that world, I would say that the biggest thing
by far, whether we are talking about security or digitization
or anything like that, is what do you do with your legacy?
What do you do with your legacy assets, with your legacy people,
do you run two camps and try to merge them over time? I think
that is one — you always run a continuum between isolating the
digital focus, giving it room to breathe, and slowly but surely
bringing all the legacy capability with it. At the end
of the day, you have to bring everybody — so think about security and put
yourself in my shoes. So 330,000 people, 185
countries, 40,000 suppliers, assets and
factories factories on the world, that on the make side.
On the sell side, 15-year-old CT scanners, digital assets on
utilities, jet engines, locomotives, so you sit down one day and say how it do I
keep my institution secure from a cyber standpoint? It is a
daunting task. You have to be committed to bring people with
you, but have to be equally rigid on the speed with which
you allow people to embrace change.
You cannot have an institution and not have people that do not
understand the fundamental technology, whether you are
going to buy or make everything, you have to have people
understand the fundamental risk and technology.
Michelle Zatlyn: I have seen where that works and it
resonates. I wanted to talk about your leadership team and
how that changed, did you have to reconfigure your leadership
team or was it similar to the rest of the organization?
Jeff Immelt: I remember it the first time my predecessor
brought Six Sigma to GE. And I said, what in the hell is he
doing? It doesn't resonate with me at all. I said, Jesus
Christ, we have lost our mind. Michelle Zatlyn: Earlier in my
career, I left Six Sigma and it was a big deal.
Jeff Immelt: And everybody in the organization said it made
sense, it is going to attract productivity and make us more
productive. And I would invite, in Rhiad, all of our customers
in the Middle East, and they would sit and nod their head
about asset performance management, predictive failure, time on wing, going
from Cap Exto Op Ex, outcome selling.
You would see a thousand heads nod.
So I think to a certain extent, the
leaders understood the context. But we could tell you, here's the study on this
nickel alloy and what does this ceramic do at 2,000 degrees, but
nobody understood what a systems architect or things like that.
So we have to it do a mosh pit between the technology you are
bringing in and the people you work with every day. And some
people left, you know, the person that runs the — this guy
named Lorenzo Stinelli runs the oil and gas business and is now
a digital native. That's what the industry dick dictates of
you and what is important. And people grab it in terms of age
and the relevancy of the business they are in. But smart
people in your career, you want to grab what is next. That is
on the analytics side. On the cyber side, I chaired the cyber task force and working
group inside the company, it was the only level of which the
right risk could be taken. If I gave it to my general counsel I
would normally give it to, he would shut it down, this is too
scary. If I gave it to a division head,
he would take too much risk. So how do you keep a company more
or less safe can only be done at the CEO level. So I think, when
we take about the wave that we are in right now, there are
different gradations of different opportunities, as you
look at it. And in cyber, we treated it more like, let's not
all experiment, do it my way. And when you say let's
experiment and see what each business can teach each other,
when it comes to risk mitigation, it is a universal
march. Michelle Zatlyn: More
prescriptive. Jeff Immelt: I cannot tell you
how scary it is to run a company from a security risk standpoint,
it is hard to run the company. You are fired two ways, one is
by not being vigilant. So you need to have a knowledge of what's happening around you, and
the other way you are fired is not being to spec. So you need
to have everybody doing what they are supposed to be doing
every day. You need to have a reflection between everything
you know, and then saying, okay, we are going to do these three
things. We are going to mitigate the install base in 18
months, do this in 7 months, even if you know you have not solv solved the
problem, you have to walk around with that knowledge.
A lot of people walk around with the happy briefcase. They
go home at night and say, I have solved every problem, and
everything is done. Today, you have — you go home
because you are tired, and you say, shit.
There is so much going on I don't know about, how will I get
on top of it?
Michelle Zatlyn: And with cybersecurity work, the question
is, is it done, are we secure? And the work is never done, we
cannot say it with certainty. So we are in progress –.
Jeff Immelt: And there's no regulatory regime for health
care yet, it is growing in financial services, but there
are so many industries we are in where there is no regime. You
are setting your own best practice as you are going
through it, and that takes whatever was hard and makes it
triply hard because you are setting the standard.
And, you know, the Deepwater horizon with British patrolling
the gulf in 2010 basically cost the company $70 or $80 billion
in costs or liability. Michelle Zatlyn: B with a
billion. Jeff Immelt: The day it
happened, there were 14 standard operating procedures on the
platform and now there are 5. So how do you have vigilance,
and having a standard and not meeting the standard, those are
the two modes of failure. And cybersecurity is the same way.
Michelle Zatlyn: You do the ABCs and you have a long way to
get there. What does the CIO of the future look like?
Jeff Immelt: You have to be a deep technology and you have to
understand the user. In other words, I think the — the day of
the gatekeeper and project manager, that has to dissipate.
And you really have to get the tools in the hands of the sales
leader, the manufacturing leader, the service leader.
So you need deep technology, and you need use case. So my view
is, you have a very small central group, very small. And
you have incredible depth as it pertains to tools, right, what
is going on. The whole notion of platform
versus points — I have to say, it still confuses me. To a
certain extent, your organization is going to grow
beyond the ability to do everything on an Oracle ERP, or
an epic, electronic patient record. So you really are going
to have to find ways to have very simple platforms, but give
get the tools in the hands of people that use money.
So I think Salesforce is an incredible company. I think
mark Vineoff is my hero. We were one of their first
customers, we spent $10 or $20 billion. If you asked me, did
you make any more money because of that, I would say, I don't
know really. I hope so.
[ Laughter ]. But it is really hard, it is
really hard for me to say. I know the sales team likes it,
and I think it was amazing in it's its context, but I don't
know. Those days are over. And now you have the — the tools
have to be in the hands of a practitioner, you have to go
from what I would call just transactional selling, you have
to guarantee outcomes. So what advantage do we have in the
digital space, as a Silicon Valley start-up company? I can
go to an airline and say, I guarantee you, 90 percent of the
time, it is paid for. There is nobody out here that is going to
do that. We have the physics and the analytics, and the
notion of getting to outcomes is incredibly important as you go
forward. Michelle Zatlyn: I wanted to
spend a few minutes and then we will go to questions.
So people in this organization, for 16 years, this
was incredible and then you joined to Silicon Valley and
joined NEA as a ven venture partner. That's a big change.
How is it going? Jeff Immelt: I had never lived
in silllicon Valley and I wanted to see what it was like, what
the start-up scene was like, I wanted to spend time around
growth companies. I joined NEA, I'm a deep domain health care
person, I'm a deep domain industrial automation person,
the NEA footprint follows the skill set. I wanted to work
with private growth companies, not with public growth
companies. And I would say that it has been — I think it has
been fascinating and fun. So there is definitely a bubble out
there. People out here do not know what is happening in the
rest of the world. [ Laughter ].
But I view that as a — I view that as a real positive. I
don't view it as a negative. There's a sense of optimism that
is so incredibly powerful in terms of the world today.
There's a lot of money, maybe too much, in terms of how and
where to invest, and there are so many
smart people out here, you have a lot of people that tell, like
you and Matthew, is this this is what you should do. In my
experience I would say, this is what I would do if I were Hugh,
that is a different kind of advice, or what I should do
next. So with growth companies, there's a different nuance than — I my friend, the
CEO of American Express, he is doing the same thing. It is
great fun, and in healthcare, you know, on the diagnostic
side, I can say in the 1980s, or the 19 90s, diseases that had a 5 percent
survival rate have an 85 percent survival rate now.
So there's certain things in the market like health care that
small companies are going to have a chance to really be
incredibly dustructive, and in health care I want to be a part
of that.
Michelle Zatlyn: Do you think you are going to start a trend?
Jeff Immelt: I wanted to do something that was interesting
and fun out here, and more importantly, I wanted to work
with CEOs and with emerging teams and help make them
successful. And I think you can really do it
here. Michelle Zatlyn: This is a
lightning round of three questions. What is the most
surprising thing, the funniest thing that has happened, and
something you would change. Jeff Immelt: The most surprising
thing, there is so much money. You know, I just —
I would say that it is like, you might want to write this down.
Not every idea is a good idea. [ Laughter ].
In other words, take it from somebody who — it is a
challenge. So I think it is hard to be discerning with you
just have so much money. So I think that was — that has been surprising.
The funniest thing, the time it takes to drive from Palo Alto to
San Francisco. For somebody that
has never lived year, the variance from 30 minutes to two
and a half hours, it takes getting used to. And the other
thing is, this may not be funny, but I think it is, is the impact. A friend of mine
for 25 years, and how that has re-created the dynamics of the
venture world, it is fun to see that — that level of
change take place, and people that have the old business
model, how they struggled with it.
I would change, I don't think the ecosystem does a good enough
job of helping people. So at this moment of time when money
is plentiful, just being an investor isn't enough. And so I
think there's a — I love the NEA guys, I have a bigger
purview and I think it is just — I think entrepreneurs need
something different than what they are getting every day. Michelle Zatlyn: So you are not –.
Jeff Immelt: It is just the old institutional knowledge,
what kind of CFO do you need to go public. Do you need somebody
that comes from — somebody that comes from WallWall Street, you
have a good company, I can take it public with you, like a
German shepherd in a (indiscernible).
If you have a good company, give me a call. If you don't, it is
hard. Okay? So you better have more than just a CFO. What a
good CFO does is, when you're gone, this is the person that is
going to help you run the place. They have the pieces come
together and know how to be a good operating right hand for
you. So there are certain vestiges of days gone by that
are not necessarily true in the setting that you're in, right?
Michelle Zatlyn: We: I will ask a question and we will open it
up. So if you look ahead, the fortune 500 leadsers and the
entrepreneurs, you wear a unique uniform in a different sense of
the word. What would you, what kind of advice do you give to
the entrepreneurs and fortune 500 leaders, and then we will
open it up for questions.
Jeff Immelt: You cannot give the government the finger,
Washington always wins. So the notion you are going to escape
around, being part of that process, being a constructive
voice, knowing how to collaborate. In other words, I
competed, let's say, hand to hand combat
for the Caterpillar for 35 35 years. We recognized in places
where we had something in common, we can work together to
set a standard. You are not just trained well to play well
together as an industry, it is like you flunked kindergarten or
something in terms of how you work together. What would I
tell industrial companies future of work? Every company and
function is going to be redefined in the next five
years. I have never seen a wave of technology that is more profound than the waves
of technology that are intersecting with the world
today, and you better have a microstrategy, not a PowerPoint,
but a micro strategy of how to deal with it.
>>SPEAKER: Any questions? Speaker: I would like to ask
about the fusion of innovation into health care. There's a
half century of research showing medical cost of treating
psychological issues associated with patients.
And yet, that hasn't really happened. In fact, medical
payors are trying to cram down the idea of paying for
behavioral health care or psychological health care, and
that is an opportunity for lowering costs if we invest in
it. How can we change that? Jeff Immelt: The whole health care
payor technology mix is antiquated. The first thing you
have to disassemble a little bit of, people that know the patient
best, or how the patient views themselves, and how an employer
would value things like that has to be re-done, and not through
the prism of United Health, or Anthem, or anything like that.
The fact the payor is so disconnected from cause and
effect is a challenge. And if you think about health care more like a little
wonky, the cancer revolution was in the '70s, '80s, the cardiac
revolution was in the '90s, 2000s. The brain is the most
unstudied, the least is known. And so this whole notion of whether a
psychological impact, aging diseases, the next 10 or 20
years of health care is about the studyf of the brain,
emotions, psychiatry and how you match pain associated with it.
That's where the health care dollars are going to go, and we
are very early still in those days. Aging diseases, and PTSD,
psychology, psychiatry, these are still emerging areas, in my mind. (Off-mic comments) (Off-mic comments).
Speaker: Can you continue to operate –.
Jeff Immelt: I did not have a chance to read the article.
And, as I said earlier, this is a huge enterprise risk in terms
of the context of our supply chain and things like that.
And beyond that, I — this is complicated today with trade.
But let's assume that we solved all the issues with who we trade
with and things like that. The one thing the government should
be working with industry on is a unified face as it pertains to cyber. We should
collectively give our experiences, work with the U.S.
government, position ourselves in the unified way as we work
with our trade partners around the world, China and otherwise,
and try to find ways that these things are spotted, they get
called out, and they get adjudicated in public. .
If you are running a multi-national company, you
should not need the U.S. government to help you sell in
China, or in Australia and Brazil. If you need that, give
me a call. We need a collective, transparent view as
it pertains to security capability and security
technology and what happens. And that just hasn't happened
yet. Beyond that, yeah, it is just —
it is a brief editorial conference, I was given the
whole context. If you put yourself — if you are in this
room today and you years are 32 old, can you imagine a world
where the two biggest economies in the world, the two biggest
militaries in thethe world, are not in some way trading,
working, in some contextual way? I think it is hard to — so we
should face these problems transparentally and if, in fact,
it is true, we will find a way to get through it.
Michelle: It was an honor to have you. Thanks so much, Jeff.
Thank you, everyone. Doug Kramer: O I'm the company's general counsel, we will have a
great new conversation with Julius Genachowski. To give
some experience, he worked on the internet and media propertys
for a number of years, and in 2009 was snatched by President
Obama to be chair of the Federal Communications
Commission communications commission and laid the work for
what they do today, and Wired Magazine named them a top 7
innovations going, and that is not the easiest thing. They
did, what I love, they set up a thing called working with FEMA
to set up a mobile emergency alert system and if anything
with come of that.
And I will go to that in a bit.
And working with The Carlyle Group as a principle and
managing director in media —
and things like that. So Julius, thank you for being
here. Julius Genachowski: Thank you.
Doug Kramer: So we will tackle what the U.S. is going to do
about technology. So the good and bad news, we have found the
one area where both parties in Washington, D.C. can agree. And
the bad news is that one area is that — they both think there
should be significant regulation of the internet and have all
sorts of ideas of how that is going to happen.
So the 1st thingng I want to talk to you about, Julius, the
experience and sense you had, what you think that U.S.
regulators, congress, the agencies, all of that, what do
you think they are setting up to do when it comes to the internet
out of all of this activity? Julius Genachowski: Great
question. It is great to be herehere, I have watched
CloudFlare for a number of years, got to know Matthew and
Michelle in the beginning. And if I had been able to come
to the first of the internet summits, my answer to this
question would have been very different. Matthew kept on scheduling these on Jewish
holidays. And in this, the shift in
Washington has been clear and stark. When I was on the Obama
campaign, when I was at the FCC at the first Cloudflare internet
summit, there was a consensus in Washington on this stuff and it
was pro-Silicon Valley and pro-tech and people that
disagreed with each other would try to go to the other side and
paint them as anti-Silicon Valley and anti-tech. And the
world has shifted, you see leaders of both parties talking
about the need to regulate tech. So what happened? Nothing
secret, really. The manipulation of the election has
been a factor, all the cyber and
privacy breaches are a factor, and parents being worried about
their kids being addicted to a mobile device is a factor, and
the growing size of the success, the most successful internet
companies has been a factor. And this thing, you know, you
can tell a lot by how people use language. For a long time,
people refer to Facebook and Amazon and Google and Netflix, you know, they refer to
them as the digital giants.
If you are in a regulatory environment and you are referred
to as a giant, there is probably another side to that coin.
Doug Kramer: When you make that switch to being a giant, or
something that raises those concerns, in this industry, when
you have on the one hand, you know, content or trans mission
or telecom and internet and tech companies, we are seeing
combinations that blur some of the lines. How would you think
about those combinations and what was problematic for you all
and to what extent do you think that Washington is on the same
path? Julius Genachowski: I'm a
student of history, and not as much as Richard ted Lowe, who is , if you go back decadeses,
there's a recurring cycle that happens again and again. And
the new electric comes around, tries to intrude on the turf of the established
player, using the leverage of government to help restrain
innovation and competition, and it is a little
sloppy, but the government is good at landing on the side of
new innovations. So think about telephone
service. Obviously you started with a monopoly, we broke that
up into regional monopolies, and some innovators
said we can deliver telephone services wirelessly and mobile.
And the incumbent said, maybe, but you should give us a couple
of licenses in markets to let us do it. And the first thing that
the FCC did was just that, they let the incumbents do that, they
did not lead to rapid innovation in mobile. And the FCC got
smart. We will issue more licenses to get mobile service
out there, and we're not going to get the incumbents get the
licenses. Very important development.
And similarly, in that area, some innovatrs developed the
ability to use spectrum in an unlicensed manner. Without
getting technical, there are two basic models for how spectrum is
put on the market. License for exclusive use, this is how AT&T,
sprint, Verizon get their licenses, and unlicensed. A
different model, nobody has exclusivity and innovators can
do what they want, and it gives you Bluetooth, WiFi, and a
series of innovations. And on the story, when that technology
came along, if you predicted that the establishment of Telcos
would resist the FCC from authorizing use, you would have
been right. But the FCC did authorize a significant amount
of license used. And we will do a couple on this. Broadcast TV,
on the media side, we grew up with having broadcast TV and the
pioneers of the cable industry came along and said, we have a
crazy idea, we will stream Coax cable around the country and you
have more choices for television, massive capitalism,
a lot of risk, interesting technology, and they went ahead
and started doing that. And with the theme, the broadcast
industry tried to slow them down in the FCC, congress, and the
global franchises already. The FCC landed on the side of the
cable new engines and said it is good for innovation and
competition. And the cable infrastructure, which the FCC
helped promote to get out there, was designed for TV and
eventually became our core infrastructure for internet.
And so it was a nice, unintended consequence.
But cable has gotten larger and the next generation of
innovators came along and said, we can use this internet to
compete with the traditional TV industry and to do other things.
Well, you know, we shouldn't be shocked if you knew in this
history, we discovered that the cable companies and broadband
providers would push back is what net neutrality debate is about.
And net neutrality, the government is saying that there
are artificial blocks to a new entrant, a new innovator or
competitor, succeeding and we should make sure that they have
a fair chance to compete in the marketplace.
Doug Kramer: If you have the ISP providers on the one hand who
are at odds with the tech giants, and those that are
innovative behind them, what is the next wave? Is there enough
oxygen and space where the tech giants — as they get big, do
they think there is enough room and opportunity for fuel for new
disruptive innovations and where does that come from?
Julius Genachowski: I used to be more of an optimist and now look inglooking at Washington —
there's a couple points I need to make.
If you go back 30 years and look at those who are answering these
questions, the next great innovation was something
unanticipated. If you are a government regulator looking at
government policy, do we have the right infrastructure where
innovators can potentially develop things. So it gets you
to have universal, high-speed, broadband and wireless. And it
was an interesting time in history because of the emergents
of more than just one or two large, well-capitalized, innova
innovative, technology companies that are each doing something
where they have an advantage, and also each more and more
competing with the other. So you have Google, Facebook, and
Apple, etc. If you take a step back in a broader way, you have
Alibaba, Amazon, and a number of large companies that are
constrained by their own balance sheets that have a ton of
engineers that are pushing each other. I think that is
interesting. And then, in the traditional
communications area, around the world, and in the U.S., it is an
odd thing where you have — those companies are all global,
and cable companies in the U.S. are regional, and wireless
companies are national, and the world is changing in a
significant way. If the question is, is it getting harder for entrepreneurs
and innovators to enter and compete with all of these
digital giants? I think it is. .
Doug Kramer: Availability of funding is not the problem. Are
there any structural tools that you are targeting and making
sure that they are getting knocked down, or never getting
stood up in the first place? Julius Genachowski: So this gets
to the first question that you asked. If you look at each of
the different, large internet tech companies, they have some
very real advantages that anyone in thee marketplace understands,
whether it is Google and search, or Amazon with AWS, that's a
pretty significant decision they have, what do we do with it? And
what they do about it is hard. And what I thought, at the FCC, looking at part of the market
where there are low barriers to entry and really robust
competition. And then there's a lot of
literature in what you do, and the industry of what you are
looking at is a monopoly. And then you ask the question that I asked,
what if you are it dealinging dealing with markets of
imperfect company competition? So there are in
novationinnovations, challenges, some barriers to entry, what is
a set of principles that you can bring to ensuring that new en entrants can come in and compete
and develop globally? You are seeing frustration at the size
and power, but not a lot of good ideas on the competition piece
of it on how to handle it.
During Doug Kramer: I want to switch gears to when you were
serving on the commission. So when you were there from 2009 to
2011, there was a virtuous approach to tech and what was
going on out here, but you had the field to yourself. And what
the FCC, they would have to work with the FTC and The hill, and
how the markets should look and be regulated, it was by and
large adopted around the world. And now you have, with some
exceptions, the tech giants are U.S.-based companies. That is
not the case anymore. And in the past couple of years,
we have seen predominantly in Europe and in other countries as
well, a new and very varied group of internet companies, and
those lines and the application of those regulations is blurred.
So what sort of — obviously, this brings complications. But
how do you think that that plays out? Is there some way that this
is harmonized, or is it just a race to the bottom with all the
different jurisdictions competing against each other for
their own interests? What do you see happening in that space?
Julius Genachowski: And you have the third leg of the stool:
U.S., Europe, China, and then India.
It is a very difficult set of challenges. In a world that is impossible, you have a set of
consistent rules that apply to everyone around the world. And
one point about Europe to start this, when I took over the
FCC in 2009, the U.S. was behind Europe on mobile.
So for those of you that are old enough to remember 3G, the U.S.
was behind 3G and Europe was ehead. And one of the big U.S.
policy goals, and I spent a lot of time on this, is that the
U.S. should lead the world, 4G LTE. And that worked out, and
you see a debate happening in 5G.
And there were some regulatory decisions made in Europe and the
U.S. that contributed to the U.S. being ahead in 3G.
What I did notice when I was at the FCC in 2009 to 2013, as the
American internet company started to grow significantly
and started to be pushing a lot of data through a lot of pipes, there was a lot of
pushback in Europe, starting with the European telepresence
that were strug struggling and blaming their problems on the had-based — U.S.-based content
companies. And rather than European companies saying, oh,
our internet consumers here really seem to like this stuff,
surely there's a business model in here somewhere, and they
wanted to convince the regulators to figure out how to
push back on the American companies. And so what we're
seeing now has been talked about and debated for a long time.
And we here in the U.S., we gave the internet companies a lot of
room to maneuver the approach, and to your point, they clear a
ly put rules in place now in privacy, and this is a leading
campal. And it may end up being a good
thing, GDPR is not perfect, but the idea that there should be
clear rules of growth in place is not a crazy idea. And it is
not a crazy idea to have different parts of the world —
governments in different parts of the world compete a little
bit on what the best regulatory approach is.
You know, is this approach to privacy, or is it a different
approach to privacy. It is not bad to have some
experimentation, and I think we are seeing some of that now, in
Europe, and privacy in California, you know, see what
happens in the marketplace, what consumers and businesses think.
So I don't think we will get to complete uniform national rules.
And so many of the businesses see the frustrations of dealing
with different local rules, wherever they operate, that
there are people in all of the different governments saying to
the extent we can have consistency, that is better.
China is a bit different. Doug Kramer: So I feel I will be
in a lot of trouble if I don't let you talk about net neutrality.
And you talked about the approach to net neutrality,
since you left, and the commission's view on that is
changing. How do you see this as someone who lived this
and playing out? Do we see it as a pendulum, one administration
is on one set of rules and then after the election they switch
to another? Is there a consensus and how do we invest long-term,
in terms of making this work well, deal with that, and where
do you see the crystal ball going? Julius Genachowski: The pendulum
is awful because it does not allow investors to invest as
they should. I wanted to put in the first net neutrality rules,
and how we did that was important as the fact that we
did that. And there are other areas of the FCC where
pro-competition rules over time and pro-innovation rules had
been bipartisan rules. They are healthy for all the players in
the industry, and even though net neutrality in 2009 was a
highly holerly polarized, we had groups saying that we will do
what we want with it, and groups on the left — my goal was for
there to be a consensus on what the rule should be so we can
move on. And we worked really hard, and
we ended up with a set of net neutrality rules, not particularly complicated, and we
had a really broad consensus supported by Silicon Valley, VCs, supported
by larger companies, AT&T, except for one company, Verizon.
And Verizon's decision to sue is what led to the thing just not
resolving itself.
And I think something good came out of that — getting people to
come from these different industries in a room together to
try to work out zones of agreement. I think that the —
we succeeded in shifting the debate. Because the industry,
the ISPs in Verizon that sued ended up agreeing with
the principles of net neutrality.
Most of them supported actual rules, but not that they
shouldn't do this, this, and the other thing. And the reason I
think that that's important is that when
that is the social front, number one, and number two, you can
reasonably predict that the rules in place will go back and
forth, depending on who is in power, you end up with a
pendulum affecting the rules, and a little bit less so than
the actual practices in the marketplace. Because the
companies understand that, in this case, the democrats will
come back to power. They want a level of caution, understand the
consumers are paying attention, and they don't want to have a
backlash. So it is better just to have a consistent resolution.
I think there is a chance that, after the midterms, that there
is legislation that is adopted that is bipartisan.
But that is net neutrality. Doug Kramer: Before we get to
everybody's question, we will get to the 10-word answer, and
that is — so 2020 comes around, and we elected a democratic
president, and they say, we will get Julius back in here, to run
the whole thing. What do you do about net neutrality on day one,
what is your approach when you are back in power in light of everything you just said? ?
Julius Genachowski: I said that we are going to put rules into
effect and I went to The Hill because there's a bunch of legal
problems RFCK , not to give away my tricks, but I announced that
the FCC will announce an extreme set of rules to create
conditions for the sense of legislation.
Doug Kramer: We will get to questions. If you have one,
raise your hand. Speaker: We have 7 billion IoT
devices, or maybe 10, it is hearted to — hard to predict.
And none of them are regulated, they are connected to a common
resource. When will the FCC step up to the plate and think
about network cleanliness and resilience, which is their
responsibility since the start of the telephone network?
Julius Genachowski: You are right, with one exception, those
devices actually all have an FCC stamp on them.
Sorry? Exactly. So all of those devices go
through the FCC. I think you are raising a really big and
important issue, you have all of these devices out there that
don't meet any set of standards, as far as, you know, what are
the basic standards you want for cyber and everything else. So I
think that's right. And if I were at the FCC now, I
would certainly press the authority that the FCC had to
try to do something about that, I'm not sure if the authority is
just limited to regulating our frequencies, but there are all
sorts of things you can do to bring industry together to do
some standard setting. I would be on that aggressively, I think
you are right. It is so obvious, and it is a large,
important, and scary area. Doug Kramer: There's a
certification of good housekeep ing, that on?
Julius Genachowski: I think it is an option.
Audience member: The United States has one of the lower
speeds of broadband, and we have a monopoly — a duoply of
telcos, cable, and wireless, and they are trying to do rent
seeking and maximizing the amount of money that they are
receiving on their investment.
And how do you actually increase broadband speed and make it so
that net neutrality isn't an imperative, there's enough
bandwidth out there so it doesn't matter?
Julius Genachowski: It is an it a great question, if you had
robust competition, you wouldn't need net neutrality regulations.
We can debate whether the U.S. should get a yellow or red
light, but it should be better and stronger. I think there are
— I think a lot of people are disappointed that competition in
broadband to the home didn't develop in the way that people
at home — you know, Verizon had FiOS, and they pulled back in
markets from competing with cable. There's a company that
is over building cable, and I think it is the case that in
markets where there are at least two players competing, there is
better speed and service. It is a problem.
And there are some people with a great amount of hope for what
the wireless companies might be able to do with high frequency
spectrum that is being put out there.
An interesting learning for me is that a bunch of technology
truths that I was told when I was at the FCC turned out not to
be right. A lot of engineers said satellites would never be
able to have really fast, low latency, internet. And that
turned out to be wrong. And at the time, people were saying
that these high frequencies are not good for anything. It is probably
wrong. And that being a solution for rural areas and dense areas. It is an important
area. Doug Kramer: I want to thank
Julius for being here today, I want to thank him on behalf of
the texts from President Trump and everything going forward.
Thank you for everything that you do. Speaker: Welcome, everybody.
I'm happy to have Roselle Safran, president of Rosint
Labs, a cybersecurity consultancy, and she is an EIR
with Political Ventures, and they invest in cyber
investments. It is cyber all the way down. And she is also
cybersecurity operations branch chief at the executive office of
the president during the Obama administration. I think what
that means is running and protecting the White House from
the actual White House from cybersecurity attacks, we will
talk about that in a minute. And this is Adrienne Porter Felt, an
engineering manager at Google Chrome, and one of the things
she specializes on is how to make hard to-use web
technologies easier and more understandable for people. So
we will talk about those things. One of the big challenges is how
do we make security usable and how do we — because people are
the problem mostly in cybersecurity.
Adrienne Porter Felt: Yes, we're the problem. John: So I will talk about being
the branch chief and what does it mean to defend the White
House? House?
Roselle Safran: It was an exciting and intense experience,
every day felt like a week. And my concern is that we would have
a breach that would be front-page news.
And that didn't happen on my watch. But it happened, months
later. It was a legitimate concern. So the work that I did is being tactical and that
the work was addressed properly and quickly, and any
vulnerabilities were addressed properly and quickly. And then
also I was trying to divide my team time with strategic
efforts to improve operations all around.
When you're working as a defender, you know that the adversaries are upping their
game, so there's a constant pressure on your end to always
be improving. So there's the 24 by 7 shop, so
there wasn't a night where I was not at least on email, if not on
the phone, checking in with the team. And during the government
shut down, I was on the night shift. Not a big fan of that
shift, but you do what you have to do.
It was a fantastic experience all around.
John: You are defending the White House and getting everyone
to use the Chrome in general. So what is a challenge?
Adrienne Porter Felt: A challenge is that security is
technical and personal. Everyone has different threat
models, and they may not even — they don't think about their
threat models the same way that security experts do. So a
challenge is trying to figure out how to build user-facing
software that can be understood and used by billions of
different people who have very different security needs as well
as different levels of understanding. And you can't
make — it is hard enough to get 5 people in a room to get
consensus. You cannot make billions of people all happy at
once. You have to figure out how to strike the right balance
between building a product that is useable by everyone. John: Does everyone have a
strong opinion on how that should work?
Adrienne Porter Felt: We have a certificate for security in
Chrome, and not everyone knew how to use this feature, but
they all seemed to know where I lived. I was at the playground
with my baby and another parent started arguing with me about
how we lost this certificate. So there's a lot of challenges,
and power users wanted want a lot of richness and
functionality and the ability to control their online experience.
But sometimes that can make it harder for other people, if
there is too much information, too many options, it can become
confusing and overwhelming. We are trying to strike a balance
between giving power users the function functionality that
they want and need and being clear and simple and usable for
everyone else. John: There are parallels here,
you have people at the White House with different levels of
skills and probably very opinionated. How do you help
them securethemselves and the White House?
Roselle Safran: As a security professional, you are trying to
cover your basis for a strong security posture and what
enables the person at the organization to do their job
effectively. So it is a constant give-and-take. To the
extent, though, there's an element of educating the user
of the value of adding security controls and the risks
associated with not having it is far greater than the little
hiccup that causes inoperation. So this is an issue that I see
across the board. And if you were to tell someone
that you need to make sure that you secure your car, the person
would say, yes, I lock my cars, I have the alarm, and if you had
a business owner and you said, hey, you need to secure your
business, then the business owner would say, of course. I
have locks on the doors, it is a larger business, they have
security, security guards, surveillance. When it comes to
an individual and you need to secure your personal data. Then
the data, then the value of doing that is less clear. And
the answer is, if it is not going to cause me inconvenience.
And for the businesses, it is, well, yes, I will secure my
business to the extent that I don't have to pay too much for
it. And that is really problematic,
especially for small- and medium-sizes ed businesses. And
for the small and medium-sized businesses, when they have a
security breach, 60 percent of them go out of business six
months later because of that. So there's an existential risk
to not having security controls in place. But that is — it is
often difficult to articulate to some people, whether they are
individuals or businesses. And I think, as security
professionals, we need to do a better job of making that clear.
It is a little more difficult to convey that message, but it is
something that has to be made clear. There's a little bit of
give-and-take. And while security teams can do their best
to minimize the amount of disruption and interruption with
security controls, there is going to be a little bit of an
issue on occasion, depending on what the control is and how it
is being implemented.
John: And there's a nudge for people to think about how they
secure websites, and they are trying to change people's behavior, especially in
the UI.
Roselle Safran in the first 15 years, it was over http that is
visible to everyone on the network. And there's a
cross-industry push to move the web to https. And when we
talked about this a few years ago, we do not think that people
understood that when they use an http website, their information
is not protected. We wanted to tell people that your connection
is not secure, your data is not secure. It turns out that would
have freaked people out if you had shown that at the time, you
know, something like 70 punt of all website patrons.
And over time, as the amount of https traffic has risen, we
flipped the paradigm, instead of rewarding websites for using
https, we told people when a website isn't secure and assumed
that a secure https website is the default. And today, it is,
and people should not just check for the word that says secure.
And if the internet traffic is going over https.
So now what we're doing is we are telling people more and more
aggressively when themize information is not secure, so
they will notice it, or maybe that website has an HTTPS
version they can switch to. John: Do you measure that, or do
people understand? Adrienne Porter Felt: That's a
good question, we have a few search methods to help people
who search. We have in-browser telemetry so we can see in
aggregate how people are spondsing to the changes.
And when we read the health forums, my team is on rotation
to people in the help forums, that's a good way to connect
with users. And we do see that people noticed this change. We
have to strike a balance with how severe the warnings are,
sometimes they see not secure unred, and they might turn off
the computer and walk away. You can scare somebody too much. We
wanted it to be noticeable and trying to avoid the negative
reaction reactions. We tried to do this testing, and
we will proactively do a lot of research before launching.
John: My mother got a certificate warning on her PC,
and she shut down the machine, called me, and switched to using
her iPad, which does not give her the warning. It turned out
the battery was dead in the PC, it thought it was 1970 — her
reaction is that she was hacked. And now you go from the White
House to consultancy. What is the
overlap between the threats, and what you learned from experience in government? Roselle Safran: Often these
actors are attacking private agencies and gum governments,
especially when you are looking at nation state actors. They
could be stealing government property, understanding policy
and how it relates to that country, any type of information
that they can use to further their agenda as a nation.
And so certainly government has that information, but many
companies have that as well. When you look at a defense
contractor, or a large financial company. So sometimes you end
up with the same types of threats and it becomes critical
that threat intelligence is shared amongst the communities
as well as possible. So before I was at the executive office of
the president, I was at the department of homeland security,
and part of its mission was to improve the posture of agencies
and critical infrastructure. There was a flow of information
both ways. Often you find attackers use the same
techniques, regardless of who they are going after. They use
the same malware and attack vectors. If that information is
shared across different entities, then it works to
immune the larger company once is person is hit and can spread
that information quickly. John: How do they think about
that threat? How do we people think about what they should
worry about? Adrienne Porter Felt: It could
be thinking about the kind of attacker that is going for
financial gain, like your bank account or your identity. And I
had people come to me and say, I'm not worried about that. I
have a thousand dollars in my bank account, nobody wants that.
But the reality is, a thousand dollars in U.S. dollars can go a
long way in other countries. So your thousand dollars may seem
like chump change to you, but it is valuable in other places.
And besides that, there are many attackers that realize that
you start with the low-hanging fruit and work your way up. So
if you have a contact list with people — with CEOs of
corporations, you're the easy target in. And the stuff that
they need, they are getting to the bigger fish they are trying
to reach. And the common tactic is to go after the unsuspecting
family member, or friend, and once they have
access to that email account, they send an ma'am, it an email
and it looks like it is from a friend, and the person that is
the target is more likely to click on the link or malware.
And sometimes people do not realize they can be a target,
they are a little guy and somebody is concerned about
them. But there are many times where they are a bigger target
and more interesting than they think they are.
John: Did you have to explain this to President Obama and did
he have a good password? Roselle Safran: I don't talk
about anything related to –. Adrienne Porter Felt: I hope
she doesn't know about the password.
John: I want to talk about something technical, and the
issue with urls and web browsersment and. As an
engineer, you are saying that there's a real problem is
interpreting the string in the browser.
Adrienne Porter Felt: You know what is url is, you know the
difference between a path and a sub domain, you understand maybe
how to modify query string. If you are on the New York times,
you know that deleting part of the path is going to get you
something different. And this is a powerful tool, for
those that know how to read and use them. They can also help
you identify phishing attacks if you know to look at them, and
what to look for when you look at the url.
What we see through user research, is that people are getting to
the desktop notion of what file structures are do not have the
ability to read or mentally parse urls or whether they
should look at it or not. Google does not — I think
blog.Google, or Google.block blog is a real website, and one
isn't. And that is confusing to people. We are simplifying the
ways that urls are displayed while giving power users the
information and control they want but, at the same time,
making it easier for other people to be able to, who are
not power users, as a way to prevent phishing, or as a tool
for understanding what websites they are on.
And it is hard to make both of those groups happy to be honest,
but we are trying. And we are trying to understand
how we can build the UI that meets both of these teams.
John: Are we close? Adrienne Porter Felt: So we are
in the process of making small changes, but we are in the
information gathering and research space right now of this
problem. And one reason why, people expect that when we are
talking about this publicly, that people would have an answer.
The reason we are doing and having this conversation right
now is we Quant want to hear from people what their ideas
are, I love to work with professors and graduate students
that are doing research that is helpful to us. We are trying to
get ideas from people on how to tackle this problem.
John: What is interesting in what you are doing is there
seems to be, there is an underlying emotion, is empathy
for the user. And I think often in security, empathy is lacking.
It is like a, do this thing, you have to use a good password.
How can we get empathy to the security space and people can
understand that for people to get on board, they have to
understand their point of view? Roselle Safran: That's a
challenge of being a chief information security officer
today, you have to convey the risk and why we are saying what
we are saying and requesting what we are requesting. And the
flip side of it is that, as a security professional, you have
to understand our function is to make the environments, the
optimal environment for getting work done.
And there's a little give and take that is required on both
sides. But I think that the security professionals sometimes
look, or are seen, as the stumbling block, the ones that
are always saying no. And there's more that can be
done to be creative about solutions so that, at the end of
the day, both sides may have to do some compromising, buterize
happy but are happy with the outcome. And part of that comes
with having these conversations. If you are on the security team,
it is easy to just be silo'd with your other security team
members and think that the users are just up to craziness again.
And having the conversation and understanding what the users are
trying to do and why they are trying to use it and why they
need something a certain way, it makes a big difference in
understanding how you can reach that compromise.
Adrienne Porter Felt: People have two reactions in the
security community. One is, they need to teach people to do
this. If you find yourself thinking, we need to teach
people to do this, think long and hard about whether that is a
problem with them not knowing enough, or a problem with a
software being too hard to use. And the second is when you see
user pain, like if you are looking in a help forum or
watching a user video, it is tempting to think that user is
just dumb, and other people get it. Don't think that. Your
users are not dumb, often very smart people, andand smarter
than you. You have to keep those instincts in check and
sometimes it can be hard, but remembering that the problem is
your own software and being open to critical feedback, that your
software is hard to use, rather than jumping and thinking it is
the problem with the person using your software.
John: We want to give the audience a chance to answer
questions. We had a lot of experience in real security
situations. What are the common mistakes that people make when
they think about security, so we can help them not make those mistakes?
Roselle Safran: Plugging in thumb DRUVrives you don't know
about. And sometimes they give gift bags at conference and
whatnot. But there are some threat actors that will drop
thumb drives in parking lots, because they know somebody will
plug it in and you have an infected network.
That's a common one. And the main attack vectors are
often email and websites. So a person receives an email, it
looks legit, they click on a link or an attachment. And from
my perspective, I want the user to have to do as little scrutiny
as possible to getting what you are saying, about whether they
need to parse out the url or not. But the reality is that
you can have many lines of defense, you can have a robust
email gateway, and there will be potentially something that slips
through the cracks and it gets to the end user, so the end user
has to have the ability to discern what looks suspicious or Phishy or what,
and there are companies that focus on security awareness for
end users. So having an eye for what you are clicking on, or
business email compromises are becoming really popular, where
somebody in the finance department received an email, it
looks like it is from the CEO, you have to wire 50,000$50,000
immediately. And not having a process outside of saying, okay,
where do I wire it, that is another issue that is a common
attack vector. And, yeah, installing intivirus,
a lot — antivirus, a lot of people think that it will slow
my computer, but even on Macs as well. And there are
common attack vectors and the way that people can take action
and improve their security posture.
John: Do you see people do things with Chrome that make you
pull your hair out? Adrienne Porter Felt: For a long
time, I was too proud to use password manage: I ment. I dint want a single
point of failure, and I created a new password and was locked
out of every accountism . I had a single point of failure
because the way I re-set them was with my Google account and
email. And now I use a password manager
through Chrome. You don't have to use Chrome's password
manager, you can use another one. And it is easier to have
many unique and different passwords that are remembered
and are not locked out of my accounts.
I am trying to encourage other people that are power users, it
is weird to feel leak you you are giving up control.
John: I had a scheme, a a whole thing, and I had a password
manager. We will get to questions right
down in the front. Audience member: When I'm on my
desktop, for it can cover my mouse over an url and I can look
at it and say, that's not what I want.
How do I do the equivalent on a portable device, like an iPad,
an iPhone, or a Google? I cannot see that, there is no place to
hover my finger. I cannot figure out what url, except by
clicking on it, which is too late.
Adrienne Porter Felt: So the concern is you are hitting a
phishing site, we think that people look at the url after
they opened the website, is the concern.
The challenge — I mean, I will be honest. Realistically speaking, I don't
think people look at where the link leads to before they click on it.
John: I think there's a question hiding behind the pillar.
Audience member: I thought it was was interesting when you
talked about the importance of not freaking people out, and it
was suggested that young people who are used to the internet go
too far the other way, they are na￯ve about security and they
think, it is fine. It is not an issue. Do you think that the
way that your companies and general security, the approach
will change over time based on the different level of user
experience of the internet, and that level of alarm that you
want to impart to people is going to change?
Roselle Safran: I thought by now there would be a general
understanding that cybersecurity is a concern and needs attention.
And as ransome ware becomes more prevalent, then the threat is
tangible in a way. People can see that their
machine is — they make a connection between something in
the cyber realm and affecting them personally. I thought
there would be an uptick by now to the fact that
cybersecurity needs to be considered, but I'm biassed in
all of this. So I think there is definitely
still a way to go, part of my concern with the mentality of,
well, if the default is that it is good, it is not a problem
unless I hear otherwise, is that that can work for the Googles
and the apples of it the world that are paying attention to
cybersecurity. But if you look at some of these IoT devices,
they are not considering cybersecurity at all. They are
getting a product out to market and cybersecurity may be tacked
on after, or not even included in the technology roadmap. So
having this view of the default is it is not a problem then
becomes even more of an issue because it really is.
Adrienne Porter Felt: One thing that people find surprising is
that — there's a perception that if you see a warning,
people are going to click through it. But most people are
actually conservative of it. Less than 20 percent of people
click through certificate warnings, and less than 10
percent click through phishing and malware warnings. I'm not
aware of generational effects or if younger people are more
likely to proceed through warnings, but it is an
interesting question. John: We are 10 seconds away
from finishing the panel, thank you to Adrienne and Roselle.
Live captioning by Lindsay @stoker_lindsay at White Coat
Captioning @whitecoatcapxg. Our next panel is about kids and
the internet, with Sara DeWitt and Jill Murphy, from Common
Sense Media who provides tech protections to
children. And Julius Genachowski mentioned
the backlash against large tech companies today is driven by its
effect on children. So Sara, we talk about free and open inter
internet. That's an area where regulation, by parents, is
required. So how should parents think
about what kinds of content their kids should have access
to? Sara DeWitt: There are so many
ways to look at this. One thing that we should all
keep in mind is that the idea that tech is bad
for kids, the anti-screentime movement, Jill and I were
talking about this back there. It is a very upper-middle-class
issue. We do a lot of our testing with kids from title 9
schools, and those parents are desperate to get technology into
their kids lives. So this is an interesting debate that is going
on all the time. So back to your question of how do you
choose the best content? Kids look at technology as play.
This is a game, this is something fun to do. And so we
always think, as parents, about what your kids like to play and
what you are hoping they can learn from the experience.
And so you want to look for something that kind of meets
kids where they are, if they are excited about dinosaurs, you
want to find a good dinosaur app for them to play with, read the
reviews, and who is publishing the content. And you want to
think about how technology fits into your child's life. You
want to think about what times of day are the times that kids
are going to play, and make it part of their schedule. You are
going to be able to play with the iPad on Tuesday nights while
I'm making dinner. And then after, we're going to talk about
what they did and I want you to show me what they played with.
And what we say to parents a lot is that this thing with raising
kids to be healthy with the technology they are using is not
to just hand them the device whenever you need them to be
quiet, but to think about what are the times of day and what
are the times in the family's life whether it is going to make
the most sense for your child to be able to play with technology?
Alex: We talked about being online and offline and how
technology is better when you combine the experience.
Sara DeWitt: There is interesting research out there,
one from Texas Tech, that is looking at the combination of
screen time and conversation. And they are looking at a PBS
kids show, this wasn't a study we commissioned, but it was one
of our shows, Mr. Roger's Neighborhood — sorry, Daniel
Tiger's neighborhood, a spin off of Mr. Roger's. And they found
that kids who watched the show, it is deepenning their empathy.
But the gains are greater if the parent talked to the kid about
what they watched. And they tested parents talking to their
kids about empathy, and that had some gain s It is not as great
as the combination of the two. Letting kids get into that world
and having the characters as context to talk about increased
the benefits of technology. So talk to kids about how they
watch and play. Alex: So Common Sense is a great
organization, it has been around for 15 years.
Jill Murphy: I'm sorry that everyone is very familiar with
it and signing up for our newsletter as we speak. We're a
non-profit organization and I oversee the ratings and reviews,
we review everything from books, games, movies, TV shows, apps,
from a parent's perspective and child development perspective.
Everything from 2 to 18, we are looking from Elmo to The
Hangover, and we know that media is aging down dramatically, so
kids that are 8 are interested in the latest Marvel movie that
is out. We are conscious of the media that kids are absorbing,
and are ready for and what age. And we do parenting advise as
well, we take a lot of that content, we have the best list,
we put it into articles, like 50 books every kid should read
before they are 12 so parents and kids can make a selection
with the content that is out there.
Alex: So there's a trend on the positive side of technology
and on the negative. Are you seeing trends that are very
positive and on the flip side, if you take both sides?
Jill Murphy: We sit on both sides of too much screentime,
and we are all handing our phones and tablets to our kids.
And kids are winning in this game when it comes to TV. There
is so much TV out there, if anything, there's too much and
it is hard to make a decision. So that has been one positive
spin. I think that, in that front, it varies from the very
young ages, preschool content is strong right now, PBS and all
the way up. So that's a great thing that we have seen. And
reading has been back on the rise, so that's another positive
spin. And so, again, we cover all different types of media.
When it comes to technology, social media, 0-8 is
when the parents are in the driver's seat, and 9 and up is
damage control. Parents come up and say, what is the right age
for a cell phone, my kid is on social media, are they too young
for it, their friends are on it, it is all — we are making these
decisions on the fly, and so we are constantly encouraging
parents to take a beat, take a minute when it comes to making
those decisions, don't make them quickly. Often times we hand
over a phone, we bought a new one, we will give the old one to
the kids, and the next thing you know they are exposed to content
you are not ready for and easing into something of this.
Alex: In a report you published about the effects of social
media, it was on teens, it was fascinating as well. Some
people said it should said, oh, it is horrible and terrible, but
yours was balanced.
Jill Murphy: We are finding that it is positive, or a
balanced approachism . We are the ones that are freaking out,
the kids are feeling like, I know these are my friends
online, but they are at school. They don't make a distinction
between this is my social media life and offline life. That
distinction does not exist. Bullying does not just happen
online and not the playground. This is something seamless for,
and for teens, they are sophisticated about the ways
they navigate social media and self-managing their own habits,
multi-tasking, and that is not all teens. But I would also say
that one thing that has come up again and again in the last few
years of our return is parent modeling, which I'm guilty of
all of that. But being a role model and trying to show your
kids the way that you want them to behave with their media,
whether it is at the dinner table, you know, not jumping for
a notification that that has a lot of impact on teens.
Sara DeWitt: Have you followed the idea of narrating what you
are doing on your phone? So if you think about, when we were
kids, if you were invited to a birthday party, you got an
invitation in the mail or at school. You had a physical
thing, and you would go with your parents to get the gift,
RSVP, you looked at the map together and you would go to the
birthday party. And now it all can happen by the parent on the
phone. And the kid does not see any of it, doesn't know that now
my mom is ordering the gift online, and now looking up where
we're going to go. And the idea of saying to the child, I'm
looking at my email because you received an invitation to a
birthday party. And just including kids of what is
happening on the device is a great idea.
Jill Murphy: And this is the mantra to parents, talk to your
kids, talk to your kids. And when it is showtime, now I have
to engage with my kids. And this is my shameless plug, we
have conversation starters with all of our rate examination ings
and reviews. It feels daunting to take on this task and it is
easyier easier to hand over a device. There's something that
kids do that I don't come home and ask them about. So engaging
with the same framing, they are doing something online, they are
watching a kid unbox something on YouTube, or show a room tour,
which drives me crazy. My kids love those.
And asking why do you like that, those are essentially
commercials in my opinion. And having them understand what they
just watched, why they are watching it, is it real or fake,
advertising in that, are those kids getting paid, they are all
teachable moments and you don't have to take all of it on at
once, but using a little bit of it as leverage for conversation can be valuable.
Sara DeWitt: When you happen instead of reading and math, if
you ask what was on the show, and kids love to talk about
media. They will go on and on about a show, at least my kids.
And a mom last week I met on the edge of a soccer field, she is a
learning specialist and after her kids watch TV, she was clear
I don't let them watch screen time, but when I let my kids
watch, I have paper and crayons to draw what they watched. So
the next Saturday, I tried this. So they were watching a show and
when they were done, I was like, guys, come to the table, I have
crayons and paper out, and draw what is your favorite moment.
They did it. And I don't think that works all the time.
Sara DeWitt: You still hate that mom for telling you, but
that was a good idea. Alex: And talking about the
content that kids are getting as well, we're talking about the distortion and those that are
working for for-profit companies, they are motivated by
different metrics that are not necessarily tied to what is best
for kids. How do you work through that?
Jill Murphy: The majority of content for kids is nom app
stores, it is on Google play, or Amazon. And the success metrics
for those stores are what drives revenue is time spent, how long
you are in the app, how many ads you see, how many in-app
purchases you download. And this is trickling into the kid
space. It is hard for a kid-content product to a break
through without it being successful on this revenue side
and that revenue, those success metrics are not actually good
for kids. This is a real — it is a real problem to be
designing something that is just sucking kids in for long periods
of time. If you watch different games for kids, you can see, at
the ends of every game level, it hooks you into the next one. It is a problem for
kids that don't have the executive functioning skills.
So this is an area that needs to be disrupt ddisrupted, we need
models that are focused on how do we use this technology to
help kids learn and to help them gain some, like, pro-social
knowledge to understand how to put it down? And it is possible
to do those things with the technology. But right now,
that's not being rewarded. There are people that are
trying, but they cannot break through.
Sara DeWitt: These are challenges we hear from parents,
my child has a temper tantrum when I told him to turn off the
iPad. It is a constant battle, and it DUCHBT doesn't stop, and
using the technology in a way that sets up the boundaries, and
with purpose, and not just for theachy sake of, it is the
right thing to do, but a teachable moment in the
platform, it is valuable and it will start to eliminate this
generation of parents that are hovering over our kids, setting
a timer. One of the biggest increases in asking about trends
that has some our way is the need for parental control. They
just say, I have to shut down the internet, I have to shut it
down. I I I think they are doing it for their spouses, but
thaare on their kids. But we are not teaching them, we are
not helping them get to that executive function.
Alex: Are you seeing good tools? Jill Murphy: It is not
something that we promote, third-party tools, we talk about
engaging with your kids, play online for
a half hour and a half hour outside. There's a lot of
companies out there that are creating the parentm al controls
in the platforms you are facing. You don't always have to get a
third party product. Your tablet likely has controls, we have a lout,
and YouTube has parentalal controls. They are not easy to
find all the time, but they are there.
Alex: And that falls to the parents, they are not regulated.
Sara DeWitt: And it will be interesting to see if that
changes anything, but the experience that I had in the
kids media production by talking to other developers and the
things that we've done as well, parents will will tell us over
and over that they want these controls, and they will tell us
exactly what they want it to do, and then they don't use them.
And this — it just — so many folks accept this. They like
the idea of it, and then ultimately it is a little too
complicated to do, a little too hard, and they forget to do it.
Jill Murphy: And I will add, some people need that. It
depends on what is going on with the kid. It is not just fun
that that I want to see every image the kid is texting. You
may need more insight. So it is great that they are available,
but it is not the first best place to go.
Alex: And people in the tech industry clamp down the most, or
let their kids watch the least amount of screen time and
interact with that the least, which is ironic, given that they
are the people in this place and regulate it.
Jill Murphy: I thought tell you how often I hear that. If you
are in the industry that is trying to build new technology,
I want you to be thinking about how your kids use it.
The truth is, kids are in a world where they are seeing the
technology in the room all the time, they are increasingly
seeing it at school, and they — they are over at friend's
houses and they sometimes get to play on the friend's tablet. It
is important for the parent to say, is this this is how we use
it in the house, and we don't bring it to the dinner table, we
charge it outside the bedroom. And things like that.
And we are trying to kind of set up some healthy habits at home
early as opposed to getting kids navigate their own way. Alex: We talk about the great
things that technology can do for learning. What are good examples that you see out there?
Jill Murphy: I don't see anything that is fantastic, I
think that everybody is messing with it. I think there's
incredible positive potential here. So again, I work with
kids who are between the ages of 2 and 8. When I started working
on content for kids, we were dealing with the keyboard and
the mouse. That was how the kids are
navigating. If you think about a three-year-old, clicking is a
problem, they don't know how to read and write. And the
technology is easier for them to use. The touch screen is a
revolution in the kid's learning space for sure. And thou we
have an opportunity for kids to interact and learn content just
talking and asking questions. That is pretty huge, it means
they can navigate into more information about that dinosaur.
And it is early days, Alexa and Google are not asking the right
sets of questions to get in there, and parents are terrified
about this. They should be thinking about privacy here.
And I think there's a great opportunity for us to begin to
help kids on the learning journey by allowing them to talk
and listen. Sara DeWitt: We focus our
privacy on the education space, there are so many schools
adopting technology, and data privacy is certainly a concern.
Data privacy at home is also a concern and, as we were talking,
I don't know if it has reached that critical mass yet where
parents are clear what the real issue is or what is being taken.
And so everyone is, eh, but we are moving forward with the
technology. We are looking at it a little bit more next year
for our consumer parenting site. But I would say that I would
agree. Privacy is certainly an issue that needs to be
considered and it creates challenges when you are building
a platform, and you are looking at privacy, and want to create
something — we are all expecting things to function a
certain way. We want to purchase it, swipe for it, we
want the answer quickly. But there may be a little bit of a
tradeoff when you want to guarantee certain privacy
regulations and policies with the technology that you are
getting. So I think that's a little bit of why parents or
consumers at large are like, eh. I think we are willing to hand
over our privacy in more ways that we realize for the use –.
Alex: Interesting. I wanted to ask, as we are winding down,
one thing that folks in the audience, and I'm sure there's a
lot of parents in the audience, that they can take away from
this talk and your knowledge about how to best promote
technology and screens with your kids. What would that
be? One thing, two things, or five things. Sara DeWitt: I said talk to
your kids. Media and technology is something that doesn't have
to happen in a black box, it is something that we talk about,
like our school day, and as our kids are older, we get into
critical thinking questions. What do we think made that game
better, how or do you like how that character was treated? And
getting kids into those conversations.
And what is really important, the other thing is that there's
no one size fits all solution. I'm asked all the time, what is
the right amount of time for kids to be on a screen? There is
no answer to that. You have to know your own kid and how well
they are able to regulate and handle media.
So I have two kids, one of them would watch and play all if I
let mim have the opportunity. So I have specific limits, and
the other one is just playing for 30 minutes and then wants to
put it down and do something else.
So we have to approach it a little bit differently, you have
to know your kid to be thinking about what is going to make the
most sense for them. Jill Murphy: I would agree with
both of those things. We are talking about how everybody
wants a number associated with, everyone we have our Fitbits,
our 10,000 steps, and a number that we are going towards for everything. And the AAP said
zero screen time for kids under two, and now they are saying 18
months and quality is what matters.
Knowing your kid is vitally important and having
conversations with the parents of your kids' friends is so
important. It will get you to that place that we are all
dealing with this together. And that is personal and in your own
community and not something that — we struggle a lot with social
media and getting a presence online for ourselves.
So let's have people share the cyber bullying story. Nobody
wants to share that, that is insane.
So these are personal decisions by child and family. And really
being aware of what your kids is — again, not just time, but
what are their fears, concerns, what do you want them to learn?
There's a ton of opportunities with technology and media, and
also a great way to get your kids talking, as I mentioned.
And really engaging with them about their media choices, it is
just — you can use that as a great skillbuilding opportunity.
And, you know, media literacy, I personally feel that it is at
the core of so much of what kids are going to be dealing with
moving forward. So helping them to start, from a
very early age, as young as 4 and 5, understanding commercials
and advertising, what is targeting them, and really
trying to crack open the messages that are in front of
them in a pretty simple way. It is a great way to start that
critical decisionmaking. Don't feel too guilty –.
Alex: We have time for one or two questions from the audience.
Audience member: What do you think about kids growing up and
they are exposed to news everywhere and
they always get answers to what questions they have and not
necessarily the right answers? Sara DeWitt: We put media
literacy, and we do a curriculum with schools and we incorporated
news literacy into our curriculum as well.
That is important. And we used to, the old
parenting device was turn off the news when the kids are in
the room. They are getting notifications on their phone, or
they are hearing about something on the playground. So being
ready to have conversations. And unfortunately, it does fall
back on the parents to be ready to have conversations with their
kids and get a good sense of plot they are exposed to.
And of course,of course, protecting other kids. And that
train left the station as far as notifications are concerned. So
arming yourself a little bit and being comfortable, and I also
think culturally in this country, we're fearful of the
news and the news is fearful. And kids much younger — young
kids in other countries are much more —
and we are trying to keep our kids protected. So opening up
that dialogue early on is important.
Audience member: Thank you, I use Common Sense Media and PBS
kids. I have a preschooler. And my question is on parental
controls, I wanted to give my kids a choice, but do you have
recommendations of how to curate — I want to give them a safe
playground where they can choose content. Once they have
internet connection, they can get access to everything. So
how do you create a curated space for them?
Sara DeWitt: Are you asking about YouTube?
Audience member: That is one of the places. But if you go to
Amazon, and on the television, you go to one screen, the next
show comes up and you scroll through everything and you don't
necessarily know — I would love to be in front of the
television, but you cannot always be monitoring everything.
. Sara DeWitt: You can only watch
so many episodes of Paw Patrol. But Daniel Tiger I would could
watch all day. And on YouTube, subscribng to
channels is a great place to go. YouTube kids is available for
that experience. You can go in and block channels, they have
given you control as the user to block things if you are not
interested, videos or toy videos.
Those are two examples. And there are other platforms out
there, there are other apps that have created this for you.
And those are a few things off the top of my head.
Alex: We are out of time. Thank you to Jill and Sara for a
great discussion. [ Applause ] >> All right, I have with me Dr.
Dan Boneh, and he started the center for blockchain research.
We will talk about that, that is something that I'm mildly
suspicious about. And before we get started, on
that blockchain, when I was growing up, the word hacker
meant somebody that did something, with computers or
electronx ices or things, and it veered into something that broke
into things.
And what happened to the word crypto?
Dan Boneh: So crypto, in this audience, meant cryptography,
that is the science of protecting your information and
more. And the word crypto morphed into the word
cryptocurrency. When you used the word crypto, if it was
cryptography or cryptocurrency. And crypto means cryptography,
but blockchains are an exciting area. There's an exciting
application of cryptography, we can use the word blockchain
instead of cryptocurrency to be clear of what we are discussing.
And we started the center for blockchain research and I'm
excited about the science behind blockchains, and researchers in
cryptography like me, every time I ask questions,
I walk away with more and we are excited about the applications for blockchain.
John: So you said that there's a great interest in security and
education. Dan Boneh: Yes, if I were to ask
you, what is the most popular area in computer science for
undergraduates? I imagine you would know the answer, it is
machine learning. It is actually the most popular area.
But security is the most, the second most popular area.
So our security and crypto courses, they are very
well-packed. It is the second most popular area among
undergraduates. And they are well-justified, and
jobs that are open in the industry, we are graduating
students in the area as quickly as we can, and all of you guys are hiring.
John: We are getting machines more and more involved.
And some areas that you are doing work on. I was interested
in the idea of aggregation information, and how you deal
with information in a secure way.
Dan Boneh: So that's an interesting area to talk about.
So cryptography is the science of protecting information, so
how do we protect information? It goes way beyond just
encryption, which is what the public thinks of. And I want to
think of one example of how we use crypt cryptography more
than just encrypting data, it is aggregating data. And there's a
lot of companies that are interested in the same problem,
which is the following. They put their product out there and
they want to understand how their customers and users use
those products. So today the way that is often
done is you send telemetry back to the headquarters and you
build statistics based on telemetry. So this is built in
many verticals. Car manufacturers want to know how
customers use their cars, what features of the radio did they
use, did they use the windows and so on. What features did
they use, how did they drive the cars, all of that information is
collected from the car and the user statistics.
Statistics, cell phone companies want to know how
customers use their phones, browser vendors use their
browsers, and so on. And this is down to telemetry, and
there's a huge issue of privacy and collecting information about
customer's use. So say you wanted to know,
you're a web browser, you're a cell phone provider and you want
to know how many of your cell phones are infected with a
particular malware. So we would have the phones report back
whether they are infected or not and you would aggregate the
information overall of the data. All that you are interested in
is how many people are infected. You are not interested in who is
infected. And by collecting telemetry in a
na￯ve way, you learned more than you wanted to learn. And the
best way not to lose your customer data is to not collect
it in the first place. This is a good mantra to live by. The
guest way to not lose the data is not to collect it. And how
do we aggregate information from the customers without learning
that data, how do we learn statistics about that data
without learning about the data. That's where cryptography can
help a lot. We built a system, and what it does is essentially
does exactly that. You can figure out how many people are
infected with a particular malware, and have their home
page set to, and how many people are on the Bay
Bridge. All of that information you collect in aggregate form
without learning about the underlying data. That's what
the system allows you to do. And I have to say, it is
fascinating, and this is getting deployed now, and you will be
hearing soon, there's a lot of interest in this from many
different vurt ferent vurt verticals. And the
way that Prio works, we use it in secret shared form. And you
can think of it that the user sends the information to the
cloud, and the cloud does not really get to see the
information clear. And nevertheless, it is able to
aggregate the information from all of the customer and once
everybody has contributed their data, it becomes the aggregate
data as it becomes available. There's an interesting problem
that comes up, since I cannot see the original contribution,
who is to say that you are not sending me John's data? That is
way out of-boundsout-of-bounds. If I wanted to, I'm a Match
provider, and you want to know how many people are on the Bay
Bridge, if you go on the Bay Bridge, you have an incentive to
report an unusually large number from your system or phone, so
the system thinks that there's a large number of cars on the Bay
Bridge. And so it is important to make sure that the data that
people contribute is, in fact, in the right range.
So people are not contributing John's data just to throw off
the contribution. And how do we do that?
So the cloud provider does not see the data in the clear, and
they need to make sure that the data is in range.
So this is exactly where this beautiful application of
cryptography comes in, this is an area called knowledge groups.
And you can very efficiently convince the cloud provider that
you are sending valid data without telling the cloud
provider what the data is. That's the magical aspect of
cryptography that everybody should p be aware it should it
is possible, it is possible to send data without you knowing
what the data is and they convince you that the data has
certain properties. So whether I'm on the bay bridge
or not, I will send you an encryption of zero or one,
without telling you if it is a zero or one. And once the
service is convinced that the data has integrity and validity,
it can aggregate it in private form. And once everyone sent
the data, the aggregate data becomes available. So the
system that does this is called Priole, the property that is
robust against malicious contributions is called
robustness, it is scalable, efficient, and it is getting deployed.
John: It is one of those moments where you think, how is
that possible? And the example, there are zero-knowledge
passwords where you can prove to a system that you know a
password without revealing anything about the password,
even if the other end was malicious. So if the other end
was taken over — how is that possible?
Dan Boneh: I will go over how that works.
What you said is a good point. It is important to understand
that with password management systems, when you log into a
website, you send the website your password. What if it is
the wrong website? You send the phishing attack to the wrong
website. There are systems where I can prove knowledge of
the password without sending it to the other side.
And now if you are on the phishing sites, on the other
site, you get nothing. So cryptography, you have to
understand, it is not about encrypting data, they do a bunch
of things for us and it is a remarkable tool to put to use.
And to be honest, this is why I'm excited about the blockchain
projects. They are at the cutting edge of what
cryptography is going to provide. And they are asking
questions that we, as a research community, never considered
before. And so they are coming to us and saying, can we do
this, XYZ, and that's a pretty good question and then pee go
and work on it that.
John: I wanted to talk about —
in the news today, there's a scary story about a hardware
attack and SGX is a hardware. Take us through what SGX is.
Dan Boneh: This is something that everybody meads needs to
know about. How many of you have heard of
Intel SGX? So let me do a quick recap of
it. I think this is something that everybody needs to know about.
Ait is not a hardware component, it is on a main
processer of the system. You probably already have it built
in. And it is a technology that allows you to create what is
called a hardware enclave. You can run code on your main
processor in a way that the code and the data that the code acts
on is isolated from the rest of the system.
So you can have, in your processor, you can cut off part
of the processor and have it run a job that nobody can see from
the outside, not even the operating system, the malicious
entities on your processorcannot see it, at least in theory.
That's what SGX allows you to do.
And there has been recent attacks on the hardware, and
instel SGX right now is not as secure as we would like. But we
hope that, over time, it actually will become better.
And there are actually many hardware enclaves ands
architectures out there. Intel SGX is one. And we will talk
about an application in the cloud and an application on the
end user machine. So one thing that people worry
about, if I out source all of my competng resources to the cloud,
including all of my data, then perhaps a corruptive
administrator can get a hold of that data and do something with
it. We are trusting the cloud to keep our data in our code in
tact. Well, with hardware enclaves,
you can reduce the trust in the cloud because all — your code
will run inside of the hardware enclave and it zil will have a
secret theme built into it, so in the enclave, the data is
available in the clear and outside, everything is
encrypted. So if everyone in the cloud exfiltrates your data,
they see cyber attacks. The only place where they see it is
inside of the enclave — we are not there yet. We are far away
from making this a reality. But that's a long-term vision.
And the end user, I want to describe something we did
recently. It is a useful and something that I wanted to
install. So the problem that is always
going to — was really frustrating to me, every time I
log into a remote system, and I typed in my password, or in my Social Security number or bank
number or tax information. That information, when I typed into
the laptop, maybe there's a key logger. And it is recording
everything that I typed in and setting it to X knows where.
Who knows, maybe the operating system is compromised and I
cannot trust it anymore to tell me what is running on the
machine. I wanted a way to guarantee that
whatever I typed on my laptop is not visible to malware, no
matter how deeply it is embedded on the system. Only the remote
website can see what I typed in and anything on the system
cannot. So hardware enclaves are a
perfect match for this type of problem. And the way that the
system works, it is called Fidelius, if you are a Harry
Potter fan, hopefully you will recognize it.
And what the system does is the following.
It is not ready for deployment, but I will explain how it works.
Everything I type on my keyboard goes through an
encryption engine inside the keyboard. So every little click
is encrypted on the way to the main processor. And the way we
do that is basically, we hack hardware these days, we use
Raspberry Pis, the keyboard is connected to the Raspberry Pis
and the Raspberry Pi to the machine. So everything is
encrypted on the rads pi Raspberry Pi on to the machine.
So now I have my clicks are encryptencrypted. Who can
decrypt that information? You can decrypt inside of the
hardware enclave. And as I type, nothing on the
system can see what I'm typing, other than the code running
inside of the hardware enclave. So that is step number one, and
number two, how do I see on the screen what I typed?
So if the hardware enclave — if the key word I entered, it sent
it to the graphics card, I can intercept and steal it. So we have a Raspberry Pi with
two HDMI screens going to the display. One going to the
graphics card and one to the hardware enclave. So the main
site is rendered by the processor in the main HDMI
screen. And whatever I type in is rendered by the hardware
enclave, that is sent to the Raspberry Pi. Now there are two
HDMi screens, one is encrypted, going to the Raspberry Pi, and
it descripts, overlays it on the regular HDMI screen, and one is
going to the display. You can see it because it is an overlay,
you can see that the keys I just typed on the screen so I can see
what I just entered. And the amazing thing, if I want a screen capture, I can use
healed — I can see what the system sees on the screen and
you can see the data. So this is a way for me to type.
I can see what consider I typed but nothing on the system is
seeing what I'm typing. And what I'm typing is taking it as
a secure http request and it is encrypted to the public key so
only the remote website can see the data that I typed.
And these components, the keyboard Raspberry Pi is
embedded into the keyboard, the display Raspberry Pi is embedded
into the display, you buy these keyboard displays, and when you
are typing into your keyboard, then nothing on the system sees
what you are typing other than the remote website and the
hardware enclave which is isolated from everything else.
So this gives me peace of mind. I am much more
comfortable doing my tax return on my computer knowing that
nothing on my computer can steal what I'm typing. So it is an
interesting application for hardware enclaves. And I feel
that there are many, many others, and it is quite promising.
John: Tell me about the center for blockchain.
Dan Boneh: All right, so blockchain.
So maybe you have noticed, blockchain is an area that is a
little bit overhyped, just a tiny bit.
I think the type around it is causing damage to the field.
When you ignore the hype, there is really interesting science in
the world of blockchain and fascinating questions. When you
look at what areas of technology and beyond do blockchains
affect, it is mind boggling. They impact distributed systems
and programming languages. When you program languages for smart
contracts, if you write bugs into your smart contracts, it is
not somebody's computer crashing, it is $50 million
locked up and nobody can get access to their money. So there
are huge amounts of money at stake as a result of these bugs.
We need verification to make sure the code is correct. We
need new cryptography, which is what I'm excited about. We need
game theory and mechanism designs for correctly
distributing systems in blockchains. And economists are
fascinated about what blockchain enables. And then there's a
huge aspect, legally, to cryptocurrencies and crypto
tokens. In fact, cryptocurrencies and
assets are a large sub-discipline of the law now.
There are academics and professionals working in this
area. If you remember, one technology
that impacts all of computer science, economics, and law, to
the extent that blockchain is impacting. We have massive
ideas that are generated here that are impacting many fields
of technology. And there's need for research, there's
fundamental questions that have never been asked before. So
researchers across the campus are waking up to these problems,
and there's a lot on blockchain and the center for blockchain
research comes under one umbrella. We run a lot of
events for the blockchain space, go to the CDR, center for
blockchain research, please join their events. In January,
there's a conference we are running on technology and
redevelopments in blockchain, it isoen to open to everyone.
Everything is open to the public. If you want to speak,
submit proposals. So it is an exciting area.
And new developments happen every day. We are teaching a
course on blockchain, it is very popular. We have almost 250
students, a very large number. This is the third year we are
teaching it, the number of students who register to the
course is correlated to the price of Bitcoin. So I hope it
goes up and next year we have 500 students.
John: Is it –. Dan Boneh: So it goes both ways,
unfortunately. So within the space of
blockchain, I'm interested in the area of crypto research
because of all of the new questions they are asking. I
will give you an example of something we did recently. It
has to do with privacy on the blockchain. And I don't know if
any of you know how cryptocurrencies work, but the
Bitcoin currency, the largest one out there, works by saying
that every time I want to pay someone, I have an address, and I write to
the blockchain saying, John, buy Bitcoin and that transaction is
recorded on the blockchain. And if you think about that, there
is something funny, so for the whole world, the blockchain is
public. So the whole world actually gets to see that I just
paid John 5 Bitcoins. Well, I'm not sure that's something that I
want the whole world to see. In particular, if Stanford wanted
to pay my salary in Bitcoin, the whole world would see what my
salary is. If you are a vendor that is buying equipment from a
supplier, but you have a supply chain, you are paying in
Bitcoin, the whole world will see what you are paying the
supplier. This is in conflict with business needs. Can we add
privey to the blockchain? And this is a fascinating area. Can
we do cryptocurrencies with privacy? So there are
cryptocurrencies that provide complete privacy, things like Z
Cash, Monero, there you cannot tell who is paying who and what
amounts, it is a private system. We were interested in a system
that goes halfway, and only hides the amounts.
So everybody will know that Stanford pays my salary, but you
should not know what the amount is or what the salary is.
So the way we do it is effectively, the value that is
written in the blockchain, Stanford pays Dan Boneh, but the
amount is encrypt crypted. We use crypto graphic — we think
of it as an encryption. The amount is encrypted. We
have these mounts that are encrypted. And the fundamental
problem with, or the guarantee of what Bitcoin does is it
guaranteed every transaction is valid. And one area that
validity needs is the sum of the money coming into the
transaction has to be at least as big as the sum coming out of
the transaction. It cannot be created out of thin air. And
everybody has to verify that publicly. And now I told you
all of the amounts that are enscripted.
How do you verify that no money is created, the sum of the
inputs is great erthan or greater than or equal to the sum
of the outputs? How do we do it? Zero knowledge. And this is to
show you this is a new area, the challenge here is data —
putting data on the blockchain is extremely expensive, it is
replicated all over the world. And it is data per transaction,
we will minimize what is on the blockchain. So the question is,
what is the shortest possible zero-knowledge prove that allows
us to prove the transaction is valid? So we are looking for
shorts. So we designed a system, bullet proofs, that
gives the shortest zero knowledge proofs that we have.
And actually, you know, that actually is something that is
adopted againagain. This is why I love the space, you invent
something and products deploy it.
So I will give you one example.
John: So let's let the audience ask a question.
Audience member: Can you talk about multi-part computing,
sharing, and the applications of that.
Dan Boneh: I think you started with homorphic encryption.
So we have a minute to talk about that.
So 30 seconds. So fully homorphic encryption is a development from
one of my former students that allows you to compute
unencrypted data. So I think we should, I don't know, there are
many applications for that technology. But maybe instead of
getting the applications, I will give a caveat. We know how to
do it now, in polynomial time but it is too
slow to be deployed into the real world. So it is technology
and the systems are going to get better.
Hopefully we will have better and better encryption .
John: So lunch is on the roof and in the basement. You can
choose if you want to be in the dark or in the sun.
And we will be back at 1:00. I will be back at 1:00 with Sophie Wilson.
And she is going to talk about the generous of ARM. Thank you
very much for joining us. [ Applause ]. Live captioning by Lindsay
@stoker_lindsay at White Coat Captioning @whitecoatcapxg. Next up: Inventing the future
in 1983 by Sophie Wilson! >> Good afternoon, welcome to — CloudFlare.
I'm incredibly proud to have Sophie Wilson with me. Sophie, it says that you are a designer?
So you are the genesis of something, an ARM processer, and
what is an ARM processor and why are there so
many of them? Sophie Wilson: In 1983, we could
not find a processer we wanted and made our own. We made a
processor that was used for desktop computers, that's what
the company sold, and it was a deeply embedded proccessor.
There are 120 billion in the world, and nobody knows how many.
You can use it as many times as you want on the chip, so ARM
DUNCHLT doesn't know, but a lot.
John: You made an 8-bit processor and you wanted to make
a business machine. And instead of making a different processor,
you made one yourselves. So why were you able to make that leap?
Sophie Wilson: It is a big stretch. So the first thing
that we did was we were using the 6502 microprocessor, the
same that is in an Apple Two. I don't think anybody sold a BBC
microcomputer for as much as an Apple 2. So you want to do
better with your next system, move on from an 8502 to
something more powerful. And we had the ability to add a second
processor to it. That means we can quickly build abroad, we
took the best of what was available from Intel, the 8286
at the time. We looked at the best that was available for
Motorola, still a functioning company that didn't make cell
phones, and that was the 68,000. We looked at an up-and-coming
processor from the national semi conductor, the NS1632, and along
the way, marketing decided they would rename it the NS3216,
which marketing does. And so on. And we began to
think, these processors did not make up to the claims that are
made by Motorola, etc. When we built them and
normalized everything to our systems, because it is a second
processor, everything is normalized to the same system,
we can see that the performance is very much the same. If you
normalize these processors, which ran without instructions
and data caches, to bandwidth, you can see that the performance
is close to identical. No matter what — that's fun.
It is Fleet week in San Francisco, so there will be some
Navy flyers doing fancy stuff over the bay. Sowy so we had all of these
graphs, the processor in two delta is determined by it's
bandwidth. And inside the BBC micro, we built a fast memory
system, 4 bits per transfer wide. We know that we can build
a high-speed memory system. We can make a 4 or 8 transfer
memory system, and we can make it 8-16-32GB. So we can give a
lot of bandwidth to anything that we designed, that would be
easy. And all of these processors
could not make a use of it. When we normalized the bandwidth
against the 6502, we found the most advanced processors they
were trying to sell us were the same.
So how do we get out of that hole? Well, by chance, as it turns out.
So a professor at Cambridge dumped on to my desk the first
research papers that were printed out by a new technology, and there's a
misconception about these machines. Risk machines is not
about the number of instructions, they have loads of
them, what is reduced is the complexity of instructions and
carrying out the instruction. So risk machines have all the
instructions that you need, they just put it together in a
different way than competing processors.
So that was interesting, and things are being made by proper
research, they had multiyears of years in designing stuff. Some
of these early papers were by IBM.
So they told us to show a way that one might do it, build your
own processor, while also seeming to set the bar
formidably high. So people that built the 6502,
they started to design a new processor and we thought, well,
we like the 6502 a lot. So we should go out and see if
they will mention the western Design Center and the other
places that made microprocessors before, we knew what to expect.
The massive factory, there are loads of people. And western
Design Center wasn't like that. It was on the outskirts of
Phoenix, it was a couple of bungalows with some electronic
engineers and a bunch of grad students.
So the processor that was designing the 65SC816, we used.
And what we came away with a lesson, a feeling of a couple of
the electronic engineers and the graduate students that designed
the microprocessors, if they could do it, we could do it. So
we started playing with the instruction sets in our heads.
And itp went to thinking it is not a maybe, we can do this
thing. We can design a processor. A processor is a
complicated set of digital logic, and inside a BBC micro
micro was a complicated set of digital logic. And it is all
about designing the digital logic and getting it right. We
can do that, so we did.
John: You did, and the first chips that were produced were
produced perfectly the first time. That's an achievement.
What did you do to make that possible?
Sophie Wilson: And here we seem to diverge from anybody. The NS3632, the 3216, was a big case
in point. We follow the processors we built the
evaluation systems for. In fact, Charles Fork came to visit
us, we put to together and made it work faster than anybody
else. And so that processor, it was
revision H, I think, before even anything remotely worked
properly. We stopped it by revision K, and they still
haven't got it right. So making sure that the thing
that we imagined was fit for purpose, validation, and making
sure that the thing that we implemented did what we designed
it to do, verification, but important steps that we took and you have to — you
have to do both at once. And so with validation, we have
to get really sure that the processor we built ran the
programs we wanted to. We wanted to do it, and so we did
that by building the instruction set simulators.
So it has been quite a lot of time, writing the instruction
set simulators, running the machine code, running on the
6502 on the 3216 and other processors. And those were
high-speed instruction set simulators, they did not model
what was going on inside, they took the abstract architecture
and ran those instructions. And then, using those
instruction set similarit simulators, you can do two
things. You can run the validation bit to run compilers
and programs, etc., that output the machine code and make sure
that this was fit for purpose, and we did what we want ed.
That was one thing that we went on and we rapidly brought up
compilers, BPL, and we didn't bring up C at that time. We
brought up lisp, BBC basic, and a whole bunch of stuff was
brought up to prove the processor works.
John: When you say we, how many people were doing this?
Sophie Wilson: So ARM was designed by me, Steve, Allison,
Robert, and Dana. John: So you have six people
that completely changed the way we do it.
Sophie Wilson: And the validation bit flew out a bit
further, so it was a bit like 15 people. But yeah, that many
people. And the second bit, the
verification bit, we wrote the self-testing programs, I wrote
the data path verification suite for the first ARM processor. So
that was written in ARM assembly language, and we were checking
to get the right answer. It was a block of code, you run it on
an ARM, a perfectly-working data path, lovely. And other people
wrote ones that tested the load store student and SOE — unit
and so on. And these are just running on the instruction set
simulator. And meanwhile, we made a ton of
programs. So Steve Fervor wrote on it the
BBC computer and register a transfer level, model of the ARM
processor. This is something that in the instruction set
simulators carries out the actions of an arm processor, but
it does it with high fidelity, and obtaining all the things
that the logic of the processor will do. It is timing accurate,
uses the same buses, there can be no mistakes in it.
So after we had all this stuff running on the high performance
machines, then we have the model of the processor carried this
out. And that will take a lot longer.
So today they look like low figures, and we have BBC micros,
they were powered by 2 and 5 meg hertz 6502s, and the learning on
those machines was 100 to 200,000 instructions per second.
That was fast enough that I could write to an ARM machine
code and run it on the instruction set simulator and
have a useful, usable editor. A lot of people use it as a
preference towards the other editors, which is knight —
quite funny. So the RTL model of the
processor is running much slower, at about 5 to 20 times,
that is exactly how it is doing, instructions per second.
So we had a BBC micro, each a part of the test program,
getting the whole volume of test programs through the
accurate model. And now, meanwhile, we built the
processor out of transistors. So we use the transistor
simulator to prove that of the transistors, when running the
test programs, is exactly the same.
And now that seems really slow. We were running on the poly
domain computers, powered by the 68,000s, and the transistor
level of the ARM at 25,000 transistors is running sub-one
instruction per second. So that's really, really
tiresome. We could not run the test suite on the extractor
transistor, we had to select bits of test suites we thought
were important, which you can do by code coverage and run them on
the extraction transistor model. We had three models that we can
move are identical and the high-speed model was proved to
be useful when we are designing what we wanted.
And thou, we did that not only for the processor, but we were
actually building four chips at the same time. A controller, a
video controller, and an I/O controller. We had four chips,
and everything worked. The chips would come back to the
fact, we put them in, run the BBC basic on the processor,
print high, open a bottle of champagne, write April 26 on the champagne
to signify the date that the first ARM worked.
John: It was an over night success?
Sophie Wilson: An over night success takes 25 years. We did
not sell the first ARM-powered devices until 2008. Yeah, we
were slow. So while ARM was captive inside acorn, only Acorn
machines and a couple others that wanted to license the
technology were using it. So we were in a world where maybe
100,000 processors with a chip in a year, and that takes a long
time to get to 130 billion like that.
And so Acorn decided to set ARM 3.
And so in 1990, the third time around, a consortium of Acorn
technology, and a company you might have heard of, Apple,
decided to set up and fund a company called, named after the
microprocessor, the microprocessor was Acorn and the company is called ARM. And
now it stands just for ARM. So that's kind of fun.
And –. John: The interesting thing
about this, you stopped making chips and you licensed
intellectual property. Sophie Wilson: So we realized we
didn't know anything about running a chip. We were
engineers. So we went out and we found a
chip expert, a salesman from, as it happened, robin Satswig.
And he had a brilliant idea, absolutely brilliant.
And, just to prove that it wasn't a mistake, we had another
brilliant idea. So the first brilliant idea was,
instead of this company like a company of side technology, to
sell chips, the company ARM would be founded and set up to
sell people who like to make chips.
So it would sell intellectual property, the
first IP licensing company. So that was brilliant idea
number one. And number two, ARM was a small company and it would
remain most of its existence, quite a small company.
So if it was going to be successful, it would do it in
partnership with its customers. So ARM built this world where
everybody is in partnership with everybody else. You may be
competitors in real life, but when you enter the hallowed
doors of ARM, your friends and your pals, you can trust secrets
to ARM, and ARM will keep your secrets from your competitors.
And together, you will grow a happier world. And that turned
out to be more brilliant than the technology and the IP model,
a partnership model, that was introduced.
And that made ARM extremely successful, eventually. An
overnight success of 25 years. John: This was not just a
processor, it was a system on a chip.
How did that come about? Sophie Wilson: So while ARM was
inside Acorn, we designed a processor, a video controller,
and an I/O controller and Acorn was about making machines
cheaper, and making more powerful machines, large
sections of Acorn market base were into education, because
people didn't really like to pay a lot for a computer.
And they wanted it to last. And the slogan for ARM was mix for
the masses, mix is millions of instructions per second.
And so that kept us on the straight and narrow. Everything
that we put into the processor design was aimed at producing
stuff that the consumer mass marketed for acorn's machines.
So in 1987, the first was powered processor was launched,
and computers were launched, from Acorn. And Acorn was on a
treadmill all the time to produce new and better machines.
So we needed to produce high-performance ARMs for the
top-end machines and the cheaper ones for low-end machines.
And so what do you do for the higher-end stuff? We
integrate better processors for the transs transistors.
And for the lower, to get cheaper, we put all of those
things on to one chip. We were the first company to make system
on chip. And that was made by GC semicon
semiconductors, and they helped us do that.
John: That was because the ARM call that you created was much
smaller than the other risks that remained.
>>SPEAKER: Yes, one of the themes that kept us on track was
to built a processor not so it was the most powerful processor,
and not to prove a point that risk was best way to build
things, but to make something that we understood how to build
that would be low-power, high-performance, and work well.
And so we ended up having to use 25,000 transistors, which is
quite a low number at the time. That is only 5 times — so 6
times more transistors than the 6502 used.
So a very small processor, similarly because we are a small
team, the memory controller, the video controller, and the I/O
controller were also small chips.
So we had small things that we understood, we combine them
together, and they work as one, so we can combine it together
andThen we can make the world's first SSH.
John: So if I take my iPhone and drop it in acid with the ARM
core exposed, do you recognize something that you worked on in
'83? Is there anything left? Sophie Wilson: So Apple is a
special case. If you take out the iPhone, ARM has different
sorts of licensing. We can license somebody like Qualcomm,
an ARM core designed by us. So Qualcomm uses cores with
imaginative names, like Cortex 875. Their marketing people
don't like it, so they renamed it chryo260 or 280, depending on
what they have done with it. And Apple is a special case,
they invested in ARM and they demanded stuff that is made for
them. ARM cannot afford to make too much fully custom for
people. So it introduced — it introduced more
ethereal license than license IP. Apple has the right to
build processors that execute the ARM instruction set.
And that is an architecture license, just as the licenses
make ARM architecture machines, it does not get anything
directly from the ARM apart from help and test programs and
compiler and all that sort of thing.
So if you took your A12 bionic out of your iPhone, whatever the
number is, XS, 10S, then the A12 designed on the 7 nanometer
processor contains no IP.
It contains no imagination technology IP, another UK
company, provided Apple with its GPUs for a long time. And Apple
— they extended out from designing processors.
So if I look at the Apple A12 bionic chip, I can see clusters
of stuff, where the big cores are. Apple's big core is
vortex, in the A12 bionic. And the processor is probably the
most complex implementation of the processor architecture being
mass marketed on the planet. Everybody thinks it is Intel.
And asfer far as we can tell, then the vortex processor does
more work per cycle, more power efficiently, than anything else
that exists. And there's also an Apple design
processor, Tempest, in there. It is a little core as opposed
to a big have the idea that you have
cores that are not completely power efficient, but big, and
the small and the two execute the same together. So Apple
designed not only their own super laative big core, but they
have the little core. They have been designing little cores for a while.
John: In '83, you and a small team of people made this thing
that had a wide-ranging effect. Who is the equivalent team today
and what are they doing that I can invest in as well?
[ Laughter ]. Sophie Wilson: It is difficult.
By its nature, you don't know. We designed ARM in total secret,
when we put out the first press release, there are quite a few
companies working on innovative problems. — projects.
And who is going to have the most success is hard. You can
look at ideas on how to build a processor with a more model patch on, and take
your chance. But it will be 20 or 25 years
before you get your money out. John: I'm willing to wait, I
need a retirement plan. I wanted to give the audience a
chance to ask a question. This is fascinating, a question down
here at the front. Audience member: At some point,
Apple decided to use power PC. Was there a competition between
power PC in the transition of 16K to power PC? I assume that
they evaluated ARM?
Sophie Wilson: It was not ready to be in the Apple desktop
machines. Power PC, they put in a lot of
effort into making a high-end implementation. With ARM, we
started at the low-end implementation and we built up. So it went — it was Apple's use
of ARM and then Steve Jobs killed it.
Audience member: The efficiency of ARM,
was that intentional? Sophie Wilson: That was, the
modeling tools were not good. When we put the first chip into
the socket, and it worked and we opened the champagne, we looked
at the bored and the line was broken. There was no power
going into the chip. The processor was working, but it
was looking at the power it was stealing from the chip
surrounding it. John: So thank you for flying
all the way here for one day to do this. You must be exhausted.
And also, I wanted to tell you, the BBC microwas the first
machine I ever owned, my parents could barely afford it, which
causes me to sit here talking with you about it. So thank you
for building it and thank you for your time. [ Applause ] Next up, we are talking about
China's role in thee internet economy. We are here withio —
with Joe Han and Jeffrey Ma. China is big in the news, with
presidential tweets about it. And I think there's certainly a
different perception in the United States and in the west as compared to China about how to
do to do business. And Joe, given your position in
one of the largest Chinese companies and infrastructure companies, you
have an interesting perspective on how to think about how to do
business in China if you are a U.S. or European company trying
to enter the market, and thinking of the infrastructure side.
Joe Han: Sure, I will speak on my thought . I will say that we are not much
different than a Chinese company doing business in the U.S. Today
we are here because — CloudFlare is doing business in
China. And you have huge investments,
you can find in policy. So myself, I lived in the U.S.
for 8 years. When I come to a different country, you have to
understand that there are rules. So that's a — my thinking about
from the U.S. company, if you want to do business in China, it
is the same thing. The rules, the laws, the regulation, that
can be different. But the processor is the same
thing. If you want to be successful, you need to
understand the local market, you need to find the right partner
and the right people. And so it can take time.
But any company can be successful, I would say that it
goes through the same process.
Alex: To follow up on your answer, we have the greater
requirement to partner, or joint venture, and doing business in
China. But there are in other parts of the world.
So China, there are specific differences, you would agree,
for companies to enter the market. Maybe that is changing.
Joe Han: It would depend on how you compare. In this industry,
that is true. So from China, it is a different
licensing process, and some of the industry, China requires a
license in the U.S. And you do not need a license.
But if you look at other countries around the world, we
run the operation in 35 countries around the world. A
lot of those countries need a license and a joint venture if
you want to get that type of license.
So I would say, yes, some perspective, compared to the
U.S. And it is also very unique.
Alex: What do you see changing, given the current climate and
transition? What do you see changing on the business side over the last
6 or 12 months? That is in China as it
relates to the U.S. Joe Han: I would say any
country, and I don't speak for the government.
But my feeling is any country, if you want to be successful,
you need to attract investment. So you have to make it easier for money to come from
other parts of the world, if you want to make it easy to do
business in your currentry. So tariffs on one side, regulations
on the other, and there's a lot of things that you can improve.
So I think China is doing the same thing, and they are trying
to make it a tactic. You have to attract money. You are doing the same
thing, you want the money to come back into the U.S., you
want to have investment and China is no different.
Jeffrey Ma: We have ideas to share.
And China is so different from the U.S. You know, we are
talking about — >> Are so many issues. And for
my economy, we hated reg regulatory issues,
too. It is so different. In the
Chinese ways, if there's a new business, a new thing, it comes
out. And at first, there's no regulations. You can do almost
everything. And then the government tried to
reactivate more and more.
Sometimes, the competitions in China are so hard, sometimes you
need more regulations to help that.
So it is a very difficult to say it is good or not.
But we need — we need to understand that the difference between — in my
understanding in the U.S., we were talking a lot before doing
that. We would make a new law to regulate that.
And, at first in China, we can do almost everything.
And then the government will learn from the practice, and
then — it is just a different way. >>SPEAKER:
>>SPEAKER:. Alex: Do you see situations that
have an easy time getting up to speed?
Jeffrey Ma: They are in early stages, there are almost no
regulations. You can do any kind of testing.
So it is just a different way, I think.
Alex: At what point does it become a robust, fast-growing
entrepreneurial ecosystem in China?
There are fighter jets going by. At what point does it hit
against the regilateegilater laters?
I'm thinking of Alibabathat grew very quickly,
with very little regulation, maybe too quickly.
And other companies, had government catches up to them
quickly and they start to regulate them.
Joe Han: We use that a lot in China, we WRUZ use it for
everything. We use Ali Pay. And this is a good example, we
are looking at how the regulations of a private tech
company are not allowed to do payment in business.
Let's just do it. And they will try to regulate
who can do this, who cannot do this, and what kind of license
that you need to get. This is step by step.
So honestly, from some perspective, I envy companies —
like private companies, and there are more regulation, or
more requirementsism and if I understand . There is no choice
for us, we need to follow our paths. And so both of us can
have all our unique value propositions and we can be
successful. Alex: That is interesting.
There is another big movement in China, government sponsored and
then also a little bit organic around Chinese companies. And
the first was in a significant way, reaching out beyond China's
borders, you see a Alibabadoing that. 3/5 of the most
downloaded apps last year were Chinese, FLOBEL globally. You are seeing
that also on the infrastructure side.
Can you talk about that? Joe Han: Yes, whether it is
national, or government-sponsored, it is
definitely market driven first than the government that seems
to say, this is the right thing to do for the overall economy.
And for example, we came to the U.S., this was by 2000, Los
Angeles. And it is not really driven by
the government's sponsorship. The government didn't ask us to
go overseas, it is driven by the internet, because we need to
connect to the U.S. carriers, we have the internet technical,
too. So nowadays, we are going to
become natural for almost any medium or large Chinese
enterprise. They have an operation, and for us we operate
in more than 30 or 35 countries, and we are keeping investing
around the world.
And that is all driven by the market demand, a lot of our
customers, Chinese manufacturers, and mostly
private companies, they are going abroad. We need to
connect them in China. And the internet traffic in China is
huge, because of the population. And we are keeping — this is
all market-driven.
Alex: 800 million people in China connected to the internet,
if you combine them in the U.S. and Europe, it is staggering.
And what people don't realize is had the internet works
differently inside of China than anywhere else in the world. We
are talking about connecting telecom outside of
China globally. There's three large carriers, and we cooperate
and not always perfectly.
Joe Han: The internet in China is not different than the
internet in the U.S.ism it is. It is the same, we have the same
protocol, the same agreement, and in certain ways
we can get your service. And you are right, it is different
in China depending on the internet world.
And that is a different regulation, and probably more
regulated than the internet in the U.S.
And we have three main carriers: China telecom, mobile,
and each has a huge customer base. As you mentioned, the
broadband users in China is around 350 million. The
internet ones, 1.5 billion, most of them, 75 percent of that are
already LTE. And so that is a huge base and that's a lot of
internet companies, if I am very successful, that — a lot of
that, you have to give the credit of the huge population.
And also the infrastructure that we already havex in the country.
And so the internet connection, each one of the carriers, is not
always that source. So given all of the
carriers, they are state-owned. And we competed with each other
fiercely. And they have the requirement, you have to
compete. So eventually that benefit to the users, the pros produce
goes up every year, and the infrastructure every year. And
90 percent of the broadband users are at home. And even in
urrural areas, you have to provide universal service.
So that is part of the country, you have to upgride the
infrastructure, even in rural areas, in the midwest and
mountain areas. And that's a very powerful —
but the result is that –. Jeffrey Ma: It is interesting, I
travel a lot in the U.S. and in China. I notice that mew
personal experience, the carrier in the U.S., in rural areas, in
China, the carrier is much better. It is much better than
mobile. That is probably because China
telecon is a state-owned company. They have to do all the jobs.
Joe Han: This is part of the program, the government wanted
to lead those rural areas out of poverty.
So rather than giving them money, they are going to provide
infrastructure, you pay with a load, you have to get the fiber, . And the benefit, think about
U commerce, you have to do something like that, without
infrastructure, it is very hard for them.
And, by the way, we do have the requirement from the city
council. This is like the KPI for us.
Joe Han: Interesting. The entrepreneurship, and at the
same time, you have a regulated internet, and the benefits and
drawbacks of that. At the same time, you have this incredible
entrepreneurial ecosystem and there's a ton of this venture
Montana money inside of China and elsewhere pouring in.
Jeffrey, you started a business that you sold.
What is different? Why do you think that is, the large
resurgence in entrepreneurship that is happening over there?
Jeffrey Ma: It is an interesting 10 years.
It is nearly 10 years. And for 10 years before actually
— it was 9 years before. If I want to start a new
companies, that is very difficult for me.
At that stage, there's a very few early stages of venture
capital in China. And they gave out low evaluations at a time.
It is difficult to do, especially for early stage.
And in 2009, the CEO of Microsoft and
Google, he started a venture capital in China. And that —
it is a very interesting thing that happened after that.
He started a really early-stage venture capital, the young guys.
So after that, he got the first of the success, hundreds of the
early stage venture capitalists appears. It is a very, like
Chinese-style. When someone gets something, just partially successful, they
come out. But for venture capital person,
the competition is very good.
When you want to start a new company, what happened in it the
rest of the years, they will rush to you.
They want to manage you. And they get a higher and higher
evaluation. And the interesting thing is now, 10 years ago, I
probably will not come to Silicon Valley or to the bay
area to ask for venture capitals to come. And now my friends
here, they come back to China to talk to venture capitals here
for higher evaluations. So that's the thing that is
happens in China. So when we talk about a lot of
things that are good or not good in China, we need to get a
better understanding of what really happened there and what
the patterns there of the things that are happening, so the
venture capital versus the different internet companies.
And my two sons are 14, the government comes out, and they
notice that. And the prime minister, everybody should
do more innovations. We are encouraging am innovations, we
encourage start-ups, it is already 2014. All the start-ups
are already there. So we come out with some help. >>ANDREW MOLDOVAN: regulations.
And the marketing is by themselves, very quickly.
And so in the last 10 years, the venture capital in start-ups, in
China, we are starting to change so much. It is like sometimes
we call the Chinese speed — every possible innovation way, they are not.
You know, there's a — we are always
saying, okay, there is someone in China, there is someone
smarter than you. It is worker hards than — harder than you. It is
the way that we are working.
Jeff Immelt: You used to work for a company in higher pay, and
now they start their own business, it is just so easy to
start a business again. This is for the internet world, and a lot of — the venture capital, they give you all of
these for free.
Alex: Not everything that has happened is good.
We need to understand –. Alex: So Amazon rebooted their
China business, and now there is talk about Google trying to
reenter the market. What do you think about Google and the press
reports around that? And do you think they are seeing something
that makes it easier to enter the market? # #.
Jeff Immelt: We will be happy to see more Google products. And
my friends say, oh, Google China — that
helps something. But I don't agree with that.
We do a lot of things. For example 10 years ago, Google
map is the most mapped product.
Now I use vital maps and Google maps.
It is quite a different product now.
So Google — probably, yeah. I do invite friends to use Google
maps, rather than doing that in the last 6 or 7 years.
And it is a new product, but it is a very different style.
When you use Google map, it is like a friend. They told you, it is like your
mom. Do this, do this, don't do that, do this.
It is a very different style.
So it is a different situation, if you — if the
environment that you are not familiar with, that really
helps. And you can always go this route. So when you are
used to the maps, then you use Google maps.
So the products in China, also innovation — it develops very quickly.
So this is really welcome, Google, coming back, and
bringing more best products here so we can really heeach other.
So the competitor products help, and we make products better.
That's the way that we love. Alex: So we're out of time.
That went quickly. Thank you for being here, and
thank you for travelling all of this way to participate in our
summit. Thank you very much. [ Applause ] Next: The reality of U.S.
privacy law: Does it exist? By Eerror error Eric Goldman and
Terrell McSweeny.
A Alissa Starzak: I'm here with Eric Goldman and Terrell
McSweeny, and we're going to talk about privacy. And privacy
kept jumping up this morning and everyone wanted to pre-empt us.
And we are seeing conversations about privacy here and
everywhere. I'm the head of public policy, in DC, and we are
seeing this come up with a privacy framework, and all of
these divisions of commerce come up in the privacy standards. We
are seeing different entities coming up with privacy
standards. So the question I want to start with is: What is
going on privacy? Why are we seeing so much activity, are we
actually going to see something productive in the United States
on privacy? I will start with ter rrel.
Terrell McSweeny: I appreciating the invitation,
this is my first Cloudflare internet summit. We are seeing
talk on privacy because of the impact of the GDPR in May,
because of the passage of the California law here in
California, and partly because I think that people are really
coming to terms with the level of check
connectivity. It is having an impact across the board and
there's a vibrant conversation in Washington right now. It is
not settled. There are something like 10 bills pending
in congress. There's the missed efforts, the NTIA efforts,
there's the association and companies coming up with their
own principles. It is a dynamic space. Eric Goldman: We are in the
midst of a structural change in the economy, and so much of the
value that is created from our companies is coming from the
data that they know about their customers. We see the largest
companies in the globe that ever existed in history powering a
lot of their economic model on data. And so it is inevitable
that when there is that much power in money in an asset,
everybody is interested in how that asset is used.
Alissa Starzak: That comes up with the question, should it be
regulated? What are the policies from the United States
standpoint? Terrell McSweeny: Well, we will
start with benefits. So one obvious benefit is that
addressing what I think of as the hodgepodge of American
privacy law, many of you in the room are pretty familiar with
what I'm talking about. The more complimentary way to
talk about it is a sector-based approach, which used to make a
lot of sense when data lived in a sector and didn't move out of
it and is increasingly making less.
So we have alphabet's new regulation at the federal level,
we have 37 or 38 different state laws, we have major pharma
privacy law here in California, and some other states following
suit, we have different laws around biometrics and other
things in different states. So that's a hodgepodge. And soy
— so one benefit is artic articulating a consistent
approach. Another benefit is that we really need to have a
better way to articulate what the U.S. approaches to the rest
of the world. The fact is that the GDPR, which is easy to
understand, has been now looked to by major markets around the
world as a potential model. And we kind of lost the argument
that our approach here in the U.S. is a better approach,
because it is hard to explain what it is and it was hard to
push back on the notion that the U.S. is the wild west. I don't
happen to believe that it is, as a former FTC commissioner, but
it is hard to win that conversation globally. That is
another huge benefit. And a third benefit is coming up with
a predictable framework that reduces costs and it reduces
conflicts as well.
Eric Goldman: The incumbent companies, and knowing the
expected roles and what hay they can do — the small- and
medium-sized companies is not as clear. So it creates
predictability and incentives for people to understand what
the rules and to build off of them.
And there are some causes. Alissa Starzak: So the benefits
of understanding the rules of the road, GDPR, we are seeing
this pop up around the world of privacy rules. Is there a set
of rules of the road, or will there be a lot of different
privacy laws around the world and it depends on, who knows
actually, where you are operating, targeting, and what's happened?
Terrell McSweeny: We want to get to a place where we have an
interoperable approach. And we are reflecting an approach on
these issues that really expresses our value. Right now
what we're seeing are some pretty major marketplaces really
engage in privacy over compensation, and data
protectionism and localization makes it harder to move data
around. And the big companies can deal with some of the
impacts of that kind of regulation, because they are
enormous. But it has a real consequence, and I think it has
a conconvince — consequence for consumers.
They want data managed consistent with expectation and
they wanted to move it around, they want to be able to port it
around, they want these controls. So I think we're
going to get to a framework where we have that.
Eric Goldman: I like the idea of interoperability. The idea is
that we have a uniform law across all jurisdictions, and we
don't have a way to build different systems or rule celt
ets within the same company. That is not aprievable. The
GDPR cannot be implemented in the U.S. as it is structured,
there are legal limits in the U.S.
that makes it impossible to do the GDPR replacement. And the
GDPR takes a one size fits all approach, and the sectoral
approach, there are benefits to optimizing the privacy roles of
different contexts, and industry niches where we want more or
less protection, or we might want different protection.
And trying to keep up with a one-size-fits-all approach,
which is what the GDPR does, is not an optimal outcome. So the
U.S. will have limits on what it can do similar to GDPR because
of it's framework, and it will probably try to optimize a law
for the edge cases in different niches. And so I think the best
we canope for is interoperable. If we don't achieve chieve
that, we have major barriers for transport operations.
Terrell McSweeny: You identified different laws in
different places, it is complex from a regulatory environment.
I want to touch on something else that gets to that, too.
What information are we trying to protect?
So the question on privacy, you know, having all of these
debates and everybody cares about privacy and wants to talk
about it. But we do not talk about what do people care about
from a privacy standpoint, and what is the point? What are we
trying to accomplish with it? Do consumers care?
Terrell McSweeny: Why is everybody looking at me all the
time? Eric Goldman: You're the best.
Terrell McSweeny: Why don't you take this one.
Eric Goldman: Privacy means different things to different
people and different things to different contexts.
So when we talk about privacy, usually it is a melange of
privacy interests and consumer considerations, they group together. And to appeal
to them in different layers reduces the fun of the
conversation at that point. We will talk about it this, this,
this, and all of a sudden, our time is up.
So when we talk about privacy, we want the control of their
data. They want the ability to do — they want to be able to
say, do this, not that. But consumers don't want the time to
manage their data, they don't want to invest in controlling
it. They want companies to read their minds, to figure out what
they think the deal should have been, and do that, and if they
change their mind to give them an option to change it.
And so in that sense, what Charles said is pretty — what
we really want, as consumers, we want to do the
right thing. We don't can know how to regulate that, and to give us
control. And yes or no how we're going to get that. Alissa Starzak: We like to think
about things in terms of harms and risks. And what the law
says that is challenging about privacy across the world is what
are the harms, what is our language that we are talking
about the harms, what are we trying to avoid happening to
people with the data that we are collecting and using.
Some of the harms are pretty clear. There are some harms
that are economic, when data is lost, or mishandled, and there
are harms that are a little bit harder to understand, turning a
camera on, when they didn't know that it was going to happen,
that can be a harm. There's emotional harms if data is lost,
or inconsistent with expectations. So there's known
universes of harms, and QUHUS use case harms that are pretty
extreme cases. And then there's an intangible
concern that I think people have. I
don't understand what is happening to my data and I'm a
little bit worried it is used against me in ways that I don't
understand. How do I navigate these waters? This is the
challenging question that I don't have an answer to.
And I think we're going to agree on this for a section, or —
second, or disagree, and get to that question requires a broader
lens of privacy. It is very important, don't misunderstand
me. And it is really just one framework, it is too narrow a
framework to think through all the consequences of powerful
technology and uses of data on people and society, that's why
we have other law and frameworks.
We have civil rights laws, and we don't always import them into
the digital world. And we were at this inflection point where
we really need to start thinking through how do we import those
values and redlines that we understand in the brick and
mortar world into the digital world.
Eric Goldman: I think I lost you on that — the whole idea that we should — we I am
cautious. If I had one wish on this front,
in that sense, I agree with you. Let's talk about all the
different things that we're trying to cover, privacy is just
such a loose word that keeps us from understanding what we are talking about.
Alissa Starzak: I want to clarify this concept of laws in
the digital world. Things like civil rights laws, if their data
is used against them for a discriminatory purpose, or a
disparate impact, they don't have advertisements for credit
opportunities, for employment opportunities, for housing
opportunities, then we actually have a whole legal framework
about how to protect groups of people and how to protect
opportunities and access to opportunities.
And we also have consumer protection laws, like the fair
credit reporting act and the equal credit opportunity act
that I think apply in this space, and enforcers have
applied in this space, but enforcers continue to really
have trouble detecting conduct and knowing when to act on it.
So this raises some really interesting questions, because
it seems to me that there are all of these efforts going on,
that everything is about privacy, privacy, privacy. And
what we are saying, we are couching things in the term
privacy that are not about private data, maybe we are okay
sharing with Facebook, and then we have to figure out, is it the
consequences of what they are using against us, or you know,
what do we care about there? And how does that fit with the
connection to what's happening in the regulatory space some ?
Are we misguided as we look at these efforts, the congress
department's efforts, are there valuable things that can come
from that effort, and what do they look like?
Terrell McSweeny: You think they are misguided. And yes,
circling back on the privacy world for a minute. I think we
see an enormous amount of innovation in privacy frameworks
at the moment when we were talking about at the outside of
this conversation, we have several processes happening at
the federal level, innovation in the state level, there's a ton
of discussion about what are the right policies and how do we get
the right balance here, and the cost/benefit analysis. And there
are — I see some consistency, which is a glimmer of hope here.
I see a lot of coalescing around some of the four, I think of
them as control values, or consumer rights: Access,
correction, deletion [potentially] and an American
version of that, and we are more focused around portability and
interoperability as well. We will come together. And there's
an acknowledgement that we need a regulator, like the Federal
Trade Commission, and a number of differences, based on the
frameworks that are on offer.
Eric Goldman: This goes back to what Terrell talked about. If
we are not specific about which harms that we are trying to fix,
then there's easy enough to do something about privacy, but not actually address the
harm that's the underlying motivation. And the constant
semantic conflation of privacy and security is accelerating
this. So a lot of people think that they are labelling a
privacy initiative, but it is cybersecurity. That may be okay
in in terms of outcomes, but are we addressing the privacy
harms or some other thing. Alissa Starzak: In terms of
addressing harms, we think about enforcement mechanisms. So what
are the appropriate enforcement mechanisms, who are the
appropriate enforcers for whatever harms that you see? So
if we are trying to redress, you are trying to look for redress
for harms, how do we do that? Is it a federal enforcer, a state
enforcer, a civil action? You can imagine a variety of
different things that you do to address harm.
Terrell McSweeny: I think this is a big source of the debate in
Washington and at the state level. So for my money I would
say, if we think about consistency as a goal, then we
think about what we call preemption in law, which is the
federal government passes a law and then that is the law and all
the other laws are preempted by that or replaced by that.
And that's a — that is simply, we are closing access to one
whole branch of government as a fairly big thing and that ought
not be done lightly, and ditto access to the courts for
plaintiffsismtplaintiffsismt. The strength of the framework
will matter in order to justify that as the law of the land.
And then, when we think about harms, there's a variety of
different frameworks out there. There's penalties for failure to
comply, right, which ask nase and predictablepredictable.
There's a robust conversation, how do we value data, are there
models that we can look to in trying to understand the value
of data that will help us be more predictive in what is the
liability around using the data, or having that data breached, or
having a privacy violation. And I think that's very challenging,
because there's, you know, the value of it.
In the dark web, there's legal value, the value a company might
place on it, everyone is trying to figure out what is the value
of their data and there's a bunch of models on how to do
that. So I think that we are still really thinking through a
lot of those questions.
Eric Goldman: The heart of this question is the biggest lever
had for us. The question is who is the enforcement
mechanism, they are telling us who is the best at enforcing.
And we will answer this question before the rights are under what
you call privacy. And so we also then know that
there can be ter turf wars. The battle is not about who is
the best, but who is passionate about trying to get their turf
defined for themselves. And I do think that federal
agencies like the FTC would be a good place because of their
expertise with dealing with consumer-related matters. And I
also think that the worst mechanisms of enforcement is
private rights of action, which we see abused in a wide variety
of contexts. And any security breach can leads lead to
lawsuits, that is accelerated under the new California law and
those lawsuits do not redress consumer law, improve business
practices, and they are about allocating money. That's not a
way to solve the problems. And the other thing that is
interesting that was a little bit different than the question
asked, but I think it is central to the question here, doing
regulation at the federal level, or do we want federal and state
regulation, or could be state-level regulation, at least
on the general basis, or the sectoral basis that does not
have a purpose for feds recall federal law.
I have strong views about that. I think that it has worked
poorly to date, the state legislators do not have it the
expertise to figure out what needs to be done. They create
variations amongst states, and tickets for any company trying
to figure out what to do, and a lot of times we see a lot of
regulatory capture at the state level that we don't see — know
that sounds weird, but as much as congress is in the pockets of
the lobbyists, the state regulators have people pushing
back on each other. We see weird distortions where the
lobbying efforts are co-opted by very small voices.
So for me, the best outcome, this question is that we have a
federal law, it pre-empts state law, and the FTC is the
enforcement mechanism for it. Terrell McSweeny: I want to
heckle a little bit. In law school, you hear that the state
are the laboratory for innovation.
Alissa Starzak: They are the laboratory for democracy.
Terrell McSweeny: Even better. And in a world on privacy, we
are trying to figure it out. So shouldn't the states have a
role? So there's a way of rethinking privacy, and
California may not be the best example here.
Eric Goldman: You don't get to take away the story of bad
examples. I have a talk on why states are
bad at manufacturing privacy law. And they don't actually
run the tasks as scientific experiments, and what tends to
happen, once a state passes a law, we have regulatory
cascades, and other states copy the law before we get to the
test. And the experiments taint each other, so many are cross
borders that it is impossible for a state experiment to be
coraled without tainting the experience of another state.
So when it comes to states and privacy, we had horrible
results. Alissa Starzak: I am heckling
here, because Terrell has strong views on stats in this space.
Terrell McSweeny: You are making valid criticisms for
statess in this space. And I see at the federal level,
the job is to reconcile some of that. It is going to be a big
debate, and whether the framework is strong enough to
justify preempting state level protections that are stronger,
that's a real policy decision that people are going to need to
wrestle with. Alissa Starzak: I will ask one
more question, and that goes back to the federal level. If
we go back to the FTC and assume that we will have federal and
not state regulation, does the FTC has the capacity to do it?
So as you ask the question, who is the appropriate regulator,
the FTC is not a big agency. Terrell McSweeny: It is my
former agency, it is terrific and strong and it doesn't have
the resources to do this job. It has huge gaps in authority,
it doesn't have jurisdiction over common carriers and non
profits. So something we want is a consistent system from edge
— we wants everybody, whether they are analog world, or digital edge
leaders, to follow the same rules. So we have to fix the
gaps in the authority. We have to resource the agency, and one
of the things that the agency has been doing which is terrific
is bringing more technologists into its work, but there's not
enough of them in the work. You need a bureau of technology to
bring the technical expertise. And we are making authority to
write the right thing. Alissa Starzak: We will open it
up to audience questions. We have a bunch.
Audience member: We discussed privacy in the context of
consumers and business, but what about the more pervasive government, how does
that mix into it? Eric Goldman: This is my
frustration around the privacy discussion, we get frust
frustrated that Google, Facebook, and Cloudflare knows
everything about us. But government is the enemy, if we
worry about the private sector, and about how the government is
using us, and it is abusing the data — I feel like we missed
the forest for the trees. I don't have a conception about
how the government is putting handcuffs on itself, but we
should be demanding that. We are telling the government that
they need to respect us and I don't feel that they do that.
Terrell McSweeny: I'm for the government
respecting that, and I'm a proponent of strong encryption
and a bunch of back door things as well. This somewhat it is
what I will say about privacy policy. And when I talk about
privacy over compensation in major markets, resulting in data
localization and bad innovation policy, from my perspective, one
thing that happened is it gives citizens in those markets a
false sense of the privacy rights and controls, vis-a-vis
their governments. We see that privacy policy happening in
markets where people are heavily surveilled by the government.
And I think it goes into a false sense of what their real privacy is.
Audience member: Do you think the government has the capacity to keep up? The regulators
aside, the thought of politicians understanding the
nuances of Google's search function, for example, is pretty
far-fetched. So to address as this in the way —
for companies like CloudFlare, it gives them a tiny bit of the
privacy back to actually invent and distribute the technologies.
Terrell McSweeny: Technology is going to out pace the law and
the government. We are seeing Moore's law in that respect, it
is exponentially outpacing. How do we get ahead of the risks
that are having other consequences, not just legal,
but reputational consequences that are harmful to business as
well. So I think it is a — it should
going require a collective approach. What is central on policy making, and it has to acknowledge that it is lagging.
And it has to think that competition is a huge factor in
generating innovation and keeping markets functioning
properly. So when it is making regulation, what is a
competitive impact of that and is this a regulation that will
introduce more regulation into the marketplace or not? In
privacy, there's a tension there. If we are locking down
the data and creating walled gardens and real barriers to
moving the data around, YOENGE I don't think we are solving that problem.
Alissa Starzak: We are at time. I'm sorry we cannot take more
questions, but thank you very much. [ Applause ]
[ Applause ]. Next up: Cryptocurrencies:
What are they good for? By Nathan Wilcox and Adam Ludwin .
Jen Taylor: Thank you for joining us, I'm joined by
Nathan, the CTO of Zcash, and Adam Ludwin, the CEO of
Interstellar. Our topic is on
cryptocurrencies, what are nay good for? Without further ado,
we will get to the chase. What are they good for? We will step
back. How do we define cryptocurrencies? We will start there.
Speaker: So the first example is Bitcoin, and a distinguishing
characteristic of Bitcoin is that anybody can participate in
the network by running a node.
There is no central point of coordination. But it has a
well-defined set of rules that the network enforces. And following on Bitcoin —
(speaker far from mic). Speaker: So cryptocurrencies are
an asset class that enables something new that we don't
have, where we haven't had until the last
few years, sensorship resistance networks that layer on top of
the internet, and the point of that network to allow everything
from money transfer to a form of cloud computing, albet an encrypted one, to file
storage. So it is an asset class that gives rise to a
decentralized model of software, and the jury is still out
whether it adds value. And the world is wondering, we
can do that in a different way, do we want to enable that, and
how is a decentralized piece of software helpful to users.
Jen: So what do you think cryptocurrencies are uniquely good at?
Speaker: Ensuring that people are following a set of rules,
given adversarial conditions. So people can opt into the
rules, and a really deep question is who defines the
rules and what should they be. And we see a bunch of different
projects exploring that space. And the key thing is — (speaker far
from mic). Speaker: Yes, building on that,
no one can stop me from sending Bitcoin to someone anywhere in the world. No one can stop me from writing
the program and having it executed on Ethereum. Whether
it applies is irrelevant. And that line of thinking leads to
criminal scenarios in our minds. And the reality is that this
notion of not being able to stop someone from doing something is
another way of just saying open innovation. And lot of the
cryptocurrency community is optimistic fundamentally because
developers are iterating in a cycle time around financial
services that is two to three orders of magnitude faster than
the Wells Fargo innovation lab. And so often you will hear it is
about the potential, which is unsatisfying, especially when
asset prices level out and how do you justify it based on
potential. That's the reality space, very early. And a
technology this early does not get this much attention. We
would not be on stage talking about any technology that is
nascent. And the only reason we are up here is because it
happens to be money, and so it tracks this, you know, this
basic human reality that people are like, well, maybe there is
something here. We can trade it, and the capital market is
phenomenon around something. And the jury is out on whether
that is going to stunt the growth of the idea of
decentralization, and it is doing both in cycles. A bunch
of developers rush in because of the boom in the asset price of
these tokens, when there's a bear market, people get
disspirited and they leave. So it is hard to say.
But it is in fits and starts, and as of next month, it is the
10-year anniversary of the Bitcoin white paper. So there
are serious retrospectives that are coming out that all of the
journalists in the room will put out on what we have to show for
it. Jenn: If you were to feed the
journalists as a piece of information, what do you want
them to highlight as, this is a great use case for
cryptocurrency and what we accomplished in the last 10 years?
Speaker: A great use case is enabling internet applications
that involve payments or automation of financial
behaviors that act globally globally where you want clarity
about the operation of those transactions, regardless of
where it is operating around the globe.
So there's this clarity, potentially, that the automation
can bring that might streamline applications that would
otherwise be more piecemeal around the globe.
Speaker: I'm sitting here listen to the military –. Speaker: : Flying overhead –.
Speaker: That's the U.S. dollar.
So if I were writing a retrospective, I be honest about
the fact that no one in the audience here is using Bitcoin
to buy lunch. No one is writing a program to run Ethereum because it is better than AWS.
But, at the same time, we now have examples or proof of concept for viable
alternative networks for payments and exchanges. Stellar
is a global open exchange, any asset can be exchanged for that
network, and Ethereum for those purposes.
There are more tokens for Ethereum than stocks traded on
the NASDAQ. And the point is we have an alternative. Even when
we are looking at Bitcoin, there is some comfort in knowing that
you can allocate a percentage of assets to something that is
similar in spirit to gold, it has a type of direct ownership
or control, there is no right price for it, it is a function
of supply, demand, and belief. You can store your value in that
instrument. And that's a meaningful back stop in a very
uncertain geopolitical world. So you don't have to be a true
believer to think that is powerful. And the world is
definitely better for having those alternatives better and
the open ininvasion — innovation that is happening.
And being able to exchange assets in a new way –. Speaker: Should you tell a cafe
lunch that they should add support for cryptocurrencies?
Would you advise them to do that, or not, and why?
Speaker: It will take a while to get there.
So one conundrum, these different networks have
different rules, and there's an exploreation about what is the
governance for those rules. If you look into the Bitcoin
culture and ethos, it is very conservative. Don't ever change
the rules and that is what gives it the gold behavior. If you
look at something like Ethereum, the community is much more
interested in figuring out how to evolve the rules that makes
sense and adapts to the future. And so there's a conundrum — if
you are using this for payments, those are different currencies.
And it is odd for consumers today to have
different currencies on their phone.
So there needs to be a step that is somehow integrating these for
these consumers to use them, and also I think the real hard part
wll will be allowing consumers to understand what are the
tradeoffs between these rules and which strengths and weaknesses do they have, and how
does it relate or not to the dollar, or whatever local
currency they are using.
Speaker: Merchants use 2 to 3 percent fees to use cash. Cash
is no longer free. That doesn't make sense in the age of the internet, where everything is in
an internet format. And tokens are in that format for value.
Bitcoin is a token format, but it is the wrong medium for payment but we
will see dollar donominated tokens that will make sense in
the merchant context. Can I survey the audience?
This just occurred to me, based on the question. I'm curious,
what percent of the audience, how many of you
would want to remove cryptocurrencies from the world,
and how many of you are happy with
it? So if you could snap your finger would remove
cryptocurrency, out of the world, and go back to pre-cryptocurrencies? Anyone?
Who is happy that they exist, despite the mixed result said
sosews so far? That is compelling to me. There's the radical middle –.
Speaker: Yes, the thrill of cryptocurrency and the defeat
and people don't understand what is going on. And we can move to
questions earlier than we tupically typically do. And for
the people that are in the place, I don't happen where it
is going, and I'm curious to see us explore different ways and
contracts in how we negotiate things, even if it is not clear.
Where are we heading in cryptocurrency? If be we fast
forwarded, what is written in those articles?
Speaker: I want to say where cryptocurrencies are used today.
The first one is incestuous, but cryptocurrency
allows you to — (speaker far from mic)ly and
there . There are problems like that, with a lot of the
behavior, like, looks like behavior in the stock market and
a lot of the behavior actually has been securities behavior.
So the fundraising mechanisms have allowed these open source
projects to bootstrap that are DPLOBEL and global and have a
self-sustaining funding model. And I have been open source for
a long time, and in the past, it is a work of passion or it is
basically sponsored by commercial companies, like
Google, who may then drive it in certain directions.
And this is, like, the first time that I have seen where we
can have these international organizations, some of which
aren't even companies, or many are like non profits, or
companies, and they have this means to allocate capital for
their vision that seems NAUCHBL NAURFBL novel to me. And I
think that is pretty interesting. In other cases,
you can find other pockets of cases where sensorship
resistance matters. So one example, in Venezuela, where the
economy is terrible, there are some cases where some people are
able to mine cryptocurrencies, and so they are basically
providing a service for the cryptocurrency network and in
exchange for that, they get these assets. And these assets
actually are valuable to them even though they are volatile,
but less volatile than the local currency. And that juxtaposes
to dollars, which they want, but are difficult to get into the
country. The cryptocurrency, they get in immediately. So we
are seeing some of these edge cases where the value
proposition is starting to come through.
Speaker: In markets and plaus places, too, where there are
fundamental or structural challenges in the economy or
financial system that prevent the reALlocation and access to
capital, and they play by a different set of rules.
Speaker: I will take the 25 year anniversary.
So I do think that the internet — drop in audio. >>SPEAKER: There is no
corporate structure that makes sense for that, but you can
easily do that in these types of mediums.
So these mediums, that is where it goes first. We need more
design and product thinking in crypto to get there, because
most people in crypto don't talk to humans. So if you want to
talk to humans to really understand where those markets
are, and it is about time for that. They are largely invested
in infrastructure and now they are going to start getting on
–. Speaker: And Nathan and I were
talking in the green room, what are the product management needs
of a cryptocurrency ryptocurrency. We talked about
where we think it is going and stuff like it that. What are
two or three things that you need out of people in this room
or this community to unlock the 10-year anniversary? What are
the things you would ask of this esteemed group of people?
Speaker: Help people manage their private keys better.
Speaker: I'm all in on that one.
Speaker: If you thought deeply about passwords, authentication,
identity, you are probably the most important figures that
crypto can really use. The whole thing kind of like from falls apart when you build a
wallet, you have a pass word word and a recovery, and what's
the point of it. So we need to figure out how the end points
can, yeah. How the end points can be secure
and also can be — have a good user experience. That balance,
that is why the biggest product challenge right now.
Speaker: Creating the opportunity for –.
Speaker: Good security and ease of use is easy.
Speaker: And there's a fundamental challenge, this is a
new paradigm where users who want to participate have the
option of controlling their own private keys, and users are not
very used to that. We need tools to help them manage that
to protect themselves and to sort of understand what is going
on. Speaker: It is a safe deposit
box system. It is in the sky, you give keys to people, you
have assets in those boxes, you move them around and they are
decentralized. As soon as you give the keys to someone else,
it is an inefficient cloud service.
Speaker: You need the nappster experience to drive the
cryptocurrency. What led to that was all of a sudden, you
had a community or people that were like, I can go and I can
actually exchange these MP3s and I'm getting a lot of value. And
it appeal appealed to technical people and my mom. And there's
a moment too, what is one or two of the first things that unlock
the demand where I participate participate in that, in a
community at scale. So with that, I will open it up
to questions. I will start right here.
Speaker: In the book, the attack of the 50-foot blockchain, when
you put your cryptocurrency in an exchange is like putting your
money in a sock under somebody else's bit. I get you are not
in favor of exchanges, based on what you said. Can you comment?
Speaker: It is like putting your money in a sock under somebody
else's bed and then getting it back, if you are going to
exchange it for something and take it out. If you leave your
money in the exchange, that is true.
And that comes back to private key management, because if you
hold your own private keys, and you use a
decentralized exchange, you are becoming more mainstream now,
then you don't have that problem. It is more like a
marketplace where you are transacting with your counter
party and neither side has to let go of what they have until
the network confirms the transaction and then it swaps.
Like in the movie, where give me the girl, give me the money,
give me the girl. It never works because how do you do that
at the same time? Blockchain solves that problem, you cannot
swap it at the exact time and the first exchanges — it didn't
work that way. They are a central counterpart that does
everything for you. And it takes a lot of product work to
make it as good as a centralized exchange.
And also early exchange, anybody can install the software, they
can write it into the exchange suddenly, wherever they are in
the world. And as the sophisticated
innovations are pushing forward, so decentralized exchanges that
don't place your funds at risk of theft ever, as you execute
the trade, that is payable, they are researched, they are
deployed now. But meanwhile, the existing financial system is
adopting andinate integrating blockchain and there are
exchanges that are well-regulated, they maintain
licenses. So if you store your funds on
those then it is very much like a brokerage or a bank, they have
your stock certificates or whatever brokerages do for you.
But you have it the the option to pull the assets directly, if
you know how, and you are a computer expert right now.
Speaker: How do you see, in 25 years, what is going to evolve
to make it compatible with any kind of regulation? The bank
regulations are pretty strict. So it is very interesting, I
love what is behind this, but it creates problems because there
are certain cases where you cannot see it.
Speaker: Yeah, I think that people believe that a lot, but
it is — I don't think it is true.
So one of the exchanges that I was thinking of is called Gemini. It has regulatory
permission from one of the main regulators for financial in the
U.S. It has permission to trade Zcash, and if you — if you
study, I have not studied KYC or AML regulations that much. But
they were created in a world that was pre-blockchain that had
cash and these other forms of transactions and money transmize
transmission. Those regulations apply as much as well. And the
privacy in Zcash, the main distinguishing feature, is it
allows you and the counter party to reveal whatever you want to
third parties. You can reveal transactions to my tax
accountant, or a bank, or whatever. The privacy protects
you from everyone else. So Bitcoin is not private, by
comparison, to the banking system.
And ranking people in Russia can see which transactions you are
issuing, or if you are a business, they can learn about
your internal finances and things like that.
So the privacy in Z cash is intended to re-establish what we
are used to in the real world with cash, or dollars.
Speaker: I want a question from the radical middle.
Speaker: How much you disparage the traditional
banking system, what does it take for cryptocurrencies to be as easy to use as the
money in my wallet? Speaker: They are easy to use
now, we have Bitcoin wallet and it can be easier than a credit
card. In many ways, sending and receiving Bitcoin is easier than
using your credit card, or as easy as Apple Pay or something.
The question is a deeper product question than interface ease of
use. When does it make sense? And that is the big, open
product question. Speaker: With that, we are at
time. So thank you guys very much, we
really enjoyed the conversation. Thank you. Speaker: We have a 15 15-minute
break, we are keeping it short, we are a minute behind schedule.
Grab coffee and snacks, a restroom break, we will see you
shortly. Thank you. Next: Weaving a common thread
through Silicon Valley's history, by Adam Fisher and Ted Castle. Ledlow.
>>SPEAKER: There was a magazine called
Wired, those are the most notable businesbusinesses, they
are there because it was cheap. And they were acheal there
because of technology and that we can have a magazine culture
in San Francisco is because of the mack Mac and desktop
publishing, you can make these under ground magazines look like
magazines out of New York. That's what was here.
Speaker: This was the home of Monster Cable that turned into
beats Audio. So you can see the labelling for that, and both of
you have had an opportunity to interview a lot of the great
business leaders through the history of Silicon Valley. I'm
wondering if there was a common thread or theme that you have
seen across those, and I will start with you. What have you
seen that is common among many of the leaders that you have
gotten a chance to get to know? Speaker: I think, in Tim Cook's
words, he has rhinoceros skin. And the second is that they have
an ability, they have an ability to think different. It is those
two things that the people who made history in Silicon Valley
share, and I find quite oppressive. Speaker: The script reads like a
screenplay, and in the process of interviewing all of those
people, were there things that stood out as things that made
sense? Speaker: Yeah, like with what
Richard said for sure, the big surprise when I collected all of
these stories from all the companies you put of, and should
have heard of, is in the various early — the earliest days,
people knew that they were creating history and they were
having an enormous amount of fun. There was an incredible
spirit of creativity that was going on that — and that's what
caused people to end up working all night, not because of the flash of, I'm
going to work harder. And it is almost a leading
indicator. And if everybody said, oh my
God, we are having so much fun, it was great because we are
creating something new every day.
That was go really well for the future of these companies.
Speaker: Your whole book is telling the story of the people
that won, although you tell general magic, which is an
incredible cast of characters that went through there and
doesn't turn into anything. Did you see a difference
between, or did you talk to many of the people who didn't win, or
the 100,000 people who left San Francisco in 2000 when the
bubble burst? Speaker: The modern history of
Silicon Valley is basically Atari early '70s to '84 when the
Mac came out, and then you have the internet of '95 on. And
there's this, essentially, three main acts, this early
Atari/apple act, the middle act and the act from '84 to '95, it
launched to Netscape IPO, and everything that Silicon Valley
tried to do failed. And that's the general magic chapter, BPL, which invented
virtual reality, and so on. There's a lot of failure. And
they were having fun, too. I admit it.
But, as it turned out, general magic maybe was a failure as a
company, but they invented the iPhone to the first
approximation. Speaker: It is almost like, I
don't know if you watched the TV show Hulk catch fire, it is from
compact, to AOL, to video games.
And you see the story along the way and you have characters
that feel like they kind of continue to repeat through that.
How much of the story of modern Silicon Valley is written by
those people? Speaker: It is like 2,000 people
made Silicon Valley in a way. And that's because companies like General
Magic, it was a failure, because a bunch of ECs lost their money,
but so what? It wasn't a failure to –.
Speaker: All the VCs in the audience just shuddered –.
Speaker: It wasn't a failure for the people that were
creating the technology and moving on and getting incredible
experience, which they then almost literally brought to, you
know, Apple later. And so you see this kind of — the vector
is a human vector that goes through all of these companies.
The names of the companies change, but the ideas are
associated with people and they — and at some point the
technology and the market is ready to get product market bit,
and you get another billionaire. Speaker: And Silicon Valley —
you are talking about the genius of it, there is not much silicon
made in Silicon Valley anymore. But you spent time studying
getting to know and y Grove and the team at Intel.
Can you get back to when the team was making –.
Speaker: It was when he was there. When where was writing
the book, among the people there was Arthur Rock. We had dinner
at the lion And Compass, I don't know if you know that place. I
remember when the dinner was organized, there were four of us
there. I was instructed by the person who invited me to meet
Arthur to pick up the check. And now, this is strictly
confidential, but Arthur is wealthier than I am.
Speaker: He basically invented venture investor at Intel, and over
Apple and this, that, and the other.
And it was at that dinner that he said to me that Intel needed noise —
(indiscernible) — and they needed it in that level.
And that helped me to understand why he was so much wealthier
than I am. It is true, THOIZ noise was a magic land. You
talk about the beginnings of Silicon Valley, Shockly came out
at Bell Labs and wanted to be near his mother, the weather was
better out there. And 8 people went to work for Shockly semi
conductor, and Arthur knew these people and he was impossible and
they moved to Fairchild semicon ductsemiconductor and they were
labelled by shock lyShockly as the traitorous8, and they went to Fairchild, and Noise was not
chosen to be a CEO of that. And in June of 1968, he went to Gordon Moore of Moore's law.
And Noise was the inventor of the integrated circuit, and
Moore is the Moore of Moore's law, and Andy Grove is in many
ways a very unlikely character. He spent the first 28 yearears
of his life in Hungary, served in the Hungarian army, spent 10
years running from the Nazis, and 10 years running from the
communists. And there was a war between hungryHungary and
Austria, he managed to get to New York, got a degree at the
city college of New York where the engineering school is now
the Andy Grove engineering school, came to Berkley, got a
PhD in chemical engineering, and had two choices: Bell Labs, or
to work at Fairchild with Gordon. And he was so enchanted
that Gordon Moore understood his thesis, he said that you're my
guy. And a fascinating thing about Grove is he thought he was
going to a start-up. He thought he was Intel employee number
three, he was number 4. And it was Andy who helped
engineer this very important change, which is studied in
business schools, from the focus on dynamic random access memory
to a commitment to the microprocessor. And there is
one important moment in that story in which Andy and Gordon are
depressed. This is the difference between business
strategy and brain weight. These guys are smart, but
thinking about strategy can be different than just brain
weight. And at one point, Andy says to Gordon, if the board
fired us and brought in a new CEO, what do you think he would
do? The assumption is it was going to be a he, it was 1985
when the discussion took place. And Moore said he would get us
out of memories. And Gordon said, we will walk out the door,
come back in, and do it ourselves.
And that thought experiment was a way of erasing history. Intel
went public on the 1103 memory chip, they were a memory. When
that business started, they had 100 percent market share.
When they were overwhelmed by the Japanese, they couldn't believe it.
They were out of the business when they had three percent market share.
Andy was CEO in 1987, they focused on the microprocessor,
and the rest is Intel, or history, so to speak.
Speaker: Why do you need them in that order? You talked about the
value that Grove brought in there, why do you need Noise,
and then Moore? Speaker: Noise was the man and
the magnet that was well-known, Noise new the traitorous8 and
was one of them. And Moore was one of the greatest
technologists in the 20th century.
And 11 03, everybody said it was a
miserable device. You needed somebody with a large brain to
go to customer and show them how. Andy was a born marketer,
and it was under Andy that Intel inside was born. And that — as
a historian, it is extraordinarily rare that a
component supplier comes a channel commander and they were
only able to do that because they branded the chip, Intel
inside. Back in the day, you're young. You are you, for
example. You are young. And so are you. I used to be young,
and now I'm old. That's how this works. And back
when I was young like you, people would go into places like
Comp U.S.A., which they cannot go into now because they are not
around, and they wouldn't ask for a Compaq, a Dell, or an
Osborne, but a 386 machine. The microprocessor, the component
came to define the product. And that's extremely rare and
that was Andy. So that's noise, Moore, And and Grove.
Speaker: You have written a lot about the history of Silicon
Valley. We have listened to panels talking about this
morning how we have turned and technology used to be the
panacea and the thing that was going to solve the world's
problems, and now we are seeing increasing regulation and
attention, negative attention, on that. Are there times in the
past when you studied before that are instructive to what
we're doing now, or maybe what is the advice? You had Richard Sandberg as a student. What is
the advice you would give right now?
Speaker: Charles Sandberg was a scholar.
Speaker: And a recovering lawyer, too.
Speaker: And I think, I just saw an article before coming on
stage, I didn't get a chance to read it. Maybe one of you can
help me out. I understand that Facebook has not discovered the
cause of the latest data breach.
And the advice I would give her, it is to keep control of your
product. It seems to be out of their hands at the moment.
And as we have just learned with the Supreme Court hearings,
what teenagers and young people do isn't going to disappear.
It is going to matter a lot in their future lives.
And people — there's a heavy ethical component and especially
with the coming of artificial intelligen intelligence, there's
a heavy ethical component to technological advances, and
people have to take ownership of that. So that's the advance I would give.
Speaker: Adam, any advice? Speaker: I'm a historian, so I
have to take the long view, I have to take a step back, and
what I see is definitely the worm has turned. When I started
making my book four years ago, 5 years ago, Silicon Valley could
do no wrong, and pull this out of the 2008 depression and, you
know, they were great. And now I look at the mainstream media,
the discourse, and everybody is bad in Silicon ValleyValley.
What I see is this is just another in a long series of boom and bust cycles.
So I don't know what happens during the silicon val era but when Atari was in the
'70s, it was making more than all Hollywood studios combined,
and 7 years later, it collapsed. And '77, there was another
collapse in '84, a long dead period that we talked about
before the internet was opened up, there's a huge collapse in
2000 and 2001, that's why –. Speaker: That's why I'm a
recovering lawyer and not a securities lawyer.
Speaker: Exactly. And now I see another big collapse, and
companies like Facebook and Google are too dug in to
actually collapse in the way that collapsed, for
example. But there's this rhyme, history
doesn't repeat, it rhymes. And we have a reputational
collapse. And what I see and what we've learned from these
prior collapses is that it really pushes out a lot of the
people who are here for the wrong reason.
The people who want to get rich quick, the Harvard MBA students.
Speaker: There's nobody like HB S — there's no BS like HBS.
Speaker: Exactly. And you get back to the ideas that don't
need venture money and a business plan, but you need time
and a basement, coding and, you know, creating stuff.
So what am I talking about? I'm talking about Twitter, I'm talking about Facebook and nap
Napster, I know it is a failure, but we have Spotify now.
Speaker: Can I hitchhike on what Adam has been saying? You
mentioned failure more than once and there is one thing that
needs to be emphasized is that sometimes failure happens in
Silicon Valley because people don't know what they have. The
classic example is Xerox, and another interesting example is
alta Vista, and can you tell the alta Vista story?
Speaker: In 1998, it was the leading search engine built by
digital equipment corporation. It was an advertising platform,
but only one ad on all of alta Vista, and they built a search
engine in order to sell main frames, what you needed to make
a search engine was a main frame and there were kids getting
their PhD at Stanford that could not afford it. So could they
write software together that could link computers together
and it became Google. And you see through your book, the
future is out there already. It just hasn't completely
materialized. I'm going to let people ask
questions in a second, but I had one more question maybe for both
of you. Where would you, if you were sitting in general magic,
you saw the people that went to build the iPhone. Where would
you sit today and where would you be looking for those people
and those ideas to build whatever it is that comes next?
Speaker: You are asking me? I would sit in Cupertino. The
company has a lot of imagination, there's a lot of
people working at that company, I spent 8 years there, I'm not
employed there anymore and I'm not in their PR department. But I was simply impressed, not just by the vice
presidents, but they had as many as professors at Harvard
Business School. And you were much more likely to hit somebody
smart on the Apple VP list than the HBS list, because there is
something called tenure, you can stay, no matter how dumb you
grow as you age. And Apple, you have to make
results or you cannot keep working there. You put that
together, and inside the organization, the people who
built the camera here are the best camera people in the phone
world on earth. And the way they work together
and the fact that this company is vertically integrated from
silicon to retail, all these different functions having
different schedules and the fact that they can still make it
work, I just think — and that they have resources. I just —
I would sit in Cupertino. Speaker: They became the IBM
that they sought to disrupt. Speaker: And they didn't want to
wind up in the same place. Speaker: Adam, where do you look
for innovation? Speaker: I would give the
opposite advice, I have never been a Harvard Business School
professor. Where I would stay away from Apple, Google,
Facebook, any established corporation and I would go to
the under ground. What — what do the engineers coming out of
Stanford think are cool, or is the coolest project possible? It
would be, oh, a cryptocurrency ryptocurrency, a flying car, a
rocket company, the things that really end up succeeding are the things that
turn the young engineers on. Almost, by definition.
My favorite example of this is 3D printers.
No one needs a 3D printer.
But it is just so cool, and they figured out how to do it, and literally they — the first 3D
printer was using a bed of cat litter and glue to make things,
it was refined, and 5 years later, it is under everybody's
Christmas tree, or a drone, something like that.
I think it is about the passions of young technical people that
define what happened. And then, yeah, the market has
to find an application for a drone or for a 3D printer. But
they will, sooner or later. Speaker: Some questions, here?
Hold on, wait for a microphone. Speaker: Is the pace of
technology and the business models that are built on
beginning to accelerate? What are your thoughts on how
companies and people that are playing with new technologies
can weave ethics into mindfulness, not only into the
technologies, but the business processes and the models and the
bay they connect with people? Speaker: I will be really quick.
The companies that don't are going to be punished for not
doing so. The biggest difference between
today when we have our first trillion dollar company and
1991, when we had the first billion dollar company, U.S.
steel, is today that companies and books, we were discussing
this before, are platforms. Thy are they are two-sided markets,
a platform is where someone builds. And as someone who owns
the platform, what happens on it is not my business. People are
going to wind up hating you, and that's not good.
When the whole world hates you, it is bad.
And when you came to Cloudflare, you made a decision. You
decided on a platform, you could have made a freedom of speech
decision that it wasn't a good idea, but I'm glad glad you
didn't. Speaker: It probably wasn't, but
we illustrated a point on why it
was dangerous to do something like that. But we needed to
have that conversation. Speaker: Okay.
Well, to put it this way: People who work in these companies, big
or small, are working 24-hour days, 7 days a week, and
sometimes they think they don't have time for ethics. And the
fact is they don't have time to skip it.
If you skip it, you pay a price.
Tim Cook was asked, what would you do in mark Zuckerberg's
position? Does anyone know what his response was?
(Speaker far from mic). So –.
And those on the live stream, Michelle yelled out, "I would
never be in this position." So you can agree with Cook or
not, but having one CEO of a company that is worth a trillion
dollars, and another worth a lot of money, that publicly is not
common around here. And that is because he believes privacy is a
fundamental human right, and long-term, that may read down to
Apple's — not to their benefit. Apple and Google have different
ideas about privacy, and that means that their internet
services businesses are very different. I should shut up.
Speaker: We are out of time. I appreciate you guys coming and
sharing this, and Adam is going to stick around to sign some
books. Thank you, I encourage you to read both of their books. (Applause).
Coming up next: Location has never mattered more, by Erik GundersonGunderson. Speaker: I'm Jen Taylor and I'm
happy to be joined by Shawna Wolverton and Erik Gunderson.
Shawna Wolverton: When you travel, it is fun. It is a
weapon. So you have some explaining to do, going through
security with a satellite. Jen Taylor: Explain hot what
you are doing with satellites today.
Shawna Wolverton: So we have built and launched 298 satellites, so Planet has the
largest number of satellites and we are engineering the earth in
a new way, we have a tremendous amount of action to take when
you understand things in the world and what is happening
there. That's a lot of images.
We have 800 images of every place on earth over the last
three years. Jen Taylor: That is crazy.
And Erik, you are thinking about stitching together the
fabric of location. Can you talk a little bit about how you
have thought about that and how you can approach that problem?
Erik Gunderson: Yeah, when we first got started, we — I
started in DC and we were working on election monitoring,
and we had the opportunities to help in this agency, and the
state department, we tried to have context. And like you
said, location is more important now than ever. It has always
been important. It is just changing a lot.
So back in — the first time to Afghanistan was in 2009, and I
was there to map an election. And we landed and at the time,
literally the map itself was the cross street of two major highways, and how are you going
to map an election year when you don't have a map? And bit by
bit, we are making a map and the APIs of tech to make it easy for
developers for enterprise to put location and stuff into the
apps, and the cool part is we have a lot more data now. And
whether it is data coming off the cell phone, and we have 400
monthly active — (40) 000-0000 monthly active users, and the
traffic and imagery coming back. And we can make our own data and
index a lot of data, but buy a lot of data. The more censors
coming — sensors coming on, the richer the map is.
Jen Taylor: And we're talking about location, and the internet
is this virtual thing, and location is theoretically a very
physical thing. But how do you guys see those
things coming together? Why is location so important on the
internet? internet?
Erik Gunderson: Just open up your phones, look at the start
screen, how many of those apps are adding context via location,
and back when we started, the map was a canvas to communicate
about something. Reality now is so much of that power of
location is starting to fade into the background. Knowing
exactly where you are, and pick an industry, in terms of
logistics, if I can get more things in more cars faster,
yeah. You just change the margins of the entire industry.
And the next thing you know, the industry is — you get all of
these different modality classifiers for how you move.
On a fundamental level, this is about changing how people move
and now all of that is happening because we are connected.
Jen Taylor: You talk about being a sensor consumer, and
Shawna, you are a sensor producer. How do you move from
the sensor producer part of the problem?
Shawna Wolverton: Yes, they are adorable, my satellites, but
they are my data center in the sky, a really fun and cool one
with a rocket. At the end of the day, what is important is
the interesting data that comes out of the sensors that are in
space and on the ground today. When those things come together
and you can understand the dimensionality of any place on
earth, not only do you know where you are, you know all of
the different kinds of things that are happening about that
space that you bring together, you bring together place and the
kinds of things you can find through machine learning and
computer vision in satellite imagery, it paints a picture of
every place on earth. Erik Gunderson: You need a lot
of data sources to open that context. Look at Snapchat, pull
it up and look at the map. On average, one of our maps is made
from over 130 different sources of data. Whether those are
roads, or understanding part classification from satellite
imagery, all the way down to the visual component of the imagery.
Sometimes you don't need to see it, you just want to know the
context. These are the places, this is my perfect beach line here, and
all of this data is constantly feeding back. You all have some
incredible stats on how often the world is changing. But it
is the live feedback loop that has changed the mapping space
for a lot of us. Jen Taylor: When you think about
the data we have, we are drowning in pickles. We need
the tools we need to gain that insight. There's a whole number
of industries that can be served by the information that comes
from all of these censors, sensors, but we need to
democratize the access to it. We need to pull all of the data
coming from all of these sensors and help our customers make
sense of it. Jen Taylor: You touched on the
sheer volume of data and aggregating and stitching
together the data and processing together the data. How do you
do that? It is a lot.
It is a lot. Shawna Wolverton: We bring down
6 terabytes of data a day from the satellites, and thankfully
the earth is 50 percent cloudy every day, so we don't bother
bringing those down, a big fat image of clouds is not that
interesting to very many people. So it is a tremendous data
pipeline that we operate to bring down that kind of data and
to process it. It is really intensive.
Erik Gunderson: On the satellite side, a lot of us in this room
are working with big sets of data and they are changing in a
timely manner. The sheer size, nobody talks about how fat this
data is. To work with large sets of data at a volume and
that are heavy to transport, it is crazy. It is really
complicated to move this stuff around to do processing.
Shawna Wolverton: In the beginning of Planet, we were
making deliveries by shipping hard drives.
Eric areik Gunderson: And our first shipment of imagery, FedEx
was our API. And no matter how much faster this stuff gets,
there are fundamental constraints. The biggest change
is probably in the last two years, where a lot of our data
processing now is truly happening on the edges.
So three months ago, we partnered with ARM and launched
our vision SDK, which is, instead of making apps from GPS
sensor data, we can turn on the camera now. And on the device,
you are able to do the classification segmentation and
detection on the chip set. So instead of putting hundreds of
thousands of dollars of equipment in the trunk of the
car, what can you do by empowering other developers to
make interfaces through the phone, or through really cheap
hardware? And for us, to be able to process that data on the edge
of it and pass up the changes and the stuff, I think that's
where things are about to get interesting.
Shawna Wolverton: Being able to run a dif on a visual file would
be really cool. I will through that out to the audience, put
that together by the end of the day.
So it is a vast amount of data, and I think about location
is incredibly powerful, you think about mapping election,
people talk about planting crops and stuff like that. Location
is incredibly powerful substrate for just humans and the way that
we engage in the world. And one thing that is interesting and
looking at the work that the organizations have done is how
you are able to aggregate and democratize that data. Can you
think about that location data and the access to that?
Shawna Wolverton: There's a relationship for people doing
science and we have a relationship with the academic
community that are bringing in climate change insights, we have
people that are using our architect data to understand hot
is happening in that region with devastating consequences and
being able to use the information that is in the
satellitesatellites to change palms and — policy and
decision making. Erik Gunderson: When people
said, are you going to democratize this data, it was
going to open the data. And opening the data is only the
first step, you have to make it accessible and usable.
And so look at Florence, from the hurricane side.
So when — right after a disaster, they use our programmatic upload API and
process it on the fly and expose it to everybody.
And another partner, a fly company
in the U.S., and they go after the
insurance, and they process the data and expose it via an API
and it is in the iPads of insurance adjusters, and it has
been processed to detect where the storm paths went to look for
fraud 236789 . If you are in a storm path area
and you are reporting something, your claim is put to the top.
So they are able to get the money out to the people whose
property was actually damaged faster, while reducing fraud.
That's an interesting point in terms of democratizing and
making it accessible and it comes to an API processing case.
Jen Taylor: You don't need a geo location expert to say that
there is probably roof damage. That's the really interesting
thing is the idea you can take something that was in the hands
of specialists, and the special tools, and even — you didn't
have imagery imagery, so much of the imagery in the world is in
governments. It is in our lifetime, speaking of governments.
Satellite imagery was shot out the back of satellites. And now we are
getting imagery. So the idea now that not only do governments
have access to data and not just the U.S.
government, one of the core foundational policies is it is
accessible to everyone. So we think the more people that
have information, the more secure the world is.
And there's an inherent tension when we talk about data, access,
and stuff like that, and securing it. At the same time,
we have the notion of privacy. And how do you think about that,
the permutation of global sensors, and most people are not
aware are happening?
Speaker: Sensors come in various degrees. Our sensors are three meters, every
pixel on the screen is three meters on the ground, and people
do not usually show up there. So we ameliorate a lot of
privacy concerns. And yes, I where it is — think it is a
critical part. And the commercial space is getting
closer and closer to the ability to identifying things, and there
will be interesting questions that we have to answer.
Erik Gunderson: There's a great new book, never lost
again. And the CMO of Keyhole, the company that Google bought
to start their mapping initiative, tells the behind the
scenes stories of what that was like getting up, and the first
time that keyhole or Google earth went live, people started
zooming in is to, I forget what the rezlution lution was, and
they started calls like, wait, everybody can see my property
now? And that's just over 10 years, 10 years ago.
So I think all of us are starting to have a new
appreciation of what is this amount of data out there?
Whether you are contribute to the data, and there's a lot of
talk about data, if I'm putting it into the platform, how is
that being used? You are making data and being part of a world,
you are contributing to that, how is that used?
For us, on the resolution side, the resolution side is it is
much more detailed. So you hear things like PII, personally identifiable
information, according to the U.S. government. That is like
social Security and credit card number.
According to location, you know where you were last night. And anything associated with
collecting data, with PII, it is sensitive data. And the only
way to protect folks is to anonymize from the start, don't
actually store self-identifiable information. Trim sets of data
by default so you cannot, even on aggregate, bring it down.
And we happen to be in the business of making maps and we
don't do advertising, we don't collect the temp ID it and the
whole idea is to get data at aggregate. If you design a data
where you lose a lot of data, you have so much at mass, you
can design something that is secure by default. That is our
SDK. Everybody building an app, are you building those levels of
— are you appreciative of that? I think this is going to be a
conversation that all enterprises and developers are
going to be having over the next couple of years.
Jen Taylor: We will open it up in a couple minutes for
questions, but I like to ask folks in CROUR your position,
as you are surrounded by thought lead thought leaders, what request do
you make to people in the room? What do you want them to do for
you, what would you do for location or for the industry?
Shawna Wolverton: The amazing opportunity in the room is to
just let your brain think about what you could do with
understanding change in the world. And I think there is
tremendous opportunity that we're not going to think, Planet
cannot think of all of the possible use cases there are in
the world. Be creative, we have APIs, experiment, think about
how the world can change. Erik Gunderson: Everybody here
should — what part of my business is impacted by
location, and what efficiency can I get from that?
And you nailed it, you have APIs — we put out the LEGO building
block to allow you, as a business, to configure an
experience that traps trans forms your efficiency into something that
was measurable. There's a whole new era of
efficiency you can bring there. You do the math on this, and you
say 15 seconds here, 15 seconds here, on every single ride, how
many more is that an hour? Wait, did I just — that's 4.
I'm at a minute, can I fit another ride in? Can you imagine
what you can fit in over thethe drive piece? If you are doing a
million rides a day, to say only 10 percent, you are talking
$200,000 a day. This stuff adds up and people in this room have
that power to do adjustments to make things much more efficient.
And that is not only good for your business, that is good for
the planet. Jen Taylor: And as I'm sitting
next to you and listening to your asks, I'm struck by the
fact that you built these really robust platforms that services
in different ways and again opening it up to the people in
the room to take the platforms and run with it.
Erik Gunderson: We came across it, because we needed their
tools to do our job. We were bootstrapped, we did not take a
dime of outside funding until 5 years ago this week.
So I was like, wait, what tools didn't we have access to?
So if we expose it via an API, a lot of people need the same
functionality. Jen Taylor: It is amazing the
work the organizations are doing. So we will turn it over
to the rod audience, questions?
Speaker: Do you use vehicle license data? There are —
(speaker far from mic) — to record license plates all over
the U.S. and they have sites all over now. Since we cannot opt
out of displaying the license plate, it is a difficult
situation for the consumer.
Shawna Wolverton: We are not using license plate sighting.
There's interesting technology for license plate for a boat.
It is interesting when you put it together with imagery data,
people can put their license plate in the middle of the ocean
and do illegal fishing, transshipmentes that are
embargoed. There is interesting data when you put that together,
but no. Erik Gunderson: We don't either.
You are hitting on the point that you just brought up.
We are now creating data trails that we did not get to.
And I think that there is some fascinating conversations about
that, because we here could be having the conversation about
security and best practices and collective data. And are the
right folks in the government thinking about that, and are the
right security measures brought in on that? Where is that going
to go? These are important questions and we are starting to
get a vocabulary to have that conversation.
Jen Taylor: Other questions? What are you novigating today in
the work you are doing and how are you participating in the
creation and the work of those standards?
Shawna Wolverton: We are regulated by the U.S.
government, NOAA regulates our work in space.
Jen Taylor: I cannot fire a rocket into space! ?
Shawna Wolverton: It is amazing the liberty we have to operate
in the world. There are no limits to imaging the earth, we
subscribe to the space treaty around space junk, we were
talking about orbiting our satellites responsibly. But for
the most part, our satellites, because of their resolution, are
not highly regulated.
Erik Gunderson: We insert imagery resolution requirements
in places like Israel, and we are fully operational legally in
China. That means we have acquired legal data through a
partnership, run through local infrastructure, and we had a
cartography signed by the survey department. And we had an ISBN number on the bottom of
our maps. So depending on the region, there's a lot of local
details to get right. Jen Taylor: That must be
challenging, how you can comply with those at scale as you are
stitching together vast quantities of data.
Erik Gunderson: We allow developers to pick what they
agree with and you can expose that world depending on the geo
IP to your users, and those are shifting country boundaries,
those are different places, and this is — it is really raw
stuff. Jen Taylor: Yeah.
Shawna Wolverton: It used to be that NOAA wanted them to send
you every image they took from space. After a while, they
said, no thanks, it's cool. Jen Taylor: Yeah, my 800 photos
of the earth, except when it is cloudy. Thank you very much.
Thanks. [ Applause ]. Coming up next: Stopping the
global spread of disinformation by John Scott-Railton and Julie Owono.
# .
>> We are here with John Scott-Railton
and Julie Owono. We are talking about the spread
of disinformation in the United States, and there are stories
internationally that we need to talk about as we think about
disinformation. I will turn it to you to get some examples of
what we are seeing.
Julie Owono: Thank you, John, and Alyssa to discuss this
important issue. So internet Without Borders and cyber
development in the world and particularly in emerging markets
and more specifically Africa, I must say that when you talk
about disinformation, you have to remember that there are
different forms that disinformation can take. We
think in terms of the 2016 disinformation campaign, which
is sponsored by a foreign state. But we tend to forget in the
conversation the campaigns that are sponsored locally and
nationally by government to push certain messages and certain government information
that are not grounded in facts about the country.
I will give an example that is old but makes sense with the
rest of the conversation that we had. In 2011, we worked a lot
on a country located in central Africa we were
enterprising at the time, and at the time nobody was talking
about votes. But that government was hiring a European
— I can't remember, but it was a private sector company that
was selling certain products to influence the opinion and
influence — the image that we had that we may have of certain
countries or of certain information on social media.
And the social media platform that was used at the time was
Twitter, there were tons of the messages that were spread to get
from civil society to what was happening and they were flooded
with messages from votes calling from people from India or the
Philippines that had nothing to do with Gabon and tweeting a lot
and flooding the hashtag with false messages. This is a
campaign that comes to my mind. We interest forget — forget in
the conversation, the campaigns that are not opposing
governments, but that happen in very tense political contexts,
for instance, a civil war, which opposes two, two polarized
communities in which the disinformation campaign will
come on top of other very dangerous hate speech
and triggers harmful consequences in the real world.
These are some of the — the ideas that came up to my mind
when we first discussed the issue.
John Scott-Railton: The sitagliptinsitagliptin — The
Citizen Lab works across groups. And sometimes they are Phishing
or malware campaigns that end up in disinformation. And to give
an entry perspective here, those at the Oxford institute have
been tracking disinformation and misinformation online. And in
2017, they reported that 48 countries showed some evidence
of systematic, organized, disinformation.
30 of those cases were around elections and just to put a big
highlighter on Julie's point, most of that is domestic-facing.
When we seek to think about disinformation, I'm an American, I think about 2016,
and the narrative is that something foreign came in and
messed with our stuff. That leads to wanting to seek
authority, like the government, to help us regulate it. The
problem is, what happens with the government is the entity
doing the disinformation and, by volume, I think the work from
the Oxford internet industry will tell you, by volume, it is
governments doing it themselves, often around elections.
Speaker: The reason that the 2016 campaign got so much press
is it was happening in the United States and people were
not expecting it to happen here. It seems like — we talk about
disinformation, we're now talking about it in the United
States and it sounds like this is a systematic problem, and
maybe we're not actually in that new of a place in some ways.
Julie Owono: Yeah, this is very important. What we think at
internet Without Borders is that we are operating on a global
platform which is accessible virtually everywhere, but by
everyone. It is important when you are in Silicon Valley and
developing a product, we need to pay attention to signals that
are 10,000 kilometers from where we are. And we test the
the products and the reaction to the products before getting them
back home. And we learned afterwards that there have been
tests by Cambridge Analytica in countries that had nothing to do
with the American or the British consultconsults, but they were
taken back home by certain companies and used by certain
governments. So it is important to — yes, it
is shocking when it happens in the United States because, well,
everybody would imagine that our institutions are strong and
everything is set to avoid these types of things from happening.
But no, it is protected against — it is a global threat and it
should be perceived that way and in that extent, it is important
to pay attention to signals if they are located 10,000
kilometers from home. .
Alissy speaker: What is at stake when we think about what
we are trying to do? John Scott-Railton: I'm nervous
about disinformation, and everyone else is mor likely to
be subjected to disinformation than you. And what scares me is
actually the first response to this. We have a disruptive set
of technologies, some of it is new. The scale of bots,
probably new. Microtargeting, probably new, on a historical
timeframe. There are other things like that that feel new,
different kinds of anonymity, new. In my book, disinformation
is about selling ideas. So I cut it like this.
There's a kind of disinformation that is about gumming up the
conversation, a lot of automated stuff, that makes it hard for
voices to come out. And that feels like a much more
understandable problem. And there is another half of it, I
look at it like this. And disinformation is marketing, the
product is feelings. And the profit is behavior.
Ultimately, we are working in an environment where much of our
behavior is happening on essentially marketing platforms,
platforms designed to deliver behavior to advertisers to buy a
product. And my concern that, as long as we're inhabiting
those platforms, the imagination of people who had going to
manipulate us through those platforms is always going to be
moving faster than the companies that are providing those platforms.
Alissa Starzak: That raises difficult questions of how to
deal with this. The questions — what are you
trying to do and trying to accomplish.
As you think about solution and and s and things you can do, how can you get around a
ploy that plays on what you are feeling?
Julie Owono: Propaganda is not something new, it is on
disinformation and we are human beings, and it has been here for
quite a long time. So what we think is that it is
very important to be, as I was saying, you can patch in certain
signals to remember that history repeats itself, unfortunately,
and that there are certain, well, when there is innovation,
there are risks and that we need to be aware of this and not
think only about how constructive we are and reacting
when there's a problem. It costs more, and it makes us
waste a lot of time. Instead, it is important to be proactive
and to understand that, well, the threats — there are threats
and that human beings, with faced with new — with innovation, they react the same
way. So being proactive is quite an interesting — when I
say being proactive, I'm talking here to product makers who are
located here, for instance, in Silicon Valley. Somebody was
saying that in Silicon Valley, people live in a bubble. It is
time to break the bubble and understand that the global
availability of the tool and the effects that it has and
anticipating a bit more, we will conducting in the tech
sector as well, when you are going to launch a product and
you are thinking about the impact that you can have from a
human right and a human perspective, a social
perspective, these are some of the ideas that are ideas
that are flowing in the about impact on products, the
benefits of the information products are also tangible. So
you end up in a world where how do you assess the negative
impacts, while also weighing it against the potential positive
impacts of dissemative information which is a powerful
tool. So the internet, as a tool, is an important piece of
what we are saying.
John Scott-Railton: It is a reality, in the U.S. there's a
big push to have a conversation about fake news, and this is an
over simplification of the problem set. The idea that the
real problem is that people are saying false things, a lot of
wrong things being said and we need to telegraph to people that
there are credibility problems, and maybe that things are fake.
To me, this is like the easiest possible solutionism and it is
not going to work. Disinformation is about
marketing, if you want to understand the future of
disinformation, you want to understand the future of
advertising. Disinformation, as it existed in the U.S. in 2016s
not what it looks like today, it is not what it looks like last
year. When we are trying to block the proliferation of
stories does, it brings back a scary regulation of
speeds. So this is a country that had
problem with disinformation. Cambodia has a disinformation
wall. It is new, in 2007, they got rid
of their defamation law, under pressure from the government.
It was used to investigate the state to limit corruption. And
the first use was against a person investigating corruption.
So my concern is that states are moving very quickly to talk
about trying to block the fake news. Over 30 governments have
some kind of a regulatory thing in play, or a law they are
working to pass, or have already passed about fake news. And to
me, this is ultimately extremely dangerous. When we talk about
that with respect to technology, too, we have risks. So more and
more of what is called disinformation now is happening
in darker places, like on a secure chat platform.
And my concern, if the stated concern is about blocking the
proliferation of fake stories, we are going to erode encryption
and a lot of flexibility that users have right now, and do it
in a way that is fighting a war that is already old. Julie Owono: And there's the
important step that governments are taking, to sensor the access
to internet itself on the basis of the — well, justifying it by
the fact that, well, it is harming
informational security. So the stakes are high, and it
is about connectivity being a stake here and specifically
connectivity in zones where we are saying that people are yet
to be connected. And it is quite frightening.
And on the — what you just said on, you know, regulating speech
itself, it is not only in, you know, more or less repressive
country. Even in democracies, this is — there is debate on
that. I'm thinking specifically, I live in France
and some of you may have followed this summer, there was
a big scandal because a body guard of the president was
accused of molesting protesters. And there was a study that was
published which alleged that it was initially a campaign that
was put out in the French public debate by foreign states, or
actors concerted by toren government and specifically the
(indiscernible) government. And it was a way to introduce
the idea it may have — and although the story was true, the
way it was put out may have been used to destabilize the
government and the institution. Whereas, no, it was actually a
very important public debate that triggered an investigation,
a nationm investigation, and that makes the democracy and the
French democracy healthier. So on that idea, regulating speech
can have even democracies doing what repressive countries were
more familiar with in that sensorship.
Julie Owono: How did we break it down? We had foreign
interference that comes in robust ways and disinformation,
which you kind of want people to know about. You have the
potential for the government itself to get involved to
manipulate public opinion, which also probably — you want your
people to know about it. And then you have there the — the things on the margin
generating unrest, for whatever purposes you might have. Do
they deal with them in the same way?
How do they think about solution?
John Scott-Railton: I think one place where we are at right now
is we are very, very early. We have connected much faster than
we could secure and we have connected a lot faster than our
norms and social institutions know how to regulate behavior.
And a lot of these are old problems in new digital
clothing, we made a big mistake, we have forgotten how to talk
about sociology and stuff like that and how that influences the
relationships that people have to technology. Every time I
think about this, the first step is media literacy, if we could
teach people that. But then I stop myself, and there are a
couple problems with this. Media literacy is what you teach
to other people who you think don't get it because you think
that you actually know what is really going on, right? It is
like, how do we get those people there? And I will highlight
something interesting that was inspired by reading something
like Dana Boyd, a critic of society. For a long time,
Russia's tagline was question everything.
And I mean, I think that the challengeschallenges, if we
start talking too much about let's educate individuals,
individuals and their sort of thinking should be the last
resort in a way — in security, if we
try to teach people better security behaviors, it doesn't
really work. These are public health scale
problems, and they have to be addressed that way. I don't
know if you have anything else? Julie Owono: I disagree a bit
with you. That's the first time we disagree with something. We have to think
that there are parts of the world where, up until very
recently, the only source of information was a national state media, which spread — they are flooded with
information, located on one platform, and the sources are —
they come from almost everywhere in the world.
And also, you have to think that this same, well, the citizens also suddenly — we were born
with the internet and evolved with it.
And you are suddenly based faced with a tool that allows you to
speak with anyone. We have to be, to put ourselves in the
place of the individual, not saying that we know better, I don't think the issue
is necessarily that it has to be seen that way. But it is rather
that it is a challenge to receive information in the 21st
century. So how do we deal with that?
Probably education is not the right word, because obviously we
can all educate each other on receiving information, as you
rightly said. But at least having more
conversation on the heedia media themselvesthemselves, there are
conversations in conspiracies, and it should be the came in
other parts of the world. And this is the trend of the
utopia of the internet initially was to enable individuals to
have access to information and be able to make the most of the
information that you received. So that's probably not perfect
answer — it is complex and not only one solution.
John Scott-Railton: This things thing that I said, the
marketing, the profit is the feeing, if you think about it,
not all disinformation campaigns work. A lot of them don't. In
my work, I come across crummy disinformation all the time, a
lot of information states and a lot of times they fall flat. If
you think about it in the context of the product language,
the market research is really bad. And the authoritarian
comic book conception is not going to work.
That said, the other part of this is that if you are selling
a product, you are selling it to a markets that is interested and
a lot of the stuff that was sold in 2016 was selling product
where there was a market appeal. Racism, class differences, this
sort of stuff, and pre-existed not only recent Russia because
the Soviet Union in the U.S. These were bigger problems, and
so to me, I feel like the solutions are at societal scale,
a lot of them have to come through education and where
people are taught. Where they are not going to come from,
though, is the next couple of years.
I think that it is a fact of the matter, there are elements of
addressing misinformation, some of them are technical, some of
them are societal, there is no reason to expect that this is a
solvable problem, in part, remember that we're talking
about elections in the beginning of this. There are going to be
elections every couple years, and if you look in a lot of the
countries where people are rolling disinformation
campaigns, who is rupping it, who is paying for? Typically, it
is parties, who do they pay? They pay political consultants
and marketing firms. These are not mysterious disinformation
operators, we are talking about the manipulation and shaping of
human opinion. A lot of times, I think that people in the U.S.
and north America CEO are exercised about it because it
feels foreign and scary. The story of manipulation of public
opinion to achieve ends was always happening. So what
is the job into the casino? The cheap food, do whatever you
want, just don't walk out the door. Where is the door Social
media is the same way. The objective of a large company is
to get you into the casino and keep you there. People are not
happy, for the most part, when they all dealing with. And they
are not happy when they spend a lot of time there. But they
stick around. If you think about who has a good bit of
market research, it is the platform level. It is not at
the individual marketing firm selling stuff level. And the
platform has much more subtle ways of shaping behavior, and
yet we know that the feelings and sentiments can shape things
like electoral behavior. So the feeling is not disinformation,
but the subtle affect around manipulation times.
This is much scarier, because the scale is better than any
specific sale of a bad fact. Julie Owono: I was going to end
my comment on a positive note. I am still going to take
questions. I'm going to turn to the audience.
Audience member: Again, I was going to end positive note.
Julie Owono: There is hope, that's a very important
question. So China, as you know, is —
(speaker far from mic) — a relationship with certain
countries in the world, and a specific continent, we know that
China is very proud, they seem to be proud of, and having to
build an internet which ask controlled, but nevertheless
allows big companies to make a lot of money. We have talked
about some of them this morning.
And the idea is it is starting to flow into the discussions
between the Chinese government, and partners in Africa. We have
recently heard of, we heard about AI and
China a leading country researching on this specific
issue. And we have learned that to counter the idea
of diversity and bias on intelligence, what the Chinese
companies are doing is they are going to African countries to
get data sets on populations, which are more diverse compared
to China and Europe. And on the continent where there
is no privacy law, where there is no — well, almost no
regulation when it comes to internet issues, it is very easy
to imagine that it is possible to build such SOLETHS credit or
— social credit or surveilian state in certain parts of China
today. It is a source of worry and why it is urgent for
companies in the U.S. or Europe which are selling
products to talk about this, to anticipate on the threats and to
anticipate on the issues. Basically we are at a
crossroads, certain countries that are not connected are
choosing open and free internet, or they are choosing an internet
they can control. And you can imagine for a repressive
country, which choice they are going to make. So it is a cross crossroad.
Alissa Starzak: We're going to take one more question.
Speaker: How would you rate the effectiveness of platform
and Facebook countering fake news? They will have a widget on
something controversial, and a lot of the disinformation is
spreading to secure messaging platforms, and that's a bigger
technical challenge. And at that point you are passing notes.
Ch John Scott-Railton: It is not
going to change much, just like you have a cigarette ad and a black and what warn ing — white warning
in text. We support lot of websites that
are not living on platforms, and it supports a speech of what the
platform molds. Thank you for doing the work. Alissa Starzak : Thank you for coming.
Coming up next: Cyber: The new frontier in state warfare by Lisa Monaco. Speaker: Lisa Monaco was an
assistant to President Obama in counter terrorism, which means
we are going out with a bang. C'mon.
We need something like that, we will have some good war stories.
So Lisa is here to talk to us about the cybersecurity threat,
and how we evaluate that in the present day and what the state
of that is. Lisa had a 10-year career at the
DOJ and FBI, before she was nominated assistant security
general for national security in 2011 and 2013 she was an
assistant to President Obama. Since the end of the Obama
administration, she has become a guide to all of the smartest
people in all best places, thinking about these
issueissues. She's a distinguished fellow at NYU
school and Harvard, chairs the Aspen institute, chairs the cybersecurity group, and is a
principal at west exec adadvisors.
Let's welcome her. So I want to do a deeper dive as
the homeland security president. She was the one to deliver the
hard news, be the point of the spear to the president when
something went wrong and lead the White House's effort to
coordinate a response to that, things like the Ebola crisis,
the Boston marathon bombing, which was your first week on the job.
Lisa Monaco: The third week.
Doug Kramer: And she would look at the best and most
information on risks, foreign and domestic, in cyber and
everything else and distilled it to how the president would think
about these things.
We will talk about using those skills and access to all the
classified information you had.
What is your take on the threat here? What is the state of that,
and are we perceiving it the right way?
Lisa Monaco: Thank you for having me, I'm aware that I'm
standing in between you and cocktail hour. So thanks for
that. I want to thank Matthew, Michelle, and Doug and others
for having me. It is great.
You described my role, it is my role and a job, in addition to
giving me the longest title in the history of the world, it earned me a
nickname from Dr. Doom.
Doug Kramer: Yeah, the schedule was never to celebrate
someone's birthday. Lisa Monaco: With respect to the
cyber threat, it is more diffuse than it has ever been, with the
range of actors, including nation states, non-state actors,
basic garden variety criminal inal actors, and
politically-motivated hacktivist s with them displaying more
dangerous and destructive toom said and — tools and
techniques and having a more destructive impact than they
ever had before. And I see today the nation
states emerging as front and center and being the most
concerned element of that threat, of that diffuse threat,
than ever before. And in that category, Russia, China,
Iran, north Korea. Doug Kramer: What is the — we have
eyes all over here. A couple years ago, we were focused on
terrorist organizations and what they do in the cyber space. You
don't see that discussion anymore, it is much more focused
on nation state actors and all of it that.
Do you think that's a right-sizing and a correct focus
of where we should have been all along, or do you think we lost
focused on state actors? Where do you think that balance lies?
Lisa Monaco: The person that helped Dr. Doom will not say
that I'm worried about helping any terrorists with a means of
ability to do us harm. I got into a debate with David
Petraeus on this, his concern is the cyber weapon of mass
destruction, and he means a non-state or terrorist actor
using cyber as a means for de destruction. He acknowledged
that the likelihood of that is remote, that is his
characterization. And I am worried about that, and
I'm more worried on a day-to-day basis about the more silent
impact. So what do I mean by that?
I mean the cyber attack without the visible impact, or the
kinetic effect. I am worried about the attack that we don't
see, but then ultimately shakes our confidence in the
information that we need to structure and go about our daily
lives. There's a lot of talk in the
last hamp — half hour on financial operations, but the
trades that whiz around the world every day. The ability of
nation state cyber actors to manipulate information, to shake
the integrity of that information, and our confidence
in it is something that I'm worried about.
Doug Kramer: On the different actors involved, definitely
nation states, we will talk about that. The terrorism
threat is out there. And the rogue actors that are acting on
behest of the nation state, is there a significant threat
there? There's a threat, but compared to the focus in the
right place, are there rogue actors or other groups we are
leaving out here? Lisa Monaco: A threat I would
highlight in that category that does not rise to the level of
the nation state actor at this point, but we should be
cognisant of and not lose sight of is the terrorist actor using
cyber means in what many experts s experts described as the
blended threat. So ISIS, and in this case it
happened, to get information about our service members, and
put that out on the internet, and then extol their followers
to take action, violent action against those individuals.
So the mixture of non-state actors doxing and using
basically mercenaries to do that is something that is real.
Doug Kramer: So you had a platform to stand up and say,
gosh, everyone pay attention to this. We have to focus on this
issue, what is that one issue in the current threat?
Lisa Monaco: The point that I made about the integrity of information, on the
enterprise level, the Internet of Things security, or lack
there of. Everyone heard the statistic,
the conservative estimate is 20 billion Internet of Things
devices connected by 2020, that's a low-end estimate. Half
of all businesses by 2020 are run by Internet of Things
devices. So that is an expanding attack
surface for that whole range of malicious actors we talked
about. And the problem is getting bigger, because we are
not building in security at the front-end.
So the challenge that I think folk said are not focusing on,
and it is a really hard challenge, to drive us to a
culture of building in security by design, what are the
incentives that we're going to have to do that. There is no —
I don't think that we're looking at a mandate at any time soon.
What are the incentives and standards we are going to agree
on. That's a big, big challenge that we are not tackling right
now. Doug Kramer: Are you in sessions
with people that are thoughtful in this area, or in
conversations with this? You are sick of hearing about X, and you
wish we would move on from that, and yes it is a threat, but it
has out sized the oxygen that it is taking up in this conversation.
Lisa Monaco: It should not a surprise to this audience but
the outside focus of thinking as a company and organization is
looking for a perfect defense. There is any number of vendors
out there trying to sell you a product, maybe to some of you in
the audience, trying to sell you a product that is going to give
you the secret key to raperfect defense. The reality is that we
need to focus on risk mitigation and measuring the response time
to the problem. The response to the incident, how long have they
been in the network and how long does it take them to get them
off. This is not a question of if, but when.
Doug Kramer: I want to shift the focus, and it resolves around
this. There are clearly established norms and laws on
the use of force by one nation — one example I used before,
there was a law about the authorized use of military
force. If you send any kinetic element into a foreign country,
within very short order, 36 hours or something, you need to
make a report to congress. When those things happen, they are
clear, they trigger a response, and it is viewed as a very
significant infringement on a country's sovereignty or
interests. You can have, however, it seems,
very large, coordinated, impactful cyber events that are
intentionally aimed at another country and have impacts inside
those countries and we don't seem to know what to think about
that. To some extent, that is just spy verse spy, or something
that happens over there and we don't take the same sets of —
umbridge ask is a woefully insufficient word, but they
don't have the same attachment. And are we going to get to a
place where that starts to be thought about in the same way as
a more physical attack? Lisa Monaco: You have examples
where folks tried to apply that same language, and it is
governed by international law. If I sent my army into your
country, that's a breach of sovereignty and norms around the
sovereignty of nation states, right?
And that isn't to say there's a disagreement about that, but you
have a big body of law and multi-lateral agreement on that
norm and what corn constitutes a breach of it. We don't have
the same set of norms around cyber activity.
So there is, I think, an agreed level of cyber activity,
including malicious cyber activity, that operates below
the line that is espionage that is spying.
And then you have all the way up here, the malicious use of
cyber activity to have an actual kinetic effect.
And there are — there may be particularly f if that is
deployed in peace time, most nations would agree that
constitutes an armed attack equivalent to your dropping a
bomb in that country. The problem in the cyber space
is that this distance, that is a very, very large gap which we
have not decided as an international community or
domestically how to treat that, how to treat that space.
There's a robust dialogue about it, but we don't have an
agreement. We tried in the Obama
administration to set and establish a set of international
norms around it, not so you can have a nice thing on a piece of
parchment, but to isolate malicious actors and nation
states that operate outside those norms. And you can impose
costs on them for violating those norms and driving a set of behaviors.
Doug Kramer: You are meeting in the situation room TOOTD today, you are chairing a
principal's creation meeting, and we magically wave a magic wand
and put it you back in. And you see what happened in
Russia, you hear about reports out of China this morning, and
then also North Korea and your agenda is now that. This is the
news of the day, you have the principles there and you are
saying, okay, how are we as the United States government going
to respond to this? What are we going to recommend to the
president that he do in response to these actions? We have three
different nation states, and you can shrug your
shoulders and say, is there an another day's news.
Lisa Monaco: No shrugging your shoulders, it doesn't have to be
hypothetical. In 2011, I was the assistant attorney general
of national security in the justice department and I was
sick of intelligence reports showing that the Chinese and
members of the people's liberation army deployed on
behalf of the nation state were stealing intellectual property.
I began an investigation that ended up in 2014 or 2015 in an
inindictment indictments. And we ended up
identifying these guys in their uniforms, at the keyboard,
stealing from a range of U.S. companys mpanies. We had meetings and
discussions about how to make it public, how to make it a
diplomatic message, and it embroiled folks in Beijing and
caused consternation in our government. And we were
thankful that our government was standing up for them, much like
any victim of a crime. And they felt like, you know what, I'm
glad the prosecutor is leveraging charges. And that
was the first in a philosophy, an approach, and a strategy to
say that we're going to identify the actors, use all of our
toolles to understand who did this, intelligence community,
etc., we're going to make it public and then impose costs.
The same thing happened with the Sony attack. We identified
it was North Korea, the FBI worked closely with Sony
pictures and the executives there and the corporation, and
ultimately decided to impose sanctions against North Korea
for doing that. And my point being that there's
a whole range, and our approach on this was much like we have
done against the terrorism threat, treat this as an
intelligence-led, threat-driven approach to this problem, and
then put all the tools on the table. Say all elements of
national power are going to be at your disposal to choose from
a philosophy that we are going to impose costs against the malicious actor. Sometimes it
is law enforcement, diplomacy, financial sanctions, military,
or intelligence action. It should be, in my view, all of
the tools should be on the table. You ought to have a
policy discussion about what is in the best interest of the
United States as to which tool do we choose.
Doug Kramer: I will reward those of you who have been here
all day, if you were here with Jeff Immelt this morning, he got
a question on the China news this morning. In Bloomberg,
Chinese spies implanted chips and microprocessors into
hardware of a lot of significant American tech companies.
And the question really was, boy, can you ever have a global
supply chain again? Does this mean that we just have to sort
of seal off the United States and China, never at least for a
couple decades to come back together? What is your prognosis
and your way to manage that situation? We send each side to
the opposite corners, or is there a way to go forward?
Lisa Monaco: It is foreseeable, and not advisable to side us off. It is
not in the interest of the United States. And there are
efforts by nation states and others to try to go into our
supply chain. The news today is sobering and an example of the
threat. Nation states are using cyber as a tool of geopolitical
statecraft. So the Chinese going into our supply chain, the
Russian indictments against the Russian military intelligence
unit, and the story about the north Koreans stealing billions
from the international trading system because the sanctions we have been imposing on them have
left them with destitute coffers.
So that is my approach, come together and as an international
community to try to isolate the bad actors for violating the
norms and try to drive a change in behavior.
Doug Kramer: And was north Korea a special case there?
Lisa Monaco: We can do more in terms of sanctions, and
particularly, vis-a-vis north Korea, the banks in China that
are providing them a lifeline and some of the work that the
Chinese are doing are looking the other way that is providing
North Korea a lifeline for their financial story.
Doug Kramer: We have time for a couple questions.
Speaker: (Speaker far from mic).
Lisa Monaco: Yeah –. Speaker: He is helping us ends end on a high note.
Lisa: We have been beating the drum for a long time for the
private sector and companies to street cybersecurity as an
enterprise threat, it is not just an OOURKS IT problem,
there's a number of sectors that have taken that on
board. It is going to cascade down. And the short answer is
the appreciation appreciation of the threat has
grown. There's a lot of work to do on what the standards are and
the best practices. Doug Kramer: Any other
questions? Speaker: So when acid rain
floats from the U.S. to U.S. to Canada, Canada complains to us.
When — (speaker far from mic) — we have no one to go to, we
cannot say China, your IP blocks are polluting the network, fix it. And if it doesn't stop,
we have a cleanliness problem. Where do we get to the point
that we have international treatys? Lisa: We have people that are advocating the digital Geneva convention, and they are
focused on an international rules of the road. All of this
is in line with what I talked about earlier, coming up with a
sets set of norms to drive behavior. Speaker: Cyber warfare is a
statecraft, and these are not up to the task and are not the right kinds of
capabilities with social cognition, and intergovernmental
institutions. Can we change the capabilities
in line with the asymmetric threats?
. Lisa Monaco: Recognizing that
the use of psycher — cyber tools is an aasymmetric threat,
and coming at it that way. In your response, as a nation
state, you are going to be mindful of the dangers of
escalation. That's a reason that the space that I talked
about is so hard to operate in. The danger of miscalculation and
misNURMGZ interpretation when you are dealing with an
asymmetric threat for your cyber response, the target of that
could miscalculate whether or not you are there to spy on the
battlefield or execute on that battlefield, and that's the quintessential challenge.
Doug Kramer: I always felt better knowing that people like
you are in these position and continue to take on these
challenges so we can all sleep at night, even though you don't
get a good night's sleep. Thank you for that and for being here
today. [ Applause ].
Matthew: Thank you, and I really first of all wanted to
hopefully get a round of applause from everyone, it takes
a ton of people to pull an event like this together. For the
events team, the communications team, the marketing team, and
Vanessa that put together the lineup, the graphic designer,
those on the live stream and Twitter feed. If you wouldn't
mind, a round of applause for everyone.
[ Applause ]. The other thing, I said last
year that we were proud proud of the fact of the amount of
women that participated on panels. This year, again, we had
9/21 speakers who were women. What I would ask is that, as you
go to other tech events, and they tell you you cannot do a
psycher security panel with women on stage, or cybersecurity
panel, both of the speakers were women on stage and what you saw
were women on stage. Demand from the tech events that you
have representation from people that are not traditionally
represented. The smartest in the industry are not those that
you do not typically hear from on stage. So we encourage you
to demand that from them here and the events you go to.
That's a wrap, we have drinks on the back patio and on the roof.
If we are lucky, the blue angle Angels might fly over. Thank
you for being here. [ Applause ]

One thought on “Cloudflare Internet Summit (2018)

Leave a Reply

Your email address will not be published. Required fields are marked *