in ,

Microsoft President Brad Smith on Kara Swisher podcast Recode Decode, Recode

Microsoft President Brad Smith on Kara Swisher podcast Recode Decode, Recode


  

Microsoft President Brad Smith, who joined the company just as it was becoming an antitrust target in the 1990 s, has some advice forGoogle and others in the crosshairs: Step up and take responsibility now.

“We had to look at ourselves in the mirror and see not what we wanted to see but what other people saw in us,” Smith said on the latest episode ofRecode Decode with Kara Swisher. “When you create technology that changes the world, you have to accept some responsibility to address the world you have created.”

In his new bookTools and Weapons, Smith and his co-author Carol Ann Browne discuss “the promise and the peril of the digital age,” examining issues such as social media, facial recognition and cyberwar, and critiquing the “move fast and break things” mentality that has hurt so many people in recent years.

“It’s been important to move fast, and we should celebrate the spirit of innovation that enabled the industry to move fast,” Smith said. “But there comes a time when you realize, you know what? We shouldn’t go faster than the speed of thought. We need to think about what is happening, and that’s part of the message in this book. Let’s look around and see what’s happening. ”

You can listen toRecode Decodewherever you get your podcasts, includingApple Podcasts,Spotify,Google Podcasts, andTuneIn.

He also stressed that fixing problems is not exclusively the responsibility of scrutinized companies such as Google and Facebook. Instead, Smith criticized thewait-until-we-get-regulated attitude of leaders like Amazon’s Andy Jassyand said the whole tech community needs to proactively step up – even when they are unequally responsible for problems.

“I like to remind people, when the United States Congress passed banking laws in the 1930 s to regulate the nation’s banks, they did not create an exception for the banks they liked, ” Smith said. “They applied toallthe banks, and I think that we across the tech sector actually would serve ourselves well to focus on how we can solve problems even if we don’t feel we helped cause them. I worry that across the industry there are too many days when people say, ‘This is somebody else’s problem. I didn’t create it. I don’t have any responsibility to help solve it. ’”

Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Brad. Listen to the full interview by subscribing toRecode Decode with Kara Swisherwherever you get your podcasts, includingApple Podcasts,Spotify,Google Podcasts, andTuneIn.


Kara Swisher : Hi. I’m Kara Swisher, editor-at-large of Recode. You may know me as someone who is bracing for Russia to weaponize Snapchat filters in 2020, but in my spare time I talk tech, and you’re listening toRecode Decodefrom the Vox Media Podcast Network.

Today in the red chair is someone I’ve known a very long time, Brad Smith, the president of Microsoft. He’s been at the company for almost 26 years and has been president for the past four. Brad has also co-authored a new book with Carol Ann Browne calledTools and Weapons: The Promise and the Peril of the Digital Age. Brad, welcome toRecode Decode.

Brad Smith:Thank you, Kara. Great to be here.

So, I’m excited about this book, because it’s a lot of stuff you’ve been talking about for a long time, which is something I’ve obviously been interested in. So, I want to talk about the book and what you’ve been doing at Microsoft and sort of the state of play between tech companies and government. Steve Case many years ago wroteThe Third Wave, where he was talking about government regulation coming to tech, the new tech. This kind of is what has happened since, I think, as sort of the result of it. This is my take, so I wanted to sort of … Let’s get people to know who you are, your background. You’ve been at Microsoft for, what, since you were born, or …

Just even earlier than that. No. *****

Right. Okay. Right.

No, but I’ve been at Microsoft since 1993, as you said, 26 years. I was a lawyer with a law firm. I worked in London and DC before that, really started my career representing the PC software industry as it was expanding into Europe in the late ’80 s and early ’90 s, and then made the jump across the English Channel.

So, why did you want to work for Microsoft? This was the time, I guess it was still small. Smallish.

It was definitely …

The most dominant software player.

To put it in perspective, when I joined Microsoft, the company had about 4, 00 0 employees. Today it has 140, 00 0 employees . I wanted to work for Microsoft in part because I was an avid user and fan of the company’s products. I bought Microsoft Word version 1.0 the day it came out, and I’ve been using versions of Word ever since. I could see what the operating system business was doing, in terms of how it was changing the face of computing. I was impressed at the time because I felt Microsoft was a company with a long-term vision, and indeed, if you compare the Microsoft at the time to the other companies in the industry – Lotus, WordPerfect, Novell – there’s a common theme. Microsoft is still here. Those were great companies as well, but it turned out Microsoft did have the right long-term plan.

But you got there right during its sort of start-to-be-in-trouble days, the era of the monopoly lawsuits and the struggles that it had.

There is some truth to that, and I had done work as a lawyer, including in the antitrust field, and the company’s first antitrust investigation actually started in 1990, and the first really big negotiation with governments was in 1994. So I’d only been at the company several months, and it was a joint negotiation between Microsoft and the Department of Justice and the European Commission. I was part of that process.

The reason I want to stress it is because of what’s happening today, Microsoft was the first there to deal with government interest in technology, because there hadn’t per se been much regulation of technology in any way whatsoever.

I think that actually is a fair characterization . When I look back, it really was almost what I would describe the first collision between information technology and the world. It was a big international set of issues. Of course, it ended up exploding. It ended up with the Department of Justice and 20 states seeking and initially obtaining an order to break up the company from a district court. It led to antitrust cases in 26 other countries. We had to adjust. We had to learn. We had to go through a lot of what I think the tech sector as a whole is increasingly confronting today.

So, let people know what happened in the end of that, because a lot of people don’t. Microsoft did not get broken up. It didn’t turn out quite the way as the headlines were at the time, but explain what happened.

I think that there was an evolution. A court of appeals decided that the company wouldn’t necessarily be broken up, but the company was found to be liable. There was a consent decree that was issued. There were further cases in Europe. There were a number of regulatory rules, in effect, imposed on Microsoft, and the whole decade of the 2000 s was really, in many ways, spent adjusting to all of that and then seeking to emerge with what I’ll say is a sense of responsibility, perhaps more maturity, but also the kind of innovative spirit that I think we really feel you can see at Microsoft today.

Absolutely. It has changed, but what I want to get to is, what did you learn from that period that … See, most of it was Microsoft versus all of tech. That was sort of the mentality. Microsoft, this big, powerful company attacking Netscape, attacking various companies, and what happened in the wake of that was all the creation of all these companies, the Googles, the Ubers. Everybody came … It wasn’t necessarily a direct line. You couldn’t draw a direct line, but tech underwent another period of renovation, essentially, during that time, and Microsoft was seen as a very powerful player, people were very frightened of it, and then it wasn’t. Tell me, what did you all learn there? How to behave? Because a lot of people are worried about this current period of regulation, which we’ll get into in a second.

We learned a lot of things, but there’s one in particular that always stands out to me. We had to look at ourselves in the mirror and see not what we wanted to see but what other people saw in us. We had to accept a higher level of responsibility. We had to appreciate the concerns that other people had. I think we had to instill in ourselves a commitment. As we say in this book, look, when you create technology that changes the world, you have to accept some responsibility to address the world you have created. That is what we learned in part.

How did the culture change then? Because you were then becoming part of the leadership. How do you shift a culture like that? Because it was still run by Steve Ballmer, who was not, I would say, a shrinking violet of nonaggression.

I think every company goes through a series of cultural stages. You don’t have one cultural evolution and then stop. There was first a definite cultural evolution in the 2000 s. When Steve was the CEO, Bill Gates was the chairman, and it was called, “We have to learn to get along.” We have to build some bridges. We have to make peace with governments. We’re going to have to agree to some restraints, and that’s going to require processes and controls. Then, in some ways, that set the foundation for what I think of as the cultural evolution ofthisdecade, led by Satya.

This is Satya Nadella, just for people who don’t know.

Yeah, yeah.

He’s the new CEO. He’s been there forever also.

Yeah. Yeah, and Satya became CEO in 2014, and in a very interesting way, I think Satya took the cultural evolution of the 2000 s and said, “Let’s sustain this level of responsibility. Let’s be committed to trust with customers as a core principle for the company, but let’s add to this, ”what he describes as a growth mindset, a real focus on a learn-it-all rather than know-it-all culture at Microsoft , and let’s use that to unleash innovation, especially with more employees, and perhaps most importantly, younger employees.

Most people can agree, you kind of became what you were. He kind of defined what the company was rather than everything, because you all were in everything. You were in MSN. You were in cable at one point. It was just sort of aggression everywhere rather than not sticking to your knitting particularly, but doing what you guys do best, what Microsoft does best.

I think that’s a really interesting point. There absolutely was a time in the late ’90 s when, as you pointed out, people looked at Microsoft, is Microsoft going to be a bank? Is Microsoft going to be a cable company?

Yeah, cable company. Yeah.

You could just go on and on and on. We did eventually learn that it’s really hard to do the big things well if you’re trying to do everything at the same time. We’re still a diversified tech company, I would hasten to point out, but I think people rightly think of us as being more enterprise-focused, as well as consumer, but more enterprise.

Well, everything is adjacent to everything else. It feels more adjacent.

A fair point. But I do think we do focus more on trying to do well a certain number of things.

Absolutely. The reason I want to go into this is because I want you to give sort of … As you look at the landscape, you’re one of the leaders in policy and regulation, and certainly Microsoft’s been the company subject to the most regulations so far. Except for fines, which we’ll get into in a second, most tech companies have had almost no regulation to speak of at all, I think, if at all, except in Europe and other places.

I was going to say, Europe is the place where we’veallbeen regulated, and the United States is, in some ways, the country where we’ve been least regulated.

Or at all. I can’t even think of a regulation at this point, just regular laws, just …

Regular laws. I mean, you certainly have issues of spectrum and the like that impact our industry, but one of the interesting things about digital technology is it is arguably the technology that has gone the longest period of time without regulation in the history of almost any technology that has truly changed the world.

For being powerful and being a powerful industry, because if you think about it, Wall Street, everything has a regulation, almost every manufacturing … They all have some level of regulation.

So, let’s set the table right now for where we are, from your perspective. What prompted you to write the book itself? It’s calledTools and Weapons: The Promise and Peril of a Digital Age. Explain that headline, for people who have not read it yet.

Any tool can be a weapon, as we point out in the book. A broom can be used to sweep a floor or hit somebody over the head.

Do you often hit people over the head with a broom?

I don’t, but hopefully most people don’t.

Who does? Fire. Let’s go for fire.

But it can be.

Fire, it ‘ ll warm the house or burn it down.

Yeah. Basically, you name it, tools can be used as weapons.

Knives.

Think about digital technology. It’s an incredible tool. I mean, I like to say it will help the world cure cancer. We need it to help address our environmental challenges. But it has been weaponized. It has been weaponized by other governments. We see cyber attacks. We see disinformation campaigns. We see issues around privacy. We see the potentially unintended consequences as we move quickly to artificial intelligence.

Think about technologies like facial recognition. Think about just the gaps that are created when some people have access to broadband and skills and other people do not. So, in short, we’re at a critical inflection point, as we argue in the book, where technology is creating both opportunity and huge challenges. As an industry, we need to acknowledge these challenges. We need to help address them. Just as Microsoft, when it had to go through a cultural change as a company 20 years ago, we believe the tech sector needs to go through a cultural change over the next decade to find a way to keep moving fast, to innovate, but to do it with guardrails that will protect the public interest.

Right. So, let’s talk about these tools and weapons. When it was created, most people had this hopeful idea around technology, that it would be the panacea for all our problems, would bring us together, and things like that. Do a landscape of where the years have gone, from your perspective.

Well, I think of the 2000 s as sort of this transition to the internet era. That’s when we saw search explode. It’s when we saw the advent of touch-based computing and mobile explode. That is what further enabled social media to explode. I think especially mobile, basically, was a huge step in making computing ubiquitous in all of our lives.

One hundred percent, yeah.

Now let’s look at this decade. I think the first transition point this decade was 2013, because that’s the year when Edward Snowden shared a number of secrets with the world.

Yes, critical point.

Yeah, and it did a number of things. I mean, first it opened people’s eyes to how much data there was about themselves and how that could be accessed and used by governments. It created a bit of a split between the tech sector and the government, including the Obama White House, with whom the tech sector …

Distrust.

… had a very friendly relationship. But interestingly enough, what we also saw was this continuing collection of more and more data, and then I think we hit t he next real inflection point in 2018, and it was Cambridge Analytica. In effect, the questions that had been asked about the government five years earlier were now being put forward against tech companies themselves. We’re now grappling with this wide array of issues, and the halo is gone.

Right. So, I’m glad you mentioned Snowden, because a lot of people forget that, because I think that’s where the relationship between tech and government, which had been cooperative, you all had a certain level of cooperation, really fell apart, because of revelations, the amount of spying, the back doors and everything else. Whatshouldhave happened then? Because it seems as if it was handled badly in terms of … or was there going to be an irreparable damage to the relationship once tech understood how much the government was spying on tech?

Well, the first thing I would say is it is actually unnatural to have an extraordinarily close relationship that is entirely friendly between any …

Right, unless you live in China.

Yeah, yeah. But between any major industry and a government in a democratic country, there is usually a healthy tension that involves the government interest in wanting the industry to succeed, but also wanting the industry to succeed in a way that ensures that, in effect, no industry or technology is above the law, just as no person is supposed to be above the law.

So, I don’t necessarily look at what happened in the years that followed 2013 as necessarily the wrong path. I actually think a different question is more interesting. Why did it take so long for regulators, leaders, the public to start asking questions about things like privacy? Because I think those were predictable.

Well, Scott McNealy said it. I just interviewed him about that because it was the anniversary of that, saying, “You have no privacy. Get used to it. ”

Yeah, and that was a prevailing view in the industry , that privacy was dead, get over it. But to me, one of the interesting stories we share in the book is the White House meeting in December of 2013, a number of tech leaders were there. We were pushing the Obama White House. We were pushing President Obama to do more, to put checks and balances on the NSA.

There was a point in this meeting when he looked at us and said, “I have a suspicion the guns will turn. Your companies, collectively, have far more information about people than the government itself. ”I always thought that privacy would be quiet until the day it became loud, that there would come a day when, in effect, we might face what I’ve referred to, what we talk about in the book as the equivalent of Three Mile Island. Three Mile Island changed the face of the nuclear power sector …

Absolutely.

… in a day in 1979, and I think Cambridge Analytica did the same thing. It took until 2018, but I frankly always thought that day was going to come.

There had been a number of incursions. I would think the North Korea hacking into Sony … There was all kinds of different things that brought into mind hacking versus privacy, but the fact that so much data was available, people’s emails, things like that, and then the same was the case of the hacking of Hillary Clinton’s emails.

Yeah. Interestingly, you have this confluence of events between 2015 and 2018. In 2015, the Sony hack showed that a foreign government could bring …

Or was doing it.

Yeah, it could bring a company to its knees through a cyber attack. 2017 showed that WannaCry and NotPetya, that foreign governments could attack the world, could attack a country like Ukraine . We had these mounting concerns around issues like privacy. We saw it flaring up between the European Union and the United States, and ultimately we reached, I think, a tipping point where the concerns that people are asking finally reached a point where somebody gave it a name. The Economist called it “techlash.”

Techlash, right.

That’s what we’ve been talking about ever since.

We’ll get to that. So, why do you imagine it took so long? Why do you think these incidents … Because these are all major incidents. I mean, I wrote about all of them, and people seemed concerned, but also, that’s the price of technology, and the tech industry seemed relatively unconcerned, I would say.

I think one of the lessons that one learns, we learned it at Microsoft, I think perhaps we’re learning it as an industry, whenever things are going well, it’s pretty easy to think that they will always keep going well. And let’s face it, our industry prospered and did great things for the world, often with the explicit slogan that the best way to develop technology was to move fast and break things.

Break things.That’s Facebook. That’s just Facebook. That’s how they think.

I think that as an entire industry, we all …

Well, they made posters.

Yeah. Oh, yeah. Well, some people are better at posters than others!

You don’t have posters at Microsoft, do you?

They’re not as good. But the reality is it’s been important to move fast, and we should celebrate the spirit of innovation that enabled the industry to move fast. But there comes a time when you realize, you know what? We shouldn’t go faster than the speed of thought. We need to think about what is happening, and that’s part of the message in this book. Let’s look around and see what’s happening.

Right. I think it’s an important one, and you’re one of the few people saying that, which is kind of interesting. We’ll get into that in the next section. But one of the things, to finish up this one, is that idea of ​​move fast and break things … when I saw it for the first time, I’m like, “Break? Is that the word you … ”I said it, too. I was at Facebook and I said, “Is that the word you … I got the move fast part, but is break what you actually want to say, not disrupt or change or whatever?”

Well, I think if you want to broaden it a little bit, I do think it’s fair to say that there have been many times in many parts of our industry where disruption has been considered an end in and of itself, and I personally think that one should step back and even think about that a little more broadly. Certainly, if there is a single vision or principle that Satya has articulated at Microsoft since his first year as CEO, it’s that there are certain values ​​that are timeless. There are actually certain values ​​that are more important than the development of technology and we need technology to serve these values. We shouldn’t think that disrupting everything is actually a good goal in and of itself.

We’re here with Brad Smith. He’s the president of Microsoft, and he’s the co-author of a book calledTools and Weapons: The Promise and Peril of the Digital Age. This is something Brad has been talking about a lot. This has been a topic which not many people were listening to early on, for sure. I have been banging at the drum, you have, some others. So, why do you think Cambridge Analytica did that? Why do you think that was the moment? Because people at Facebook still to this day … I was there the other day, and they’re like, “It wasn’t that big a deal,” and I’m like, “Oh, God. You have to stop. ”

I think it’s a great question, Kara. When I try to think about it, everything in life is a big deal and not a big deal at the same time, and certain things take off. When they do, you look back and you ask, “Why is that the thing that took off?” I think the reason that Cambridge Analytica took off is because fundamentally, it was about the use of people’s data in a political campaign to support a candidate for president, Donald Trump, who many people were not prepared to support.

So, when they learned that information that they were sharing, including about their friends and the like, was being used for a political campaign in this way, they got upset, and because it was about politics, Congress got upset. So, you had this making for a bit of a tinderbox on a particular privacy issue that was different from what we’d seen in the past.

So, it put privacy with politics, with the Russians, with all kinds of things, a plot, a scandal.

Yeah. No. It’s a pretty powerful stew, if you think about it that way. So, at one level, one can debate forever what impact did it actually have on anything? The reality is it doesn’t necessarily matter because what does matter is how people thought about it. It was as if a light bulb had been turned on, and people looked around and they saw the room, and they said, “Wow. Look at all this data. Look at how it can be used. We should be paying more attention to what this means for our privacy. ”

So, one of the things that I think a lot of people have been talking about is the focus had been on Facebook and Google, essentially, which are two companies that have most of the data. Microsoft doesn’t traffic in as much data, for sure, and neither does Apple, and neither do lots of companies. I mean, they use data, and it’s certainly important, but it was focused on just two companies who did this.

When you’re thinking about these things, what do you think … Because you’re all up in Seattle, Amazon’s in … Amazon has a lot of information on people, has yet to have a very big privacy scandal yet, but pending. As with all of them. People think, when you think of the techlash, that it’s one tech industry, but it’s just not. And there’s different levels of tech and who’s involved and who’s responsible, but you all get glommed together.

So how do you organize an industry where certain players are causing most of the mess and the others have to live with it?

I really think about two things. The first is there’s definitely differences between companies. Some have more data and some have more consumer data than others. Some have business models that really encourage more use of that data for monetization and advertising, and others less so. But I like to remind people, when the United States Congress passed banking laws in the 1930 s to regulate the nation’s banks, they did not create an exception for the banks they liked. They applied toallthe banks and I think that we across the tech sector actually would serve ourselves well to focus on how we can solve problems even if we Don’t feel we helped cause them.

I worry that across the industry there are too many days when people say, “This is somebody else’s problem. I didn’t create it. I don’t have any responsibility to help solve it. ”

That said, I mean , I can’t imagine a newspaper was doing crappy job. I wouldn’t be like, those people are ruining it for the rest of us kind of thing. Is that an attitude that you think is pervasive around tech?

I think that there is an attitude in tech sometimes that people say, “You know what? I don’t want to go work on that because I don’t want to stand next to that other company. They’re in the firing line. ”They’ve got a virus, so to speak, politically. “If I stand next to them, I’m going to do something worse than catch a cold.”

Right.

I think that’s a mistake. I think one of the most interesting examples of, in my view, a company doing the right thing as we describe in this book, was in the wake of the Christchurch terrorist attack. Jacinda Ardern, the prime minister of New Zealand, got on the phone, made calls.

I had one of the first, actuallythefirst meeting with her because I coincidentally was in New Zealand’s capital shortly after the attack. But she asked people to help. Google, Facebook, Twitter, all said yes. I think they felt understandably, logically, that they needed to be involved, given that their platforms were used by the Christchurch terrorist.

This is to broadcast for people. They broadcast a lot of the attack and these platforms are very slow to get them down. They say fast, but most people think it was slow.

Yeah. The company we should applaud is Amazon because Jeff Bezos took the call, Amazon decided to get involved. They could’ve said, “You know what, we didn’t contribute to this problem. Our services weren’t used. We’re not going to show up, we’re not going to help. ”We at Microsoft, obviously we’re involved from the very first day. But we need a bit more of that kind of civic spirit. And I’d love to see more companies doing that.

So, why isn’t it there? And then I want to get into the individual problems – from inequality to social media to disinformation – in a second. But how do you get that to happen? How do you get … Because my experience with tech is they can’t decide on lunch, essentially. Like, there’s a lot of issues that they come together every now and again, but it’s rather rare and I know it’s onerous, from what I can hear from people dealing with it. Like, having agreement on anything.

I think that’s a fair comment. It’s an accurate observation and I think it is a reflection of our need to build a more civic spirit across the industry. It’s also a reflection of the need for companies, perhaps especially large companies, to develop the capacity to make decisions. Because whenever you get into these questions of should we or should we not do this, you really set the stage for what can be endless debates. And unless you have the capacity to be decisive, you just have endless debates, which results in inaction, and in effect, the answer is “no,” whether people intended it to be or not. And I think there’s a muscle that the industry needs to develop as well as a culture.

Mm-hmm. And is it because it’s young or because it’s …

I think it is a reflection of the industry’s youth, which is sort of an interesting thing to say in and of itself because …

You’re not so young anymore though.

Yeah. I mean, Microsoft’s looking to our 50 th anniversary in just a few years, but you have other companies obviously that are both extremely successful and much younger. I think the real message for all of us is, you know what? You have to grow up faster than maybe you used to.

All right. So let’s talk about it, because I think the main problem coming is regulation and how it’s going to be written and how the industry is going to … Is that wrong?

I think one should hesitate to use the word “problem.” I would wholeheartedly agree that it is going to be a principal phenomenon and I think this is a challenge for the industry. It is an opportunity. And the real question we should ask is, what are the problems that laws and regulations need to help us solve? It won’t be, in my view, one regulation that is so sweeping that it covers every issue.

Of course not, yeah.

So what we really strive to do in the book, you see it in the table of contents, is let’s look at the different problems and let’s talk about some potential solutions.

So let’s go through some. One that I obviously spent a lot of time on recently is surveillance, facial … I’ll put facial recognition, sensors, cameras, everything else. Where are we on that right now from your perspective?

Yeah, having been in this industry for a quarter of a century , I think the facial recognition issue is one of the most unusual issues we have ever seen. We at Microsoft, literally just 16 months ago, put out a blog that said, this is technology that is subject to abuse. It’s going to need to be regulated. *****

And there were people in Silicon Valley who looked at us and said, what are you talking about? Why are you saying this? And here we are and it is exploding around the world. We’re sitting here today in a city, San Francisco, that has banned its use for the public. And the issue is changing on almost a monthly basis.

What do you imagine the the regulation needs to be? Because I’m interviewing the people who do police cameras, they don’t think it’s ready for primetime. The only people who think it’s ready for primetime are some of the creators of it. I hadAndy Jassyfrom Microsoft, which makes Rekognition.

From Amazon. Yes.

Amazon. Yes. Amazon. Right. And he was talking about like, “We don’t have responsibility for people misusing it.” And I was like, “Well, you do. But okay, sure. ”That’s an interesting attitude, but it’s not going to hold for very long. Where do you imagine the regulations going here in the United States and globally?

I think we are going to need and we are going to see new laws that will address the risk of bias and discrimination and I think there’s particular steps that a good regulation can take to do that. I think we are going to see laws that address commercial privacy.

In other words, if we go into a shopping mall and our face is being identified and our images are being captured, I think we’re going to see laws , too, that give us as consumers some control over that kind of data. And I think perhaps most importantly, we need and we’ll see laws that address the use of facial recognition by public authorities, especially law enforcement.

Like police?

Yes. So as to ensure that we don’t have people arrested or taken downtown in the back of a police car for an incorrect identification – or worse, ongoing or even mass surveillance.

So I think we’re going to see all of these things addressed and I think we need governments to step forward, but I actually wholeheartedly agree with what you also said. This is not an issue where a tech company should be permitted to say, “We may create this technology, but we only have one responsibility. And that’s to follow some law when some government passes it somewhere. ”

If we want to be an industry that is respected by the public for doing more than selling whatever we can create to anybody who is prepared to buy it, then we darn well better say we have some principles we’re going to apply to ourselves, even when the law is not yet in place.

Which doesn’t seem to be the case. Is there a country that’s more stringent on these laws than others? Or is it state by state here in the United States? It’s city by city.

Well, interestingly, we’re seeing two phenomenons. One is we’re seeing at the national level, some governments start to move faster, and I don’t think people will be surprised to hear that that is likely to be in Europe, where already, they have more privacy protections because of the nature of European law.

But the other phenomenon that is interesting, that should actually speak to any company that wants to be in the facial recognition business, is we’re starting to see court decisions here in the United States. Facebook lost at least a preliminary decision based on Illinois law that could go quite far in actually making it difficult to develop and use facial recognition, even for very beneficial purposes. I think it’s the classic case of, if you want good things to prosper, you better be thoughtful and responsible, because if you’re not, you’re going to find that people are going to end up throwing out the baby and the bathwater . *****

The whole thing. What are the beneficial uses of facial recognition?

Well, let me share with you one thing that I was very …

Getting on a plane faster?

I was fascinated … I was in Brazil three months ago and there is a nonprofit organization that works to find missing children and missing adults who are family members, and in some cases they are missing adults that have mental health issues. They might be separated from their families, they’re homeless.

The family, in a lawful way, can provide photographs of their loved one. And then if somebody ends up in a hospital emergency room or arrested by the police, they can take a photo in real time and run it against the database and a probability appears of whether this is the person.

Okay. There’s one.

Yeah, there is one.

One.

And I’ll say it is one that has reunited missing people. Now I’ll give you another one. There is an effort in Washington, DC, under the auspices of the National Institute of Health that has realized that there is a certain genetic syndrome that tends to manifest itself more among people from Asia, Latin America, and Africa where there are certain facial characteristics …

Tics or whatever.

… that then are related to more serious health hazards, including kidney damage and the like. And by using this kind of technology, physicians are able to augment their natural abilities with this kind of technology tool.

For that limited use?

Yes. For that limited use.

Yeah. I think the problem is it won’t be for that limited use. It’ll be for … You’ll do it for getting … I mean, years ago I did Clear and I regret it, but it’s too late. Right? What can I do? It’s out of the bag, essentially. I liked Clear too, by the way. This is a way to get on planes faster.

But if we step back and just think about the evolution of technology, I think facial recognition is an incredible illustration of the degree to which we should want people to have the opportunity to be creative, to be innovative.

If you asked about any technology two years into its development, is that the only thing it will ever do? You would have killed the smartphone five years before the iPhone was invented. So we don’t want to stifle technology, at least in my view, but we’ve got to have strong guard rails. Especially for this technology.

And we should be prepared to say that there are some uses that we should never permit and there are some uses that we should not allow until the technology is more mature than it is today.

All right. Related to surveillance is privacy. The idea of ​​privacy, I mean, because that’s part of privacy. It seems as if it’s all over the map across the world. But Europe is obviously more stringent. The United States has privacy bills all over the place. There’s one coming online in California. Where do you imagine that ending up? Whoever is the strongest global one will be the winner here?

Well, I think there’s an interesting aspect of the story around privacy, which we really sought to bring to life in our book. And in a sense the chapter on privacy is the story of, as we describe it, the two unlikely individuals who have most influenced privacy protection in the United States.

One, an Austrian student named Max Schrems, who persuaded the European Court of Justice to strike down a safe harbor and pushed the US government to strengthen privacy protection. And the other, a San Francisco-based real estate developer, Alistair McTaggart. And there’s an interesting lesson in both of their stories. One is the opportunity for the European Union to use its influence, as it did with the United States, to strengthen privacy around the world. And then the other, and I think it’s the really interesting lesson from Alistair McTaggart’s experience, is to use …

Explain what

Basically what Alastair did, and I just … You won’t find perhaps everybody in the tech sector applauding what he did, but I think it is absolutely worthy of applause. He said, we need some laws in this country that better protect privacy. We have gridlock at the national level. He looked at the situation in Sacramento. He said, we’re going to have gridlock there. But California, as some other western states, has this thing called a ballot initiative.

So he used his own money to collect more than twice the number of signatures needed to put a strong privacy initiative on the ballot. It was going to go before the public. Once it was on the ballot, it brought people to the negotiating table and we saw California adopt a privacy law.

That goes into force in 2020.

Exactly.

Will that be the de facto national privacy standard?

Because one in every eight Americans lives in California. If California is the only state that enacts a privacy law, I do believe it will be the …

There’s about 12 others, right? In the states?

There are some others. I think ultimately, we are a country that needs a national privacy law. I gave a speech in 2005 in Washington, DC, calling for a national privacy law and everybody … *****

They’re moving fast, Brad.

Yeah. I was clearly …

There’s nothing.

I was very influential, Kara, can’t you tell?

They have some congressperson who doesn’t know anything about it. It’s like, “She’s just getting to it.”

It will come.

Really? When?

I think in the 2020 s.

Okay.

It’s not going to come this year.

(No.)

But I think your point is nonetheless on the mark. California’s large enough to create a national standard in the United States.

Well, it seems it’s happening in a lot of stuff. It seems like it could do on a lot of like emissions and everything else.

All right. Let’s get to one more thing in this section. So privacy and consumer protection of their data. That will be in the same privacy bill, correct?

(Absolutely.)

Like hacking of their data and things like that?

The interesting thing about privacy is it’s actually a two-sided issue, so to speak. One side of it is the protection of consumer data. That’s what people like Alistair McTaggart have focused on. The other is the protection of what you would call citizen data. That’s the government surveillance side of the coin and that is this issue that keeps evolving.

We share some of the big decisions that have been made, not just by governments, but in the tech sector over the last five years, the debates between government and tech. But these two things are going to continue to evolve, I think in a very robust, and at times, even dramatic way over the next five years.

So do you predict a national privacy bill in the next five years? I’ll hold you to it, Brad.

I will predict a national privacy bill in the next five years and I’ll be back, Kara, in four years and 364 days.

We’ll see, we’ll see. If Mitch O’Connell is still there, I wouldn’t bank on it.

Let’s move on to social media and protections of democracy. They all sort of fall into the same thing. Obviously the big bit of information has been about what social media does. One, disinformation. Two, addiction. Three, fake news and things like that. They’re all sort of wrapped up into one unpleasant pile of crap. That’s how I put it, in my legal point of view. How do you look at what’s going to happen in this area? Because we just did a big long thing on Section 230 and protections, and that’s probably not going to … It gives immunity, brought immunity to lots of publishers. Talk a little bit about where we are right now.

I think 2019 has been an inflection point year for this issue. As you point out, Section 230 was created in the 1990 s. The idea was to give broad immunity to interactive computer services. The notion was the internet was young, it needed to grow up.

Couldn’t be sued out of existence.

Exactly. And what we’ve seen in 2019 is two factors or forces really come together. One is concern about nation-state disinformation campaigns, principally Russian campaigns. Now well-documented, starting in 2016 in the United States.

But then we saw a second development emerge as well. It was the Christchurch terrorist attack and the use of, in that case, a terrorist used social media and the internet as a stage. And these are two very different things, but they’ve reinforced concerns among governments to impose more responsibilities on tech companies. And, I should hasten to add, it’s easy to think about YouTube and Facebook and Twitter, but a company like Microsoft has LinkedIn. We have services like Xbox Live. This is broader than those.

Yes, there are. More minor.

Yeah. Clearly.

I don’t think a terrorist is going to be doing a LinkedIn broadcast , for some reason.

We would clearly hope not.

No.

But what we saw, and this is the really interesting and important point, is all of a sudden things changed. The Christchurch attack was on March 15 th. Jacinda Ardern, the prime minister of New Zealand, within 10 days said, “We need tech companies to take a different approach. ”Within three weeks after that, the Australian government passed a new law that imposes criminal penalties – not just fines but prison sentences – on tech executives if they don’t expeditiously remove violent terrorist or extremist content.

Two weeks after that, we saw the British government introduce a new proposal. We’ve seen the French and German governments use this as an opportunity to advance their proposals as well. If you almost envision a map of the world, it’s easy in the United States to think that the United States itself is the center of the world, we have this statute. It hasn’t changed yet.

These are US companies.

Yeah. It hasn’t changed yet, but if you envision the world that starts with New Zealand to Australia, up to the United Kingdom, France, and Germany. You fill in Canada where Prime Minister Trudeau has raised more issues. I would say the law is changing very quickly, and even if Section 230 itself does not change, all of us in the tech sector are going to need to because of these other countries.

I actually believe that we are seeing a wave and the wave is going to come back to the shores of the United States. We actually probablywillsee some change in this country as well, but the US is …

Well, it’s been shipped away with around sex trafficking, but it still hasn’t had a full frontal attack. I mean, Josh Hawley is the first one that’s … But that’s over conservative speech too.

The US will be a late mover, not a first mover, which is actually, I think, unfortunate because it means that people in the US government are less influential in the global conversation.

Will be acted upon.

And the real question that we should be asking ourselves in my view in the tech sector, the question that we address in this book, is not whether we want to continue to live in the past. Because I think the past is now over. The real question we should be asking ourselves is, what is the right future? And I would argue it needs to be a future that preserves fundamental benefits in Section 230. It’s what makes the social media model possible, but there are some new responsibilities and we need to take them on, and we would be better served to develop and advance our own ideas.

So how do we do that? Because I think whenever I mention it to people, YouTube, they just turn white, because they feel like their businesses would be sued out of existence, and they’re probably right in many ways. It can’t be touched, is their feeling. I’m like, “It’s going to be touched. There’s a lot of touching about to happen. ”

Yeah. Looking at the world globally, it’s being touched all over the place. I think what one has to move towards is a conversation that says, “Look, no one should want this abolished.” If it were abolished tomorrow, then there are just fundamental technological services that would be put at great risk.

Right .

If you want to avoid that result, put a little bit of thought into what’s the right way to touch it in a balanced and thoughtful way. What is a good model for the next 20 Years?

From your perspective, what is that?

I think it is to identify certain areas where certain responsibilities can be assumed, and indeed because of the events of Christchurch and what’s called the Christchurch Call, there are certain responsibilities that tech companies are stepping up to. I do think as we think about all these issues around fake news and the like, we should all …

Which is related.

Yeah, they are. We should recognize you can’t impose on a tech company, because you can’t impose on anybody, the responsibility to be the ultimate arbiter of what news is true and what news is false. But you can start to impose some responsibilities to identify who is speaking.

That’s a really good point.

If content is being posted, what country is it coming from? If content is spreading, is it a human who is doing the speaking or is it a bot? If you think about the nature of political advertising in the United States, there’s a really interesting analogy in my view, we point to it in the book. We don’t try to say, “Oh, this politician can say this in an ad or not,” but you always do know at the end of the day …

Where it’s coming from.

Yeah. Who is speaking, let the public know who is speaking.

So you don’t feel like tech companies should have any role in monitoring hate speech or getting rid of people, or the health of the conversation. Right now they’re all talking about these “healthy conversations,” which I think it’s impossible. You can’t have healthy … You can have some healthy conversations on it, but it tends towards an unhealthy conversation.

I think that we live in a world where the American conception of the First Amendment and free expression is actually at one end of the political spectrum. We are having to adapt and adjust to certain standards from other countries that address certain aspects of hate speech, but I also agree it’s a very delicate balance. It is a balance where there are human rights interests that are important to advance, free expression. But it’s a big world and we do have to work, especially with other democratic governments across Europe and elsewhere, about how to address this.

Is it an impossible thing? Every week there’s something else. Recently there was one where a lot of people who pushed free speech then don’t like it when they’re insulted, and it’s like, “Oh, you can’t call a manager to fix it.” There’s no fixing any of it , people can say what they want. Then people become indignant when they themselves are subject to being attacked.

I think we need to move from a situation where we say we can ‘t do anything because it’s so difficult to figure out what to do, to a conversation where we identify at least some smart things that can be done, but recognize the limits on them and recognize the importance of this balance. There have been columnists at the New York Times that I think have said pretty persuasively, “A world where tech companies absolve themselves of all responsibility in the name of free expression is not actually creating the world, at least the world of technology, we want . ”

That’s me, but go ahead.

And others, yeah.

Yeah, yeah. At the same time, do we want them making decisions?

Exactly.

(Right.)

Exactly.

My feeling is the architecture is thus, that it will only degenerate into crappy discussions.

Well, that’s one of the reasons I think we’re starting to see governments seeking to understand better how algorithms work in promoting certain kinds of conversations.

Related to hate speech and others is the press from conservative side, like Senator Hawley and others, that there needs to be … and President Trump. We’re looking into this. I mean, he’s not looking into anything, but that’s besides the point. But the idea that it has to be even-handed, like a fairness doctrine.

I think it’s difficult to imagine how you import a fairness doctrine, and at the same time I think most of the people in the tech sector, at least at the leadership level, do strive for something that is not putting a thumb on the scale for one party or the other.

Nobody believes them. I agree with you. I think they are, but I think nobody believes them. Then related to that is the concept of whether these companies should be broken up. Now all the Democratic candidates, I’ve noticed lately, suddenly Pete Buttigieg is saying things. Elizabeth Warren, obviously, is the best known one of breaking them up or regulation. But all the Democrats have jumped on that particular bandwagon.

Do you see breakup as … Or will it be a combination of fines, which many people find inadequate. I find therecent YouTube one to be inadequate, therecent Facebook one to be inadequate. Is it a combination of things like fines and certain regulations? Or is breakup inevitable?

Well, I’m not one who believes that breakup is inevitable, nor am I an advocate for breaking up companies. In fact, if you are forced to spend as much time studying the history of antitrust law as I was 20 years ago, what you really see is that breakups are talked about all the time and are almost never imposed.

I think the real question should be, what problem do people want to solve? What is the best way to solve that problem? If one thinks it’s an antitrust problem, there are a variety of antitrust solutions, not potentially fines for all the reasons you mentioned, but more, other kinds of constraints that are imposed. But I think some of the problems that people worry about the most, in fact, may be problems that call for action outside the antitrust sphere, or we may see a third development.

And In fact, I think it’s fair to say it is a development that is emerging. Historically, antitrust law was used to go after solely economic issues and economic harms, and now we’re seeing a new school of antitrust thinking, which says no, let’s look at the harms to democracy, let’s look at the harms to privacy, or let’s think about the impact of data.

It’s not a consumer harm.

Yeah. The current head of the antitrust division and the Justice Department has endorsed this view.

Makan Delrahim has said this.

Exactly.

Which I think is the direction a lot of people are going.

I think that is probably an accurate observation and prediction.

Mm-hmm. All right, when you think of it that way, do you imagine that these companies will be broken up? You don’t just don’t know, because here you haven’t had a new social media company since 2011. You haven’t had a search engine since the beginning of time. The beginning of search, there’s one. There’s just simply … I know you guys are trying real hard over at Bing, but sorry. They seem to dominate. Because a lot of people feel that there can’t be innovation without … why bother going into several different sectors? E-commerce is another one.

I’ll again say, I just don’t think it’s likely that breakup is the remedy …

That occurs.

… that typically is pursued, but we’ll see antitrust cases move forward. We’ll see sectoral regulations that are aimed at addressing specific parts of technology. I think we may see more of a focus on particular business models. Certainly, the advertising-based business model is increasingly in the crosshairs of regulators.

Well, there’s two of those.

Yeah. We’ll see more of that. Again, that’s not a statement of advocacy. That’s just the observation of somebody who’s lived with these issues for a quarter of a century.

Okay. Let’s finish up talking about rural broadband and sort of the talent gap of where people are. Because I think those are together, is that there’s talent everywhere and I think it’s just a question of opportunity, not a question of talent. But we remain stubbornly non-diverse as a tech community, only because it’s a question of opportunity, it seems like.

There was a really goodtweet yesterday of people jumping from jump to jumpand someone called it “white guys funding each other. “You know what I mean? Because it was all like … and then they patted themselves on the back. Talk about what we need to do to bring in rural broadband to take advantage of the talent across the country. Microsoft has been doing some of this, making some investments in this area. Finish up talking about where that’s headed and when do we need a government program to do that.

Yeah, no, these are issues about which we have a lot of passion. I would say there are some big technology gaps in this country today, as well as in the world. The first gap is the rural gap, and we share the story of going to a particular county in eastern Washington state, the county that has the highest unemployment rate, and what you find is they’re still living in the 1990 s when it comes to the dial-up era. You see how difficult it is to attract jobs.

Right.

You see the impact on culture. You actually see this interesting phenomenon where almost … so much of the government data about rural communities is just inaccurate. And one argument we make in this book is that we need a national cause to bring broadband to every part of the country. We’ve done this. We did it with the telephone. Franklin Roosevelt did it with electricity. We need to do it in some ways with new technology. That’s what we’re focused on at Microsoft. We do need government funding for the right kind of targeted public matching investments and we should do it and say we’re going to close and eliminate this gap over the next five years. It can be done. *****

And you know, we highlight in the book the connection between that issue, which is a technology issue, and the politics of our day . We have a populist in the White House in part because rural America feels that the government doesn’t understand it. And we point out that in important respects, people in these communities are right. The government doesn’t understand it.

Then we have another gap. It’s the skills gap. And it is, I think, very disconcerting as we point out that if you look at who has access, say, in high school to a computer science class, the people who have access to these new skills are more male, more white, more affluent, and more urban than the United States as a whole. So here is this fantastic skill that can propel people into greater prosperity, a brighter future, and it is not being made available equally. And that is a critical public policy issue. Let’s face it, if the government does one thing almost entirely as a public matter, it’s education.

And it is letting us down, so we need more there. So when you look at these two gaps and you see them together and then you connect it with all the issues that we …

Opiates. Everything.

Yeah. Every single aspect of American society. You realize this is part of a technology issue and a technology skills issue, and we in the tech sector, and across the country more broadly, need to peel the layers of the onion and think about this technology dimension and think about, in part, what we as an industry can do to help solve it.

What are those? Just finish up. What would be the biggest solution to do that? Because a lot have been tried, this whole Silicon Holler. I just think it’s just … The jobs are … They’re not going to … A coal miner’s not going to … most of them are not going to become coders. They’re just not. And some of these programs aren’t very good. A lot of it is locationally based, but what do you think the most important thing to do, of all the things to try here?

Well, I’ll at least start with let’s eliminate the broadband gap.

Yes.

We’ve met young, promising African American high school students in rural Virginia, but they can’t excel in computer science because they can’t access the internet from home. And what we’ve pointed out is anytime you can move from a wired to a wireless technology, you just look at the history of technology. It spreads into rural areas much more rapidly. And yet, even here in 2019, we’re seeing most of the presidential candidates that have broadband plans talking about only one thing: fiber optic cables.

Right.

The hardest and most expensive solution to a problem that can be solved much faster, if we think about it with the benefit of some additional technology, and we would argue a smarter way to spend public money and to think about what companies, it’s what we’re doing. We have this air band initiative. We’re committed to working with telecommunications partners to bring broadband to 3 million people by the year 2022, but even more what we’re really trying to do is build a market so the market can take off and help close this gap.

So broadband access.Training skills. How useful are those programs?

I am a big believer in training skills. One of the stories we share is something that we’ve championed, Microsoft and Boeing in Washington state together with the state government, the Opportunity Scholarship Program. And yeah, it’s now enabling thousands of low-income, often minority, typically first generation to go to college, students to get a college degree in a high-demand STEM field.

And the thing that gives me the most enthusiasm about it, I ‘ve chaired the board for this since it was created in 2011, is that our studies are showing that five years after graduation, the average income of the students who have gone through this program is 50 percent higher than their entire family’s income was when they started going to college.

And you know, just think about our country, we so often worry – with great justification, in my view – that economic mobility is debt. People can’t rise up the way other generations did before. Here is an example, in my view, with the right investment, public-private partnership, business leadership, you can make a difference.

I want to finish on how much should the government be involved? In other countries, like China, moving ahead of us because we don’t have that anymore. That idea that the government is responsible for this.

Well, that is the other part of the equation that we describe in the book. The tech sector needs to do more, but we need governments to do more. We need governments to move faster. We need government to catch up with technology. We need not just government acting by itself, but we need a new era of multi-stakeholder efforts where governments and businesses, tech companies and nonprofits, work together. And if you look at a problem like broadband or you look at a problem like the skills gap, I think those are readily identifiable as issues where this is the right formula.

But even when it comes to something like cybersecurity , if you want to protect cyberspace, which is often owned and operated by private companies, we need a new era where we’re working together in new ways.

Do you see that coming?

(I do.)

You do?

I am realistic enough, as we say in the book, to recognize we have huge headwinds. It starts with gridlock at home and nationalism around the world. But every day, we have opportunities to build coalitions of the willing. And that is what we’re doing. We’ve done it to address some of the cybersecurity threats. We’ve done it to start to address some of these skills-gap issues. You look at the different nonprofits that are playing such an important role in this space. We’ve done it now to start to make some progress around rural broadband.

If we could just take the great energy and enthusiasm that is so clearly evident across the tech sector and instead of saying we’re going to move fast and break things, we’re gonna work together and fix some things, we’re not going to solve every problem in the world. That would be naive. But there’s a lot more good that we can do than what we’re doing right now.

I actually have one more quick question. With all the upcoming technologies, which one are you most nervous about in terms of weapons and which one are you most positive about will be a tool?

I ‘ m probably the most nervous about facial recognition because of its potential for abuse. And I’m probably the most optimistic about artificial intelligence more broadly, simply because of the impact it actually is starting to have in helping us solve some of the most fundamental societal challenges of our time.

Including health and disease and everywhere.

Yes. Yeah. Medicine, disaster relief, protection of human rights. I think about something like Microsoft Seeing AI. My gosh. Give a person who is blind the ability to use the camera in their phone to identify what is in the world around them, you change a person’s life.

All right, Brad Smith, thank you so much for coming. We really appreciate it.

Good. Thank you. *****

Can’t believe I’m saying Microsoft’s good for the world. It’s been a long journey for us, hasn’t it been? That’s Frank Shaw laughing in the background. Anyway, thank you for coming on the show.

Brave Browser
(Read More)Payeer

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

NBA 2K20 Suffers Gamer Revolt as # Fix2K20 Trends on Twitter – CCN.com, Crypto Coins News

NBA 2K20 Suffers Gamer Revolt as # Fix2K20 Trends on Twitter – CCN.com, Crypto Coins News

Apple TV + streaming service: here’s everything we know so far, Recode

Apple TV + streaming service: here’s everything we know so far, Recode