Ethics in the Digital Age

Internet DNA Podcast

Who should regulate the tech industry? Ethics is not black and white, silicon valley should ‘do no evil’ but who defines evil? It is a minefield and a polarising topic, why are a vast majority of people so verbally against wrong doing of a company but physically don't change their habits? Should we be offered what we want even if it is not what we need and who should decide? We discuss the differences of government versus self regulation, morals versus ethics, and how an ethical framework needs to be put in place… soon.



(this transcription is written by robots… so don’t be surprised!)

Hello, welcome to this week's episode. It's net DNA with me, Abby. This week we go to discuss ethics in the digital age you say excellent. I feel slightly

the overawed by the enormity of it and the unknown of it. I'm quite binary. I like, yes, no. Whereas this is such a gray area from start to finish. I was even going to say at least our morals are okay, but they're not. Everyone's in uproar about Facebook and all that data. It steals the Capon using Facebook. It's a really give two hoots. They're just like shout about it, which is, I'm jumping the gun a bit here as I normally do, but that's where to me the complication stops. What everyone shouts about and what they do. Two completely different things.

So like who's responsible at the end of the day, I think there are a number of things. So there are unintended consequences, there are intended consequences and then there's how people react to those. So I guess an unintended one is if we look at something like Airbnb in Spain, I think you now need a license to use Airbnb. You can't just rent out any property because what they found was happening was it was forcing up rents, especially places like Barcelona where suddenly people weren't able to rent anywhere in the city because everyone was just IBM being at probably Airbnb when they set out, didn't set out to raise rents in popular destinations, but because it's a disruptive technology, suddenly it was easy for everybody to rent any property fairly cheaply. You will probably know more about this than I do have, knock on consequences and then it's should the company regulate that and so should Airbnb look at what it's renting and say, actually, this is just someone's house.

Okay. You're saying, should Abby and be looked at and say you can't do it or should it be a government law? Yes. Now let's turn that on its head. For some people it's bad because they could no longer afford to rent. For other people who are perhaps struggling to live or needed an income, it's amazing because suddenly they have an income that they were never able to have before. Maybe they're at home looking after children. There are so many reasons why Airbnb has improved so many people's lives. So you have to look at both sides. So by Airbnb saying, well you can't rent that because it's actually could be a normal rent and therefore it's ethically wrong. They may have sourced out one person's life, they could go rent it, but they may have destroyed the other person's life who was trying to rent it on Airbnb.

Okay, so now let's say it's done by government. So like in Spain where you now need a license to rent out property, how would that change that person's life? Cause you still going to have to go and get license.

This is why I sort of saw it and it's such a gray area and possibly why ethics has not caught up with the digital disruption at all. Because what's the phrase? Steal from pizza to pay Paul, you're helping one person by doing one thing when you're helping the other person by doing the other thing. I don't know which is right. So I don't know what to do. I mean, I would say in the town near where we live, yeah, there's hundreds of Airbnbs and possibly they could be rented long net instead. I don't know what the solution is.

So now you've got a way up. Is it more ethical to ensure that everybody has somewhere to live or that some people have an income,

let's look at unintended consequences, take Barcelona or wherever it is that this has become law by removing the ability to earn these houses out on Airbnb because it has pushed rental properties out by 40% has it also pushed up the amount that the town or city is earning through tourism by having that amount of people in there? So are there other areas of the town that are going to be affected by this law as well? Negatively,

this is a question. So obviously it's not a an Abby focus. Yes, no answer. Actually, what has to be is we have to regulate this because if it's let's go unregulated, then basically no one can actually live here and it just turns into a holiday resort, which we don't want. But a lot of Barcelona is a very touristy place that is a large part of its income. And so therefore we're talking about, okay, we now need to regulate because Airbnb are not going to regulate because it's not in their business interest to do so. And also it would be very difficult for them to do so whereas the government has probably got more data on people and where they live and where workers live and where holiday people live. Yeah,

I mean it is very difficult and I don't know the answer, but it does bring the question about who regulates it into every other large tech company these days before these tech giants. Was it ever heard of that a company would self-regulate that it would bring laws against itself?

Well, there's an interesting question there. Was it last month? I think the UK gambling commission, they were looking at loot boxes. They ruled it under the current regulations. They're not a form of gambling, but that issue probably lay with the definition of gambling rather than the fact that this was not a bad thing and should be considered gambling. But probably we should change the law to make sure that it is, because this is gambling. It's games of John's.

Okay, and have you ever heard of a gambling company, self regulating itself or a cigarette company? Self regulating itself? I'm not going to sell cigarettes because they're bad people. No, I don't think it is the job of commercial businesses to regulate themselves because they can't because it's not in their self interest.

Okay, so let's talk about Cambridge Analytica and Facebook. Yeah. So do you think that Facebook should have self-regulated the access it has to people's data

About the time that they decided to allow all data to be easily accessible and free. This was very much a time of the internet when everybody was going, let's share, let's make the internet free. Let's be transparent. Let's not be Microsoft and lock it all down. We're part of the community. We're all in it together. This is a brave new world, and that's what they did and that's what everything about the internet on that side was everyone was sharing, everyone had access to everyone else's code. It was where the advent of being able to create apps for other people's systems came from. So at the time it made perfect sense. The knock on consequence was that everybody will use things to create a business from to make money and so do I think they should have self regulated at the time it was the right thing to do. And we all in the tech world felt that that was a great way to be free and open and transparent.

But that's not what they were really doing was it? They were selling your data basically to companies marketing. They did the deep data that was sold on a basis of we will give you deeper access to our data than the normal Facebook API so that you can use that data. Yeah, let's put it in another way. So does Facebook have any responsibility to look after their audience?

Yes. Right. I'm playing devil's advocate. Yeah. Take it back to when Mark Zuckerberg was probably 22 or 23 nobody at that time when you have valuable data was no one even thought of your name and your gender and your email address. And your likes as some other valuable currency.

Well Facebook did because that's why they were valued at that sort of money. They knew that where the money was was in the data because that's why it was worth so much so it wasn't worth so much because people were posting their pictures of their cat.

I don't say wow cause I could bike was inherently evil. I think he was naive. I think he is still very young as a CEO of something that's bigger than a global government

don't. It's all right. You're naive.

No, no it's not all right for you and I, if it was giving its users a service that they thought was absolutely fantastic and they had to pay for it in some way, Facebook was free. The system and the money spent to build that system was far from fee. How are they going to pay to give all these users this free service that they wanted?

Yeah, but maybe then they should've explained that we are going to track everything that you do and everything that you say and we're going to sell it to third parties for a financial gain.

Well didn't you know that anyway,

I still think very few people truly understand that if not paying for it, you're the product.

Yeah, and even though they do know and it's been very broadly advertised that this is what they do. Nobody has left Facebook. Not the users, not the advertisers,

no and that's, so there is a part of this which is part of this AI is if you're going to be stupid about it then you should expect what you get but there is another part of it where you need to say some of this, if it was done in any other domain and not in a digital domain would be seen as unethical. It would be seen as we were saying you were doing one thing, but actually behind the scenes you're doing something very, very different. So when we look at AI for example, which is the freedom of all this data and one of the beautiful things of all this data is we can do things with AI and machine learning and those things can be done for the good or for the bad. I think we've all discussed that technology itself is not inherently good or bad. It's how you wish to use it. That's where the ethics come into it. So you need to start to think to yourself, well, is this really ethical? Is driving massively addictive techniques to keep people on the screen all the time. At what point does that become unethical? At what point does micropayments to children become unethical? These are all questions that we need to ask them.

Ben, not only done in the digital world, take any child's craze throughout history and it's daylight robbery, you know these horrible little smelly fairy things that all have names and everyone wants to swap them and they cost a fortune and you have to buy more and more. Mom, it's actually the same. Yeah.

For example, the government says you can't sell suites at checkout anymore in supermarkets because they've decided that that was a point where it was very difficult for a parent who's trying to check out with that kid's grabbing sweets.

Okay, so that was the government. And also people were selling their phone lists and their mail lists before technology and that was also ruled against.

When you start crunching vast amounts of data, you can start to make very, very accurate predictions about people without actually knowing. So you can say, well we don't store their race or their gender or their religion in our databases, but we can interpolate it from the data. Then they do it with Uber where they could work out pretty much what race and gender race they could work out from the postcodes. So all this stuff that you're not allowed to have on, you're not allowed to share. You still can via the back door. And how would you regulate that buyer ethic now? How would you regulate the machine learning? The fact that these biases are in the data even if you don't put them in there, because there will be other bits of data that point to that.

And this AI and massive data crunching can also be used for good. Yeah. That minute that can be found. The needle in the haystack can also be used to save lives through medicine, through Hotsy, through food, through climate. So how do you regulate it? How do you create the ethical law? Because if you'll say, right, this sort of data crunching is wrong, then you've stopped a whole area where that data crunching was doing a lot of,

but don't you regulate? Not that this kind of data crunching is wrong, but when it's used for this purpose, so when it's used for saving lives, that's fine. When it's used to make sure that people who have genetic markers for diseases can't get medical insurance, when at the end of the day they're the people that really gonna need it. Then you might say, well, you can't use this kind of AI predictive stuff for things like, and then you'd have to obviously list out things. I think the problem with it is that the technology is moving so fast it was going to be very difficult to regulate for Airbnb before Airbnb came along.

Well, exactly. And if you look at the porn industry basically pushed forward and enhanced the online industry and it was the one that helped video streaming and become much more advanced. There are many ways that the web was just perfect for porn. Now most people would say, well, pawns bad porn shouldn't have been on there, but if it hadn't done these things, then there are lots of forces for good, whether you call YouTube good or bad, there are lots of things that may not have happened if it hadn't first had the idea and the resources put in by the porn industry. So if people are going to be blocked, everything they do, you are not going to have these brilliant but perhaps and socially aware minds. You're not going to have these brilliant minds pushing them. The reason I'm pushing and pushing so you're not going to get the good stuff as well as not getting the bad stuff. So my worry is if you limit everything in its infancy, it's still not going to be a better world.

Okay. And I totally agree. This is almost what we were saying, which is it's not that you say you can't do X, but you say you can't do X if the explicit purpose of it is to do Y. It's not that you can't do data crunching, you can't do profiling, you can't do segmenting, but you can't do it in the following situations or industries

just said Airbnb didn't know what they were going to be doing. Airbnb thought people were renting out sofas. Yeah, so if you're saying before you even start a business, you've got to be careful of this enormous amount of things, then you're not even going to get that far because you feel like you're trying to start something new with your hands tied behind your back. Facebook didn't really know what it was doing. Uber probably did know what it was doing. The food delivery industry enormous and they didn't know what they should be doing. The delivery for the restaurants, they thought that they'd be telling restaurants to do the delivery, so people's plan changes as it grows.

You're confusing stamps where you're saying you can't do this thing. That's not the answer. All of this regulation, it's going to be by its very nature lagging behind the technology because as you say, we're not necessarily going to know what the downsides of these technologies are till they happened. I think you can quite clearly say you are not allowed to use people's inferred data in order to change the way you price things for people, especially in the areas of medical insurance or housing. So you can't say the rent for you on this property because of the person that you are is X or Y. What you can say is you don't pass the following criteria like a credit score, but you can't say, well because your a person who works in this industry and you've listened to this sort of music and we've looked at Facebook data and we think that you're probably not that we want, then you're into a world of this is not really where we want to go, is it?

I totally agree and I think that it's a government or a global government issue, not a company issue, and I think that once they see something happening that's bad, all the countries should come together and look at how this can be regulated. Well, I'm worried about is that if you try and slap on too many regulations, then those people with the brilliant minds that are coming up with these things, you're not stopping them innovating and being creative, but you are slightly dampening their genius. If they know that they're just going to come up against barriers in every area, then they're not going to have the enthusiasm to try as hard from the beginning. If you let people run and then regulate it

once the damage is done. I think if you're saying, if you're doing something that you know is ethically questionable, do it anyway. Make some money until they closed the door, which is basically what you're saying, which is we should not regulate anything at all until we know how it's damaging.

Okay, well this is where your morals should come in. And we did talk about the Justine ethics and morals. And so if you morally know that something is wrong, then you shouldn't do it, but that's a lot harder to regulate. So you shouldn't be in the position that you started off a business to do something wrong.

And also morals is very dangerous because obviously if your some sort of technical alter your morality maybe very, very different from other people's morality. So actually you do need a set of ethical guidelines that say you should not be disadvantaged people and if you are disadvantaged you should be mitigating it. I understand what you're saying is, yeah, but now I'm not even going to start because where am I here? But if you could say, well, there was no way of knowing that this would arise or when we did our risk analysis, this was not a thing that came up then I think that's perfectly fair to say, you know, this was an unintended consequence that no one was,

yeah, my worry is a small two guys at uni have come up with this great idea and they go, let's just try it. Let's do proof of concept. They don't really have a clue about where it might go. No, but as [inaudible] as it goes, say yes, but I don't want to say create spark of creativity.

There was a big change between two guys at a university with a thousand people on there and a $7 billion company. There was a gap between those two places.

Don't get me wrong, I 100% think that Facebook should be regulated, but I think that it is the government's business to regulate. I still don't believe that in the past companies regulated themselves. Yes, they morally may have done good, but I don't think that self regulation is something that has happened before and if it is something that's going to happen in the future, then that is something that needs to be laid out in the law.

Well, there are some things that we can do, so as we know in the food or the medical industry, you have to at least say, well, we can say that this isn't going to kill you. In the medical industry. You have to say it's at least as good as other ones. You don't have to say it's better, but you have to say it's as good and it won't kill you or it kills you in a different way to the other ones. So for some people it will be a better drug and for other people it will be a worst run. There is a framework that says there are some basic things that you need to clear. Like with food, it should probably not be poisonous. It should probably not,

but that's, that's regulation already. I mean take it to vaping. Should they be self-regulating or should there be a law against no, right. They shouldn't be self-regulating. Okay. But they're not, I don't think anybody should be self-regulating because I think that's just not, you're saying that the tech companies should be self-regulating. Facebook actually, they should have itself doing bad stuff. They should have a regulatory framework within which they operate and that simply says like with food, don't put poisons in food. Real simple stuff. Yeah.

Vape. CIA has been really interesting to me because obviously in America, big tobacco producing country where they've said it's absolutely fine to vape as long as you're only vaping menthol or tobacco flavors. But what you can't do is have fruit flavors because children liked fruits. But smoking still fine. Don't worry about smoking cause that's okay. But fruit flavors in vape status, we need to ban this immediately. This is a danger to society. And this is my worry about regulation from governments, which is they're swayed by lobbying groups who have vested interests. So vaping is very, obviously in America, the tobacco industry has gone, we want young people to smoke cigarettes. We don't want them to smoke the strawberry and cream. So if they've got to have start by vaping, then we want them to vape tobacco flavors so that when they moved to cigarettes it's a lot easier transition for them. And I think that's a really, really worrying bit of information.

Oh, they have died from vaping by mixing their own stuff,

buying stuff from some very weird people are trying to put their own THC mixtures into it compared to the amount of people that die on a daily basis from smoking. The number is just vanishingly small.

That's the thing, isn't it? People can cope with small numbers. They can't cope. You know, smoking is like too much to for your brain to fathom, but a few people dying from vaping. That's personal. I think

so. It's the fact that it doesn't happen very often and so it's newsworthy. Whereas people dying in car crashes happens all the time and therefore it doesn't reach the news unless they are. Diana [inaudible]

mission statement used to be do no evil, which they removed the moment they started doing evil, but if the tech industry had lived by that statement, then perhaps the world would have been a better place. Perhaps if Facebook and Google and everybody else as they grew had lived by the do no evil, does my new technology or my new algorithm fit with this is the answer. Yes it does. No evil. Then let's carry on. If it's the answer is no, it does evil, then we shouldn't do it. So it's almost like shining the, as my binary again, the yes, no light on it. Is that evil? Yes. Stop. No evil, no

define me evil.

But someone's going to have to make that decision aren't they? And it's going to be a hit

to law. And when they turn it into law, that is the point where the great idea of GDPR, which was companies allow them. Yeah, which is someone's data is their own and you should ask the permission to use that data for whatever purpose you need to use it. And if you've asked them for that permission, then that's fine. But when it got turned into law, what actually happened was stick a big button that everyone presses. I agree. And then you can do whatever the hell you want and worse, now you've actually even got their consent so they can't even come back and go, well I didn't know you were doing that. You can go, well actually if you read through the details page, you have seen the 7,000 cookies and tracking devices we replacing on your browser that you agreed to. This is where the difficulty of law becomes a problem. We all know what we're trying to achieve, but how do you actually frame that legally so that the outcome is as what you're intending

I'd who should take responsibility and I don't know where we are between you and me, but I think making a company that is, I sound really evil myself, but a company that is coming up with ideas that's running fast, that then gets bigger and is making money for its shareholders and giving its users what it's once, even if it's bad for the users. So there's another thing. Let's say I love my Facebook, I don't pay it and I love using it and I love everything about it, but it's bad for me. Either I'm becoming addicted or it's stealing all my data. Yup. Has the company done a good thing about thing?

Here we go. Has the company purposefully created a system that does those things to you? Has that company spent thousands and thousands and millions of pounds on ensuring that you do become addicted to it. So it was alcohol and sugar. Absolutely. Which is why, for example, they plant cigarettes. It's inherently not a good product. As an ex smoker, I can tell you it's inherently not a good product for tobacco. And therefore they said, actually we shouldn't probably be associating smoking in people's minds with cool things.

So you shouldn't give me what I want because it isn't what I need. And who is the person to decide?

You shouldn't create a want in someone that they absolutely do not need.

But that's the total of the capitalist society. I mean, clothes. I don't need clothes. I mean, obviously I need clothes.

Sure. A conversation about what's the difference between consumerism and capitalism? Because I think what you're going about to talk about is consumerism, the mindless buying of stuff that you don't need.

Yeah. And so you're telling me that it's fine for the shops to sell me all those really cheap clothes made by people who are in bad working conditions. And there's another thing in itself, because if those people didn't have the work in the bad working conditions and they wouldn't work at all and they wouldn't have the money. So there's a whole heap of difficulty there as well. But you're telling me that's different. The fact of me buying too many clothes is different from the box of me being on Facebook.

Well, I would maybe argue with that is the clothes are not inherently doing you any harm. They may be damaging your wallet, but that's a choice that you're prepared to make.

They may be damaging the people that are making it.

Like, as you said, it may not be, it may be their only form of income.

Exactly. But it also might be damaging the environment.

It may well be. And then, yeah. So it is a minefield and I think we're all agreeing. It's a minefield. But where you say, are you selling something that you know, actively damages someone's life, which I don't know if you say close or actively damaging your life. I mean, maybe if that's too small or something, but I think that's fine. Yeah, exactly. Or shoes that are way too small. If you're promoting something that you know actively damages somewhat life, and not only are you promoting it, you know that it does that and you're engineering it so that people want to do that, which is smoking, smoking as a perfect example, they made it very, very cool. Everybody used to smoke. It was seen as the thing. Every sports thing was advertised on it. It is everywhere. They were actively promoting something that killed you and didn't potentially kill you.

It will kill you if you do it for long enough. So is that ethical? No, I think that's totally unethical. And another thing is right that governments say you should be able to smoke if you definitely want to smoke, if that's something that you absolutely want to do, feel free. But what we're not going to do is promote it. And I know some people go, Oh no, who's the big government? But it's personal choice. The fact is is people are very malleable. Generally. That's how marketing works, which is we all want things. We're programmed to status. That's because we're basically a pack animal or social people. It's very difficult to create a law before the thing you're really against has a thing. You can do a light touch framework. Even if it says if you know this is actively or if it could be reasonably deduce that what you're doing is actively harming people, then you should definitely have to do a study. Yes,

there is a governing body on the internet that only deals with ethics. It's an enormous industry that is currently unregulated because it's so new and there are people who day in, day out are looking at this. Ethics by its nature is difficult to deal with. I think that you're always going to be on the back foot because what you're ruling about can't be ruled about until it's happening,

but I do with any law degree that there could be

parameters put in place that should be able to be guidelines that when I'm coming up with this exciting new idea, I can go through the list and go a bit like, uh, when you get on an airplane, it's all these things. I'm not allowed to push in your hand luggage. I can go through the list and go, no, it doesn't. No, no, no. That's fine. Oh, I better look into that. So something that is clear as you're coming up with these ideas so that you could work around them, not illegally work around them, but make sure that you're not going headlong into something that is going to be dangerous. Is that the right word for a certain human being? Then I agree that that is a really good idea, but we've both come down to, again here the fact that it is a governing body or governments, not the company itself. The company itself needs to have a moral obligation, but maybe I'm wrong, but we have come to the conclusion that they are not any good at regulating themselves.

They're not good at regulating themselves and the general public for some reason I'll totally unbothered by that. You'd hope that downside of not doing those sorts of things would be that you would suffer massive reputational damage and you think, well, isn't that exactly what happened to Facebook but it didn't. It didn't affect their [inaudible].

No. On Facebook was in court and they said, do something about it or we'll find you. They got fined. It didn't really matter. It proves that it isn't a workable thing. It's a bit like telling a child, putting a whole heap of sweets in their room and telling them not to have them.

Whereas if you'd said to them, right, you have to delete all of these sets of data and you are not allowed to share your data with anybody at all without going through a regulator because we believe that you have basically crossed the rubric and they may have

and and it's a federal offense so you may end up in prison because they would have had a set of rules to understand what it is they had to do. I think the problem is self-regulating was in the case of Facebook, the court didn't really know what to tell them to do, so they didn't really know what to do, so they didn't really do anything. Whereas if they're told, here is a law, this is what you must do, I think it's easier to abide by it. I've seen to be standing up for Facebook, but here it's not my intention. Yeah,

no, I understand what you're saying, which is you limit the progress by creating a legal framework that basically means that no one can even be bothered because it's just too much of an ass. I get that, but there also needs to be a thing which says there are a set of guidelines. Here they are, they're very clear and you need to revisit them at these points. When you start to do this, when you reach this size so that you're not out regulating the tiny guy because when you've only got a thousand followers, what damage are you really doing

and you're doing it on a shoe string.

You're still finding out and there will be things that are well beyond your ability to even do a study on. You're too busy coding or whatever you're doing, but as you grow and you go through certain different stages, you should say, okay, at this point you should do a study. You should consider the following factors and if your score is over 32 you should refer it to the regulatory body for an independent study. Something along those lines. And here I am asking for more government, which is really not me at all.

We have fights in the wrong way around this time I started off very animated and I haven't stopped. I think it's the podcast where we've talked over each other most, but we are going to have to leave it there. I'm sure we could carry on for agents, but we have to go and speak to you next week. Bye. [inaudible].

Dan & Abi work, talk & dream in tech. If you would like to discuss any speaking opportunity contact us.