Emily Gorcenski: The Ethics of the Internet of Things | JSConf EU 2017



thank you all very much I'm going to talk about the ethics of the Internet of Things and I promise that I'm not going to lecture you too much I'm Emily Gorski I'm on Twitter I say things there sometimes but I'm really interested in in the Internet of Things and the landscape that is creating so why am I talking about the Internet of Things at a JavaScript conference why am i talking about ethics at a JavaScript conference and it's because I've given this talk a few times and I joke that some that every time I give it I can give a brand new talk because there are so many issues there are so many failures and bugs and security issues that come up so frequently that if I just focus on case studies it'll be a brand new talk every time and this time I've decided that I don't want to give this talk anymore I want you all to be able to give this talk so I want to talk a little bit more generally about why ethics matters light should matter to you as a JavaScript developer and what we can do as we go through the future of putting javascript putting technology into all sorts of devices and services where it doesn't normally belong it's almost impossible to do an ethics talk without getting into some really heavy stuff so there are some content warnings for this talk we can talk frankly about some incidents that resulted in injury and depth of people there's going to be a discussion of a specific incident of sexual assault and I'm going to have an image of Rami so that kind of spits you out like it does me that'll be about maybe ten minutes into the talk so Who am I I have a little bit of a confession I'm kind of an impostor here because I'm not a JavaScript developer the last time I wrote JavaScript seriously ojos talk was still the future I'm a data scientist and I'm trained as a mathematician but also as an engineer I went to school for aeronautical engineering mechanical engineering and I've worked in my career in the aerospace biotech and now I work in the finance industries and what these industries have in common is that they're all heavily regulated and most of the people working in them subscribe to a professional code of ethics through either an independent Society or some other organization that helped that guides what ethical conduct means so you know here I am I work in defense I worked in health care and now I'm working in banking and I'm going to talk to you about ethics so buckle in when I talk about the Internet of Things what is it that I mean like there's it's kind of a wishy-washy definition we might think about like smart fridges or smart cars that sort of thing I like to think of it as putting the Internet where it doesn't normally belong so it could be smart appliances but I also think of things like uber as an IOT taxi so when we when we look at the ethics of that we have to look at the entire scope of what are we doing with our technology what are we connecting and the difference is that it's not that where these devices or these products and services haven't been computerized it's that we're letting the consumer have connectivity to what's going on so if your JavaScript developer maybe you want an IOT bread machine so that you can keep hacking on your code well your bread bakes and this is this is important because IOT products are the next level of convenience optimization we've spent the last 30 years optimizing products for convenience and there's not much more competitive advantage that you can get in a refrigerator nowadays so if you don't have a competitive advantage with a non connected device you have to go connected this is also important for people whose livelihoods are affected by disability you might be interested you might be concerned about like the surveillance capabilities of IOT devices or you know the horrible things that uber has been accused of doing but if you're not able to get around or if you're not able to if you don't live in a place where there's easy taxi service and you have other needs something like uber is is a lifesaver and it really changes your life so we can't just write it off as an absurdity we can't just say that IOT is something that is frivolous or like a toy that's you know we have the Internet of Twitter account it's hilarious and there's a lot of misses in there but there's a lot of good things that comes out of IOT as well and when I talk about ethics what is it that I mean when I when I say that word you know the trolley problem is really popular right now because of self-driving cars and you've probably seen this diagram and the framing goes you know there's one Nobel laureate tied to one set of tracks and there's five normal people tied to another and somehow you have been put in the position of pulling the lever and this is a really popular problem on the internet right now because one it feels like something that we can solve with category theory if we just abstract it enough and two it makes your some really dank memes the thing about the trolley problem is the trolley problem wasn't even a problem for trolleys so why do we think it's going to be a problem for self-driving cars we love to try to frame things as ethical dilemmas that are puzzles to solve that's our nature as developers and engineers the thing is in tech we don't actually face ethical dilemmas that often ethical dilemmas happen when there are two competing ethical frameworks and an action that you take cannot not be in violation of at least one of them and what I think is fascinating about JavaScript is that the JavaScript community is responsible for what I consider to be the most fascinating that's a true ethical dilemma in a decade of technology I'll get to that a little bit later some of you probably know exactly what I'm talking about already the issue with technology is that we just often don't act with ethics and I don't mean this as an indictment I don't mean to say like you're bad in ethical immoral human beings there are some companies out there that are going to get a side-eye right now but what I mean by this is that our industry we don't have a professional code there are some societies that you can join but raise your hand if you're so that if you're a member of like ACM or I Triple E I mean there's a few of you out there but it's not the majority in practice ethics are about two things they're about the analysis of harm and the mitigation of risk so when we talk about acting ethically especially in something like research ethics what we're doing is we're not trying to eliminate the possibility of somebody getting hurt but we're trying to understand all of the ways that somebody might be hurt by our technology and we're looking for how what actions we can take to mitigate the chance of that happening to mitigate the severity of it when it happens and to provide remediation when it inevitably does so this is what we need to build up as our ethical framework when we're developing technology particularly for IOT so harm can happen in three ways the first is through malfeasance this is the most common top topic in IOT this is security this is people talking about hacking when the mayor I bought net had a do s attack last fall that was a huge issue there was it was the biggest VOS attack ever witnessed and it was be happened through IOT devices that were unsecured and you probably know that I wrote e security is in a pretty abysmal state right now and when this happened the timing of it and the way that it was structured gave a lot of people a lot of concerns that it was a precursor to an attack on the u.s. presidential election and that this would be an attempt to influence the outcome of that election as it turns out those fears are unfounded we managed to screw that one up all by ourselves but I don't want to talk about security in this talk because first I can't cover everything second the other two ways that harm can happen through failures and edge cases if we address those we also address many of the security issues so failures are things like bugs and software and then edge cases happen when a device is acting under normal nominal operating circumstances but it gets put into a condition that we did not predict as developers and it's also worth mentioning that sometimes we like to treat bugs as edge cases and edge cases as bugs so there's not really a lot of difference there except for semantics it's a great example of this this was on Twitter this poor gentleman Andrew had an IOT water-cooler and his TLS certificate expired which led to some blocking code which meant that a hardware interlocks failed and he has water all over his house and this is a real issue right like this is this is a problem if a TLS certificate expires in a web service there like a forum in the web space and we forget a TLS certificate and we have blocking code that's like ops problem we have ops people to deal with that but we can't treat IOT devices like cattle anymore we have to treat them like pets we have to treat IOT devices like pets that live in people's homes and get very very angry when they don't get fed one day if we're not careful we're going to put JavaScript into I don't know an IOT tea kettle and then it's going to light somebody's house on fire because undefined is not a function I stole that joke from Laurie Vasa by the way which is total payback because he didn't save me NPM socks this is this is something that I did that I'm very proud of this I did this a couple years ago so we'll go fullscreen on this if I can if I can figure out where my mouse is there we go so this is a Microsoft band and I'm not picking on Microsoft's at all here and this is a piece of raw chicken and what you'll see is I didn't do anything horrible to a chicken like have a zombie chicken out there this is just a piece of meat that I bought from the from the grocery store and as you can see there's it's reading a heart rate of about 120 beats per minute in the real world censors are messy they're noisy they're imperfect and so when we're designing for IOT we have to take this into consideration this is absurd that you can read a heart rate off of a piece of chicken breast but this actually has deep deep ramifications for a lot of things for one there are colleges out there that are mandating students wear fitbit's there are employers out there that have incentive health insurance and scented programs for doing this and if you're following what's happening with American health care now we have this issue where we have surveillance devices that are monitoring our health and can report on pre-existing conditions but this is actually not just hypothetical this is something that's really happened let me get out of full-screen mode as I can 2015 a woman was visiting a co-worker in Lancaster Pennsylvania she called police to report a sexual assault when police investigated they found her Fitbit and with her permission they analyzed the data and when they analyzed the data not only did they drop the investigation into her claims but they turned around and they charged her with making false statements to the police and last year she pled guilty to those charges and was convicted and put on probation the prosecuting attorney said that the Fitbit data sealed the deal I can pull 120 beats per minute off of a piece of raw chicken and a woman's life is ruined because nobody has Fitbit stood up and said no our devices are not the FAQ Yura you can't do that our devices bear false witness against us or they can and the problem is that there's no regulation there's no Quality Assurance there's no standards for how we build them we just ship code we just ship hardware we make the next thing we innovate fast fast fast and we don't ask ourselves what can what kinds of harm can happen when this goes wrong and this is happening increasingly often these devices are being used in criminal and civil investigations just last week on CNN reported that a man is being charged with murder of his wife based on the Fitbit data that she had that said that she travelled a certain distance and that distance did not correlate with his story but anybody that's knitted for example while wearing one will know that it will record steps while you're sitting on your couch how can we do this what how can we let this happen how can we let this information affect people's lives another incident I saw a smart water meter was used in a murder investigation last year in that same investigation they also filed a warrant for Amazon echo data the question is who's going to go to jail who's going to get put on probation when it vice makes false statements to the police moreover say something happens say something breaks and somebody gets hurt or somebody gets killed who's going to be liable if that device causes an accident is it the owner is it the developer the company that made it and it seems like it should be a settled question but it's actually not and this has already happened in this frame you'll see the vehicle the white vehicle on the right is a Google self-driving car and this image is a still image or this is a screen capture from a video taken from the dashcam of a municipal bus in Mountain View California this is right before a historical moment that Google SUV is about to pull out in front of the bus and get in an accident thankfully nobody was hurt in this note there were no injuries it was just a fender bender but this is the first time that a self-driving car has ever been found responsible for causing an accident Google fest up to this they said you know what our bad you know we'll take care of the damages and they investigated what happened and they concluded that the car predicted that the bus would yield to us because we were ahead of it okay now Google is in a position right now where they want to ship self-driving cars so of course they're going to assume liability for this because they don't want to test it in court but we can't rely on that as we go into the IOT feature we cannot rely on benevolent corporations to assume liability once this goes out at scale by the way even if Google was right this was still going to be a historical moment because if that bus had yielded to that vehicle it would be the first time in history that a municipal bus has ever yielded a few years ago there's a judge in San Francisco and this is part of a research project this is not part of a case he looked into the question of whether autonomous systems fall under existing theories of liability and in looking into it he found that vehicles that make their own decisions that use things like neural nets that use adaptive self adjusting control systems smart smart systems if you will devising their own means suzana tasks may not be subject to liability under any existing theories of tort and this has huge implications because if you buy a normal refrigerator and it breaks you can say hey manufacturer you're responsible for that break or if you you know buy a coffeemaker and it burns down your house because of a defective unit you can go and you know you get out safely and then you recover damages from the company and your insurance company takes care of it and there's a whole ethical framework that's built up around this and there's a legal structure that's there as well obviously self-driving cars are going to be safer they're going to save lives that's a very important thing we want to save lives we want the roads to be better but the number of lives save is not the only term in our ethical calculus we also have to look at what happens when people get injured how are they taken care of how are they able to pay their medical bills or get back to work or miss work while they're recovering but still be able to pay rent and afford food so the question about this is what does this mean for us as developers like does this give us a free pass we're not liable for IOT devices that means we can ship right we can do whatever we can innovate and let's just innovate the hell out of everything until something breaks and and there's a precedent right is that really the legacy that we want to leave behind do we want to leave the legacy behind us we did it because we could and we didn't give a damn about who we tart some companies are doing this some companies are actually still working in a space where they just want to innovate and they just want to build things and shiftings and they'll deal with the consequences later but you have to ask yourself do I want to be responsible for that and that's what ethics is all about now I said that javascript the JavaScript community had one of the most fascinating ethical incidents and technology in a while and that's the left pad incident and I don't know why it's pink but whatever when Ashley talked about left pad last night it was really fascinating because she focused on a lot of the backlash like she said the internet blew up when left had happened and people were angry about a lot of things people were angry about the way that javascript community has developed small module system and maybe it's wrong or maybe it's right and there was a lot of argument back and forth and friendships were lost or damaged in this incident and what people didn't realize was that the reason for all of this anger and acrimony was because the Left pad incident actually exposed what is a true ethical dilemma and we just didn't see the forest for the trees at the moment because left pad had two competing ethical decisions the first is the hacker culture ethic that openness is the most important thing that openness of virtue and that the ability to control your code is tantamount to being a hacker to being an open source developer sure other people can fork it but you can choose where you're going to put it and so in the left eye developer pulled all of his modules and broke a bunch of stuff on the internet NPM also had a competing ethical framework which is that they have a responsibility to the people that use their product they have a responsibility as engineers they also value openness they're an open source community so this was a very hard decision then that's why there were so many you know so many heads being but over over the decision that was made but could you imagine what would have happened if this didn't happen in 2016 when it mostly affected the web but rather in I don't know 2018 2020 when NPM is running on people's cars people's refrigerators somebody pulls down a module and now of a sudden you're driving down the highway at 70 miles an hour and your car shuts off or something goes wrong and you think oh well that couldn't ever happen nobody would actually you know do live deployments on a car and running or an IOT device that's running in the field please like we're doing it in production systems right now IOT security is a mess we're doing this rapid innovation pace of course there's going to be issues and you don't want to be the person that's responsible for somebody's refrigerator going out they lose all their food or maybe they lose their important medicine that needs refrigeration you don't be responsible for that or maybe you do maybe you think that the virtue of openness is the moat is the more important ethic and that is a true ethical dilemma so what do we do as engineers what are the takeaways that we can have when we talk about ethics and this is kind of why one is why I said I wanted to you know not give this talk and rather empower you to be able to give this talk because there are actionable things that we can do as engineers as developers to make our workspaces better to act with more ethics the first one would be set expectations with your boss if you know what your what pressures your boss has if they ask you to do something that you don't feel comfortable with you need to know like hey can I go to my boss and say I don't I don't feel comfortable with this can you go to your to your manager and say I have concerns over this and you know what process will happen if you do that that's an important thing you also have to be prepared to say no if somebody comes to you and says hey I need you to build in this method that sends tracking data somebody's heart rate back to our server in real-time are you comfortable doing that maybe you're not but do you know how to do do you know how to refuse an order are you willing to refuse an order to put your career in jeopardy for doing so if it goes against something that you believe in you need to also be able to hold frank discussions with your co-workers about what this means I work in finance and I'm a data scientist so we have a vast amount of data on people and a vast amount of capability to do things with that data and so we talked often my team we talk often about the implications of what we're doing when we record customer data when we record information about their finances we talk all the time like what are you not willing to do or you know what are we legally obligated to do we because in finance we have legal obligations in terms of reporting fraud in terms of looking for money laundering for example so we have to talk about these things with each other and as engineers you should be able to talk about talk frankly with your co-workers about I don't like where this is going or I have concerns how do we make sure that it doesn't go there how do we make sure we stay in the safe side and not the dangerous side and the most important thing is to know your limit its to know when you're willing to walk away because tech is really lucrative and we have a lot of privilege we have a lot of privilege in tech if even just look around the space that we're in this is a remarkable conference in a remarkable space and there's amenities and all sorts of decadence here and not every industry is like this so what is your limit that you would be willing to say I can no longer in good conscience continue to do this and go do something different and if you don't know what that limit is you're not going to discover that you're over it until it's too late that's all I have thank you very much [Applause] you

2 thoughts on “Emily Gorcenski: The Ethics of the Internet of Things | JSConf EU 2017

  1. This is the most impressive slide I have ever seen. Thank you for presenting me important questions to think about.

Leave a Reply

Your email address will not be published. Required fields are marked *