Hypothetical: if you modify the software in a Tesla, and your modification allows a hacker to remote control it, and this power is used to commit vehicular homicide, how much of the burden of responsibility do you have?
I’m not a lawyer, so while I can throw out phrases like “endangering others through negligence”, I don’t know what the threshold is or even if that’s the right concept in this case.
With regards to game consoles, I tend to agree with you — no good reason for them to be allowed to prevent side-loading.
With regards to phones however, I explicitly want the devices heavily locked down by law, to the extent that other people should not be allowed to put random software on them. This is because they are powerful sensor packages that almost everyone carries almost all the time, so the mere possibility of malware turning them into blackmail spy boxes is a problem for all of society — everyone, given what I’ve heard about “three felonies per day” [0] — and not just the person who installs the dodgy app.
I say this despite not liking that the action of American cultural hegemony in the App Store means those stores have a bigger problem with nipples than with violence.
And likewise, I have a problem with the law that requires me (a British citizen working in Germany) to keep the U.S. government informed about my app’s use of encryption in case I “export” something they can’t break, which is insane in my opinion.
Security is important and app stores help, but trust can be chained and delegated, and it is possible to have alternative App Stores which Apple/Google audit for security.
> Hypothetical: if you modify the software in a Tesla, and your modification allows a hacker to remote control it, and this power is used to commit vehicular homicide, how much of the burden of responsibility do you have?
If you were negligent, I'd say you were responsible, just like if you hit someone because you took off the breaks and couldn't stop.
Although... response question: Let's say in ten years, Tesla stops providing security updates for their old cars. If you continue driving the car anyway, and it gets hacked and kills someone, who is responsible? Which is to say, I'm not super into this whole internet-connected-car thing. ;)
> With regards to phones however, I explicitly want the devices heavily locked down by law, to the extent that other people should not be allowed to put random software on them. This is because they are powerful sensor packages that almost everyone carries almost all the time.
What if I decide to carry around a Raspberry Pi, or wear a wire? I understand the concern, but I don't think controlling what everyone else uses is a reasonable solution in a free society.
> Which is to say, I'm not super into this whole internet-connected-car thing. ;)
Likewise, similar reasons.
> What if I decide to carry around a Raspberry Pi, or wear a wire?
Smartphone-software-based attacks target a wide net of unknowing victims, wearing a wire (or knowingly running a sousveilence app, which I’m all in favour of) targets just those you choose to target.
For the same reason, I regard state-mandated encryption backdoors as a much bigger privacy risk than the state giving the police authority to target specific suspects with wiretaps or even authority to replace all the target’s lightbulbs with ones that have built-in webcams — dragnet or spearfishing.
> Hypothetical: if you modify the software in a Tesla, and your modification allows a hacker to remote control it, and this power is used to commit vehicular homicide, how much of the burden of responsibility do you have?
Hypothetically: a friend borrows your car, but you didn't fix the break fluid leak, and he runs over someone.
Who's at fault?
> With regards to phones however, I explicitly want the devices heavily locked down by law, to the extent that other people should not be allowed to put random software on them
that's the definition of a phone.
IMO Apple is to blame for creating the "smart" phones that can run arbitrary software.
They have created the problem that they are allegedly fixing by imposing walled garden on their users.
It is like handing guns to kids and then locking them up in their rooms so they are safe.
> Hypothetical: if you modify the software in a Tesla, and your modification allows a hacker to remote control it, and this power is used to commit vehicular homicide, how much of the burden of responsibility do you have?
If you modified your keyless-start car to not need a nearby key, and left it unlocked in parking lot and someone got into it, started it, and ran over several pedestrians would you be at fault?
Interesting, I had to think about this for a moment! It's not entirely intuitive, but I think the answer is "because there's no one else to make liable."
What if the unlocked car was in your driveway, and the person who got in was your seven-year-old daughter? Now who's responsible, and why? The child is a minor, so legally she can't be at fault, but the parents should have taken appropriate precautions, like locking the door.
Likewise, if a bunch of people are killed by a remote driver in North Korea, you can't arrest the hacker. And if there's no one to arrest, our society has no way to discourage the crime. So responsibility moves up the chain.
Hypothetical: if you modify the software in a Tesla, and your modification allows a hacker to remote control it, and this power is used to commit vehicular homicide, how much of the burden of responsibility do you have?
I’m not a lawyer, so while I can throw out phrases like “endangering others through negligence”, I don’t know what the threshold is or even if that’s the right concept in this case.
With regards to game consoles, I tend to agree with you — no good reason for them to be allowed to prevent side-loading.
With regards to phones however, I explicitly want the devices heavily locked down by law, to the extent that other people should not be allowed to put random software on them. This is because they are powerful sensor packages that almost everyone carries almost all the time, so the mere possibility of malware turning them into blackmail spy boxes is a problem for all of society — everyone, given what I’ve heard about “three felonies per day” [0] — and not just the person who installs the dodgy app.
I say this despite not liking that the action of American cultural hegemony in the App Store means those stores have a bigger problem with nipples than with violence.
And likewise, I have a problem with the law that requires me (a British citizen working in Germany) to keep the U.S. government informed about my app’s use of encryption in case I “export” something they can’t break, which is insane in my opinion.
Security is important and app stores help, but trust can be chained and delegated, and it is possible to have alternative App Stores which Apple/Google audit for security.
[0] https://kottke.org/13/06/you-commit-three-felonies-a-day