@ItsjustJules I didn't deactivate. I plan not to, though my usage does tend to wax and wane. I got on Twitter in 2009 and I'm sure there are many months during that time where I didn't log in once. If it went away, I wouldn't miss it.
@PeaceMob @ItsjustJules My concern would be from a security standpoint. As they cut staff in that area, expect a black hat to copy the Twitter database of users.
Also, for those that think deleting works, I have a git repo that wants to talk to you. 😅
@feloneouscat @ItsjustJules I'm not too worried about it from a security standpoint. I understand the risk (I am an infosec pro, actually, so this is my area of expertise). I'm sure most of the controls that protect confidentiality of data are technical, i.e. they don't require people do "do" things (applying encryption, for instance), but they may end up light on the threat hunting/response side of the house, and could be slower to respond to potentially malicious events.
@feloneouscat @ItsjustJules Despite how reckless the current owner may seem, I doubt he's going to expose himself to legal jeopardy over the privacy of consumer data. He knows how expensive that could be.
@PeaceMob @ItsjustJules Tesla Autopilot has killed 11 people.
This isn’t even killing people.
You are personally shielded by it being a corporation.
The WORST that could happen to Twitter is it shutters a little earlier.
@feloneouscat @PeaceMob @ItsjustJules 11 people killed while autopilot was on is quite different than autopilot killing 11 people.
Autopilot saves lives all the time, you just can't document the accident prevented as easily, but the overall numbers back that up.
@TomHarriss @PeaceMob @ItsjustJules The point was exposure to legal jeopardy — Tesla should never had untested software on the road. I never signed an agreement to participate.
Sadly, Tesla will never be held responsible.
@feloneouscat @PeaceMob @ItsjustJules Sorry that point of view is just insupportable.
The is still driven by a licensed driver who is legally and directly responsible. The software is tested extensively.
@TomHarriss @feloneouscat @ItsjustJules it has always been, and I suppose always will be true that it will be difficult to hold to account a company that produces a very deadly product that depends mainly on the consumer to use the product in a non-deadly way. In that regard, software is no different than any other "design" that becomes a part of a product. Humans write them, and humans make mistakes.
@PeaceMob @feloneouscat @ItsjustJules to be fair, if you swallow an egg whole and choke, is that the chicken's fault for producing a deadly product?
I don't disagree that companies often cut ethical and saftey (and environmental) corners and consumers suffer and I believe we need strong regulatory and oversight from goverment to combat that.
I just don't see that as being the core issue with Tesla autopilot, since the numbers clearly show it is reducing accidents and saving lives.
@TomHarriss @PeaceMob @ItsjustJules People argue the same point with guns: they save lives.
My point was that companies, including Tesla (see latest 300,000 recall) are shipping vehicles with under tested software and calling it golden.
Worse, they spend an inordinate amount of time “proving” that usage of autopilot is always the drivers fault.
Autopilot needs to be removed from Tesla vehicles until we have a commission that evaluates and tests auto drive said software.
@feloneouscat @TomHarriss @PeaceMob @ItsjustJules
Tesla recall, push undertested software update to fix windshield wiper calibration, next day the road is full of Christines but without the cool Plymouth Fury vibe.
@Cosmichomicide @feloneouscat @PeaceMob @ItsjustJules There are more examples of this process working well and making people safer than it failing and causing harm in the case of Tesla.
So I side with safer. I side with lives saved and accidents prevented and problems fixed quickly.
I side with actively working towards a future that is better using a present that is also better.
@Cosmichomicide @TomHarriss @feloneouscat @ItsjustJules We want a Utopian solution here, but it's probably not realistic. We are just not the risk-taking society we used to be when the world was actually a lot more dangerous.
@Cosmichomicide @TomHarriss @feloneouscat @ItsjustJules Right. Eventually, a happy-medium is reached where most parties involved agree "enough" caution has been put into the design. i.e. trigger safety locks on firearms.
@Cosmichomicide @PeaceMob @feloneouscat @ItsjustJules In the case of Tesla autopilot, it will disable features when it detects issues with cameras, or bad weather, or driver inattenion (both not holding the wheel, and the driver not watching the road... it uses a camera track the drivers attention).
@TomHarriss @PeaceMob @feloneouscat @ItsjustJules Fair enough, but I'd add in residential areas, city ground streets and school/hospital/construction zones, etc. where there are more likely to be less predictable hazards like people.
Then again, I'd advocate for cellphones that turned off once a car starts moving, but most of Cosmic Jr's friends would not be able to navigate to the mini-mart without them. 😉
There's a balance, we just aren't there yet.
@PeaceMob @TomHarriss @feloneouscat @ItsjustJules I'm not even arguing for perfection, 'cause I know better where people are involved. But we have the ability to deactivate "helper" tech in less-than-optimal situations, we should use it. We also need to accept the fact that people are lazy (idiots who use cruise control in the rain, I'm looking at you) and, to a degree, the tech has to account for that with more than a "bad idea, don't do it" warning.