Jailbreaking automated equipment introduces a ton of risk. I’m generally a supporter of being able to do whatever you want with things you own, but things like this where tinkering with a heavy object on wheels where glitches will kill people, I’d be fine with some amount of regulation here probably from NHTSA. We really don’t want some random person running a 1-click tool to hack their self driving car to install buggy self driving software that may or may not have still have the safety overrides still working. Imagine if what they think they’re installing is FSD beta from Tesla but what they’re really installing is something infected with spyware which corrupts safety overrides?
I don’t know much about how driver takeover and brake pedal take over works on the Teslas, maybe these fears are unfounded, just my 2 cents.