your figures (though I doubt them) don’t include the skew of driver vs driverless vehicles. of course driverless cars are 92% less likely to have an accident involving animals. that’s because driverless cars account for less than 1% of the entire vehicle population.
Ai doesn’t drink, get distracted, or smoke meth like the ml mods.
here you dropped this
your figures (though I doubt them) don’t include the skew of driver vs driverless vehicles. of course driverless cars are 92% less likely to have an accident involving animals. that’s because driverless cars account for less than 1% of the entire vehicle population.
but it does randomly hallucinate.
the AI in cars is not the same as the AI in LLMs, it’s not programmed to guess its way to a conclusion.
that being said, it’s still far from perfect, and shouldn’t be on the road yet.
teslas make some pretty crazy assumptions (hallucinations).
ever see the one where it sees pedestrians in a cemetery?
or how about the accidents where they veer off the road because the lines were missing.
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
The vote question would be better suited for the admin team and not mods, as we are just over this community.
Too many people being nasty in this thread, so we’ve removed the negative comments for rule 3.
Removed by mod
Removed by mod