Well when I said AI, I meant leaving the decision making skills to it as well.
As to the outcome, I agree that a computer with millions of computations per second will likely make less mistakes than a human however there is the psychological impact of such an engagement especially when life and death is at stake.
It's not too different than self driving cars. Would you let your car drive itself even if say some scientist shows you analysis and stats that says it's 10x safer than human drivers?
and to be honest, self driving trucks and cars are rather insane as well?? machines have no "compassion" and compassion goes beyond avoiding a collision with something, its avoiding having taking another life, a child or even a dog, that makes the human operator so much more willing to sacrifice oneself, rather than "take a life"... theres no way you can "program" that into a machine, in fact theres no way you can instill that in another human being who doesn't have it??
look at ISIS, cold hearted, ruthless, less than humane in all their thinking?? less than human values, even cold hearted people will take extreme measures to avoid taking life for selfish reasons??
as I said, its an extremely intense ethical issue, beyond the average persons thinking ability to "see the possible consequences", why the world is full of folks who buy into "junk thinking", failing to fully "flesh out" the eventualities that will occur just understanding the human motor!
we've gone far, far, afield gentlemen, end off topic. (mods feel free to move or delete this or any other post I've made that is so far off the beaten path, I wanted to respond to something that I find extremely troubling in its totality)