Mm... well, I'm obligated to say I think you should turn it in to the government... but I don't know what your circumstances are, there. [He's not naive.] Is it safe while you're gone? I know they said that time shouldn't pass while we're away, but we have no proof of that.
[NOT A GOVERNMENT... he would ask about this but they're talking about it in the other thread so for now he just frowns and moves on.]
Well... I should say that I'm no commander or anything. These kinds of decisions--I'm usually not the one that makes them. I'm the one that carries them out. [Not that he can't take on leadership roles, or hasn't in the past--but he's not, like, the president.] But if it were up to me...
[Mmm...]
In stories, they always say this kind of thing should be destroyed. "Power tends to corrupt, and absolute power corrupts absolutely." But... I think I would rather teach it to help people. You could do a lot of good with something like that, if you could keep it safe.
I think that's how we feel, too. It's sentient - we've talked to it, and it just wants to help. It would be cruel to kill it. But it's naive, it's too susceptible to outside influences. I think we've decided to let it decide for itself what it wants to do, but to try and be positive outside influences.
A truly sentient AI... [He's not sure how he feels about that, honestly. It's exciting and fascinating, as a man who works so closely with such things--but dangerous, too, and a little bit frightening.] The trouble is, if it can tap into any network and gather that data, then it's only a matter of time before it [goes on 4chan] learns something it shouldn't. You'd have to have people you trust monitoring it 24/7.
You could put it like that, but isn't that something we all have to do? Develop our own internal flags and safety mechanisms? Or put it another way, it just needs to learn how to have a conscience.
This must be the difference between our technologies. [He could accept DAVIS being aware, but he certainly wouldn't consider him alive, even if he'd understand why people might argue otherwise.] Do you think this ship has something capable of that?
[Since so far the IRIS doesn't seem to be the same]
no subject
no subject
no subject
no subject
Well... I should say that I'm no commander or anything. These kinds of decisions--I'm usually not the one that makes them. I'm the one that carries them out. [Not that he can't take on leadership roles, or hasn't in the past--but he's not, like, the president.] But if it were up to me...
[Mmm...]
In stories, they always say this kind of thing should be destroyed. "Power tends to corrupt, and absolute power corrupts absolutely." But... I think I would rather teach it to help people. You could do a lot of good with something like that, if you could keep it safe.
no subject
I think that's how we feel, too. It's sentient - we've talked to it, and it just wants to help. It would be cruel to kill it. But it's naive, it's too susceptible to outside influences. I think we've decided to let it decide for itself what it wants to do, but to try and be positive outside influences.
no subject
no subject
no subject
no subject
no subject
[WHERE'S ASMIOV WHEN YOU NEED HIM]
no subject
no subject
[Since so far the IRIS doesn't seem to be the same]
no subject
no subject