Ralyx
Versed in the lewd.
- Joined
- Jun 1, 2017
- Messages
- 1,199
- Likes received
- 7,028
To anyone tuning in from the blue, this discussion originally spawned as a bit of a derail from a BNHA story over in the NSFW forums. It's pretty decent so far, so go check it out if it strikes your fancy.
The core questions (among others) seems to be: would an A.I. need to experience an emotion to accurately understand it? Would subjecting an A.I. to priority parameters based on human emotion be a good idea?
I would contend that, no, it would not. Considering the fact an A.I. would need to be able to accurately model an emotion first before it could apply it to itself, there is no logical reason why it would gain any additional understanding from a self-application.
The core questions (among others) seems to be: would an A.I. need to experience an emotion to accurately understand it? Would subjecting an A.I. to priority parameters based on human emotion be a good idea?
I would contend that, no, it would not. Considering the fact an A.I. would need to be able to accurately model an emotion first before it could apply it to itself, there is no logical reason why it would gain any additional understanding from a self-application.