how would you define pleasure into an AI? be it an android, robot, just a program, even if we think it is alive or sentient...
how do you define pleasure into a program so that AI can "enjoy" pleasure?
the pleasure of enjoying a good beer, excitement/enjoyment of watching a good movie, the mental pleasure of sex, and on and on.
would it just be externally representative of a thrill? like, you program the REACTION that WE see, and nothing really internally? I'm not sure if visible reactions we identify, really proves we have programmed pleasure into an AI
We FEEL pain - our brain tells us which finger we cut, because the nerves there signal brain and we FEEL it as "Pain". how would that work into a sentient AI? Would be cause wires to burn out and fail? our preception of pain is that it "Hurts" us and we wish it to stop. What would "HURT" an AI? or "FEEL GOOD"? I say nothing
...thus, it will never be "alive". just imitation of life
...thus, it will never be "alive". just imitation of life
I think there is only one definition of life, and that is a being that is self aware.
Humans evolved on this planet to the Earth's particular peculiarities. Who knows if sentient life elsewhere has bodies remotely like ours, or if they feel emotion, or feel pain for that matter. But if they are aware that they exist, they qualify as life.
I'm still not convinced programs we create that tell us they are self aware, really are, or just designed to act like it. thus my pain/pleasure paradox: can we just create similar reactions, and not real pain pleasure?
I would like to argue that organic things like amoebas are also alive without knowing it, but that's a discussion for elsewhere I think. :)
I like it. but it is still just a number. it is HOT to us, but what would a computer care? how can we be sure when a computer sees that high, hot number, it FEELS a "pain" and not just a code:
\\if TEMP > 80c, then action is - yelp sound/retract/feel "pain" in that location(nebulous)
Let's say the computer has to fulfill a task in a certain time / as fast as possible. High temperature (= pain) lowers the processing speed. Computer notices there is something wrong and tries to ease the "pain" (= increases fan speed).
By the way: Learning works indeed pretty well with the punish / reward principle. But how to punish / reward a learning machine?
I get what you are saying, but I see us stopped at the "Pain" part.
If you or I drop a hammer on our foot accidently, mash some nerves (maybe not actually damage anything), we register that as "Pain" and we cry out, and grab our foot. That is a lesson learned to us: this causes that pain to learn to not do it again.
So, we can tell a machine "This is pain" and how to react to it... say it drops a hammer on its metal foot - it could be programmed to cry out, imitating OUR pain, but it doesn't feel the "searing HURT" that we get that identifies as pain. It just has a new dent that it must now deal with by repairing or replacing.
I may have just answered my question: maybe PAIN is simply a non-existent thing in the AI world. I mean, is there any real reason for it, beyond self preservation responses to outside actions that would normally cause pain to you or I?
Inanimate things can never "feel" pain, ergo, will never be alive or sentient.
Oh, this way the pain in organic beings is just a hormone distribution, a chemical reaction telling us that something is wrong. As long as there is nothing broken by your hammer (and has to be repaired or replaced) there is no real physical damage. Tissue gets warm and expands, but that is already part of the healing process.
Our AI can of course identify the contact with the hammer with the help of sensors. But you have to substitute the hormone distribution with something the machine is enabled to identify as unpleasant. Again: Temperature...
And no, pain is not only caused by outer actions but also by our failures. And learning means to avoid these failures and the resulting pain in the future.
still, if an AI detects hot temperature, labels it with a number above "pain threshold", it still doesn't FEEL any pain, just reacts per our programming.
if a chunk of metal gets hot, it doesn't feel or know pain, it just melts.
if we get too hot to burn, our brain tells us it feels OUCH or "hot" to us.
Isn't the AI merely reacting to preprogramming: it feels hot from distance, back away
similar with the pleasure model: we eat sugar or have sex and it tastes or feels great, and we want more.
Is programming AI to react that same way, just bits and bytes calling out a reaction to seek more of it, but no feeling drive behind it?
If I tell AI it LOVES to have sex or make paper clips the same way, it would make both exactly equal per programming.
I'm not sure if I am stating my point very clear....
we feel specific things for pain/pleasure that we could describe in great detail from our personal experience.
robots/AI "Feel" nothing and only react to preprogrammed ways. We could program pain like, hot temp, to be something "desired" wanted, driven to... which doesn't make sense to living creatures that are alive.
As we don't have to learn what pain IS, isn't it also somehow programmed in our brain? Isn't pain something we (usually) try to avoid, meanwhile pleasure is (usually) something we are looking for? When our programmed AI avoids something, isn't it possible that it avoids just pain? Is it necessary to compare this pain with ours? I am sure that it is even hard to compare your sensation of pain with mine...
agreed, but where is the line drawn between AI simply mimicking us via programming to them being "alive" and knowing pain?
I cut my finger, my brain does not just register there is a cut at this specific location... it feeds me searing pain at the damage done.
AI will register location, then simply mimic how we programmed it to react, PRETENDING it feels a pain.
we are, afterall, talking about AI becoming self away, and feeling and being like us.... but yeah, maybe there is no need for pain/pleasure in AI, just the knowing it needs to fix itself.
i think we're on the same page, I just keep wondering about the "actual pain feelings" part, that doesn't exist in a non-organic AI life form.
or pleasure that would drive it to self replicate, as is in us.
Positive/Negative? I guess you could program items and such based on sharpness and FLIR readings to give negative points to indicate something is bad or painful and the opposite for registered items that are considered good and 'happy' as positive points.
they didn't even bother to show if the chris rock robot felt any pain when he flew through the fire leaving only his face which was still on fire when we last see it. not one chris rock scream.
Firstly: AI doesent exist as yet - its all just completely normal programs pretending , and labelled "AI" by marketing men.
Secondly , you're looking at pain and pleasure wrong, its not something that will need to be programmed in - if AI really existed those responses would develop by themselves.
Its emotional pain/pleasure rather than physical - for an AI
p.s. the first AI wont be goofy looking Honda robot performing tricks like a circus seal , It'll be a server room full of racks of hardware - unless it decides to migrate itself to a "distributed" computing model for life insurance reasons.
but they are not going to FEEL it.... how do we make something PAINFUL in programming, other than saying "This now 'HURTS' " and create a response? we FEEL a cut finger, or burning heat melting skin.... this is my crux.... we can program in responses, but not actual pain. if heat melted a robot finger off, we tell it to ACT like pain, but it wont FEEL that pain