FacebookTwitter
Hatrack River Forum   
my profile login | search | faq | forum home

  next oldest topic   next newest topic
» Hatrack River Forum » Active Forums » Books, Films, Food and Culture » Feeling algorithm

   
Author Topic: Feeling algorithm
Raymond Arnold
Member
Member # 11712

 - posted      Profile for Raymond Arnold   Email Raymond Arnold         Edit/Delete Post 
http://www.itnews.com.au/News/232971,researcher-builds-machines-that-daydream.aspx

Some guy says he built a machine that can daydream and free associate or some such. The article doesn't go into a whole lot of detail so I don't know whether this actually means anything significant. Anyone know what/whether this means?

Posts: 4136 | Registered: Aug 2008  |  IP: Logged | Report this post to a Moderator
The White Whale
Member
Member # 6594

 - posted      Profile for The White Whale           Edit/Delete Post 
Meh.

It looks to me like the 'emotions' this machine is 'feeling' are just algorithms precisely defined. This machine isn't feeling anything. It's just simulating a (fairly crude) approximation of the emotions that we humans can feel.

I'm all for better Netflix recommendations. But I don't see this as groundbreaking, or particularly exiting, really.

Posts: 1711 | Registered: Jun 2004  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
White Whale is right I think. It seems like this article has misinterpretted what this machine is intended to do. It doesn't dream and feel. It simulates dreaming and feeling.
Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
quote:
It doesn't dream and feel. It simulates dreaming and feeling.
When we talk about simulation in other areas, such as flight simulation or virtual material stress testing, we are often making a distinction between a physical process which produces a given outcome and a computational analysis of a physical process which predicts that outcome. Given this, if the phenomena being simulated *is* a computational analysis, then what is the distinction between the phenomena and a simulation of the phenomena, provided both produce the same outcome for the same inputs?
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
Dreaming and feeling are not a computational analysis though. They are experiences. If I make a program that responds to "I hate you" by outputting "I am sad and angry", that doesn't imply the computer actually felt anything.
Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
quote:
Dreaming and feeling are not a computational analysis though. They are experiences.
Many people would argue that experiences *are* computational analysis. Very complex analysis, but analysis nonetheless. It seems like this may be a philosophical argument.

quote:
If I make a program that responds to "I hate you" by outputting "I am sad and angry", that doesn't imply the computer actually felt anything.
And yet, given certain preconditions (Person B cares what Person A thinks about them, etc.), we can predict a response roughly equivalent to "I am sad an angry" when a person is provided with the same stimulus. Again, is this a matter of degree, or are you saying that phenomena produced by a brain can not, in principle, be replicated by an entity without a biological brain?
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
The White Whale
Member
Member # 6594

 - posted      Profile for The White Whale           Edit/Delete Post 
quote:
Originally posted by MattP:
quote:
It doesn't dream and feel. It simulates dreaming and feeling.
When we talk about simulation in other areas, such as flight simulation or virtual material stress testing, we are often making a distinction between a physical process which produces a given outcome and a computational analysis of a physical process which predicts that outcome. Given this, if the phenomena being simulated *is* a computational analysis, then what is the distinction between the phenomena and a simulation of the phenomena, provided both produce the same outcome for the same inputs?
The simulation can only simulate what is known of the system. It can only be based on previous algorithms, or parametrization, or statistical models. So if this computer said "I felt sad for the bird," it didn't actually feel sad. It ran through a bunch of code and determined that {sad} best matched the given conditions of {the story}, and therefore output "I felt {sad} for {the bird}."

There can be no surprises, no unexpected outcomes. The article states that they "disallowed mutually exclusive states - like joy and sadness - from being experienced simultaneously." And, from my own experience, I know this is inaccurate. This computer cannot do what I know that my brain can do, and as a result, I have severe doubts that the output of this machine can be called "feeling."

I feel like you'd need something more like Mike from Heinlein's 'The Moon is a Harsh Mistress' that has a large number of digital neurons that are connected buy not understood. This system could better approach my understanding of a human brain, and so when Mike had feelings, I could actually imagine that it might be possible that these were actual feelings.

Edit: I don't think outputted is a word.

Posts: 1711 | Registered: Jun 2004  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
quote:
The simulation can only simulate what is known of the system.
quote:
There can be no surprises, no unexpected outcomes.
quote:
a large number of digital neurons that are connected buy not understood.
Is the primary issue then one of complexity, predictability, determinism? Does it only become "true" feeling once we can no longer predict the response or determine specifically how it was generated? Can a non-biological entity achieve this or is that ruled out in principle?
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
The White Whale
Member
Member # 6594

 - posted      Profile for The White Whale           Edit/Delete Post 
No. From my experience with modeling, it's fairly easy to get something to give expected outputs for a given situation. With enough tweaking and adjusting, I don't see it as necessarily groundbreaking that a machine can say "I felt bad for the bird." It's fairly easy to over-fit a model for a very specific situation. Especially under such limited conditions.

Maybe it's that I can't see this machine as being aware of itself, and that I don't think something that is not aware of itself can have feelings and emotions. Are emotions tied to consciousness and awareness? What is the smallest organizational structure that can feel? I don't know the answers to these questions, but I simply do not accept that this computer model is aware and actually generating emotions. I think it's emulating.

Posts: 1711 | Registered: Jun 2004  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
I'm not really talking about this particular model at this point. I'm just curious about what people think about the potential of actually synthesizing feelings and was responding to the claim that it was "only simulating feelings". Surely the first device which actually feels will be discarded by some as only simulating feelings, and I'm wondering if there is any distinction between a really good simulation of a complex computation process and the process itself. Does this all boil down to an argument over the nature of the mind - materialism and/or determinism vs dualism/spiritualism?
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
Raymond Arnold
Member
Member # 11712

 - posted      Profile for Raymond Arnold   Email Raymond Arnold         Edit/Delete Post 
I think that feelings and emotions ARE the result of a deterministic process that will always produce the same output given the exact same stimuli. But the feelings and emotions are not the process itself, they are something else that results if you accomplish the process a certain (so far unknown) way.
Posts: 4136 | Registered: Aug 2008  |  IP: Logged | Report this post to a Moderator
Xavier
Member
Member # 405

 - posted      Profile for Xavier   Email Xavier         Edit/Delete Post 
quote:
or are you saying that phenomena produced by a brain can not, in principle, be replicated by an entity without a biological brain?
Tresopax believes that "experience" is something that a soul is required for, not just a physical brain.
Posts: 5656 | Registered: Oct 1999  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
quote:
Tresopax believes that "experience" is something that a soul is required for, not just a physical brain.
In that case, we just need to simulate a soul. [Smile]
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
Xavier is right about my view (although I will admit that whether a "soul" is a seperate entity, or a property of brains, or a property of any sufficiently complicated system is an open question.)

---

Beyond that, though, the problem here is that the experiences we are talking about (feelings, dreams, etc.) are internal things. They are not equivalent to inputs and outputs, and you can't directly observe them as an outside viewer. You can't know for sure, as an outside person, exactly how or what I am feeling right now - although you can use inputs and outputs as clues to make an educated prediction.

So the real problem is, there's really no way for this researcher to investigate whether or not his computer is really feeling things or really dreaming things. Maybe it is, but since those are internal things, we can't know for sure. All the researcher can look at is inputs and outputs, or speculate from the way he's set up the internal mechanics of his machine. I think the experience of dreams and feelings simply doesn't reduce down to those things.

We can't observe the thing this article suggests this researcher is trying to recreate.

Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
MattP
Member
Member # 10495

 - posted      Profile for MattP   Email MattP         Edit/Delete Post 
quote:
although I will admit that whether a "soul" is a seperate entity, or a property of brains, or a property of any sufficiently complicated system is an open question
This is a pretty significant caveat. How do you evaluate whether an artificial entity has a soul if you are unsure if it's a separate entity vs an emergent property?
Posts: 3275 | Registered: May 2007  |  IP: Logged | Report this post to a Moderator
Raymond Arnold
Member
Member # 11712

 - posted      Profile for Raymond Arnold   Email Raymond Arnold         Edit/Delete Post 
I think if you're going to make a serious attempt to make a computer that can feel, you shouldn't be trying to simulate human emotion, because that gives you barometers that are easy to fake.
Posts: 4136 | Registered: Aug 2008  |  IP: Logged | Report this post to a Moderator
scifibum
Member
Member # 7625

 - posted      Profile for scifibum   Email scifibum         Edit/Delete Post 
I tend to think that this approach to simulation won't come close enough to how our nervous systems work to make any result truly comparable to how we think and feel.

However, I think at some point we'll be able to physically model an actual brain, and there will actually be some serious ethical implications to doing so, because we won't be able to be sure that the simulation doesn't feel exactly as we do, and enslaving it in a controlled computational environment for our own purposes may be a horrible thing to do.

There was a pretty good story in one of the short sci fi mags last year about this very thing.

Posts: 4287 | Registered: Mar 2005  |  IP: Logged | Report this post to a Moderator
Raymond Arnold
Member
Member # 11712

 - posted      Profile for Raymond Arnold   Email Raymond Arnold         Edit/Delete Post 
One of the more interesting ethical dilemnas involved with creating any kind of superintelligent AI is that, in their attempt to accurately predict the actions of other real people given hypothetical actions, they might end up producing simulations of people that are so complex and accurate that they count as real people. Literally billions of lives might live briefly suffering and then be killed.
Posts: 4136 | Registered: Aug 2008  |  IP: Logged | Report this post to a Moderator
Godric 2.0
Member
Member # 11443

 - posted      Profile for Godric 2.0   Email Godric 2.0         Edit/Delete Post 
quote:
His algorithm was based on Plutchick's Wheel of Emotions, which illustrated emotions as a colour wheel and disallowed mutually exclusive states - like joy and sadness - from being experienced simultaneously.
I'm not familiar with Plutchick's Wheel of Emotions, but I'm pretty sure people can feel joy and sadness at the same time. Hence the word "bittersweet."
Posts: 382 | Registered: Jan 2008  |  IP: Logged | Report this post to a Moderator
   

   Close Topic   Feature Topic   Move Topic   Delete Topic next oldest topic   next newest topic
 - Printer-friendly view of this topic
Hop To:


Contact Us | Hatrack River Home Page

Copyright © 2008 Hatrack River Enterprises Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.


Powered by Infopop Corporation
UBB.classic™ 6.7.2