• Welcome to Smashboards, the world's largest Super Smash Brothers community! Over 250,000 Smash Bros. fans from around the world have come to discuss these great games in over 19 million posts!

    You are currently viewing our boards as a visitor. Click here to sign up right now and start on your path in the Smash community!

Determinism vs Free Will

Status
Not open for further replies.

Dre89

Smash Hero
Joined
Oct 29, 2009
Messages
6,158
Location
Australia
NNID
Dre4789
Gofg- Just pointing out that you said evolution developed conciousness as an argument for determinism. This doesn't suggest determinism at all, unless you're saying evolution developed determined conciousness, in which case you're begging the question.

And if you believe robots are just like us, and not merely simulations, then you must believe they have thoughts and feelings too.

The problem is defining at what level of complexity does a robot possess these things. There animals less complex than currently existing robots who have thoughts and feelings, so it's not a measure of complexity. However this appears to be what materialists are saying, that only complexity is what is holding robots back from being exactly like us.

:phone:
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
Dre beat me to it.
I was going to say that robots do not feel emotion, which all living thinhs with a conciousness do. Robots can be programed to react in X ways to Y stimulous, but Not every human emotionally reacts the same way to each stimulous.

:phone:
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
That is because each human is programmed differently! But we are still programmed; the brain's behavior is entirely determined by the laws of physics.

You guys keep talking about 'robots not having emotions'. Robots which are programmed to walk around a room and make a map of it obviously do not have emotions. Neither do toasters or programs being worked on in the field of AI or programs which govern nuclear silos or anything. I agree with you there.

Our consciousness is not some amazing thing. Evolution would not care if we had it or didn't have it. However, being conscious happens to have advantages in the way of being able to adapt long-term plans from intelligence (rather than just executing our adaptations; in essence, consciousness is the meta-adaptation). That is why we have consciousness.

I understand all of these things. I say: "Consciousness is a purely physical processes, we could replace all of the neurons in the brain with tiny computers which did the exact same thing as a neuron and the brain would be exactly the same as before, or we could replace the brain with huge clouds of gas or planets moving around or anything, and as long as the behavior of the neural net was isomorphic to the neural net of a human brain it would be conscious."

You say back: "ROBOTS HAVE NO CONSCIOUSNESS MAN WHAT ARE YOU TALKING ABOUT"

Can you read what I have to say and understand it before trying to argue against it?

e: Dre: A robot whose brain was isomorphic to our brain, except based on transistors instead of neurons, would have emotions and feelings and be conscious. Current robots are not because they lack a human-like brain.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
So, what programming gives the robot the same ability to adapt without needing to be reprogramed?
I understand humans adaptation can be seen as the same thing, but by your analogy the circuits/chips/what you are replacing neurons with would have to be replaced where as our brain does not.

:phone:
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
So, what programming gives the robot the same ability to adapt without needing to be reprogramed?
What programming gives us the ability to adapt without needing to reprogram? Probably some kind of strange recursive algorithm; I doubt evolution comments its code so we are kind of at a loss for this, which is why there are entire fields dedicated to researching general AI.

I understand humans adaptation can be seen as the same thing, but by your analogy the circuits/chips/what you are replacing neurons with would have to be replaced where as our brain does not.

:phone:
What does this mean?

You are saying that since the brain is already conscious, it is better? It is 'more conscious'?

A brain is a piece of hardware, made out of neurons, which is conscious due to its structure. If you take the structure of a brain and, instead, map it onto silicon transistors, it will perform exactly the same as the old brain, consciousness and all. (We have not done this because we do not have the computational power to model an entire brain using computers yet, but this isn't a hard limitation, and it will go away soon just as we did not use to have the computational power to track an electron real-time across a room.)

Neurons are just one way to program consciousness. Javascript is another. Machine code is another. NOR gates is another. Any medium which has some basic properties could be used to run a human brain, and any human brain ran on any medium would be exactly as conscious as a human brain which happened to be ran on the medium of neurons.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
I understand humans adaptation can be seen as the same thing, but by your analogy the circuits/chips/what you are replacing neurons with would have to be replaced where as our brain does not.

:phone:
What i mean is- That the human brain is flexible enough that it can adapt to things without needing to be replaced/updated. No AI can do this.
As for the whole carbon/silicon medium, do you have a link listing the specifics? I don't believe it would be exactly equivalent due to what i just said.

:phone:
 

Holder of the Heel

Fiat justitia, pereat mundus
Joined
Dec 3, 2011
Messages
8,850
Location
Alabama
NNID
Roarfang
3DS FC
1332-7720-7283
Switch FC
6734-2078-8990
The 'mental' realm is not opposed to the 'physical' realm; it is a subset of it. Consciousness is not some thing which is separate from the physical processes of the mind: it is created by the neurons just like every other aspect of the mind.

You say robots will never be conscious simply because they are not conscious right now. This is a limitation of our ability to write general artificial intelligence, and is not evidence against the conceptual existence of intelligently designed consciousness.
I did not say there was a discrepancy between mental and physical. I agree with what Dre has stated, it isn't a matter of computing power, not one bit.

What? My Dolphin emulator runs SSBM just as well as my gamecube does. They operate on the same model.
I figured that is how you got your definition. Please look up the word, do not put a device that was named to imitate systems as your reference... oh wait, you'd still be wrong.

Uhm, no? Consciousness is an essential part of our decision making process. Without it, our brains would make different decisions. Saying that a human could be a 'zombie', that is, look exactly like John Smith and behave exactly like John Smith and make the same decisions as John Smith except without being 'conscious' is like saying that you could have John Smith who made the same decisions and behaviors except without having 'eyes'. Consciousness is just as real a physical processes and just as important to our brains as any other thing. (Actually many times more important.)
You are mistaken about what the zombie argument is. It isn't a "You couldn't tell your friend John Smith is a zombie if he suddenly changed sometime this week!" It isn't emulating an specific person, it is emulating consciousness. You wouldn't be able to tell whether the zombie was conscious or not simply by looking at the signs. That is the point.


e: TL;DR If consciousness were some kind of 'magic' process, as opposed to being turing complete, then I could understand thinking that it possessed 'magical' qualities, but there is absolutely nothing magical about consciousness, and therefore it is a physical process, and therefore it as an algorithm is compatible with any hardware including carbon-based neurons, silicon-based transistors, or any other medium.[/QUOTE]

That is because each human is programmed differently! But we are still programmed; the brain's behavior is entirely determined by the laws of physics.
You are still equating to a biochemical evolution process and us making robots as the same thing, it isn't. We humans fashion robots for a purpose, after determined goals that are to be pre-programmed in. Evolution makes things without trying to do anything set, only trying to evolve to the environment. We'll never be able to do that, we can of course let nature do its thing in a lab, but we then didn't make a robot, nor did we really make anything, evolution did.

Our consciousness is not some amazing thing. Evolution would not care if we had it or didn't have it. However, being conscious happens to have advantages in the way of being able to adapt long-term plans from intelligence (rather than just executing our adaptations; in essence, consciousness is the meta-adaptation). That is why we have consciousness.
We do not have consciousness to gain benefit from it.. just like you said evolution doesn't give two cells about whether we have it. We have it only consequently, like we have both been saying at this point.

What programming gives us the ability to adapt without needing to reprogram? Probably some kind of strange recursive algorithm; I doubt evolution comments its code so we are kind of at a loss for this, which is why there are entire fields dedicated to researching general AI.
Think about this: How does the brain adapt without being re-programmed?
Your answer: By being pre-programmed to reprogram.

Now ask yourself why are we mentioning this reprogram nonsense? Because consciousness does not need to be programmed. So by trying to have some program to program, it doesn't solve your problem. Not even by having a program that programs which then programs.


A brain is a piece of hardware, made out of neurons, which is conscious due to its structure. If you take the structure of a brain and, instead, map it onto silicon transistors, it will perform exactly the same as the old brain, consciousness and all. (We have not done this because we do not have the computational power to model an entire brain using computers yet, but this isn't a hard limitation, and it will go away soon just as we did not use to have the computational power to track an electron real-time across a room.)

Neurons are just one way to program consciousness. Javascript is another. Machine code is another. NOR gates is another. Any medium which has some basic properties could be used to run a human brain, and any human brain ran on any medium would be exactly as conscious as a human brain which happened to be ran on the medium of neurons.
Again, we are not hardware, we are not programmed to do anything, we do not simply emulate consciousness or do anything that simply imitates that we are conscious. Technology is not capable of making conscious robots or even computers out of neurons (the reverse?). Even being able to prove it theoretically, you cannot do it, you just assume we can make that jump at the crucial moment in technology, you don't even know, so that would be two marks to be against your argument. Emotions and imagination, decision weighing, nothing I can think of even theoretically makes sense for a computer to be capable of doing, it isn't even a question of strength, it is the fact they are programmed.
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
I did not say there was a discrepancy between mental and physical. I agree with what Dre has stated, it isn't a matter of computing power, not one bit.



I figured that is how you got your definition. Please look up the word, do not put a device that was named to imitate systems as your reference... oh wait, you'd still be wrong.
Semantics.

You are mistaken about what the zombie argument is. It isn't a "You couldn't tell your friend John Smith is a zombie if he suddenly changed sometime this week!" It isn't emulating an specific person, it is emulating consciousness. You wouldn't be able to tell whether the zombie was conscious or not simply by looking at the signs. That is the point.
Why not? Consciousness is a physical aspect of the brain, so if the zombie's brain contained the same elements which made a non-zombie conscious, then the zombie would be conscious.


You are still equating to a biochemical evolution process and us making robots as the same thing, it isn't. We humans fashion robots for a purpose, after determined goals that are to be pre-programmed in. Evolution makes things without trying to do anything set, only trying to evolve to the environment. We'll never be able to do that, we can of course let nature do its thing in a lab, but we then didn't make a robot, nor did we really make anything, evolution did.
This is the distinction between adaptation-executors and utility-maximizers. We are adaptation-executors; we execute our adaptations, and that works because evolution dictated that our adaptations would increase the chances of procreation, but obviously a utility-maximizing intelligence would be preferable. A utility-maximizing intelligence will (likely) never be created by evolution, it must be designed by another intelligence (us). Such an intelligence would undoubtedly have goals which were pre-programmed in; this does not mean that it cannot be conscious. Consciousness is not something special: it is just another process of our brain which has to do with vetoing impulse decisions and long-term planning. Admittedly, the codebase for such a process is probably HUGE (and so it appears to us to have elements of randomness), but it is still a turing-complete and calculable algorithm. Do you disagree?

Think about this: How does the brain adapt without being re-programmed?
Your answer: By being pre-programmed to reprogram.

Now ask yourself why are we mentioning this reprogram nonsense? Because consciousness does not need to be programmed.
Yes it does. At some point in the past, a mammalian brain developed some kind of adaptation which resembled consciousness and this adaptation led it to have more children. That is how evolution goes about programming things.

So by trying to have some program to program, it doesn't solve your problem. Not even by having a program that programs which then programs.
Why not?


Again, we are not hardware, we are not programmed to do anything, we do not simply emulate consciousness or do anything that simply imitates that we are conscious. Technology is not capable of making conscious robots or even computers out of neurons (the reverse?). Even being able to prove it theoretically, you cannot do it, you just assume we can make that jump at the crucial moment in technology, you don't even know, so that would be two marks to be against your argument.
There is no 'crucial jump'. We could do it right now, if we dedicated almost all of the world's resources to producing top-of-the-line ATI graphics cards and then simply used them to model a human brain atom-by-atom.

and then combined Emotions and imagination, decision weighing, nothing I can think of even theoretically makes sense for a computer to be capable of doing, it isn't even a question of strength, it is the fact they are programmed.
Why can't a computer have emotions? Emotions are a purely physical construct, they do not emerge from the neurons, they are the neurons. If it's a purely physical construct than we can reconstruct it.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
gofg said:
Why can't a computer have emotions? Emotions are a purely physical construct, they do not emerge from the neurons, they are the neurons. If it's a purely physical construct than we can reconstruct it.
Computers could react to a situation with a programmed reaction. There is no universal reaction to a single event. Every single person reacts with a different emotional mindset. Like how the joker finds his actions hilarious while batman is disgusted by them.
Definition of emotions- A natural instinctive state of mind deriving from one's circumstances, mood, or relationships with others.
Any of the particular feelings that characterize such a state of mind, such as joy, anger, love, hate, horror, etc.

:phone:
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
A computer made of silicon neurons arranged in the structure of a human brain would also base it's decisions on it's emotional circumstance.
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
Do you suggest that the action of 'feeling' emotions is nonphysical? That it cannot be described in physical terms? That's dualism.

If we, with our neuron-based computational devices called brains, can feel emotions, then another life form with a silicon-based computational device can feel emotions also.
 

Holder of the Heel

Fiat justitia, pereat mundus
Joined
Dec 3, 2011
Messages
8,850
Location
Alabama
NNID
Roarfang
3DS FC
1332-7720-7283
Switch FC
6734-2078-8990
Semantics.
I'm just informing you what people mean when they say emulate. I've never heard of it not in that meaning.



Why not? Consciousness is a physical aspect of the brain, so if the zombie's brain contained the same elements which made a non-zombie conscious, then the zombie would be conscious.
Still missed the point, it was about emulating consciousness, when people reference that they do not speak of brain construction or activity for that matter.

This is the distinction between adaptation-executors and utility-maximizers. We are adaptation-executors; we execute our adaptations, and that works because evolution dictated that our adaptations would increase the chances of procreation, but obviously a utility-maximizing intelligence would be preferable. A utility-maximizing intelligence will (likely) never be created by evolution, it must be designed by another intelligence (us). Such an intelligence would undoubtedly have goals which were pre-programmed in; this does not mean that it cannot be conscious. Consciousness is not something special: it is just another process of our brain which has to do with vetoing impulse decisions and long-term planning. Admittedly, the codebase for such a process is probably HUGE (and so it appears to us to have elements of randomness), but it is still a turing-complete and calculable algorithm. Do you disagree?
I do disagree, you have no idea if it is as you say it is, you are merely going on an assumption. Like I said, you can't even theoretically speak of it working because we have no clue if we can do it. You say there is no crucial jump, yet you don't know this. Not to mention, we weren't made with intelligence, we were made to survive, even when we were just a bunch of cells grouping together and growing and expanding to more complex organisms. For the tenth time, we weren't programmed at all, there is nothing program-like about us. We do not have the power of some sort of God and make consciousness out of materials we find. We can make a machine that runs very well and can react to data, you saying "map" our brain with silicon transistors is very easy to say, though it doesn't really make much sense, not even you have any sort of inkling as to what that means.

Yes it does. At some point in the past, a mammalian brain developed some kind of adaptation which resembled consciousness and this adaptation led it to have more children. That is how evolution goes about programming things.
A brain already has consciousness, it didn't develop it ex post facto from adaptation, and it wasn't made to procreate as you think. It is just there because we can experience and reflect on them, and thus imagine and conceptualize, and I could go on a tangent and explain my personal philosophy on how emotions emerge from these conceptions in our mind, but I digress. All a computer may ever do is do what it was programmed, that is it. It can upload data, won't interpret it, keep it in its memory, and if programmed it could arbitrarily forget it like we do, but that would be because of not how it is but what it was designed to do. It would decide to grow attached to things if it met a set prerequisite of programmed criteria, and it wouldn't be real, nor an experience, just change the factors and probabilities of its actions. I could go on and on, but it would just be reiterating everything I have expressed thus far: emulation.


Why can't a computer have emotions? Emotions are a purely physical construct, they do not emerge from the neurons, they are the neurons. If it's a purely physical construct than we can reconstruct it.
Again, emotions are not real emotions if they were designed to be "taken into account", just as how we wouldn't assume someone truly loved another if they were forced to try and do it, although not a parallel example but I hope the point comes across. I would not be conscious if I was involuntarily being controlled by something higher than myself, and fortunately, as me being a part of what created me, evolution, my consciousness reigns on top. A robot would unfortunately always play second fiddle to the program made by what created it, even if it was somehow pre-programmed to never magically need programming for everything.





All in all, we are generally repeating ourselves since it isn't so much about emulation anymore that keeps this going, its the fact that one of us believes with machines we can make it so a robot doesn't emulate but experiences, and Orbo and I do not feel that way. Unfortunately, now that I think about it, nothing much can be said to change our minds at this point in time with our present knowledge.
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
You have to be a dualist to believe that consciousness can't be implemented by hardware other than a human brain based on neurons.
 

Holder of the Heel

Fiat justitia, pereat mundus
Joined
Dec 3, 2011
Messages
8,850
Location
Alabama
NNID
Roarfang
3DS FC
1332-7720-7283
Switch FC
6734-2078-8990
No, that isn't necessarily the case. I don't think there is something non-physical about the brain, but that doesn't mean I have to believe that programming does the job as well.
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
...what? I am not suggesting programming a human intelligence, I am suggesting running a copy of a human brain on a medium other then neurons. What is so special about neurons that lets consciousnesses emerge on then rather than transistors?
 

Holder of the Heel

Fiat justitia, pereat mundus
Joined
Dec 3, 2011
Messages
8,850
Location
Alabama
NNID
Roarfang
3DS FC
1332-7720-7283
Switch FC
6734-2078-8990
For the reasons I have been repeating ad naseum, my last rebuttal post being another reiteration of that. The end of the post, which you seem to be only addressing, if anything, is simply an acknowledgement of the wall we have now faced with who is willing to think this or that. We can word the same things in all sorts of manners but it won't change anything, as it seems. That is all I was saying I recognize.
 

Dre89

Smash Hero
Joined
Oct 29, 2009
Messages
6,158
Location
Australia
NNID
Dre4789
Gofg- By saying we could emulate the brain with transitors and it would have conciousness is begging the question.

The problem is conciousness can be perfectly simulated without actually having conciousness, so your point is hard to prove.

It's not as if you can look inside someone's head or a robot's system and see conciousness in a physical location.

Also your point that transitors can emulate neurons and create conciousness assumes materialism because it is only under a materialist framework where neurons are the only necessary commodity to create conciousness.

:phone:
 

ElvenKing

Smash Apprentice
Joined
Aug 2, 2008
Messages
98
Location
Melbourne, Australia
Do you suggest that the action of 'feeling' emotions is nonphysical? That it cannot be described in physical terms? That's dualism.

If we, with our neuron-based computational devices called brains, can feel emotions, then another life form with a silicon-based computational device can feel emotions also.
That would only work if there was what is required to feel emotions. There are hormones and different parts of the brain involved in the production of emotional state. There simply being neurons is not really enough to conclude that there will be emotions as in humans.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
The problem is conciousness can be perfectly simulated without actually having conciousness, so your point is hard to prove.
In a materialist world "a perfect simulation" is a synonym for "actually having" anything. And that's the whole point. If there's no physical difference, and all that exists is physical, then there is no difference.

That's not a proof of materialism. You won't ever find one, as it's not possible to disprove things like extra planes of non-physical existence. But the materialist position is absolutely internally consistent.


Also, GofG was throwing some computer science terms around like "Turing Complete" that I suspect nobody was familiar with, so I'll take the opportunity to elaborate. (As I'm a computer scientist myself) This is what might be called "The argument from combinatorics"

So Alan Turing made many breakthroughs in the area of theoretical computer science, but one of the major ones was the discovery of what became later known as a "Turing Machine". A Turing Machine is a hypothetical machine (IE: A mathematical construct) that consists of four simple parts:

a) A length of writable memory tape of arbitrary length. (Not infinite. It's finite, but assumed to be sufficiently long for the given problem)
b) A read/write head that writes symbols to the tape above
c) A table of instructions for the read/write head
d) A store of the current state of the machine

Such a machine is capable of computation. Meaning that it can answer questions that you ask it. Trivial examples are of course mathematical questions like "What is 2+3?". But with the right program (the table in component c) you can ask it literally any question.

Alan Turing then went on to prove that this machine is what would later be called "Turing Complete". Which means that it can solve any problem which is capable of being solved. It has the greatest possible ability to solve problems that can be attained. And the computer in front of you right now is one of these.

So what does this mean? (The point GofG was trying to make) It means that the there is no problem that the brain can solve that a computer cannot. The computer is Turing complete, it can solve any solvable problem. Therefore we can in principle build a full brain out of any material. (Since a computer can be built from any material)

Now this argument doesn't automatically preclude the existence of a non-physical mind. (Nothing ever can) But it sure makes it pointless. There is no (solvable) mental task that a brain is incapable of. There's no need to appeal to a non-physical entity since everything is provably capable of being handled right there in the brain.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
It may be able to solve any problem, but can it interact with and feel for another person? Regardless of its mathematical capabilities a brain can process emotion, and act upon it accordingly.
Thanks for the info though alt.

:phone:
 

Theftz22

Smash Lord
Joined
Mar 21, 2008
Messages
1,030
Location
Hopewell, NJ
Now this argument doesn't automatically preclude the existence of a non-physical mind. (Nothing ever can) But it sure makes it pointless. There is no (solvable) mental task that a brain is incapable of. There's no need to appeal to a non-physical entity since everything is provably capable of being handled right there in the brain.
Who's positing the existence of a non-physical mind in order to solve equations?
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
Orboknown:

That's a good question. A "problem" in this sense is any decision problem. In short, it's a question with a yes or no response. (Note that you can boil down essentially any question into a decision problem by rephrasing the question.)

So let's take the example of empathy, a human's ability to put one's self in another's shoes and "feel what they're feeling". (Animals have it too, it's not uniquely human) We can view this ability as a series of decision problems. The brain stores a great deal of past experiences from your childhood, upbringing, and past, and then applies these experiences to the present time.

So you might see a person being assaulted, for instance, and empathically feel an emotion. How does your brain decide what emotion to feel? It's a straight forward question: "Should I feel sad at this image?" Your brain has all the past data needed to come to a conclusion, and it has all the necessary computational ability to do the processing.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
Humans dont function on yes and no answers though. If we did, humanity might never have gotten to this point. What would have motivated us to go beyond slaying animals and living in caves if everything boiled down to "yes" or "no"?

:phone:
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
Who's positing the existence of a non-physical mind in order to solve equations?
Well, now, I'm just clarifying and expanding on the point that GofG was trying to make.

But don't try to pinhole this into a narrow matter of "solving equations". What's so important is that it's NOT just about crunching numbers. Any mental process applies. There is no mental process a computer (and brain) are incapable of. So that rather diminishes the need for a non-physical mind. Since there are many people who precisely say things like Orboknown just said. Who are under the impression that the brain is incapable of performing certain mental tasks: therefore a non-physical mind must exist to do it.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
I never said the brain was incapable of performing all mental tasks. The very adjective mental explains that it is the mind doing it.

:phone:
 

Holder of the Heel

Fiat justitia, pereat mundus
Joined
Dec 3, 2011
Messages
8,850
Location
Alabama
NNID
Roarfang
3DS FC
1332-7720-7283
Switch FC
6734-2078-8990
But we wouldn't ask ourselves if we should be sad about it, you either are or you aren't. Just like how you can't control your emotions at will, you can't infuriate yourself arbitrarily, you can think of things that anger you, but it wouldn't be your consciousness willing it, and it wouldn't work too strongly or smoothly as opposed to actually coming across sense-datum that will enforce these feelings upon you. Unfortunately, everything is not reduced to questions, and I don't know if this is what Dre. would call like metaphysical difference or something, but regardless, that doesn't imply a non-physical mind, it simply means there is more to it than computing.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
It's all automatically done by your brain. Orboknown asked how something like empathy can be accomplished under the framework of decision problems, and I explained how. Any mental or intellectual task can be re-phrased as a decision problem. And a computer (and brain) are capable of solving any solvable decision problem. Thus, there's no missing gaps in brain ability.

You don't do it "manually" as it were. There is a subconscious portion of your brain that does tons of really complex computational tasks "automatically" without you even knowing it.

Consider the blind spot in your vision. You might have not even known about it, but there is a fairly large area of your vision slightly to the sides of the center that all humans are blind in. You cannot see anything there. So why don't you see large black spots in your vision? Because your brain fills the empty spots in. Fills it with whatever it processes that should be there. Of course, sometimes it's wrong. (As in the example on the Wikipedia page)

Or consider facial recognition. A very computationally difficult processing task. Our computer algorithms are only still getting the hang of it. But humans are remarkably good at immediately finding faces. (Though we also have a remarkably high false positive rate) Detecting faces is something that your brain just does automatically. (The ability to quickly detect faces has a clear evolutionary advantage. Being able to see a tiger's face in 1 second as opposed to 2 seconds can mean life or death. But a false positive doesn't kill you. So that's what our brains tend toward.)
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
Alt, you keep refering to how a computer can manually sort through yes/no decisions and recognizing images/reactions already in the brains memory/database. Humans, however, do not simplify everything into a yes/no decision. "What do I want to wear" is not a yes/no decision.
The ever present question of Why that we try to answer is not a yes/no question.

:phone:
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
Decision problems (IE: With a yes / no answer ) are equivalent to function problems. (IE: A question with many answers) See here. I'm sorry for bombarding you with high level theoretical computer science material, but that's just how it is. So don't get so hung up on this thing about yes/no questions. It's really referring to ANY (solvable) question.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
I get that you can simplify it. But does our brain go through " am i feeling this emotion/to what degree do i feel it"? every time we feel an emotion? And if so, that still doesn'texplain how a computer could actually feel that emotion instead of merely figuring out what emotion it should feel.

:phone:
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
You are splitting hairs on small inconsistencies in the analogies we use to get you to understand, and claiming that these inconsistencies are points against the argument which you clearly have yet to understand.
 

GofG

Smash Champion
Joined
Jul 6, 2005
Messages
2,001
Location
Raleigh, NC
How do we feel an emotion? How does it happen on the physical level? My point is, you don't have to know how it happens. If you know for a fact that it does happen, then you know that it happens on a physical level because you aren't a dualist. If it happens physically on one medium (neurons) then it can happen on any medium capable of performing similar calculations. So to be arguing your point, you must either say that neurons are special kinds of computational deviceswhich cannot be emulated, or that emotions happen in a nonphysical plane.
 

Orboknown

Smash Hero
Joined
Aug 3, 2011
Messages
5,097
Location
SatShelter
Splitting hairs can be the downfall of many a great thing.
I think Holder summarized it best when he said
All in all, we are generally repeating ourselves since it isn't so much about emulation anymore that keeps this going, its the fact that one of us believes with machines we can make it so a robot doesn't emulate but experiences, and Orbo and I do not feel that way. Unfortunately, now that I think about it, nothing much can be said to change our minds at this point in time with our present knowledge.
And Gofg, did it ever occur maybe I do understand, but Don't agree? Or that by continuing this is a possibility for me to learn more?

:phone:
 

Theftz22

Smash Lord
Joined
Mar 21, 2008
Messages
1,030
Location
Hopewell, NJ
Well, now, I'm just clarifying and expanding on the point that GofG was trying to make.

But don't try to pinhole this into a narrow matter of "solving equations". What's so important is that it's NOT just about crunching numbers. Any mental process applies. There is no mental process a computer (and brain) are incapable of. So that rather diminishes the need for a non-physical mind. Since there are many people who precisely say things like Orboknown just said. Who are under the impression that the brain is incapable of performing certain mental tasks: therefore a non-physical mind must exist to do it.
But again, who's positing the existence of a non-physical mind in order to answer decision problems? We posit the immaterial mind because it has properties, stemming from their raw, qualitative feel (that mental states have something that it feels like to experience them) and reference to a content, that physical things do not share, even if there is no decision problem that the mind can solve that the brain cannot.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
But again, who's positing the existence of a non-physical mind in order to answer decision problems? We posit the immaterial mind because it has properties, stemming from their raw, qualitative feel (that mental states have something that it feels like to experience them) and reference to a content, that physical things do not share, even if there is no decision problem that the mind can solve that the brain cannot.
No, Underdogs, that's why you believe in it. Don't try to insinuate that everyone else does too. If this argument doesn't apply to you, then just ignore it. Because it sure does apply to others. Such as in this very thread:

It may be able to solve any problem, but can it interact with and feel for another person? Regardless of its mathematical capabilities a brain can process emotion, and act upon it accordingly.
Clearly claiming that there are some intellectual tasks that a brain can do but a computer cannot. I aimed to show that this is false. And also go another step further, which is to show that there is nothing that the mind does that the brain is incapable of.
 

Dre89

Smash Hero
Joined
Oct 29, 2009
Messages
6,158
Location
Australia
NNID
Dre4789
In a materialist world "a perfect simulation" is a synonym for "actually having" anything. And that's the whole point. If there's no physical difference, and all that exists is physical, then there is no difference.

That's not a proof of materialism. You won't ever find one, as it's not possible to disprove things like extra planes of non-physical existence. But the materialist position is absolutely internally consistent.
The point was that something could imitate the activity of a concious being, without being concious themselves. Without having thought, feelings etc.

That means that conciousness is an unnecessary existence in that humans could function the way they do now without it. That has problematic implications for materialism.
 
Status
Not open for further replies.
Top Bottom