• Welcome to Smashboards, the world's largest Super Smash Brothers community! Over 250,000 Smash Bros. fans from around the world have come to discuss these great games in over 19 million posts!

    You are currently viewing our boards as a visitor. Click here to sign up right now and start on your path in the Smash community!

Free Will

Status
Not open for further replies.

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
Hmmm.... I grossly misrepresented Kantian morality in the last few points.

Anyhow, I will forgo arguing what morality and instead focus on whether morality exists.

By changing our criterion for existence, we can prove the existence of morality but not make any definitive statements about what it is. This differs only slightly from the empirical approach which would force us to nail down a definition of morality and then prove that it exists. I don't need to do that.

We can say a subject exists if a world without the subject becomes unworkable. Without a concept of morality, we cannot make definitive statements about anything. We cannot say 'murder is bad' or 'altruism is good'. Our society could therefore not function without the concept of morality.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
Sure we could! I can say "altruism is good" just like I can say "ice cream is good." It's just that both will be asserted as reflections of my feelings and not as an assertion of some abstract form of morality (or tastiness) somewhere.

And society could still function by using ethics, which are nothing more than rules for the self-propagation of society. See my post about peeing dogs.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
Sure we could! I can say "altruism is good" just like I can say "ice cream is good." It's just that both will be asserted as reflections of my feelings and not as an assertion of some abstract form of morality (or tastiness) somewhere.

And society could still function by using ethics, which are nothing more than rules for the self-propagation of society. See my post about peeing dogs.
Morality is merely ethics applied, so by suggesting that we operate on ethics rather than morality, you are simply delaying the question. Your definition of ethics is vastly different than any I have seen, but it still falls into the same pitfall.
You can tell me that altruism is good because it raises the effectiveness (or any other category you pick. It does not matter) of propagation, but why should I be altruistic? Why should I do what you feel is good?

edit: I should have been more clear. We cannot make and enforce laws if morality does not exist.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
The rules of society (be altruistic, etc.) have been pounded into you from such an early age that they are now almost hardwired into your brain. As such, I don't have to tell you to be altruistic, because you already want to be from the continued conditioning that society has put you through.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
The rules of society (be altruistic, etc.) have been pounded into you from such an early age that they are now almost hardwired into your brain. As such, I don't have to tell you to be altruistic, because you already want to be from the continued conditioning that society has put you through.
And the person who has not been hardwired with our set of moral codes? How do we justify our beliefs to him?
Also, you are not rejecting morality, you are positing an evolutionary/sociological cause for morality's existence. That does not refute my statement at all. They can be compatible.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
And the person who has not been hardwired with our set of moral codes? How do we justify our beliefs to him?
Also, you are not rejecting morality, you are positing an evolutionary/sociological cause for morality's existence. That does not refute my statement at all. They can be compatible.
A person who has not been hardwired with our set of "moral codes" (I would prefer to call it a code of ethics) cannot even truly communicate with someone inside our society, unless he implicitly signs the social contract and joins our society (at which point conditioning can begin). Fortunately, as we approach a "global village" this is becoming less of a problem.

And if you think my position asserts the existence of morality, then assuming morality cannot be equivalent to assuming free will. It is entirely possible for someone to be following my definition of "morality" while not having free will.

This is why I am calling it "ethics" - to differentiate it from the "morality" that you are pushing for in the original argument for free will.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
A person who has not been hardwired with our set of "moral codes" (I would prefer to call it a code of ethics) cannot even truly communicate with someone inside our society, unless he implicitly signs the social contract and joins our society (at which point conditioning can begin). Fortunately, as we approach a "global village" this is becoming less of a problem.
So there are people inside our society who have not been conditioned.
Yet these people still obey our laws, why?
Because they want the benefits of our society?
Why do they want the benefits of our society?
And if you think my position asserts the existence of morality, then assuming morality cannot be equivalent to assuming free will. It is entirely possible for someone to be following my definition of "morality" while not having free will.
This is why I am calling it "ethics" - to differentiate it from the "morality" that you are pushing for in the original argument for free will.
It is totally compatible.
Moral responsibility is all that is needed from free will. Your view of morality ("ethics" whatever you want to call it) still allows for moral responsibility in that somebody can still perform a 'wrong' action and be held responsible for it. If your society views murder as bad, and I murder while in your society, I am morally responsible for what I did. Your justification of morality is different. You go so far as to call it a different name, but there is still moral responsibility.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
So there are people inside our society who have not been conditioned.
Yet these people still obey our laws, why?
Because they want the benefits of our society?
Why do they want the benefits of our society?
They obey our laws because they are afraid of punishment. They don't obey our ethical code. If a person comes from a culture where constant lying is acceptable (unlikely given the way societal evolution works, but not impossible), they will lie to you any time it is legal to do so.

It is totally compatible.
Moral responsibility is all that is needed from free will. Your view of morality ("ethics" whatever you want to call it) still allows for moral responsibility in that somebody can still perform a 'wrong' action and be held responsible for it. If your society views murder as bad, and I murder while in your society, I am morally responsible for what I did. Your justification of morality is different. You go so far as to call it a different name, but there is still moral responsibility.
No, my system is no different from a person trying to program a variety of imperfect robots. Some of these robots are harder to program then others, so more work is required for the "problem" ones. Some are absolutely impossible to program at all (one of them runs on Windows ME), so the only choice is to put them somewhere where they do not harm the other robots. These robots do not have free will, and I am not holding them morally responsible.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
They obey our laws because they are afraid of punishment. They don't obey our ethical code. If a person comes from a culture where constant lying is acceptable (unlikely given the way societal evolution works, but not impossible), they will lie to you any time it is legal to do so.
And why is punishment a bad thing?
No, my system is no different from a person trying to program a variety of imperfect robots. Some of these robots are harder to program then others, so more work is required for the "problem" ones. Some are absolutely impossible to program at all, so the only choice is to put them somewhere where they do not harm the other robots. These robots do not have free will, and I am not holding them morally responsible.
You are still holding a 'robot' responsible for its actions. If robot A shoots robot B, you remove robot A from society. Robot A has been punished for his actions, even if no other action can be made by the robot.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
This talk of morality is inane. It is completely besides the point.

I have already presented a couple of arguments as to why Free Will is impossible. They have yet to be addressed. They are:

1) Free Will violates causality
2) The problem of emergence

A greater elaboration on these can be found on my original Free Will thread.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
Explain further on the "why is punishment a bad thing?" line of thought. I don't really see where you're going with it. Punishment is an undesirable thing by definition.

And, sure, robot A has been punished. Or not, I don't really care. They're just robots. The point is that these robots don't have free will.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
This talk of morality is inane. It is completely besides the point.
It is entirely the point.
I am suggesting that the entire 'causality vs free will' debate has been miscast.
1) Free Will violates causality
Addressed. Moral responsibility for one's actions has no quantitative difference from free will. It is all we need from free will.
I have already shown how we can be held morally responsible for our actions even if we had no alternative possibilities.
2) The problem of emergence
A greater elaboration on these can be found on my original Free Will thread.
Emergence can be addressed with our apparent uniqueness in consciousness and abilities to symbolize reality and make it abstract.
Explain further on the "why is punishment a bad thing?" line of thought. I don't really see where you're going with it. Punishment is an undesirable thing by definition.
I do not mean the word 'punishment' so much as what punishment entails. Why do we view the restriction of freedoms (going to jail) as bad?
I am specifically not referring to death because you already addressed that problem
And, sure, robot A has been punished. Or not, I don't really care. They're just robots. The point is that these robots don't have free will.
In the sense that they could not make a choice, yes. But they are morally responsible for their actions. Moral responsibility is all we need from free will, so there is no quantitative difference between our two definitions.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
I do not mean the word 'punishment' so much as what punishment entails. Why do we view the restriction of freedoms (going to jail) as bad?
I am specifically not referring to death because you already addressed that problem

In the sense that they could not make a choice, yes. But they are morally responsible for their actions. Moral responsibility is all we need from free will, so there is no quantitative difference between our two definitions.
1. Punishment is nothing but negative feedback. If you use enough negative feedback on an animal, you can eventually force it to do basically anything you want (as long as the action is within its physical and psychological limitations). Again, morals don't have anything to do with it.

2. No, the robots are not morally responsible for their actions any more than a deck of cards is morally responsible for giving me a bad hand. Let's say I cheat in the next hand by secretly taking away the Ace of spades and burning it. I'm still not holding the deck morally responsible even though the deck has been "punished."
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
1. Punishment is nothing but negative feedback. If you use enough negative feedback on an animal, you can eventually force it to do basically anything you want (as long as the action is within its physical and psychological limitations). Again, morals don't have anything to do with it.
They have plenty to do with it.
Why are you giving them negative feedback for their actions? Every single law we have comes back to some vague feeling of good and bad.

And why is this particular feedback (jail) 'negative'?
Why do they have a will to avoid being confined?
2. No, the robots are not morally responsible for their actions any more than a deck of cards is morally responsible for giving me a bad hand. Let's say I cheat in the next hand by secretly taking away the Ace of spades and burning it. I'm still not holding the deck morally responsible even though the deck has been "punished."
You are holding them responsible for their actions, even if they had no choice. Robot A had no other option but to shoot Robot B. Yet you punish Robot A. It is responsible for its actions. You view the killing of Robot B as 'bad'. You can substitute 'bad' for anything else, but you simply delaying the inevitable problem.
Morality can be programmed into us by evolution and sociological pressure. That does not take away from morality in the slightest.

And you have shifted the analogy. A deck cannot perform an action; it is perfectly inanimate. We consciously perform an action upon it. Your analogy is irrelevant.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
Here is a crude summary of my position

Free will is not the ability to do as one chooses, but the ability to be morally responsible for one's actions. Morality must exist, because a world without morality would not function as it does right now. ComradeSAL, your objections are irrelevant because they only state that 'good' and 'bad' are due to sociological and biological pressure. The exact nature of morality is irrelevant to my position, all I need is the existence of morality. Moral relativism, moral absolutism, and any other system of morality are perfectly acceptable so long as one does not reject morality outright. Your system still has 'good' and 'bad'; they have only been renamed
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
This is just dumb. You're choosing your own definition for Free Will rather than the one everyone else is using, then asserting that it must then exist.

Well woopdey ****! I can do that too. I choose god to be defined as my computer monitor right in front of me. Look! God exists!


Your definition of Free Will is not the definition we have all been going on. You're arguing some lame strawman you ripped from the cold dead hands of Immanuel Kant.

Free Will is the ability for an agent to choose their actions and not be confined only to that dictated by the laws of nature. It has nothing to do with moral responsibility.
 

ComradeSAL

Smash Journeyman
Joined
Nov 27, 2001
Messages
223
Location
Ft. Collins, CO
They have plenty to do with it.
Why are you giving them negative feedback for their actions? Every single law we have comes back to some vague feeling of good and bad.

And why is this particular feedback (jail) 'negative'?
Why do they have a will to avoid being confined?
This is a biological question, not a moral one. Biological imperatives do not imply morality.

You are holding them responsible for their actions, even if they had no choice. Robot A had no other option but to shoot Robot B. Yet you punish Robot A. It is responsible for its actions. You view the killing of Robot B as 'bad'. You can substitute 'bad' for anything else, but you simply delaying the inevitable problem.
Morality can be programmed into us by evolution and sociological pressure. That does not take away from morality in the slightest.

And you have shifted the analogy. A deck cannot perform an action; it is perfectly inanimate. We consciously perform an action upon it. Your analogy is irrelevant.
A robot is not any less inanimate than a windmill. It's just more complex.

I agree with AltF4. You have not only redefined everyone's definition of free will to be equivalent to moral responsibility, but you seem to have redefined moral responsibility to something that doesn't even make sense.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
Oh, and I almost forgot, Yossarian. You in no way addressed either of my arguments for why Free Will cannot exist.

-In my previous thread (which was far more relevant than this one) I gave a lengthy description as to exactly how Free Will violates causality. In order to counter this argument, you have to point out what part of it is incorrect. You have yet to do this.

-Emergence cannot be explained by apparent complexity! That's exactly what makes it "the problem of emergence"!

For example, a computer has only three operations: AND, OR, and NOT. Everything a computer does is some combination of these three things. No matter how complex you make a computer it is completely deterministic. It has no Free Will, despite how much you make it LOOK like it does. Period.

It does not matter how much "apparent" uniqueness we have, the problem of emergence remains.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
This is just dumb. You're choosing your own definition for Free Will rather than the one everyone else is using, then asserting that it must then exist.

Well woopdey ****! I can do that too. I choose god to be defined as my computer monitor right in front of me. Look! God exists!

Your definition of Free Will is not the definition we have all been going on. You're arguing some lame strawman you ripped from the cold dead hands of Immanuel Kant.

Free Will is the ability for an agent to choose their actions and not be confined only to that dictated by the laws of nature. It has nothing to do with moral responsibility.
For starters, Kant only suggested that we are only acting autonomously when we act morally.
The idea that free will is identical to moral responsibility is far from being unique to Kant anyhow, most philosophers hold that view.

For starters, the version of free will I am using is far from being unestablished.

You claim that free will is freedom of action. That definition is idiotic and pointless.
My definition of free will is freedom of will. There is no difference in practice between the two, but my definition is far more flexible and useful.

This is a biological question, not a moral one. Biological imperatives do not imply morality.
Again, you have simply shifted 'good' to be 'evolutionarily/sociologically advantageous'. I could care less about what morality is. Merely that it exists. My method of proving existence may clash with traditional empiricism, but it is far from being a new and radical method. At the very least, I have established that we have a perception of 'good' and 'bad'.

A robot is not any less inanimate than a windmill. It's just more complex.
I am not differentiating between a robot and a windmill. I am differentiating between a robot and a deck of cards. And it can be stated that you do hold the deck morally responsible for its actions, just not to the extent that you do with a robot. You would not use a deck that lacks an ace of spades

I agree with AltF4. You have not only redefined everyone's definition of free will to be equivalent to moral responsibility, but you seem to have redefined moral responsibility to something that doesn't even make sense.
Freedom of will is a perfectly valid definition of free will.
And moral responsibility is nothing more than the ability to be judged for one's actions. We do this all the time.


Oh, and I almost forgot, Yossarian. You in no way addressed either of my arguments for why Free Will cannot exist.

-In my previous thread (which was far more relevant than this one) I gave a lengthy description as to exactly how Free Will violates causality. In order to counter this argument, you have to point out what part of it is incorrect. You have yet to do this.
You have a faulty definition.
You have shown that freedom of action is impossible, but failed to show how freedom of will is invalid.

-Emergence cannot be explained by apparent complexity! That's exactly what makes it "the problem of emergence"!
For example, a computer has only three operations: AND, OR, and NOT. Everything a computer does is some combination of these three things. No matter how complex you make a computer it is completely deterministic. It has no Free Will, despite how much you make it LOOK like it does. Period.
It does not matter how much "apparent" uniqueness we have, the problem of emergence remains.
Fair enough. If you refuse to accept the argument that we are unique compared to anything else we have encountered....

We still hold animals responsible for their actions, albeit not in the extent that we hold each other too. When a pitbull kills a child, we kill the pitbull. So the pitbull does have free will. We simply perform more interactions than a pitbull can and have superior reasoning, hence we are held more responsible for our actions. [ So we are not unique in the slightest]

And again, you using freedom of action. Not freedom of will.

To reiterate, moral responsibility are free will are intrinsically linked, as we cannot have the former without the latter. The idea that in order to be morally responsible for an action, one needs alternative led to the definition of free will you are using. But one does not need an alternative to be held morally responsible for one's actions, as I have shown earlier in the thread. If one does not need alternatives to be morally responsible, than it stands that one does not need alternatives to have freedom of will
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
The problem your having is that you assume that there IS a will, that humans have "intentions".

Does a rock falling down a hill have an "intention?", does it have a "will" , is it morally responsible for the things it crushes? Of course not. It is a rock. A piece of matter which has no freedom of action, as you choose to put it.

What I have shown is that nothing can have freedom of action. Thus everything is just a hunk of matter, rolling along and following the laws of nature.


In effect, a human is no more morally responsible for murder than an avalanche is for killing a witless skier. However, actions such as murder are a detriment to society and are naturally counterbalanced. So we are left with a biochemical emotion that instructs most humans to not do certain things, like murder.

There is no need to describe anything in terms of "Will" in some kind of idealized form. I don't suppose you're trying to be a dualist, Yossarian? In a materialist world, there is no such thing. Only the actions of moving particles which are bound to the laws of physics. Nothing more.
 

yossarian22

Smash Journeyman
Joined
Mar 31, 2008
Messages
204
The problem your having is that you assume that there IS a will, that humans have "intentions".

Does a rock falling down a hill have an "intention?", does it have a "will" , is it morally responsible for the things it crushes? Of course not. It is a rock. A piece of matter which has no freedom of action, as you choose to put it.

What I have shown is that nothing can have freedom of action. Thus everything is just a hunk of matter, rolling along and following the laws of nature.
And I don't need freedom of action to have freedom of will.
You are arguing that will is purely mechanistic (correct me if I am wrong).
In effect, a human is no more morally responsible for murder than an avalanche is for killing a witless skier. However, actions such as murder are a detriment to society and are naturally counterbalanced. So we are left with a biochemical emotion that instructs most humans to not do certain things, like murder.

There is no need to describe anything in terms of "Will" in some kind of idealized form. I don't suppose you're trying to be a dualist, Yossarian? In a materialist world, there is no such thing. Only the actions of moving particles which are bound to the laws of physics. Nothing more.
I am not trying to be a dualist. It is a thoroughly useless view, although some have attempted to abuse quantum mechanics to argue that consciousness cannot be an emergent property. Dualism has mutated to a form of mysticism, most of which I dislike.

Will is inseparable from moral responsibility.

Your argument hinges on the principle of alternative possibilities, that one needs to be able to different in order to held morally responsible. I have already provided a counter example to this. The other problem with your claim is it still allows for 'good' and 'bad' to exist, it shifts them into biological imperatives, but they still exist. There will also be some problems when it comes to crimes such as fraud, ****, and theft. I am interested in hearing yor particular work arounds.
 

AltF4

BRoomer
BRoomer
Joined
Dec 13, 2005
Messages
5,042
Location
2.412 – 2.462 GHz
Well, you've already conceded that there is no Freedom of Action, which is what this topic was always about until you got here. So as far as I'm concerned, this conversation is over. But now I'm just intrigued at what you could possibly be getting at by imposing a notion of "will" into a materialistic world without Freedom of Action.

How are you trying to define "will" in your Free Will? Is it intention? So the world you are envisioning is one where we cannot control our actions... but can have freedom of intention? It just doesn't make sense. And even if it were somehow the case, the point would be useless. The entire universe is still bound by physical law, and these "intentions" or "wills" that are somehow free can have no effect on the real world, which keeps on going according to the laws of physics.
 

adumbrodeus

Smash Legend
Joined
Aug 21, 2007
Messages
11,321
Location
Tri-state area
Insofar as I can tell, you do not disagree on the facts, nor the implications of said facts, this is merely a question of defining terms.

In that sense, neither of you are really wrong since "free will" is inherently an ambiguous term. Once you agree upon an acceptable meaning for "free will", there is no debate here.


However, since the meaning of the term is not a substantiative difference, merely an inadequacy of communication, I see no reason for you two to debate, since there does not seem to be a conceptual difference between your ideas.
 
Status
Not open for further replies.
Top Bottom