Page 6 of 9 [ 133 posts ]  Go to page Previous  1 ... 3, 4, 5, 6, 7, 8, 9  Next

ouinon
Supporting Member
Supporting Member

User avatar

Joined: 10 Jul 2007
Age: 61
Gender: Female
Posts: 5,939
Location: Europe

01 Oct 2008, 3:35 am

The more I think about it the more I think that it sounds like a back to front/inside out way of saying " humans do not have free will", :) :wink:

Because if it is not to involve people being your servants/slaves/objects available/ready and waiting to carry out/express your every conscious or unconscious desire/whim/"choice", it has to involve people whose own needs/"choices" are exactly matched up at that moment in time with yours.

The man driving the car needed/wanted/ had "chosen" to have an accident, etc.

Like a machine in which one cog meets up with another at exactly the right angle etc etc etc, ... ... no free will... ... each of us part of one immense machine! 8)

.



lau
Veteran
Veteran

User avatar

Joined: 17 Jun 2006
Age: 75
Gender: Male
Posts: 9,791
Location: Somerset UK

01 Oct 2008, 9:03 am

Sorry again, ouinon, but you have started ascribing "beliefs" to me, which I do not have.

I stated that I take responsibility. I mentioned no beliefs.

If you wish to elaborate this in your own way, to imply things that I have not said...

If others also take responsibility for events, I am pleased.

I see no conflict between freedom of choice and acceptance of responsibility.

I find your last comment quite useful(?):

ouinon wrote:
Like a machine in which one cog meets up with another at exactly the right angle etc etc etc, ... ... no free will... ... each of us part of one immense machine!

It seems, to me, that consciousness sits at a mid-point, of sorts. The machine has to be complex, before we get "I", but currently, the larger the machine becomes, the less likely it is to be conscious. A question of scale.

Hence your reference to "cogs" would imply systems too simple to exhibit consciousness.

Conversely, your "each of us part of one immense machine" is targeting attention at a level that has insufficient cohesion to match up with the concept of consciousness. The same problem crops up with trees - a giant sequoia is a hugely complex machine, but really doesn't seem to compete in the self-awareness stakes (or maybe we're wrong there, and it's just so slow).

Nope. Evolution has come up with some pretty smart things... neurons, that are versatile and small. On a scale that balances nicely between complexity and speed. Until nanotechnology comes up with a more compact substitute/replacement (which I can see no reason not to expect), they're the best thing around.


_________________
"Striking up conversations with strangers is an autistic person's version of extreme sports." Kamran Nazeer


chever
Veteran
Veteran

User avatar

Joined: 21 Aug 2008
Age: 36
Gender: Male
Posts: 1,291
Location: Earth

01 Oct 2008, 11:49 pm

lau wrote:
chever wrote:
The definition comes from the Norvig text I mentioned earlier, which very neatly identifies a rational agent as a mapping from percepts (various sensory inputs) to rational actions (based on these percepts).

How very behaviourist of him - a definition of thinking as a list of reactions to inputs.


I guess the mapping can define itself.

Of course, coming up with definitions in this field that will satisfy everyone is impossible. One of the reasons that the Norvig text is widely considered the best undergrad AI text is that it does the best it can, and entertains other viewpoints at least briefly.

lau wrote:
Again, I dropped the word, because I felt that it was carrying exactly that baggage: the implication that there is "something else" that distinguishes a "genuine subjective experience" from a plain "subjective experience". As a machine, I refuse to have any truck with this baseless otherness.


I used to follow the same line of reasoning, but I couldn't anymore. Doesn't mean I felt the need to believe in an afterlife like many other people who believe in some kind of spirit or god(s); I'd rather not have to worry about an afterlife, so this just gives me even more of a headache.


_________________
"You can take me, but you cannot take my bunghole! For I have no bunghole! I am the Great Cornholio!"


slowmutant
Veteran
Veteran

User avatar

Joined: 13 Feb 2008
Age: 45
Gender: Male
Posts: 8,430
Location: Ontario, Canada

01 Oct 2008, 11:52 pm

ouinon wrote:
The more I think about it the more I think that it sounds like a back to front/inside out way of saying " humans do not have free will", :) :wink:

Because if it is not to involve people being your servants/slaves/objects available/ready and waiting to carry out/express your every conscious or unconscious desire/whim/"choice", it has to involve people whose own needs/"choices" are exactly matched up at that moment in time with yours.

The man driving the car needed/wanted/ had "chosen" to have an accident, etc.

Like a machine in which one cog meets up with another at exactly the right angle etc etc etc, ... ... no free will... ... each of us part of one immense machine! 8)

.


Congratulations, I think you've won.



chever
Veteran
Veteran

User avatar

Joined: 21 Aug 2008
Age: 36
Gender: Male
Posts: 1,291
Location: Earth

02 Oct 2008, 12:24 am

In the monists' defense: having no soul isn't a bar to free will since sufficiently complex systems (e.g., weather, the stock market, the insides of your head) are really not deterministic, at least not in any immediately obvious way.

Now, it would be very interesting if, everything else being equal, a brain reacted the same way under the same circumstances every time...


_________________
"You can take me, but you cannot take my bunghole! For I have no bunghole! I am the Great Cornholio!"


ouinon
Supporting Member
Supporting Member

User avatar

Joined: 10 Jul 2007
Age: 61
Gender: Female
Posts: 5,939
Location: Europe

02 Oct 2008, 7:29 am

lau wrote:
Sorry again, ouinon, but you have started ascribing "beliefs" to me, which I do not have.

I see that we misunderstand each other almost completely, as you seem to have totally misunderstood most of my last couple of posts aswell. :? :wink:

Quote:
I stated that I take responsibility. I mentioned no beliefs.

If by "taking responsibility for something" you do not mean that you believe that you in some way caused it to happen, what do you mean by the statement? :? :?: What does "taking responsibility" mean if not "believing" that you caused it to happen, ( which I "believe" you did say in fact )?

Quote:
If others also take responsibility for events, I am pleased.

My question is how do you divide up responsibility for events in that case, if you believe that you are responsible for everything that happens to you?

How could anyone else be responsible for anything at all concerning/in their relation to you, unless in fact you are not responsible for everything that happens to you? (Or do you mean by "you" some higher self of which you are an indivisible part, along with everyone else ?)

Quote:
The machine has to be complex, before we get "I", but currently, the larger the machine becomes, the less likely it is to be conscious. Hence your reference to "cogs" would imply systems too simple to exhibit consciousness.

I meant this metaphorically, as visual analogy, not as exact description. I am well aware that the kind of machine which has cogs is too large, and not complex enough, to have consciousness.

I just meant that each of us may be as interlinked, as automatically connected in functioning, as some cogs connecting/interlocking in a machine. Each on their ( relatively, but always inter-relating) ) "set" path. No free will at all. Otherwise your responsibility for everything that happens to you would mean that other's wills were subject to yours.

.



Last edited by ouinon on 02 Oct 2008, 8:22 am, edited 1 time in total.

ouinon
Supporting Member
Supporting Member

User avatar

Joined: 10 Jul 2007
Age: 61
Gender: Female
Posts: 5,939
Location: Europe

02 Oct 2008, 8:20 am

chever wrote:
It would be very interesting [ ed. to see/find out whether ...], everything else being equal, a brain reacted the same way under the same circumstances every time...

Exactly. And until we know, one way or the other, belief in contra-causal free-will, ( as opposed to merely a descriptive term for our subjective experience of making decisions), is a question of faith.

And unless some hitherto undetected/unproven "force" is discovered to be interfering in our brain activity, the probability is that the brain would react the same way, ( all other things being equal; though even the passage of time would presumably have some effect, and so circumstances could never be said to have remained the same :? :) ) .

.



slowmutant
Veteran
Veteran

User avatar

Joined: 13 Feb 2008
Age: 45
Gender: Male
Posts: 8,430
Location: Ontario, Canada

02 Oct 2008, 8:26 am

If we were all cogs in the same machine, you'd think there'd be more cooperation and peaceful coexistance among the human race. But there isn't much of either, really.



ouinon
Supporting Member
Supporting Member

User avatar

Joined: 10 Jul 2007
Age: 61
Gender: Female
Posts: 5,939
Location: Europe

02 Oct 2008, 9:37 am

slowmutant wrote:
If we were all cogs in the same machine, you'd think there'd be more cooperation and peaceful coexistance among the human race. But there isn't much of either, really.

Most humans behave in a mostly cooperative and peaceful way. It is only a very small minority who do not. In fact civilisation is a huge and generally successful cooperative effort. :)
.



lau
Veteran
Veteran

User avatar

Joined: 17 Jun 2006
Age: 75
Gender: Male
Posts: 9,791
Location: Somerset UK

02 Oct 2008, 9:47 am

ouinon wrote:
I see that we misunderstand each other almost completely, as you seem to have totally misunderstood most of my last couple of posts aswell. :? :wink:
I haven't.

ouinon wrote:
Quote:
I stated that I take responsibility. I mentioned no beliefs.

If by "taking responsibility for something" you do not mean that you believe that you in some way caused it to happen, what do you mean by the statement? :? :?: What does "taking responsibility" mean if not "believing" that you caused it to happen, ( which I "believe" you did say in fact )?
Your belief. If you check, you will find you were wrong to think I had said this.

You equate responsibility with active cause. I don't. I can happily take responsibility for effects that have occurred that I have not actively sought to prevent.

ouinon wrote:
Quote:
If others also take responsibility for events, I am pleased.

My question is how do you divide up responsibility for events in that case, if you believe that you are responsible for everything that happens to you?

How could anyone else be responsible for anything at all concerning/in their relation to you, unless in fact you are not responsible for everything that happens to you? (Or do you mean by "you" some higher self of which you are an indivisible part, along with everyone else ?)
I did not say anything about dividing responsibility. I do not do that. I see no reason why two people cannot have take full responsibility for a single event, even though I am a mathematician.

ouinon wrote:
Quote:
The machine has to be complex, before we get "I", but currently, the larger the machine becomes, the less likely it is to be conscious. Hence your reference to "cogs" would imply systems too simple to exhibit consciousness.

I meant this metaphorically, as visual analogy, not as exact description. I am well aware that the kind of machine which has cogs is too large, and not complex enough, to have consciousness.

I just meant that each of us may be as interlinked, as automatically connected in functioning, as some cogs connecting/interlocking in a machine. Each on their ( relatively, but always inter-relating) ) "set" path. No free will at all. Otherwise your responsibility for everything that happens to you would mean that other's wills were subject to yours.

.
I find most metaphors to be ultimately misleading, which is why I took apart the one you used. I tried (it would appear unsuccessfully) to show where you were introducing, what I consider to be, two fallacies of scale.


_________________
"Striking up conversations with strangers is an autistic person's version of extreme sports." Kamran Nazeer


ouinon
Supporting Member
Supporting Member

User avatar

Joined: 10 Jul 2007
Age: 61
Gender: Female
Posts: 5,939
Location: Europe

02 Oct 2008, 10:00 am

lau wrote:
ouinon wrote:
I see that we misunderstand each other almost completely, as you seem to have totally misunderstood most of my last posts aswell.
I haven't.

You have, or that is the impression produced by your replies anyway. I still have no idea what you mean/are trying to express either; it is as if despite both using english we are speaking different languages.

.



lau
Veteran
Veteran

User avatar

Joined: 17 Jun 2006
Age: 75
Gender: Male
Posts: 9,791
Location: Somerset UK

02 Oct 2008, 10:18 am

ouinon wrote:
lau wrote:
ouinon wrote:
I see that we misunderstand each other almost completely, as you seem to have totally misunderstood most of my last posts aswell.
I haven't.

You have. And I still have no idea what you mean/are trying to express either. I think that despite both using english we are speaking different languages.

.

I haven't misunderstood you.

That is, other than finding it difficult to see what "responsibility", as a word, and "free will", as a concept, mean to you. I use both in what I feel is close to the "conventional" meanings. I suppose in both cases, I use them utterly subjectively.

You do seem to believe in absolute determinism, at times, then start drifting off into the "each of us ... some cogs connecting/interlocking in a machine" realm, at others. I "believe" in neither. I see no very strong evidence for either.


_________________
"Striking up conversations with strangers is an autistic person's version of extreme sports." Kamran Nazeer


Amitiel
Yellow-bellied Woodpecker
Yellow-bellied Woodpecker

User avatar

Joined: 13 Sep 2008
Gender: Female
Posts: 55

02 Oct 2008, 3:08 pm

In regard to the OP's question.

My thoughts: Our consciousness is due to the intricacy and extensive development of our neurological wiring. Computers are pretty straight forward in comparison.



slowmutant
Veteran
Veteran

User avatar

Joined: 13 Feb 2008
Age: 45
Gender: Male
Posts: 8,430
Location: Ontario, Canada

02 Oct 2008, 3:10 pm

Amitiel wrote:
In regard to the OP's question.

My thoughts: Our consciousness is due to the intricacy and extensive development of our neurological wiring. Computers are pretty straight forward in comparison.


I agree. I doubt that any man-made machine will attain or even approach the level of complexity found in the human brain. I think the human brain is one-of-a-kind in this respect.



lau
Veteran
Veteran

User avatar

Joined: 17 Jun 2006
Age: 75
Gender: Male
Posts: 9,791
Location: Somerset UK

02 Oct 2008, 5:26 pm

slowmutant wrote:
Amitiel wrote:
In regard to the OP's question.

My thoughts: Our consciousness is due to the intricacy and extensive development of our neurological wiring. Computers are pretty straight forward in comparison.


I agree. I doubt that any man-made machine will attain or even approach the level of complexity found in the human brain. I think the human brain is one-of-a-kind in this respect.

I disgree. I am sure that man-made machines will attain and exceed the level of complexity found in the human brain. I think the human brain is a minor transitional phase in this respect.


_________________
"Striking up conversations with strangers is an autistic person's version of extreme sports." Kamran Nazeer


slowmutant
Veteran
Veteran

User avatar

Joined: 13 Feb 2008
Age: 45
Gender: Male
Posts: 8,430
Location: Ontario, Canada

02 Oct 2008, 5:29 pm

lau wrote:
slowmutant wrote:
Amitiel wrote:
In regard to the OP's question.

My thoughts: Our consciousness is due to the intricacy and extensive development of our neurological wiring. Computers are pretty straight forward in comparison.


I agree. I doubt that any man-made machine will attain or even approach the level of complexity found in the human brain. I think the human brain is one-of-a-kind in this respect.

I disgree. I am sure that man-made machines will attain and exceed the level of complexity found in the human brain. I think the human brain is a minor transitional phase in this respect.


I think you underestimate the brain, good sir.