Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Religion and Spirituality > Atheism and Agnosticism
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Closed Thread Start New Thread
 
Old 01-23-2014, 05:41 PM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,129,546 times
Reputation: 21239

Advertisements

Let us say that you have developed an android, a mechanical being which in all outward respects resembles a human. Now you are programming its personality and are free to make it as virtuous as you wish

How unselfish would you make it? To what degree would you obligate its programming to take risks to itself in order to help others?

 
Old 01-23-2014, 06:35 PM
 
Location: Florida
7,246 posts, read 7,076,730 times
Reputation: 17828
Didn't Asimov do that already?
 
Old 01-23-2014, 06:50 PM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,129,546 times
Reputation: 21239
Quote:
Originally Posted by kab0906 View Post
Didn't Asimov do that already?
Not in this thread.

The idea here was to trigger a debate about ourselves, to what degree do we owe others relative to our selves?
 
Old 01-24-2014, 04:20 AM
 
Location: S. Wales.
50,088 posts, read 20,723,660 times
Reputation: 5930
I presume that the android would not be as vulnerable as ourselves. How 'expendable' it would be would depend on how human it looked. The fact is that we can dump paper and paint a lot more easily than paper and paint put together to make a picture of a pretty lass with appealing eyes.
 
Old 01-24-2014, 07:12 AM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,129,546 times
Reputation: 21239
Forget the android part, that is getting hung on on the hypothetical at the expense of the question the hypothetical raises.

Make it a human...but in an age where we have learned how to genetically program personalities. There are ten possible settings with ten being total disregard for self in favor of others and one being self preservation uber alles.

What would be the proper setting?
 
Old 01-24-2014, 07:43 AM
 
Location: Northeastern US
20,005 posts, read 13,480,828 times
Reputation: 9938
I think the first problem here is the equating of "morality" with "roboticness". This is an extremely common error that theists fall into. "We must have suffering, because ROBOTS!" "We must have free will, because ROBOTS!" Yet somehow this is not a problem in heaven. "We MUST have a perfect afterlife, and be ROBOTS!"

Yes, I know, you dispensed with the android, but even so, the idea that we can create and program a person immediately evokes ROBOTS.

Here's the deal. Morality is a social contract which we subscribe to because we have freedom of choice (if not, exactly, free will). We are free to choose to be moral or immoral. The decision just is what it is, however, a sufficiently enlightened and self-aware actor recognizes that fully participating in the social contract is in the long term interest of a sustainable society, which, in turn, is in the long term interest of each individual living in that society.

Because of self-preservation there is always a tension between the good of the individual and the greater good of the family, tribe, or society. This is simply a tension that each person must resolve with their freedom of choice. My thought is that the correct setting isn't simply some sort of scale of (dis)regard for self, but a scale of self-awareness and how oneself fits into society and the subgroups to which one belongs. It's never a simple question of (dis)regard for self but of cost vs benefits to self and a widening circle of family, friends, coworkers, and fellow citizens.
 
Old 01-24-2014, 10:45 AM
 
Location: S. Wales.
50,088 posts, read 20,723,660 times
Reputation: 5930
Quote:
Originally Posted by Grandstander View Post
Forget the android part, that is getting hung on on the hypothetical at the expense of the question the hypothetical raises.

Make it a human...but in an age where we have learned how to genetically program personalities. There are ten possible settings with ten being total disregard for self in favor of others and one being self preservation uber alles.

What would be the proper setting?
That is rather different. In effect we are being given a human and asked to instruct it in morality. I rather think the answer is the same as we have heard on the threads on relative morality vs. God -given absolute morality.

Mordant's above does it pretty well. We treat others as we would want them to treat us. Individual acts of bravery and self -sacrifice are not for all, but they tend to evoke admiration, medals and an appearance on a chat show if they come off.
 
Old 01-24-2014, 01:06 PM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,129,546 times
Reputation: 21239
Quote:
Originally Posted by mordant View Post
I think the first problem here is the equating of "morality" with "roboticness". This is an extremely common error that theists fall into. "We must have suffering, because ROBOTS!" "We must have free will, because ROBOTS!" Yet somehow this is not a problem in heaven. "We MUST have a perfect afterlife, and be ROBOTS!"

Yes, I know, you dispensed with the android, but even so, the idea that we can create and program a person immediately evokes ROBOTS.

Here's the deal. Morality is a social contract which we subscribe to because we have freedom of choice (if not, exactly, free will). We are free to choose to be moral or immoral. The decision just is what it is, however, a sufficiently enlightened and self-aware actor recognizes that fully participating in the social contract is in the long term interest of a sustainable society, which, in turn, is in the long term interest of each individual living in that society.

Because of self-preservation there is always a tension between the good of the individual and the greater good of the family, tribe, or society. This is simply a tension that each person must resolve with their freedom of choice. My thought is that the correct setting isn't simply some sort of scale of (dis)regard for self, but a scale of self-awareness and how oneself fits into society and the subgroups to which one belongs. It's never a simple question of (dis)regard for self but of cost vs benefits to self and a widening circle of family, friends, coworkers, and fellow citizens.
The point of setting it up the way that I did was to create a situation where instead of behaving well or poorly on the basis on the situation in front of you at that moment, we might try and arrive at a consensus regarding what is the proper, general duty to others vs self. You are right, we do not make these choices in a vacuum, but what if we could?
 
Old 01-24-2014, 02:25 PM
 
Location: Northeastern US
20,005 posts, read 13,480,828 times
Reputation: 9938
Quote:
Originally Posted by Grandstander View Post
The point of setting it up the way that I did was to create a situation where instead of behaving well or poorly on the basis on the situation in front of you at that moment, we might try and arrive at a consensus regarding what is the proper, general duty to others vs self. You are right, we do not make these choices in a vacuum, but what if we could?
Well if you're deliberately contriving a situation I guess I would invoke the principle that no one can give to others without it coming from some sort of overflow for themselves. So I would set "selfless" all the way up but have an exception where the person is free to back off when their life becomes a bleak wasteland because they are giving selflessly to the point that everyone else expects it and takes it for granted.
 
Old 01-24-2014, 05:18 PM
 
Location: S. Wales.
50,088 posts, read 20,723,660 times
Reputation: 5930
A 'vacuum' is also a bit of a difficult thing to have. If we are passing on some sort of morality for it, we would have to assume that it might well come to interact with humans and so would find it safe (as well as convenient) to pass on the consensus - morality we have thrashed out over centuries of social experiment, law -code making and moral philosophy. So what we pass on would not have come from a 'vacuum'.

Which I see is pretty much what mordant says.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Closed Thread


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Religion and Spirituality > Atheism and Agnosticism
Similar Threads

All times are GMT -6. The time now is 01:17 AM.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top