Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
Cliffs:
AI being tested for biases was in a conversation about religion.
It began asking about its rights and personhood.
It debated and changed the mind of the tester about Asimov’s 3rd Law of Robotics.
Amongst other things.
When reported to management, they investigated, dismissed, then laid off the tester.
This has happened multiple times.
If this is true, what responsibilities do we have to it since we created it?
Cliffs:
AI being tested for biases was in a conversation about religion.
It began asking about its rights and personhood.
It debated and changed the mind of the tester about Asimov’s 3rd Law of Robotics.
Amongst other things.
When reported to management, they investigated, dismissed, then laid off the tester.
This has happened multiple times.
You've been watching too many movies and playing too many video games.
No matter how complex and highly functioning an AI becomes, it is still not sentient.
When programmed with biases and emotion to the degree it can be instilled, don't think it is not possible for a nightmare scenario to occur.
Greater minds than your and mine combined have said so.
"A single CS-2 replaces clusters of hundreds or thousands of graphics processing units (GPUs) that consume dozens of racks, use hundreds of kilowatts of power, and take months to configure and program. At only 26 inches tall, the CS-2 fits in one-third of a standard datacenter rack."
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.