Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Politics and Other Controversies
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 06-26-2016, 05:44 PM
 
1,423 posts, read 1,050,663 times
Reputation: 532

Advertisements

Well, I work in this area and I can tell many of the "automatically learned" models still depend on what features you pick.

For example, if you use "race" as a feature in crime prediction, in the US I am pretty sure "black" people are more likely to commit crimes, and the statistical model will learn it. However, humans decide what features to use to begin with.

In the camera example, if you use both white people and black people in the training data set to train the model, of course black people will form a tight cluster, which leads to problems. But if you train the model with only black people's data, it will be totally different.
Reply With Quote Quick reply to this message

 
Old 06-26-2016, 06:09 PM
 
Location: Texas
37,949 posts, read 17,870,209 times
Reputation: 10371
Quote:
Originally Posted by The Dark Enlightenment View Post
Geniuses like Stephen Hawking and Elon Musk warn of the dangers of Terminator-like AIs running amok, but this NYT writer wants you to know that's just silly "hand wringing". The real problem with artificial intelligence is that it's mostly being created by white men. And since white men are racist and sexist, AI will be too.

An even more serious problem is that, when AIs crunch all the crime data, they give racist advice to the police.
Police departments across the United States are also deploying data-driven risk-assessment tools in “predictive policing” crime prevention efforts. In many cities, including New York, Los Angeles, Chicago and Miami, software analyses of large sets of historical crime data are used to forecast where crime hot spots are most likely to emerge; the police are then directed to those areas... this could result in more surveillance in traditionally poorer, nonwhite neighborhoods, while wealthy, whiter neighborhoods are scrutinized even less.

If racist trees weren't bad enough now we have racist robots.


NYT: “Artificial Intelligence
The police need no help in being racist. The biggest organized crime unit in America is also the most racist.
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 06:10 PM
 
Location: Minnesota
1,548 posts, read 913,607 times
Reputation: 1413
"Google’s photo app, which applies automatic labels to pictures in digital photo albums, was classifying images of black people as gorillas."

LOL
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 06:15 PM
 
34,619 posts, read 21,621,539 times
Reputation: 22232
We need to counter police racism by keeping the police out of poor black neighborhoods.

Please deploy these police to my neighborhood.
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 06:48 PM
 
Location: Suburb of Chicago
31,848 posts, read 17,615,406 times
Reputation: 29385
Quote:
Originally Posted by blanker View Post
"Google’s photo app, which applies automatic labels to pictures in digital photo albums, was classifying images of black people as gorillas."

LOL

You seriously find that funny?
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 07:05 PM
 
Location: Japan
15,292 posts, read 7,761,514 times
Reputation: 10006
Quote:
Originally Posted by yueng-ling View Post
Well, I work in this area and I can tell many of the "automatically learned" models still depend on what features you pick.

For example, if you use "race" as a feature in crime prediction, in the US I am pretty sure "black" people are more likely to commit crimes, and the statistical model will learn it. However, humans decide what features to use to begin with.
If the machine's purpose is to predict criminal activity and advise police on where to deploy, I doubt the results would depend on it knowing about race. Dozens of other categories of data would lead it to predict more crime in black neighborhoods.

Quote:
In the camera example, if you use both white people and black people in the training data set to train the model, of course black people will form a tight cluster, which leads to problems. But if you train the model with only black people's data, it will be totally different.
I imagine clearly distinguishing people with dark skin, hair and eyes is just a fundamentally harder task for the software, but doable in the long run.
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 07:30 PM
 
12,270 posts, read 11,331,859 times
Reputation: 8066
Quote:
Originally Posted by andywire View Post
Can we just have a list of the things that are not racist? It might simplify things a bit...
Trees in National Parks won't be on that list. Including most National Parks and their Rangers ..Trees are Racist: National Parks are Racist Because They Have Trees | Frontpage Mag
Reply With Quote Quick reply to this message
 
Old 06-26-2016, 07:54 PM
 
Location: Just over the horizon
18,461 posts, read 7,092,496 times
Reputation: 11707
Quote:
Originally Posted by yueng-ling View Post
Well, I work in this area and I can tell many of the "automatically learned" models still depend on what features you pick.

For example, if you use "race" as a feature in crime prediction, in the US I am pretty sure "black" people are more likely to commit crimes, and the statistical model will learn it. However, humans decide what features to use to begin with.

In the camera example, if you use both white people and black people in the training data set to train the model, of course black people will form a tight cluster, which leads to problems. But if you train the model with only black people's data, it will be totally different.



In other words:

Reality is racist.
Reply With Quote Quick reply to this message
 
Old 06-27-2016, 07:00 AM
 
16,212 posts, read 10,826,104 times
Reputation: 8442
Quote:
Originally Posted by The Dark Enlightenment View Post
Geniuses like Stephen Hawking and Elon Musk warn of the dangers of Terminator-like AIs running amok, but this NYT writer wants you to know that's just silly "hand wringing". The real problem with artificial intelligence is that it's mostly being created by white men. And since white men are racist and sexist, AI will be too.

An even more serious problem is that, when AIs crunch all the crime data, they give racist advice to the police.

If racist trees weren't bad enough now we have racist robots.


NYT: “Artificial Intelligence
First off, I find it odd that your OP link is to a psuedo racist site instead of the actual NYT op-ed piece...

Artificial Intelligence's White Guy Problem

You seem to frequent a lot of white supremacy sites...

On the op-ed itself, it mentioned both racism and sexism in regards to technology. It also mentioned more than just black people in regards to the photo issues of AI and how internet job posting automatically show female browsers lower paying jobs versus male browsers looking for jobs.

IMO as a couple of the people have shared, this actually is a more important issue that "The Terminator" scenario of killer robots because Google especially is a part of nearly everyone's lives today.

Also FWIW, the op-ed piece did not mention Stephen Hawking...


Quote:
Originally Posted by pknopp View Post
Most of this is simply a fact that new technology takes time to work out the bugs. It is a good argument to keep it out of the hands of law enforcement until it is more precise though.
I agree with this. I also feel that it is a good thing to review the kinks in technology and to ensure that developers are creating apps and websites that can handle a broad array of individuals.

Also, on the OP's comments that darker skinned people are more difficult to discern in pictures.... that is silly. You should also realize that the majority of the world's population is brown to dark skinned individuals and that all sorts of non-white people use these apps and technologies.
Reply With Quote Quick reply to this message
 
Old 06-27-2016, 07:49 AM
 
Location: Philadelphia, Pennsylvania
5,281 posts, read 6,590,770 times
Reputation: 4405
This is a common data science problem. "Data is biased", but this has always been an issue. This actually isn't "artificial intelligence" it's actually just a predictive model. It uses simple regressions and the fitting of data to make accurate predictions about what decisions to make next. I don't think "crime prevention" is a useful application for machine learning. But if more black people like yours truly learn more machine learning, we can counteract these with our own models that are more sophisticated. I like the idea of "racist big data". This means that there is an actual market for black people who want to go into the vast field of machine learning and data science.

These are popular articles, and I'm still somewhat learning about machine learning. But it sounds to me that this is an example of "supervised learning", meaning they take historical data, use some sort of regression, and then come up with a prediction. It is among the most crude applications of machine learning. It would be interesting to see what is the best model. I may need to chase down the methodology. I've been looking to cut my teeth on some machine learning problems, and I may have just found one I may find interesting.
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Politics and Other Controversies

All times are GMT -6. The time now is 01:03 PM.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top