MetaChat REGISTER   ||   LOGIN   ||   IMAGES ARE OFF   ||   RECENT COMMENTS




artphoto by splunge
artphoto by TheophileEscargot
artphoto by Kronos_to_Earth
artphoto by ethylene

Home

About

Search

Archives

Mecha Wiki

Metachat Eye

Emcee

IRC Channels

IRC FAQ


 RSS


Comment Feed:

RSS

21 December 2009

Maybe it's not racist but just sexist?

Seriously; that's very stupid.
posted by jouke 21 December | 04:10
That's weird bug - is it a bug?
posted by dabitch 21 December | 05:17
In twelve years in the software business, I've never met an African-American software developer so accidental racism could be in play here. They just forgot to test on darker skin. On the other hand, there would certainly be Indian programmers around who would be as dark as the man in the video.

As a software tester by profession, this is a pretty cool bug. I love stuff that only happens in a certain environment, like time sensitive bugs.
posted by octothorpe 21 December | 07:58
How bizarre.
posted by typewriter 21 December | 09:31
On the other hand, I think it'd be cool to be invisible to computers. "Hey, computer! Look at this pencil floating around! Whoooo...."
posted by BoringPostcards 21 December | 10:30
They just forgot to test on darker skin. On the other hand, there would certainly be Indian programmers around who would be as dark as the man in the video.

Considering the enormous number of man-hours required to make an algorithm like this work at all, it's pretty much impossible that no one thought to try.

More likely the algorithm just doesn't work very well with darker skin, or isn't optimised for it, or even more likely it isn't able to cope with this combination of skin colour and lighting.
posted by cillit bang 21 December | 10:46
I will remain the skeptic and call it a hoax until I see some substantiation.
posted by Ardiril 21 December | 12:03
HP responds.
posted by arse_hat 21 December | 12:36
OK, the possibility exists. Still, I would like to see some data on ease of replication.
posted by Ardiril 21 December | 12:57
Please note: Although my skepticism is real, it does not run deep. At one consultancy where I worked, a senior consultant's full time job was damage control for HP's QC screw-ups.
posted by Ardiril 21 December | 13:14
This happened on Better Off Ted. That show is an oracle of science.
posted by ethylene 21 December | 15:53
It's much more likely that the algorithm could not cope with store lighting, where the background is lit up and the faces aren't. My webcam can't track my face when the foreground lighting is dim. Last time I moved, it fixated on my dog in the background and refused to turn back to me. But I doubt that my webcam is animist ...
posted by Susurration 21 December | 18:11
"It's much more likely that the algorithm could not cope with store lighting, where the background is lit up and the faces aren't." Yeah it's just bad software.
posted by arse_hat 21 December | 18:52
It's much more likely that the algorithm could not cope with store lighting, where the background is lit up and the faces aren't.

It works on the white woman's face despite the bad lighting.
posted by Miko 21 December | 23:05
Music on the radio || Avatar... I'm sorry if we've already discussed this before...

HOME  ||   REGISTER  ||   LOGIN