What bias that should be built in to Predictive Analytic? When is bias good? When is it inappropriate?
Any comment?
Mathematics doesn’t lie. An unbiased program (not algorithm) will faithfully reveal the truth. In today’s pc-obsessed world, the truth explodes heads.
What the a$$holes want is programs with deliberate bias, so the numbers come out “right”.
Remember: math scores that unfailingly show that Chinese and Japanese, among others, beat Whites.
eTyrant
what a bunch of gibberish and BS!
“what makes us accept the determinations of algorithms without questioning what factors could have shaped the outcome”
Yes, what indeed? Is there a better reason that we are equippent with brains capable of logical analysis and reasoning?
Best I can figure it, the author is afraid that if algos are written that bias the outcome and we don’t pay any attention, we’ll be misled. Welcome to the polling industry, 2016 edition. Or global warming 10 years prior.
I activated a Google alert for Trump. The results sent to me just today were 90%+ heavily anti-Trump stories from the likes of CNN, NYT, etc
The problem with analytics from the progressive view is not that the programmers were biased. It is that many times the objective algorithms and data violate progressive's enforced assumptions about the world.
The point of analytics is to have software look for patterns in data. Patterns the programmers were not aware of. Sounds innocent, but when people do that to other people, isn't that profiling, or racism, or any of the other "isms"?
Here is an example. One of the first users of data analytics was grocery stores. When they began those loyalty cards, they could use people's history to learn that people who bought ketchup often bought mustard at the same time. So a smart grocery store would locate ketchup and mustard together or even package them together.
So far this is innocent. But how about if the software found that people who bought fried chicken also bought watermelon? And better yet, if the software knew the customers were black? Should you locate the watermelon next to the chicken? Should the store send these customers coupons for both? Or should you pretend not to see a racial stereotype despite the fact that objective algorithms found a real pattern?
Now you have a progressive quandary. The article seems to want progressive filters in the software in order to become blind to patterns which violate progressive dogma. The actal bias is in the filters progressives want, not in the analytics software, and not in the programmers of the analytics software.
While these tools should be used, basing all decisions on what some computer algorithm says is really stupid.