When corporations from one country develop and deploy technology in many other countries, extracting data and profits, often with little awareness of local cultural issues, a number of ethical issues can arise. Here we will explore algorithmic colonialism. We will also consider next steps for how students can continue to engage around data ethics and take what they’ve learned back to their workplaces.
People are motivated by incentives. Training is basically useless if other parts of the culture say that problematic behavior is just fine (if not encouraged).
It’s hard to change explicit bias with a training. But even if you could help some ppl,that just scratches the surface. Why? Because the bias we see in policing is both historical and systemic.
Say you manage to reach one of the “good guys” that “doesn’t have a racist bone in his body”. Good luck turning around that dept. Police operate in a rigid hierarchical environment with formal policies/quotas and informal practices and traditions. There are networks of loyalty and influence that keep people in check. And the strength of their relationships could have life or death consequences. There are very little rewards if someone makes waves and pisses off the folks that are just fine with all of their racist bones.
These broken corporate solutions cause real harm. It’s bad enough if Karen from HR leaves a training pissed b/c of the SJW’s that wasted her time. It’s even worse when we push this on ppl who have the power to destroy the lives of POC and face little accountability if they do so.
Hi Rachel, would you kindly post the latest slides from Thursday on algorithmic colonialism on Canvas so we can have them as a resource when we prep for the final/to have during the final? Thank you!