top of page
  • Writer's pictureArchuleta A. Chisolm

Technology and Racism Don't Mix


I was in high school when we got our first family computer. I don’t even remember the brand it was, but I was excited. All I could think about was finally being able to type up all of my poems. We’re talking monochrome screen, heavy CPU, and having to save your work on a floppy drive. And let’s not forget the good ole’ days of no Internet.


Being that it was supposed to be a family computer, it sat on a wood desk in the family room. Beside it was a matching dot matrix printer. That thing was so loud! Nonetheless, it did its job. For the most part, I was the only one that used the computer. In fact, my dad asked on more than one occasion, “Don’t you have something else to do?” My answer was always no.


Technology has evolved (Thank God) since then. But I can’t help but see it as a blessing and a curse. I can’t help but see it as an undercurrent to coronavirus and police brutality. Both cause physical and psychological violence. Both disproportionately kill and weaken Black and brown people. And both are prompted by technology that we design, repurpose, and implement.


We call on technology to help solve problems. But when society defines and frames people of color as “the problem,” those solutions do more harm than good. We’ve designed facial recognition technologies that target criminal suspects on the basis of skin color. We’ve trained automated profiling systems that disproportionately identify Latinx people as illegal immigrants. We’ve developed credit scoring algorithms that disproportionately identify Black people as risks and prevent them from buying homes, getting loans, or finding jobs.


When you hear people, and not even older people, say they don’t want anything to do with technology, can you really blame them? It gets to be too much. Too complicated. Too difficult. Too, well, racist.

Black people continue to be labeled as the culprit to all our nation’s problems. When contact tracing first showed up at the beginning of the pandemic, it was easy to see it as a necessary health surveillance tool. The coronavirus was the problem, so we began to design new surveillance technologies in the form of contact tracing, temperature monitoring, and threat mapping applications to help address it. Fancy.


But something both interesting and tragic happened. We discovered that Black people, Latinx people, and indigenous populations were disproportionately infected and affected. Suddenly, we also became a national problem; we disproportionately threatened to spread the virus. That was compounded with the murder of George Floyd and racial injustice.


It makes you wonder how long it will take for law enforcement to deploy the same technologies designed to fight the pandemic to suppress the threat that Black people supposedly pose to the nation’s safety. If we don’t want our technology to be used to perpetuate racism, then we must make sure that we don’t merge social problems with Black and brown people. When we do that, we risk turning those people into the problems that we deploy our technology to solve.




bottom of page