Tech

Humanities can’t save big technology from itself


Problem with technology, many claim, is its penchant for quantification, a “hard” math implemented in the softer human world. Technology is Mark Zuckerberg: all become beautiful girls into numbers and raved about the social wonders of metaverse while being so clumsy in all human interactions that he is instantly remembered. The human world contains Zuck, but it’s also everything where he fails spectacularly. That failure, his lack of social and ethical rigor, is one of the things many believe he shares with the industry in which he is so attached.

And so, because Big Tech is failing to understand people, we often hear that its workforce simply needs to hire more people. do knowledge. Titles like “Liberal arts majors are the future of technology” and “Why computer science needs humanities“Has been a regular feature of tech and business articles over the past few years. It has been suggested that social workers and librarians can help the tech industry limit the harmful effects of social media on Black youth and the increase of wrong information, corresponding. Many anthropologists, sociologists, and philosophers—especially those with advanced degrees who are feeling a financial squeeze because of academia’s preference for STEM—are rushing to demonstrate the utility of them against tech giants whose starting salaries would make the average humanities professor blush.

I have been researching non-tech workers in the technology and media industries for the past few years. The arguments for “attracting” sociocultural experts shed light on the fact that these roles and workers already exist in the tech industry and, in various ways, always have. For example, many current UX researchers have advanced degrees in sociology, anthropology, library, and information science. And EDI (Equality, Diversity and Inclusion) teachers and professionals often fill roles in technology HR departments.

Recently, however, the technology industry To be explore where non-technical expertise can solve some of the social problems associated with their products. Increasingly, tech companies are looking to law and philosophy professors to help them navigate the legal and ethical complexities of platform governance, activists and critical academics to help preserve protect marginalized users and other professionals to assist with platform challenges such as algorithmic suppression, disinformation, community governance, user health, and revolutions and activism. digital motion. These data-driven industries are trying to augment their technical know-how and data warehouses with social, cultural and ethical expertise, or what I usually call data. “soft”.

But you can add all the soft data staff you want, and very little will change unless the industry values ​​that kind of data and expertise. In fact, many academics, policy experts, and other socio-cultural experts in the AI ​​and technology ethics space are notice a disturbing trend of tech companies seeking out their expertise and then ignoring it in favor of jobs and more technical workers.

Such experiences particularly highlight this fraught moment in the burgeoning field of AI ethics, where the tech industry can claim to combine non-technical roles while actually adding frameworks. about ethics and socioculturalism into the job titles ultimately held by the “old” technologists. More importantly, in our love for these often underrated “soft” professions, we must not overlook their limitations when it comes to achieving our lofty goals.

While it is important to protect the vital work done by these underresourced and undervalued professions, there is no reason to believe that their members are inherently better equipped to be. Be the arbiter of what is moral. These individuals have very practical and important social and cultural expertise, but their fields are all taking account of structural dilemmas and areas of weakness.

Consider anthropology, a discipline that emerged as part of the Western colonial project. Although cultural anthropology now generally espouses the goal of social justice, there is no guarantee that an anthropologist (85% of whom are white in the US) will direct or develop the techniques of social justice. math in a less biased way, like a computer scientist. Perhaps the most notorious example is PredPol, the multi-million dollar predictive policy firm that Ruha Benjamin calls part of New Jim Code. PredPol was created by Jeff Brantingham, a professor of Anthropology at UCLA.

Other academic communities advocated by soft data promoters are similarly at odds. Monitoring and Quantification of the Early Black Population of Sociology Played role in today’s surveillance technologies that help to overwhelm Black communities. My own research area, important studies on the internet, is very white and has failure to focus concerns around race and racism. Indeed, I am often one of the few Black and brown researchers who attend conferences in our field. There was a time when I was surrounded by diversity in the tech-industry spaces rather than in the academic spaces from which major Big Tech critiques stemmed.

Social worker will can add some much-needed variety to the technology. Social work is overwhelmingly performed by women and is a fairly diverse profession: more than 22% Black and 14% Hispanic/Latino. However, social workers have also been implicated in state violence against marginalized communities. For example, a social worker co-authorizes a controversial paper with Brantingham expanding its predictive policing work to automated gang classification.

.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button