Tech

Case of creepy algorithm showing teenage pregnancy ‘predicted’


para leer este actiulo en español by pro aprete aqui.

In 2018, while the Argentine Congress was hotly debating whether to abolish abortion, the Ministry of Early Childhood Education in the northern province of Salta and American technology giant Microsoft presented a Algorithmic system to predict adolescent pregnancy. They call it the Technology Platform for Social Intervention.

“With technology, you can see five or six years ahead, with last name, first name and address, which girl — prospective teen — is 86% pre-pregnancy,” said Juan Manuel Urtubey, then the governor of the province. , announced proudly on national television. The stated goal is to use an algorithm to predict which girls from low-income areas will become pregnant in the next five years. It’s never been clear what happens once a girl or young woman is labeled as “lucky” to be a mother or how this information will help prevent teenage pregnancy. The social theories that inform the AI ​​system, like its algorithms, are ambiguous.

The system is based on data — including age, ethnicity, country of origin, disability, and whether the subject’s home has hot water in the bathroom — from 200,000 residents in the city of Salta, including 12,000 women and girls aged 10 years and over 19. Although there are no official documents, from a review of media articles and two technical reviews, we know that “the territory agent” visited the homes of the girls and women in question, asked survey questions, took photos, and recorded GPS locations. What do these closely watched people have in common? They are poor, some are migrants from Bolivia and other countries in South America, and others are from the indigenous communities of Wichí, Qulla and Guaraní.

While a Microsoft spokesperson has proudly stated that the technology in Salta is “one of the pioneers in the use of AI data” in state programs, it reveals little new. batch. Instead, it is an extension of a long-standing Argentine tradition: population control through surveillance and force. And the response to it shows how grassroots Argentine feminists have been receptive to this abuse of artificial intelligence.

In the day 19 In the early 20th century, successive governments of Argentina carried out genocide of indigenous communities and promoted immigration policies based on ideologies designed to attract European settlement. Europe, all with hope blanquismo, or “whiten” the country. Over time, national identities have been built upon social, cultural and especially racial boundaries.

This type of eugenics thinking tends to reshape and adapt to new scientific models and political circumstances, according to historian Marisa Miranda, according to historian Marisa Miranda, who tracks control efforts. population of Argentina through science and technology. Take the immigration case. Throughout Argentina’s history, opinion has fluctuated between celebrating immigration as a means of “improvement” of the population and seeing immigration as undesirable and a political threat that needs to be monitored and managed. handle carefully.

More recently, the Argentine military dictatorship from 1976 to 1983 controlled the population through systematic political violence. During the dictatorship, women had a “patriotic duty” to build the country, and contraception was banned under the 1977 law. The dictator’s most ruthless expression of concern for motherhood Abduction of pregnant women is considered political subversion. Most of the women were murdered after giving birth, and many of their children were illegally adopted by the military to be raised by “patriotic Catholic families.”

Although Salta’s AI system for “pregnant prediction” is hailed as the future, it can only be understood in light of this long history, particularly, in Miranda’s words, the persistent eugenics impulse. persistent always “contains a reference to the future” and assumes that reproduction “Should be governed by power.”

Because Completely lacking in national regulation of AI, the Social Intervention Technology Platform has never been formally evaluated and no assessment has been made of its impact on girls and women. There is no published official data on its accuracy or results. Like most AI systems worldwide, including those used in sensitive contexts, it lacks transparency and accountability.

While it’s unclear if the tech program will eventually be suspended, everything we know about the system comes from the efforts of feminists and journalists who have led the level test. basis for a flawed and harmful AI system. By quickly activating a greased community organizing machine, these activists have attracted national media attention on how an untested, unregulated technology has been discovered. used to violate the rights of girls and women.

Feminist scholars Paz Peña and Joana Varon write: “The idea that algorithms can predict teenage pregnancy before it happens is the perfect excuse for anti-feminist and anti-feminist activists sexual and reproductive rights claims abortion laws are unnecessary. Indeed, it was soon revealed that an Argentinian nonprofit called the Conin Foundation, run by Dr Abel Albino, an anti-abortion rights advocate, was behind the technology, along with Microsoft.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button