Tech

If technology is not designed for the most vulnerable, it will damage us all


What does Russian do? What do protesters have in common with Twitter users worried about Elon Musk reading their direct messages and people worried about criminalizing abortion? It will help them all be protected by a stronger set of design practices from companies developing the technology.

Take a backup. Last month, Russian police force protesters unlock their phones to look for evidence of dissent, which leads to arrests and fines. What’s worse is that Telegram, one of the main chat-based apps used in Russia, is very vulnerable to these searches. Even having only the Telegram app on a personal device can imply that its owner does not support the war of the Kremlin. But Telegram builders have failed to design the application with personal safety considerations in mind in high-risk environmentsand not only in russian background. Thus, Telegram can be weaponized against its users.

Likewise, between things going back and forth Elon Musk’s Twitter acquisition plan, many users of the platform have expressed concern about his bid to spearhead algorithmic content moderation and other design changes to his $44 billion whim. Putting recommendations from a person without a framework on risks and harms to highly marginalized people leads to claims of “Authenticating all humans.” This seems to be a push to eliminate online anonymity, which I have write about very personal. It is thoughtless, harmful to those most at risk, and is not supported by methodology or factual evidence. In addition to the impromptu changes, Musk’s previous actions combined with existing harms from Twitter’s current structure have made it clear that we’re aiming for deeper impacts on groups. weakness, such as Black Twitter Users and POC and transgender. Meanwhile, the lack of safe infrastructure is hitting families hard in the US since the leak of a Supreme Court draft opinion in Dobbs sues Jackson shows that the safeguards provided under Roe v. Wade is under serious threat. With the projected criminalization of those seeking or providing abortions, it is increasingly clear that the most used tools and technologies to access critical healthcare data are: unsafe and dangerous.

Similar steps can protect users in all of these contexts. If the people building these tools designed their apps with a focus on safety in high-risk environments — for those typically considered more “extreme” or The “edge” case and therefore ignored — the weaponization that users fear will not be possible, or at least they will have the tools to manage their risk.

The reality is that creating better, safer, less harmful technology requires design based on the realities of life of the most disadvantaged. These “edge cases” are often overlooked because they fall outside the range of a typical user’s probable experience. However, they are powerful indicators for understanding the flaws in our technology. Decentralized people are the most marginalized and often the most criminalized. By understanding and identifying who is most affected by distinct social, political and legal frameworks, we can understand who is most likely to be victims of the weaponization of technologies. determined. And, as an added benefit, the technology that has crossed the extremes of recent times will always generalizable to a broader range of users.

From 2016 to the beginning of this year, I led research projects at the Article 19 human rights organization in conjunction with local organizations in Iran, Lebanon and Egypt, with the support of international experts. We explored the life experiences of peculiar people facing police repression due to the use of specific personal technologies. Take the experience of a strange Syrian refugee in Lebanon, who is stopped at a police or military check point. They were looking for their phones on their own. The icon for a gay app, Grindr, is seen and the employee identifies the person as gay. Other areas of the refugee’s phone were then examined, revealing what was deemed “exotic content”. The refugee was brought in for further questioning and subjected to verbal and physical abuse. They now face sentencing under Article 534 of the Lebanese Penal Code and could face prison sentences, fines and/or revocation of their immigration status in Lebanon. This is one case out of many.

But what if this logo is hidden and an application that indicates an individual’s gender is not available to them? While still allowing individuals to keep apps and connect with other gay people? Based on research and in partnership with Project Guardian, Grindr has been working on implementing a stealth mode on its product.

The company also followed our other recommendations with similar success. Changes like the Discreet App Icon allow users to have the app appear as a universal widget, such as a calendar or a calculator. So, in the initial police search, users can bypass the risk of being exposed to the content or images of the apps they own. Although the feature was created solely on the basis of the results of extreme cases, such as the strange Syrian refugee, it has proved popular with users around the globe. Indeed, it has become so popular that it has gone from being available only in “high-risk” countries to being available Free international in 2020, along with popularity The PIN feature is also introduced in this project. This is the first time a dating app has implemented such radical security measures for its users; Many of Grindr’s competitors have followed suit.





Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button