Law in the Internet Society

View   r36  >  r35  ...
TWikiGuestFirstEssay 36 - 24 Oct 2024 - Main.CliftonMartin
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Added:
>
>
Clifton Martin L6160 Law in the Internet Society
 
Changed:
<
<
Jasmine Bovia Law and the Internet Society

PSRs, RAIs, and the Fight Against AI

>
>
Grindr: A Revolutionary App or A Disease to the LGBT Community?
 

Introduction:

Changed:
<
<
Although Artificial Intelligence models have existed in some form since the 1950s, 2022 marked the beginning of what has now become known as the “AI Boom”, a term used to describe the rapid expansion of Artificial Intelligence usage into the mainstream. This technological boom, spurred by the use of large-language models like ChatGPT? and Meta Platforms, has become increasingly observable not only in the public sphere, but in a number of professional fields such as journalism, medicine, and, notably, law. This paper seeks to examine the potentially negative consequences of AI usage on the legal sector, specifically the judiciary. Further, it suggests some preliminary measures to limit, if not completely curb, the role AI plays in judgment.
>
>
Previously you could consider yourself lucky if you met anyone at a club or bar as a gay or bisexual man. In the LGBT community, there was no clear way for men to meet one another; however, today’s phone apps have revolutionized dating for the general public. Grindr, a dating app meant to connect male identifying folks of the LGBT community, lets men locate other Grindr users who are nearby. According to the app’s creator, Joel Simkhai, Grindr is for “guys meeting guys” and it’s meant to help gay men establish relationships, whether that be friendship, dates, or sex. Despite the creator’s intentions, generally most men are using Grindr for casual sex. Therefore, Grindr’s culture of casual sex is problematic as it reinforces an inaccurate, generalized view commonly held by members outside of the LGBT community that queer men are more sexually promiscuous.
 
Added:
>
>
Origin, Function, and Use of Grindr:
 
Changed:
<
<
AI and the Judiciary:
>
>
Grindr is a smartphone application that utilizes GPS technology to locate other gay men that are in proximity – regardless of your geographic location. Since launching in 2009, the app has been downloaded over 10 million times, is available in 192 countries, and has more than 2.6 million users that have collectively exchanged more than 70 million chat messages. Over the past 15 years, Grindr has quickly grown into the world’s largest social networking app for gay, bisexual, trans, and queer people. The app is not limited to men who are “out of the closet”; men who are questioning their sexuality and/or identify as “discreet” or “closeted” can use the app as well.
 
Changed:
<
<
While the usage of Artificial Intelligence within the entire legal sphere has been met with rightful controversy, AI’s effect on the judiciary is especially troubling. According to the American Bar Association, numerous states have begun incorporating AI models into the judicial practice as an evaluation tool meant to aid in the generation of Pre-Sentence Reports (PSRs). Risk Assessment Tools are one specific class of AI model that rely on fact patterns and outcomes of previous cases to calculate metrics such as recidivism potential for criminal defendants. These metrics play an increasingly instrumental role in PSRs and, consequently, the sentencing outcomes of criminal cases. Sentencing courts have become increasingly reliant on these AI models to disastrous effect; already, the use of this software in PSR generation has been the subject of legal challenges on Due Process grounds. An investigative article published by ProPublica? highlighted one of the glaring issues with state judiciaries’ use of AI tools in criminal cases. Although limited data currently exists on these AI models, studies are beginning to show that risk assessment tools perpetuate racial bias in their assessments. The risk recidivism software COMPAS, developed by the for-profit company Equivant, serves as a shining example; Black defendants were almost twice as likely as white defendants to be wrongfully labeled as having a “high-risk” of recidivism. On the flipside, white defendants were much more likely than Black defendants to be incorrectly considered at “low-risk” of reoffense. This is far from the only problem with Artificial Intelligence models like COMPAS. Another potential issue with sentencing courts’ use of these tools is one inherent to their very nature. Artificial intelligence learns by constantly adapting its output to expanding data sets. These ever-evolving algorithms could mean skewed results for defendants as more data becomes available; the machine’s determination of a fair sentence for a defendant one day can, in theory, be completely different from its determination of a fair sentence for a future defendant with an identical fact pattern. Even further, the American Bar Association correctly posits that the use of computer-generated evaluations for determining matters such as recidivism risk removes the necessary human aspect of sentencing. Where human judges are better able to see beyond fact patterns and take more nuanced views of the defendants in front of them, AI software can only see the numbers, resulting in distressingly clinical results. With these problems in mind, it is understandable to see why the use of AI tools within the judiciary remains controversial.
>
>
Each Grindr user has a profile with personal information, focusing on physical features like their height, weight, ethnicity, and body type. A user’s profile also displays their relationship status, current HIV status, and their “tribe”. A tribe is a filter that lets users identify themselves with a specific group within the gay community like clean-cut, twink, bear, and geek. These preferences let users specify their searches and find their preferred type of man. These features let men easily find what they are looking for, but they also contribute to the app’s overtly sexual nature since the filtering is done primarily by physical preference.
 
Added:
>
>
How Grindr Perpetuates Gay Stereotypes
 
Changed:
<
<
Preliminary Measures:
>
>
Outside and even within the LGBT community, there’s an inaccurate but established stereotype that queer men are more promiscuous and heavily active in today’s “hook-up culture.” Hook-up culture both encourages and normalizes sexual encounters without a long-term commitment or emotional attachment. Grindr and its users have created its own culture of hooking up. And individuals outside of the LGBT community are already apt to believe that gay men have higher levels of casual sex – especially after 1980s HIV/AIDS epidemic which initiated a great deal of the gay, sexual stereotypes that exist today.
 
Added:
>
>
However, not every gay or bisexual man is sexually active, let alone sexually promiscuous, which disproves the largest misconception behind these stereotypes. Though, gay, bisexual, and queen men’s actual use for Grindr further pushes this inaccurate stereotype when the app’s societal influence could instead be used to shatter this myth. Grindr also has some additional features that seem to inadvertently encourage casual sex amongst its users. For example, the instant messaging feature helps in creating Grindr’s hook up culture. In their messages, users can send pictures that tend to go beyond the typical selfie and are usually sexually explicit. The slang popularized by Grindr messaging has also helped in forming the app’s culture of casual sex. Some of the lingo is words like “host”, which is asking if the individual can host the sex partner(s) at his home or “safe”, a way to see if the person wants to use a condom or another safe sex method. At the end of the day, the frequent and popular use of Grindr and its features have allowed a culture of hooking up to permeate the LGBT community and thrive. The fact that the app is mostly used exclusively for casual sexual behaviors inaccurately implies that homosexual men are more promiscuous.
 
Changed:
<
<
Barring an absolute moratorium on the use of AI tools in the judiciary, which would be difficult to enforce in practice, there are mitigating measures that may be taken to minimize the negative impacts of risk assessment instruments (RAIs) on the sentencing process. For one, regulation could look like limiting what factors go into determining matters like risk recidivism in defendants. Currently, tools like COMPAS utilize information relating to a defendant’s identity when calculating risk factors– including their race, sex, and age. To avoid integrating the same biases that plague the current sentencing process into the RAI algorithms, developers should be explicitly required to exclude these demographics. Further, developing companies of RAIs should be required to publicize what considerations go into their pre-sentencing reports and risk assessments. The confidential nature of RAIs has already been the subject of legal challenge; in Loomis v. Wisconsin, a defendant raised arguments against the COMPAS software for, inter alia, not reporting what data went into the generation of his risk assessment, making it impossible to challenge the instrument’s accuracy and validity. His point was entirely valid; if pre-sentencing reports are to be made accessible to parties of a case, why should other investigative tools, like the risk assessment algorithms that help generate such reports, not be made available and open to scrutiny and potential challenge on due process grounds? Lastly, software developers should be required to analyze the algorithmic outputs of the software that they create, and publish both their process and results. In order for there to be greater transparency and scrutiny in the judiciary’s use of AI, all stakeholders need to hold equal responsibility, and accountability, for potential failings and shortcomings of the risk assessment tools. Allowing developers to gain financially from the use of their algorithms in the sentencing process without any actual stake in the outcomes will work to disincentivize them from ensuring that their models are accurate, reliable, and nondiscriminatory. While the ultimate responsibility of case outcomes should lie with the government, any party that has a stake in criminal cases should bear at least some accountability for the execution, or lack thereof, of justice. These solutions are only launching points for a longer conversation around the use of AI in the criminal justice system. There remains a larger discussion about the use of AI by police, as well as the privacy considerations that plague the integration of artificial intelligence in government as a whole. These preliminary regulations would, however, work to address the issue of AI in the judiciary pending more substantive changes. With the acceleration of the AI boom, the unregulated usage of these so-called “risk assessment tools” will only become more of a risk in-and-of-itself.
>
>
The Issue and What the App Should Do:
 
Added:
>
>
While Grindr has created easy access for meeting gay men in the area, it has simultaneously made obtaining long-term relationship more challenging. The possibilities of a relationship typically seem promising for users as the application provides such easy access to other men who are nearby. However, due to the popular use of Grindr to find casual sex, a great deal of men has found that these meetings don’t really go anywhere and that the app is inefficient means for finding a relationship, leaving those craving a long-term relationship extremely disappointed.
 
Changed:
<
<
Sources: Hillman, Noel L. “The Use of Artificial Intelligence in Gauging the Risk of Recidivism.” American Bar Association, 1 Jan. 2019, www.americanbar.org/groups/judicial/publications/judges_journal/2019/winter/the-use-artificial-intelligence-gauging-risk-recidivism
>
>
Grindr certainly has revolutionized physical interaction among gay men as it allows them to easily filter through and find sexual partners. Although Grindr serves to connect gay men with one another, its actual use goes beyond a networking outlet to an app with a thriving culture of casual sex. This reality further strengthens the social belief that exists both inside and outside of the LGBT community that homosexual men are hypersexual and promiscuous. To a certain extent, Grindr does pose benefits for the gay community as it truly does connect gay, bisexual, and queer men with one another. However, the negative social impact and stigma that is associated with the LGBT community continues to exist due to Grindr’s use and popularity does make the app slightly problematic. Even though Simkhai can’t control all of Grindr’s consumers and their intentions for using it, he can control the impact it creates for the rest of the public, and he should consider the implications that his app has for the LGBT community in doing that. There’s a need to eliminate the established stereotypes about gay men that have existed for so long, rather than perpetuate it.
 
Deleted:
<
<
Garrett, Brandon, and John Monahan. “Assessing Risk: The Use of Risk Assessment in Sentencing .” Bolch Judicial Institute at Duke Law, vol. 103, no. 2, 2019.
 
Changed:
<
<
State v. Loomis, 371 Wis. 2d 235 (2016)
>
>
Sources: “About Grindr.” App - Privacy Policy, www.grindr.com/about/.
 
Changed:
<
<
Angwin, Julia, et al. “Machine Bias.” ProPublica? , 23 May 2016, www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
>
>
Beck, Julie. “The Rise of Dating-App Fatigue.” The Atlantic, Atlantic Media Company, 27 Oct. 2016, www.theatlantic.com/health/archive/2016/10/the-unbearable-exhaustion-of-dating-apps/505184/.
 
Changed:
<
<
18 USC Sec. 3552(d)
>
>
Engle, Clyde. “10 Things I Learned About Gay Hook-Up Culture From My Day On Grindr.” Elite Daily, Elite Daily, 17 Dec. 2018, www.elitedaily.com/dating/gay-hook-up-culture-grindr/1354315
 
Added:
>
>
Salemo, Robert. “Twenty Questions for Grindr Creator Joel Simkhai.” Xtra, 28 July 2011, www.dailyxtra.com/twenty-questions-for-grindr-creator-joel-simkhai-33729
 
Added:
>
>
Tadich, Paul. “The IPhone Revolutionized Gay Hookup Culture.” Motherboard, VICE, 27 June 2017, www.motherboard.vice.com/en_us/article/bj84b8/iphone-anniversary-grindr-gay-hookup-culture

Revision 36r36 - 24 Oct 2024 - 21:42:47 - CliftonMartin
Revision 35r35 - 22 Oct 2023 - 21:00:34 - JasmineBovia
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM