521 lines
66 KiB
HTML
521 lines
66 KiB
HTML
|
<!DOCTYPE html>
|
|||
|
<html lang="" xml:lang="" xmlns="http://www.w3.org/1999/xhtml"><head>
|
|||
|
<meta charset="utf-8"/>
|
|||
|
<meta content="pandoc" name="generator"/>
|
|||
|
<meta content="width=device-width, initial-scale=1.0, user-scalable=yes" name="viewport"/>
|
|||
|
<title>14 June, 2023</title>
|
|||
|
<style>
|
|||
|
code{white-space: pre-wrap;}
|
|||
|
span.smallcaps{font-variant: small-caps;}
|
|||
|
span.underline{text-decoration: underline;}
|
|||
|
div.column{display: inline-block; vertical-align: top; width: 50%;}
|
|||
|
div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;}
|
|||
|
ul.task-list{list-style: none;}
|
|||
|
</style>
|
|||
|
<title>Daily-Dose</title><meta content="width=device-width, initial-scale=1.0" name="viewport"/><link href="styles/simple.css" rel="stylesheet"/><link href="../styles/simple.css" rel="stylesheet"/><style>*{overflow-x:hidden;}</style><link href="https://unpkg.com/aos@2.3.1/dist/aos.css" rel="stylesheet"/><script src="https://unpkg.com/aos@2.3.1/dist/aos.js"></script></head>
|
|||
|
<body>
|
|||
|
<h1 data-aos="fade-down" id="daily-dose">Daily-Dose</h1>
|
|||
|
<h1 data-aos="fade-right" data-aos-anchor-placement="top-bottom" id="contents">Contents</h1>
|
|||
|
<ul>
|
|||
|
<li><a href="#from-new-yorker">From New Yorker</a></li>
|
|||
|
<li><a href="#from-vox">From Vox</a></li>
|
|||
|
<li><a href="#from-the-hindu-sports">From The Hindu: Sports</a></li>
|
|||
|
<li><a href="#from-the-hindu-national-news">From The Hindu: National News</a></li>
|
|||
|
<li><a href="#from-bbc-europe">From BBC: Europe</a></li>
|
|||
|
<li><a href="#from-ars-technica">From Ars Technica</a></li>
|
|||
|
<li><a href="#from-jokes-subreddit">From Jokes Subreddit</a></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-new-yorker">From New Yorker</h1>
|
|||
|
<ul>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>The Case Against Trump Is Strong, but There Are Problems Ahead</strong> - It gives Trump a compelling reason to persevere in his campaign, and to sow doubt about the criminal process. - <a href="https://www.newyorker.com/news/daily-comment/the-case-against-trump-is-strong-but-there-are-problems-ahead">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>How Elon Musk Could Affect the 2024 Election</strong> - The personal politics of Twitter’s owner wouldn’t matter so much if he hadn’t also demonstrated an extraordinary capacity for pettiness. - <a href="https://www.newyorker.com/news/annals-of-communications/how-elon-musk-could-affect-the-2024-election">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>What Was Nate Silver’s Data Revolution?</strong> - Silver, a former professional poker player, was in the business of measuring probabilities. Many readers mistook him for an oracle. - <a href="https://www.newyorker.com/news/our-columnists/what-was-nate-silvers-data-revolution">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>The Trial of the Malibu Shooter</strong> - Anthony Rauda, who was accused of terrorizing residents of Malibu, one of California’s wealthiest and safest communities, has been convicted of killing a man sleeping in a tent with his two young daughters. - <a href="https://www.newyorker.com/news/california-chronicles/the-trial-of-the-malibu-shooter">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Is Donald Trump Scared?</strong> - At the former President’s indictment in Miami on Tuesday, it was impossible to say whether his fate was more likely to be a return to the White House—or prison. - <a href="https://www.newyorker.com/news/dispatch/is-donald-trump-scared">link</a></p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-vox">From Vox</h1>
|
|||
|
<ul>
|
|||
|
<li><strong>Will limiting background checks make housing fairer?</strong> -
|
|||
|
<figure>
|
|||
|
<img alt="An “apartment for rent” sign on row of brownstone townhouses." src="https://cdn.vox-cdn.com/thumbor/pQ4ZQcm5kgZvZKLE5Fg9MkNxS1U=/531x0:4778x3185/1310x983/cdn.vox-cdn.com/uploads/chorus_image/image/72368763/542768874.0.jpg"/>
|
|||
|
<figcaption>
|
|||
|
A sign advertises an apartment for rent in Brooklyn, New York, in June 2016. | Drew Angerer/Getty Images
|
|||
|
</figcaption>
|
|||
|
</figure>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
Why some cities are restricting landlords’ reviews of criminal history.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="mdUdww">
|
|||
|
Every year, more than 600,000 people <a href="https://aspe.hhs.gov/topics/human-services/incarceration-reentry-0#:~:text=Each%20year%2C%20more%20than%20600%2C000,million%20cycle%20through%20local%20jails.">leave US state and federal prisons</a>. Then they need to find a place to live.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="EVmcRE">
|
|||
|
Researchers have found that formerly incarcerated individuals are <a href="https://www.prisonpolicy.org/reports/housing.html">far more likely</a> to be <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4762459/">homeless</a> than the general public. Many landlords simply reject renting to applicants who’ve been to jail or prison — and given that one in three US adults has a criminal record, this creates a significant housing crisis.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="x1pnC2">
|
|||
|
But those <a href="https://static1.squarespace.com/static/5935ee95893fc011586f1304/t/59c08915197aeab158677454/1505790234297/Washington+State+Housing+Voucher+Evaluation.pdf">released with stable housing</a> are more likely to <a href="https://www.ojp.gov/ncjrs/virtual-library/abstracts/homelessness-and-reentry-multisite-outcome-evaluation-washington">reintegrate into their communities</a> and less likely to end up back in prison than their formerly incarcerated peers in more precarious housing situations.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="sma5li">
|
|||
|
Enter “fair chance” laws: legislation that limits how landlords can use criminal records when screening prospective tenants. While the ordinances vary from place to place — some cover all rental housing while others just apply to subsidized housing — the goal is to limit how criminal histories can be used and ensure due process for prospective tenants when applying.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="vW7E58">
|
|||
|
Think of it <a href="https://www.vox.com/2015/11/2/9660282/obama-ban-the-box">as a “ban-the-box”</a> policy,<strong> </strong>which prohibits<strong> </strong>employers from asking about criminal records, but for landlords. The movement has picked up steam in liberal localities over the last decade, first in cities like Oakland, Berkeley, Seattle, and Portland. In <a href="https://richmondconfidential.org/2017/12/11/after-almost-a-year-richmonds-fair-housing-ordinance-begins-implementation/">Richmond, California</a>, landlords who accept Section 8 vouchers are barred from rejecting applicants based on criminal history alone. <a href="https://spokesman-recorder.com/2019/11/06/minneapolis-expands-renter-protections/">Minneapolis</a> restricts the use of background checks, eviction history, and credit history in rental applications, and <a href="https://www.nytimes.com/2021/06/04/nyregion/nj-housing-bill-ban-the-box-bill.html">New Jersey</a> restricts how far back in time a specific crime can be considered.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="db8VhP">
|
|||
|
Early research suggests fair chance ordinances may have some unintended consequences: <a href="https://www.minneapolisfed.org/research/institute-working-papers/the-impact-of-renter-protection-policies-on-rental-housing-discrimination">One study found landlords in Minneapolis</a> became more likely to discriminate by race after the policy took effect. But by and large, there hasn’t been much research into how fair chance laws are working, as proponents have been focused on raising awareness about the new protections and implementing them.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="T8NFTx">
|
|||
|
“It’s been just two years since New Jersey’s passage, and in full transparency, a bill like this does take time,” said James Williams, the director of racial justice policy at the New Jersey-based Fair Share Housing Center. “There’s a tremendous amount of education required and the education piece is still something that’s being actively done.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="RCCG6e">
|
|||
|
For now, most advocates have their eye on a pending legal battle in Seattle, which in 2017 passed <a href="https://library.municode.com/wa/seattle/codes/municipal_code?nodeId=TIT14HURI_CH14.09USSCREHO_14.09.005SHTI">the most progressive fair chance ordinance</a> in the country, prohibiting landlords from asking about “any arrest record, conviction record or criminal history” or refusing to rent to them because of that history. Landlords <a href="https://www.seattletimes.com/seattle-news/politics/seattle-sued-over-law-meant-to-help-renters-with-criminal-records-get-housing/">sued in 2018</a>, arguing the statute violated their free speech and due process rights, and this past March a panel of the Ninth Circuit Court of Appeals decided the part of the law banning landlords from asking about criminal <a href="https://cdn.ca9.uscourts.gov/datastore/opinions/2023/03/21/21-35567.pdf">histories was unconstitutional</a>. The Court upheld other aspects of the law, though, and both sides have filed for an appeal.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="E1UsUo">
|
|||
|
“I think the results from that case will have far-reaching implications,” said Marie Claire Tran-Leung, a senior attorney at the National Housing Law Project, <a href="https://www.nhlp.org/wp-content/uploads/021320_NHLP_FairChance_Final.pdf">which has promoted fair chance ordinances</a> around the country.
|
|||
|
</p>
|
|||
|
<h3 id="rK3eob">
|
|||
|
What we know about fair chance ordinances in practice
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="G1sVs6">
|
|||
|
For now, the only rigorous study on fair chance housing ordinances comes from the Minneapolis Federal Reserve, where two economists <a href="https://www.minneapolisfed.org/research/institute-working-papers/the-impact-of-renter-protection-policies-on-rental-housing-discrimination">looked at the effects of a law</a> the Minneapolis city council passed in 2019.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="tGC7AJ">
|
|||
|
The local law caps security deposits at one month’s rent, bans the use of credit scores in rental applications, and restricts landlords’ ability to reject people based on evictions that occurred more than three years prior. For criminal records, landlords can no longer reject applicants due to misdemeanors older than three years, felonies older than seven years, and certain more serious convictions older than 10 years.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="XYQFKQ">
|
|||
|
The economists submitted fake email inquiries to publicly listed rental ads using names chosen to sound like Black, white, and Somali people. (Minnesota has the largest Somali population in the US.)
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="luw2uM">
|
|||
|
The researchers found that after Minneapolis’s fair chance ordinance took effect, discrimination against Black and Somali applicants increased by over 10 percentage points for both groups, relative to those in neighboring St. Paul, which did not have such a law. Differences were largest for emails sent from Black and Somali male-sounding names, for apartments that were at least two bedrooms, and for units in historically Black neighborhoods. (The researchers couldn’t identify individual companies that discriminated, but could observe discrimination based on overall contact rates to randomized emails sent to large groups of properties.)
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="qw4xXw">
|
|||
|
Marina Mileo Gorzig, one of the economists, told Vox that their study helps show causal impact of the fair chance ordinance, though it’s impossible to tell which aspect of the law — be it limiting eviction history, credit history, or criminal records — might be causing the effect.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="9aCr04">
|
|||
|
Similar unintended consequences have been found for ban-the-box policies in the employment context. <a href="https://www.vox.com/2016/6/3/11842950/ban-the-box-race">Research published in 2016</a> found employers were actually more likely to discriminate based on race following the passage of ban-the-box, thus increasing racial disparities in job interviews. More recent studies suggest the policies seem to have <a href="https://www.annualreviews.org/doi/abs/10.1146/annurev-criminol-061020-022137">done little</a> to increase employment for ex-offenders in the private sector.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="ziqXjy">
|
|||
|
Deborah Rho, the other economist to study Minneapolis’s fair chance ordinance, suggested outcomes might have been different if Minneapolis had a greater supply of housing, or if the city removed certain barriers to new housing development. “Economic theory would tell us landlords would have less room to discriminate if they were competing with more landlords,” she said.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="5znePJ">
|
|||
|
Jeremiah Ellison, the Minneapolis council member who led the push for the city’s law, largely dismissed the conclusion that a tight housing market might be a factor. “That’s a free market solution, like saying the free market will solve racism,” he said.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="p0xr67">
|
|||
|
Ellison told Vox he was reviewing the study and planned to meet with the researchers to ask questions, but felt their findings didn’t detract from the policy’s necessity. “From my vantage point, I don’t think they analyzed how the policy works at all,” he said. “And it’s a relatively young policy … it could take many, many years until tenants learn their rights.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="cqx8Ou">
|
|||
|
Meanwhile, in Seattle, city officials have been arguing that researchers find no empirical basis for the claim that a criminal record might indicate a future problematic tenancy or threat. Landlords, for their part, tend to argue such relationships exist and that they need to screen tenants’ criminal backgrounds.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="mJHLkP">
|
|||
|
Seattle points to two large-scale, rigorous studies that evaluated the efficacy of supportive housing programs that helped people at risk of homelessness, including tenants with criminal histories. One <a href="https://pubmed.ncbi.nlm.nih.gov/15605755/">study found no significant difference</a> between those formerly incarcerated and those never incarcerated in terms of supportive housing program outcomes. Another <a href="https://pubmed.ncbi.nlm.nih.gov/19176417/">found a criminal record was not statistically predictive</a> of failure in supportive housing. The researcher looked at detailed data like a program participant’s specific criminal history, time elapsed since their last conviction, number of prior offenses, and the seriousness of their past offenses, and found none were statistically predictive.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="1fmde0">
|
|||
|
Coupled with the research showing how crucial stable housing is for successful re-entry, <a href="http://clerk.seattle.gov/~cfpics/cf_320351e.pdf">advocates have argued</a> these studies “raise important questions about the validity of standards of risk estimation, screening practices and admissions policies related to criminal records in the general rental housing context.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="gjrrQ0">
|
|||
|
But lawyers representing landlords say the fact that Seattle can only point to relevant studies in the supportive housing context matters. “This is at the heart of our appeal,” Brian Hodges, an attorney with the Pacific Legal Foundation, said. “Seattle is not relying on studies that look at the private rental market, they’re looking at public supportive housing, which are either government-run or NGO housing that provides not just affordable housing but also drug and occupational counseling.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="scLP1e">
|
|||
|
Some Seattle landlords argue their experience dramatically changed following the passage of the fair chance law, and that denying them the ability to screen applicants makes it impossible to protect other residents and the property itself.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="wEBxe2">
|
|||
|
In one amicus brief, owners of a federally assisted building said following the law taking effect, conditions rapidly declined. The number of 911 calls more than doubled, more fights broke out in the lobby, used needles, trash, and feces were left in stairways, and fire alarms were repeatedly set off at night. They cited <a href="https://cdn.vox-cdn.com/uploads/chorus_asset/file/24724730/Pleading_Template.pdf">increased negative reviews</a> online and average occupancy declines.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="gFIp0C">
|
|||
|
When one tenant stabbed his guest in the chest during an argument in November 2019, it was only after they were arrested that managers learned they had several outstanding arrest warrants.
|
|||
|
</p>
|
|||
|
<h3 id="IHVjhD">
|
|||
|
New Jersey has a model that landlords say is reasonable
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="Of8xbk">
|
|||
|
In 2021, New Jersey <a href="https://www.njoag.gov/about/divisions-and-offices/division-on-civil-rights-home/fcha/">passed a statewide fair chance housing law</a> with bipartisan support, and with backing from landlord groups. It doesn’t go as far as Seattle’s ordinance in restricting how criminal histories can ultimately be used, but it comes with a strong enforcement mechanism.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="XXqo1r">
|
|||
|
The New Jersey Apartment Association, an industry group that represents landlords and housing managers, originally opposed the bill, but eventually endorsed it following a series of amendments. The original version, for example, proposed fines up to $25,000 for a first offense, and the final version landed on $10,000.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="dZBPHi">
|
|||
|
David Brogan, the group’s executive director, told Vox that since the law was passed, the real estate industry has had to train staff, reprogram systems, and update old paperwork, materials, and online data. “It’s a process,” he said, “but I have been impressed by how quickly the industry has moved to comply.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="Dbg1jV">
|
|||
|
With the exception of convictions related to producing methamphetamine and being listed on a sex offender registry, landlords can never ask about an applicant’s criminal history in the first round of applications, and they can only evaluate a criminal record after a conditional housing offer has been made. If a landlord finds a serious crime committed relatively recently, they can withdraw the offer, explaining to the applicant in detail why, and the applicant has the right to appeal it or file a complaint with the state. A housing provider can never rely on arrests that didn’t result in convictions to reject an applicant.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="5x051s">
|
|||
|
Brogan said his members think the bill is “reasonable” and “balanced” because people should not be punished for the rest of their lives for something they did years ago, but at the same time, landlords have an obligation to provide safe housing. The balance, he said, was struck by providing liability protection, creating reasonable penalties, and “banning the box” from an initial renter application but allowing it in later inquiries.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="6BZw2I">
|
|||
|
“Some fair chance in housing acts in other areas of the country don’t acknowledge the severity of the crime [and] simply ban background checks altogether,” Brogan said. “We felt that was unfair and unsafe.”<strong> </strong>
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="Zv2MDy">
|
|||
|
Williams, of the Fair Share Housing Center, said they’re most proud of the fact that the law puts responsibility for enforcing the rules within the state attorney general’s office, bringing more serious investigative powers than other states and cities had thus far embraced. He thinks his state’s law would be less likely to face the kind of constitutional challenge Seattle is dealing with because they don’t abolish the practice of landlords reviewing criminal records entirely, they just move those reviews to the back end of the process.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="pHEsto">
|
|||
|
“There’s no bulletproof piece of legislation, but if it gets challenged, we’re ready,” he said.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="1Q1Vqy">
|
|||
|
Hodges, from the Pacific Legal Foundation, said the Seattle landlords he represents are willing to rent to people with criminal records, so long as they’re not violent, cooking meth, or past sex offenders. He suggested the government should provide housing for them, and excluding those kinds of applicants is not discrimination but a “business and property” decision.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="whZgvK">
|
|||
|
“Being a criminal is not an inherent characteristic, this is not like race and religion or gender, it’s not a protected class,” he argued, and pointed <a href="http://courts.mrsc.org/appellate/097wnapp/097wnapp0557.htm">to past</a> <a href="https://law.justia.com/cases/washington/supreme-court/1992/57903-2-1.html">court decisions</a> that established landlords’ duty to other tenants to screen for violent crimes. Yet without more ample public supportive options, people with those kinds of backgrounds have nowhere to live.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="WTzBqC">
|
|||
|
As for potential unintended consequences, national advocates think that the existence of a housing shortage is not a reason to avoid pursuing more fair chance laws around the country and that the broader fight against racism will need to continue.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="FObG6s">
|
|||
|
“Taking away what many landlords are using as a proxy for race helps reveal the underlying race discrimination,” said Tran-Leung, of the National Housing Law Project. “But I don’t think there’s any notion that taking away problematic screening criteria is going to cure it.”
|
|||
|
</p></li>
|
|||
|
<li><strong>Discrimination everywhere</strong> -
|
|||
|
<figure>
|
|||
|
<img alt="A black woman’s image is fragmented and distorted in the reflection of a broken mirror." src="https://cdn.vox-cdn.com/thumbor/jf466ff9M2lYwyi7XRHbtrHNuEQ=/240x0:1680x1080/1310x983/cdn.vox-cdn.com/uploads/chorus_image/image/72361875/XiaGordon_cover.0.jpg"/>
|
|||
|
<figcaption>
|
|||
|
<a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a>
|
|||
|
</figcaption>
|
|||
|
</figure>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
Interrogating the true toll of pervasive racism.
|
|||
|
</p>
|
|||
|
<div class="c-float-left">
|
|||
|
<figure class="e-image">
|
|||
|
<img alt=" " src="https://cdn.vox-cdn.com/thumbor/0ttVKnNL7z4KLNwe92qagDl_ecE=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24696943/JuneHighlight_PartnershipLogo.png"/>
|
|||
|
</figure>
|
|||
|
</div>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="AyE1DX">
|
|||
|
For this month’s issue of The Highlight, Vox teamed up with Black-led nonprofit newsroom <a href="https://capitalbnews.org/">Capital B</a> to explore the insidious effects of discrimination on <a href="https://www.vox.com/race">Black Americans</a>.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="bhCyeG">
|
|||
|
The collaboration — part of an ongoing partnership with Capital B — was prompted in part by the work of researchers at the University of Chicago, who compiled nearly 50 years’ worth of studies examining bias against Black people in nearly every aspect of modern life, which they shared exclusively with Vox and Capital B.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="xXHhDj">
|
|||
|
The researchers, led by Sendhil Mullainathan — one of the scholars behind the seminal resumé discrimination study <a href="https://www.nber.org/papers/w9873">“Are Emily and Greg More Employable than Lakisha and Jamal?”</a> — reviewed decades of studies. They found that discrimination impacts consequential endeavors and mundane tasks alike, from buying a home and applying to <a href="https://www.vox.com/labor-jobs">jobs</a> to looking for a new church and using rideshare apps. Simply put, discrimination is everywhere.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="8M7j1W">
|
|||
|
That isn’t news to the tens of millions of Black Americans who experience it daily. But as <a href="https://www.usnews.com/news/health-news/articles/2022-11-16/poll-many-americans-dont-believe-systemic-racism-exists">many white Americans</a> continue to question whether systemic racism exists, these myriad studies offer affirmation, borne out with data in no uncertain terms, of its existence and pervasiveness. This, in turn, has provided us with opportunities to ask: What happens to Black lives when you endure racist acts day in and day out? What solutions exist? And what care can we provide?
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="Uu9wp6"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="A black woman is holding her face in her hands. She is blurry and disoriented, and the background is warped." src="https://cdn.vox-cdn.com/thumbor/iLqNTiGoVbQtF7Ot5aeFgflkBjc=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24693742/XiaGordon_Lede1.jpg"/> <cite>Illustration by <a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a>.</cite>
|
|||
|
</figure>
|
|||
|
<h3 id="1dnKm8">
|
|||
|
<a href="https://www.vox.com/race/23739082/discrimination-racism-black-people-time-juneteenth"><strong>Discrimination isn’t just infuriating. It steals Black people’s time.</strong></a>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="uesKtY">
|
|||
|
Vox analyzed dozens of studies and found that racism adds up in insidious ways.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="LBwnLB">
|
|||
|
<em>By Sean Collins and Izzie Ramirez</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="yhFydq"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt=" " src="https://cdn.vox-cdn.com/thumbor/HN3SRUURfRHXI0_FakX8DCLRd8c=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24694918/MeikaE_Cover_05.jpg"/> <cite><a class="ql-link" href="https://www.meikaejiasi.com/" target="_blank">Meika Ejiasi</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="mvs1mE">
|
|||
|
<a href="https://www.vox.com/health/23744329/weathering-racism-black-health-chronic-stress-cancer-discrimination"><strong>A racist society is detrimental to your health</strong></a>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="hdnuVm">
|
|||
|
From chronic stress to cancer, racial discrimination weathers Black Americans’ lives over time.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="VNosin">
|
|||
|
<em>By Margo Snipe</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="GEpp7D"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="The shadow of a mysterious figure appears on the side of a car. Police car lights can be seen in the reflection of the car window." src="https://cdn.vox-cdn.com/thumbor/REl3mntqahfa7yBTSCZy4V709gw=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24694059/XiaGordon_Lede2.jpg"/> <cite><a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="dvfX5o">
|
|||
|
<a href="https://www.vox.com/23735896/racism-car-ownership-driving-violence-traffic-violations"><strong>How cars fuel racial inequality</strong></a>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="cQAKFG">
|
|||
|
Cars can be a source of freedom. They also drive discrimination.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="aucfpr">
|
|||
|
<em>By Marin Cogan</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="qNYuYh"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="A Black child and a White child play with blocks. A teacher approaches, clearly focusing on the Black child." src="https://cdn.vox-cdn.com/thumbor/wMRdHaHXMrc3q9iDUH1V7J3XkGg=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24696806/XiaGordon_Lede3.jpg"/> <cite><a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="E8C68u">
|
|||
|
<a href="https://www.vox.com/race/23745609/school-discipline-racial-discrimination-achievement-gap"><strong>We need to rethink discipline in schools</strong></a>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="42Y3tt">
|
|||
|
How school reinforces inequalities between Black children and their peers.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="0VowkU">
|
|||
|
<em>By Jonquilyn Hill</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="odWxP2"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="A drawing of a woman looking at a computer with a warning message on the screen." src="https://cdn.vox-cdn.com/thumbor/BsJSO1yjNxwLsQm5wpvKba_ClQI=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24693608/XiaGordon_Lede5_2.jpg"/> <cite><a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="vYSfU0">
|
|||
|
<a href="https://www.vox.com/technology/23738987/racism-ai-automated-bias-discrimination-algorithm"><strong>AI automated discrimination. Here’s how to spot it.</strong></a>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="SthxHq">
|
|||
|
The next generation of AI comes with a familiar bias problem.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="NshQln">
|
|||
|
<em>By Abby Ohlheiser</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="3djv7Y"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="A Black woman sits in a hospital bed with a confused look on her face. Mysterious shapes surround her." src="https://cdn.vox-cdn.com/thumbor/cnvTkD_seWV_TezbNEnwauZfrrI=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24694824/XiaGordon_Lede4.jpg"/> <cite><a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="a9TzxT">
|
|||
|
<strong>What’s behind Black women’s excessive rate of fibroids? </strong><em><strong>(coming Thursday)</strong></em>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="m1hC0b">
|
|||
|
Is it chemicals? Diet? Stress?
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="eKA7C2">
|
|||
|
<em>By Akilah Wise</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="EbAu65"/>
|
|||
|
<figure class="e-image">
|
|||
|
<img alt="A Black person with their arms crossed and eyes closed, as a police office stands behind them and menacing figures are on either side of them." src="https://cdn.vox-cdn.com/thumbor/1YnRmyxnWcWmp2mB8KfHKVJfz38=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24698567/Trauma.jpg"/> <cite><a class="ql-link" href="https://www.foursixsix.com/" target="_blank">Carlos Basabe</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a></cite>
|
|||
|
</figure>
|
|||
|
<h3 id="1c2ozv">
|
|||
|
<strong>How to deal with racial trauma, according to Black experts </strong><em><strong>(coming Friday)</strong></em>
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="x0xLHd">
|
|||
|
There’s no cure for the effects of pervasive discrimination, but there are steps you can take to help heal.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="ZdBqdG">
|
|||
|
<em>By Kenya Hunter</em>
|
|||
|
</p>
|
|||
|
<hr class="p-entry-hr" id="ankMeD"/>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="jW6hDf">
|
|||
|
<strong>CREDITS</strong>
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="otp9Bk">
|
|||
|
<strong>Editors: </strong><em>Vox</em>: Adam Este<em>s, </em>Libby Nelson<em>, </em>Lavanya Ramanathan, Julia Rubin. <em>Capital B</em>: Gavin Godfrey, Dalila Johari-Paul,<em> </em>Simone Sebastian
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="g4FMq5">
|
|||
|
<strong>Copy editors/fact-checkers:</strong> <em>Vox</em>: Elizabeth Crane, Kim Eggleston, Tanya Pai, Caitlin PenzeyMoog. <em>Capital B</em>: Neil Cornish
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="l184Rs">
|
|||
|
<strong>Additional fact-checking: </strong>Anouck Dussaud, Kelsey Lannin
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="bc91j9">
|
|||
|
<strong>Art direction: </strong>Dion Lee, Paige Vickers
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="h6VziQ">
|
|||
|
<strong>Audience:</strong> Gabriela Fernandez, Shira Tarlo, Agnes Mazur. <em>Capital B:</em> Charity Scott, Alexandra Watts
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="AcXClY">
|
|||
|
<strong>Production/project editors:</strong> Lauren Katz, Nathan Hall
|
|||
|
</p>
|
|||
|
<div id="nayjsq">
|
|||
|
|
|||
|
</div>
|
|||
|
<div id="GJ0avH">
|
|||
|
<div id="money_pixel_page_level_exception">
|
|||
|
|
|||
|
</div></div></li>
|
|||
|
</ul>
|
|||
|
|
|||
|
<ul>
|
|||
|
<li><strong>AI automated discrimination. Here’s how to spot it.</strong> -
|
|||
|
<figure>
|
|||
|
<img alt="A drawing of a woman looking at a computer with a warning message on the screen." src="https://cdn.vox-cdn.com/thumbor/TxuhF6kIyzQMfkbFDk_glBB4JZ4=/240x0:1680x1080/1310x983/cdn.vox-cdn.com/uploads/chorus_image/image/72341902/XiaGordon_Lede5_2.0.jpg"/>
|
|||
|
<figcaption>
|
|||
|
<a class="ql-link" href="https://xiagordon.com/" target="_blank">Xia Gordon</a> for Vox and <a class="ql-link" href="https://capitalbnews.org/" target="_blank">Capital B</a>
|
|||
|
</figcaption>
|
|||
|
</figure>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
The next generation of AI comes with a familiar bias problem.
|
|||
|
</p>
|
|||
|
<div class="c-float-left">
|
|||
|
<figure class="e-image">
|
|||
|
<img alt=" " src="https://cdn.vox-cdn.com/thumbor/uAgAb8-tqMf0lYwl3rPOZldmPq4=/800x0/filters:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24691810/JuneHighlight_PartnershipLogo.png"/>
|
|||
|
</figure>
|
|||
|
</div>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="btNnBX">
|
|||
|
<em>Part of the </em><a href="https://www.vox.com/race/23745799/discrimination-racism-university-chicago-studies"><em>discrimination issue</em></a><em> of </em><a href="https://www.vox.com/the-highlight"><em>The Highlight</em></a><em>. This story was produced in partnership with </em><a href="https://capitalbnews.org/"><em>Capital B</em></a><em>.</em>
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="OWKNUh">
|
|||
|
Say a computer and a human were pitted against each other in a battle for neutrality. Who do you think would win? Plenty of people would bet on the machine. But this is the wrong question.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="rDrKcl">
|
|||
|
Humans created computers, and they design and train the systems that make modern technology work. As these systems are created, the biases of their human creators are reflected in them. When people refer to <a href="https://www.vox.com/2023/4/28/23702644/artificial-intelligence-machine-learning-technology">artificial intelligence</a> bias, this is, in essence, what they are talking about. Like human bias, AI bias, when translated into decisions or actions, becomes discrimination. Like many forms of discrimination, AI bias disproportionately impacts communities that historically or presently face oppression.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="g2yQ9N">
|
|||
|
Facial recognition software has a long history of <a href="https://www.vox.com/future-perfect/2019/4/19/18412674/ai-bias-facial-recognition-black-gay-transgender">failing to recognize Black faces</a>. Researchers and users have identified anti-Black biases in AI applications ranging from <a href="https://hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias">hiring</a> to <a href="https://www.washingtonpost.com/technology/2022/07/16/racist-robots-ai/">robots</a> to <a href="https://apnews.com/article/lifestyle-technology-business-race-and-ethnicity-mortgages-2d3d40d5751f933a88c1e17063657586">loans</a>. AI systems can determine whether you <a href="https://cdt.org/wp-content/uploads/2023/03/2023-03-17-Civic-Tech-Technology-in-Public-Housing-Agencies-final.pdf">find public housing</a> or <a href="https://www.wired.com/story/algorithms-allegedly-penalized-black-renters-the-us-government-is-watching/">whether a landlord rents to you</a>. Generative AI technology is <a href="https://www.npr.org/sections/health-shots/2023/04/05/1167993888/chatgpt-medicine-artificial-intelligence-healthcare">being pitched as a cure for the paperwork onslaught</a> that contributes to medical professional burnout.
|
|||
|
</p>
|
|||
|
<div class="c-float-right">
|
|||
|
<div id="ARrO7A">
|
|||
|
<div>
|
|||
|
|
|||
|
</div>
|
|||
|
</div>
|
|||
|
</div>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="u8KYuc">
|
|||
|
As the capabilities of generative AI tools like <a href="https://www.vox.com/future-perfect/23674696/chatgpt-ai-creativity-originality-homogenization'">ChatGPT</a> and <a href="https://www.vox.com/technology/2023/3/22/23651093/google-bard-ai-chatbot-microsoft-chat-gpt-generative">Google Bard</a> enter the mainstream, the unfair preferences or prejudices that have long plagued AI have remained in place. The effect is all around us, in apps and software you encounter daily, from the automatic sorting of your social media feeds to the chatbots you use for customer service. AI bias also can creep into some of the biggest decisions companies might make about you: whether to hire you for a job, to lend you money for a home, or to cover the cost of your <a href="https://www.vox.com/health-care">health care</a>.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="G85DnV">
|
|||
|
The terminology of this technology — AI, algorithms, large language models — can make the examinations of its effects feel highly technical. In some ways, AI bias is a technical issue, one with no easy solution. Yet the questions at the heart of fighting AI bias require little specialized knowledge to understand: Why does bias creep into these systems? Who is harmed by AI bias? Who is responsible for addressing it and the harms it generates in practice? Can we trust AI to handle important tasks that have an impact on human lives?
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="wmrzra">
|
|||
|
Here’s a guide to help you sort through these concerns, and figure out where we go from here.
|
|||
|
</p>
|
|||
|
<h3 id="tVA2zX">
|
|||
|
What is AI? What’s an algorithm?
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="I5wHXU">
|
|||
|
A lot of definitions of artificial intelligence rely on a comparison to human reasoning: AI, these definitions go, is advanced technology designed to replicate human intelligence, and able to perform tasks that have previously required human intervention. But really, AI is software that can learn, make decisions, complete tasks, and problem-solve.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="iMb5Ez">
|
|||
|
AI learns how to do this from a data set, often referred to as its training data. An AI system trained to recognize faces would learn to do that on a data set composed of a bunch of photos. One that creates text would learn how to write from existing writing fed into the system. Most of the AI you’ve heard about in 2023 is generative AI, which is AI that can, from large data sets, learn how to make new content, like photos, audio clips, and text. Think the image generator <a href="https://www.vox.com/technology/2023/3/30/23662292/ai-image-dalle-openai-midjourney-pope-jacket">DALL-E</a> or the chatbot <a href="https://www.vox.com/recode/2023/1/5/23539055/generative-ai-chatgpt-stable-diffusion-lensa-dall-e">ChatGPT</a>.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="hAFdny">
|
|||
|
In order to work, AI needs algorithms, which are <a href="https://www.wired.com/insights/2014/09/artificial-intelligence-algorithms-2/">basically mathematical recipes</a>, instructions for a piece of software to follow in order to complete tasks. In AI, they provide the basis for how a program will learn, and what it will do.
|
|||
|
</p>
|
|||
|
<h3 id="rSb4zD">
|
|||
|
Okay, so then what is AI bias, and how does it get into an AI system?
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="XqAvik">
|
|||
|
AI bias is like any other bias: It’s an unfair prejudice or practice present in or executed by the system. It disproportionately impacts some communities over others, and is creeping into more and more corners of daily life. People might encounter bias from a social media filter that doesn’t work properly on darker skin, or in test proctoring software that doesn’t account for the behavior of neurodivergent students. Biased AI systems might determine the care someone receives at the doctor or how they’re treated by the <a href="https://www.vox.com/criminal-justice">criminal justice</a> system.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="WIu2nY">
|
|||
|
Bias finds its way into AI in a lot of ways. Broadly speaking, however, to understand what’s happening when an AI system is biased, you just need to know that AI is fundamentally trained to recognize patterns and complete tasks based on those patterns, according to Sasha Luccioni, a researcher on the Machine Learning Ethics and Society team at the open source AI startup Hugging Face. Because of this, she said, AI systems “will home in on the dominant patterns, whatever they are.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="jj8dbG">
|
|||
|
Those dominant patterns might show up in the training data an AI system learns from, in the tasks it is asked to complete, and in the algorithms that power its learning process. Let’s start with the first of these.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="yitypu">
|
|||
|
AI-powered systems are trained on sets of existing data, like photos, videos, audio recordings, or text. This data can be skewed in an endless number of ways. For instance, facial recognition software needs photos to learn how to spot faces, but if the data set it’s trained on contains photographs that depict mostly white people, the system might not work as well on nonwhite faces. An AI-powered captioning program might not be able to accurately transcribe somebody speaking English with a slight foreign accent if that accent isn’t represented in the audio clips in its training database. AI can only learn from what it’s been given.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="oNIrIT">
|
|||
|
The data set’s bias might itself merely be a reflection of larger systemic biases. As Karen Hao has explained in <a href="https://www.technologyreview.com/2019/02/04/137602/this-is-how-ai-bias-really-happensand-why-its-so-hard-to-fix/">MIT Technology Review</a>,<strong> </strong>unrepresentative training data prompts AI systems to identify unrepresentative patterns. A system designed to automate a decision-making process trained on historical data may simply learn how to perpetuate the prejudices already represented in that history.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="ysx7WO">
|
|||
|
Perhaps an AI system’s creator is trying to remove bias introduced by a data set. Some methods of trying to reduce bias can also introduce their own problems. Making an algorithm “blind” to an attribute like race or gender doesn’t mean that the AI won’t find other ways to introduce biases into its decision-making process — and perhaps to identify the same attributes it was supposed to ignore, as the Brookings Institution explained <a href="https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/">in a 2019 report</a>. For example, a system that is designed to assess applications for a job might be rendered “blind” to an applicant’s gender but learn to distinguish male-sounding and female-sounding names, or look for other indicators in their CV, like a degree from an all-women’s college, if the data set it’s trained on favors male applicants.
|
|||
|
</p>
|
|||
|
<h3 id="jWqwRh">
|
|||
|
Have I encountered AI bias?
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="LKonrD">
|
|||
|
Probably, yes.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="WqbmpK">
|
|||
|
For many Americans, AI-powered algorithms are already part of their daily routines, from recommendation algorithms driving their online shopping to the posts they see on social media. Vincent Conitzer, a professor of computer science at Carnegie Mellon University, notes that the rise of chatbots like ChatGPT provides more opportunities for these algorithms to produce bias. Meanwhile, companies like Google and <a href="https://www.vox.com/microsoft">Microsoft</a> are looking to generative AI <a href="https://www.vox.com/recode/2023/3/4/23624033/openai-bing-bard-microsoft-generative-ai-explained">to power the search engines of the future</a>, where users will be able to ask conversational questions and get clear, simple answers.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="7uX7nc">
|
|||
|
“One use of chat might be, ‘Okay, well, I’m going to visit this city. What are some of the sites that I should see? What is a good neighborhood to stay in?’ That could have real business implications for real people,” Conitzer said.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="BJaYrZ">
|
|||
|
Although generative AI is just beginning to show up in quotidian technologies, conversational search is already a part of many people’s lives. Voice-activated assistants have already shifted our relationship to searching for information and staying organized, making routine tasks — compiling a grocery list, setting a timer, or managing a schedule — as simple as talking. The assistant will do the rest of the work. But there’s an established bias in tools like Siri, Alexa, and Google Assistant.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="lT4NAG">
|
|||
|
Speech recognition technologies have an established history of failing in certain scenarios. They might not recognize requests from people who do not speak English as a first language, or they may fail to properly understand Black speakers. While some people may choose to avoid these problems by not using these technologies, these failures can be particularly devastating for those with disabilities <a href="https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/">who may rely on voice-activated technologies</a>.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="NjrjAq">
|
|||
|
This form of bias is creeping into generative AI, too. <a href="https://arxiv.org/abs/2304.02819">One recent study</a> of tools meant to detect the use of ChatGPT in any given writing sample found that these detectors might falsely and unfairly flag writing done by non-native English speakers as AI-generated. Right now, ChatGPT feels like a novelty to many of its users. But as companies rush to incorporate generative AI into their products, Conitzer said, “these techniques will increasingly be integrated into products in various ways that have real consequences on people.”
|
|||
|
</p>
|
|||
|
<h3 id="j7vqH7">
|
|||
|
Who is hurt most by AI bias?
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="uUViIZ">
|
|||
|
For a stark glimpse of how AI bias impacts human lives, we can look at the criminal justice system. Courts have used <a href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing">algorithms with biases against Black people to create risk scores</a> meant to predict how likely an individual is to commit another crime. These scores influence sentences and prisoners’ ability to get parole. Police departments have even incorporated facial recognition, <a href="https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html">along with the technology’s well-documented biases</a>, into its daily policing.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="knzFd0">
|
|||
|
An algorithm designed to do a risk assessment on whether an arrestee should be detained would use data derived from the US criminal justice system. That data would contain false convictions, and fail to capture data on people who commit crimes and are not caught, according to Conitzer.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="LBZcJR">
|
|||
|
“Some communities are policed far more heavily than other communities. It’s going to look like the other community isn’t committing a whole lot of crimes, but that might just be a consequence of the fact that they’re not policed as heavily,” Conitzer explained. An algorithm trained on this data would pick up on these biases within the criminal justice system, recognize it as a pattern, and produce biased decisions based on that data.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="HjIH7m">
|
|||
|
Obvious AI bias is hardly limited to one institution. At the start of the Covid-19 pandemic, schools relied on anti-cheating software used for virtual test takers. That type of software often uses video analysis and facial recognition to watch for specific behaviors it’s been trained to see as potential signs of cheating. Students soon found that virtual proctoring software, intended to enforce academic fairness, didn’t work equally well for all students. Some popular proctoring programs were <a href="https://www.wired.com/story/student-exam-software-bias-proctorio/">failing to detect Black faces</a> and <a href="https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/">penalizing students </a>who were unable to find a <a href="https://www.theverge.com/2020/10/22/21526792/proctorio-online-test-proctoring-lawsuit-universities-students-coronavirus">stable internet connection</a> and quiet, private space in their home for test-taking. Proctoring software was particularly biased against students with a wide range of disabilities, or <a href="https://www.insidehighered.com/news/2021/02/01/u-illinois-says-goodbye-proctorio">spiking the anxiety</a> of test takers with some <a href="https://www.vox.com/mental-health">mental health</a> conditions.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="0Mdy4f">
|
|||
|
As the <a href="https://cdt.org/insights/how-automated-test-proctoring-software-discriminates-against-disabled-students/">Center for Democracy and Technology</a> has noted, proctoring software could incorrectly flag students requiring a screen reader, those with visual impairment or other disabilities that might cause irregular eye movements, and neurodivergent students who might pace or fidget while taking a test. Some proctoring services do <a href="https://abovethelaw.com/2020/08/law-students-forced-to-urinate-while-being-watched-by-proctors-during-remote-ethics-exam/">not allow for bathroom breaks</a>.
|
|||
|
</p>
|
|||
|
<h3 id="IBVAv3">
|
|||
|
This sounds pretty bad! Is there any solution?
|
|||
|
</h3>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="qP6Nyl">
|
|||
|
The good news is that AI bias is a problem that lots of people talk about and think about how to reduce. However, not everyone agrees on how to fix this increasingly pressing issue.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="jANmwp">
|
|||
|
Sam Altman, the founder of OpenAI, recently told Rest of World that <a href="https://restofworld.org/2023/3-minutes-with-sam-altman/">he believes these systems will eventually be able to fix themselves</a>: “I’m optimistic that we will get to a world where these models can be a force to reduce bias in society, not reinforce it,” he said. “Even though the early systems before people figured out these techniques certainly reinforced bias, I think we can now explain that we want a model to be unbiased, and it’s pretty good at that.”
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="2zWOaz">
|
|||
|
Altman’s solution essentially asks the world to trust the technology to fix itself, in a process driven by the people who created it. For a lot of AI and ethics experts, that’s not enough.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="P3AfWF">
|
|||
|
Luccioni, the Hugging Face ethics researcher, used the example of generative AI tools that <a href="https://www.npr.org/sections/health-shots/2023/04/05/1167993888/chatgpt-medicine-artificial-intelligence-healthcare">are supposed to speed up medical paperwork</a>, arguing that we should be questioning whether AI belongs in this space at all. “Say that ChatGPT writes down the wrong prescription, and someone dies,” she says. While note-taking is not a task that takes a decade of education to master, assuming that you can simply swap out a medical doctor for an AI bot to speed up the paperwork process removes vital oversight from the equation.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="ddRajE">
|
|||
|
An even deeper problem, Luccioni notes, is that there are no mechanisms for accountability when an AI tool integrating itself into vital care makes mistakes. Companies promising to replace or work in tandem with highly specialized professionals do not need to seek any sort of certification before, for instance, launching a bot that’s supposed to serve as a virtual therapist.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="jjrT17">
|
|||
|
Timnit Gebru, a computer scientist with deep expertise in AI bias and the founder of the <a href="https://www.dair-institute.org/">Distributed AI Research Institute</a>, argued recently that the companies behind the push to incorporate AI into more and more aspects of our lives have already proven that they do not deserve this trust. “Unless there is external pressure to do something different, companies are not just going to self-regulate. We need regulation and we need something better than just a profit motive,” she <a href="https://www.theguardian.com/lifeandstyle/2023/may/22/there-was-all-sorts-of-toxic-behaviour-timnit-gebru-on-her-sacking-by-google-ais-dangers-and-big-techs-biases">told the Guardian</a>.
|
|||
|
</p>
|
|||
|
<p class="c-end-para" data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="dWUVYL">
|
|||
|
Conitzer says the problem of AI bias requires auditing and transparency in AI systems, particularly those tasked with important decisions. Presently, many of these systems are proprietary or otherwise unavailable for scrutiny from the public. As the novelty of generative AI tools like ChatGPT fuels a rush to incorporate new systems into more and more of our lives, understanding how to identify AI bias is the first step toward systemic change.
|
|||
|
</p>
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom" id="vkE7JG">
|
|||
|
<a href="https://www.awohlheiser.com/"><em>Abby Ohlheiser</em></a><em> is a freelance reporter and editor who writes about technology, religion, and culture.</em>
|
|||
|
</p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-the-hindu-sports">From The Hindu: Sports</h1>
|
|||
|
<ul>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Beach volleyballers Raju and Naresh chase an elusive medal</strong> -</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>World Cup triumph and Dronacharya award will be the crowning glory for Indian men’s junior hockey coach C.R Kumar</strong> - ‘I have been given a free hand by Hockey India and I have a strong belief that the juniors can win the world event; it has been a recurring dream for me and I will make it happen’</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>World Test Championship | India to start 2023-25 cycle with WI tour, will also play Australia, England</strong> - India have been finalists in the first two editions of the WTC, losing to New Zealand (2021) and to Australia (2023) in the final</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Squash World Cup: India gets off to a rousing start by blanking Hong Kong</strong> -</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Rahane, Shardul rise in ICC Test rankings; Ashwin maintains top spot among bowlers</strong> - Labuschagne continues to hold on to his number-one position with 903 rating points while Smith has advanced one place to second position after scores of 121 and 34</p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-the-hindu-national-news">From The Hindu: National News</h1>
|
|||
|
<ul>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>ICAR-CTCRI implementing ‘rainbow diet’ campaign in Attappady</strong> - Initiative aimed at tackling malnutrition among tribal population</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>A milestone event for Jewish-Indian Heritage in Israel as foundation stone laid for the Heritage and Cultural Center of Indian Jews</strong> - The multi-million-dollar project, spread across three acres of land, will have a museum, conference and events hall, and an Indian tropical garden, encapsulated in a lotus-stem shaped complex, designed by leading Israeli architects.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Here are the big stories from Karnataka today</strong> - Welcome to the Karnataka Today newsletter, your guide from The Hindu on the major news stories to follow today. Curated and written by Nalme Nachiyar.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>ED action on T.N. Minister Senthilbalaji as per Supreme Court direction: BJP MLA Vanathi</strong> - BJP spokesperson Syed Zafar Islam also reiterated that the arrest was based on evidence and following the Supreme Court’s critical observation about the case involving the former AIADMK leader, now in the ruling DMK.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Khap members, farmers block Rohtak-Delhi NH in Haryana’s Bahadurgarh</strong> - Twenty-five demands had been raised, including justice for wrestlers, farmers loan waiver, legal guarantee of crop MSP and enhanced compensation for land.</p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-bbc-europe">From BBC: Europe</h1>
|
|||
|
<ul>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Valery Zaluzhny, the man behind Ukraine’s counter-offensive</strong> - Gen Valery Zaluzhny is Ukraine’s handpicked army boss and mastermind of the unlikely successes over Russia.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Dozens killed as migrant boat capsizes off Greece</strong> - At least 59 people drown after their fishing vessel capsized off the coast of southern Greece.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Italy mourns and asks who will succeed Berlusconi</strong> - Italians say goodbye to the four-time PM and ask who will lead his party and run his businesses.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>International drugs ring smuggled cocaine in surfboards</strong> - Police disband a gang which stashed surfboards full of cocaine and sent them from Uruguay to Europe.</p></li>
|
|||
|
<li data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Austria: Mother accused of attempted murder after locking child in dog cage</strong> - The Austrian mother is being investigated for attempted murder and torture of her 12-year-old son.</p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-ars-technica">From Ars Technica</h1>
|
|||
|
<ul>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Dealmaster: Last-minute savings to celebrate dads and grads</strong> - Gift ideas and deals for Father’s Day and graduation. - <a href="https://arstechnica.com/?p=1947435">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>3D muscle reconstruction shows 3.2 million-year-old “Lucy” walked upright</strong> - “Lucy’s muscles suggest that she was as proficient at bipedalism as we are.” - <a href="https://arstechnica.com/?p=1947108">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>OpenAI rolls out big chatbot API upgrades for developers</strong> - API updates include 4x larger conversation memory for GPT-3.5 and function calling. - <a href="https://arstechnica.com/?p=1947545">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Lead admin of child sex abuse website pleads guilty, faces 20 years to life</strong> - DOJ: Website had “section devoted to the sexual abuse of infants and toddlers.” - <a href="https://arstechnica.com/?p=1947631">link</a></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Reddit CEO assures employees that API pricing protests haven’t hurt revenue</strong> - Furor “will pass,” Huffman says in internal memo reportedly viewed by The Verge. - <a href="https://arstechnica.com/?p=1947571">link</a></p></li>
|
|||
|
</ul>
|
|||
|
<h1 data-aos="fade-right" id="from-jokes-subreddit">From Jokes Subreddit</h1>
|
|||
|
<ul>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Reddit is killing third-party applications (and itself). Read more in the comments.</strong> - submitted by <a href="https://www.reddit.com/user/JokeSentinel"> /u/JokeSentinel </a> <br/> <span><a href="https://i.redd.it/1j5nee06kx5b1.png">[link]</a></span> <span><a href="https://www.reddit.com/r/Jokes/comments/1490rmv/reddit_is_killing_thirdparty_applications_and/">[comments]</a></span></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Reddit is killing third-party applications (and itself). Read more in the comments.</strong> - submitted by <a href="https://www.reddit.com/user/JokeSentinel"> /u/JokeSentinel </a> <br/> <span><a href="https://i.redd.it/dd1lsrcp3t5b1.png">[link]</a></span> <span><a href="https://www.reddit.com/r/Jokes/comments/148igma/reddit_is_killing_thirdparty_applications_and/">[comments]</a></span></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>When you go to church in the morning you say, “Amen.”</strong> - <!-- SC_OFF --></p>
|
|||
|
<div class="md">
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
When you go to church in the afternoon you say, “Pmen.”
|
|||
|
</p>
|
|||
|
</div>
|
|||
|
<!-- SC_ON -->
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"> submitted by <a href="https://www.reddit.com/user/vedicsun"> /u/vedicsun </a> <br/> <span><a href="https://www.reddit.com/r/Jokes/comments/146sgvx/when_you_go_to_church_in_the_morning_you_say_amen/">[link]</a></span> <span><a href="https://www.reddit.com/r/Jokes/comments/146sgvx/when_you_go_to_church_in_the_morning_you_say_amen/">[comments]</a></span></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>Did you hear about Apple’s new VR headset?</strong> - <!-- SC_OFF --></p>
|
|||
|
<div class="md">
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
They’re called the iGlasses
|
|||
|
</p>
|
|||
|
</div>
|
|||
|
<!-- SC_ON -->
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"> submitted by <a href="https://www.reddit.com/user/sheeeeeez"> /u/sheeeeeez </a> <br/> <span><a href="https://www.reddit.com/r/Jokes/comments/1470v9z/did_you_hear_about_apples_new_vr_headset/">[link]</a></span> <span><a href="https://www.reddit.com/r/Jokes/comments/1470v9z/did_you_hear_about_apples_new_vr_headset/">[comments]</a></span></p></li>
|
|||
|
<li><p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"><strong>If women want a guy who is taller than them…</strong> - <!-- SC_OFF --></p>
|
|||
|
<div class="md">
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom">
|
|||
|
why do they care if he has hair on top of his head?
|
|||
|
</p>
|
|||
|
</div>
|
|||
|
<!-- SC_ON -->
|
|||
|
<p data-aos="fade-left" data-aos-anchor-placement="bottom-bottom"> submitted by <a href="https://www.reddit.com/user/2Agile2Furious"> /u/2Agile2Furious </a> <br/> <span><a href="https://www.reddit.com/r/Jokes/comments/146hxfi/if_women_want_a_guy_who_is_taller_than_them/">[link]</a></span> <span><a href="https://www.reddit.com/r/Jokes/comments/146hxfi/if_women_want_a_guy_who_is_taller_than_them/">[comments]</a></span></p></li>
|
|||
|
</ul>
|
|||
|
|
|||
|
|
|||
|
<script>AOS.init();</script></body></html>
|