AHOSKIE, N.C. — The railroad tracks cut through Weyling White’s boyhood backyard like an invisible fence. He would play there on sweltering afternoons, stacking rocks along the rails under the watch of his grandfather, who established a firm rule: Weyling wasn’t to cross the right of way into the white part of town.
The other side had nicer homes and parks, all the medical offices, and the town’s only hospital. As a consequence, White said, his family mostly got by without regular care, relying on home remedies and the healing hands of the Baptist church. “There were no health care resources whatsoever,” said White, 34. “You would see tons of worse health outcomes for people on those streets.”
The hard lines of segregation have faded in Ahoskie, a town of 5,000 people in the northeastern corner of the state. But in health care, a new force is redrawing those barriers: algorithms that blindly soak up and perpetuate historical imbalances in access to medical resources.
A STAT investigation found that a common method of using analytics software to target medical services to patients who need them most is infusing racial bias into decision-making about who should receive stepped-up care. While a study published last year documented bias in the use of an algorithm in one health system, STAT found the problems arise from multiple algorithms used in hospitals across the country. The bias is not intentional, but it reinforces deeply rooted inequities in the American health care system, effectively walling off low-income Black and Hispanic patients from services that less sick white patients routinely receive.
These algorithms are running in the background of most Americans’ interaction with the health care system. They sift data on patients’ medical problems, prior health costs, medication use, lab results, and other information to predict
If you’re browsing through Yelp reviews, you might come across a new consumer alert, warning you that a business has been “accused of racist behavior.”
“Recently, someone associated with this business was accused of racist behavior, resulting in an influx of people posting their views to this page,” the warning says.
There has been a substantial increase in the number of reviews mentioning Black-owned businesses, Yelp said in a news release Thursday. This summer, Yelp saw a 617% rise in such reviews compared with last year.
“While searches for Black-owned businesses surged on Yelp, so did the volume of reviews warning users of racist behavior at businesses,” the company said.
In the interest of the company’s “zero tolerance policy to racism,” it will now place a consumer alert on a business’s page “to caution people about businesses that may be associated with overtly racist actions.”
Yelp already placed warnings called “Public Attention Alerts” on business pages that got reviews accusing them of racism based on news reports or social media.
The number of reviews based on news reports has increased 133% in 2020. Yelp placed more than 450 Public Attention Alerts on pages “that were either accused of, or the target of, racist behavior related to the Black Lives Matter movement.”
“Now, when a business gains public attention for reports of racist conduct, such as using racist language or symbols, Yelp will place a new Business Accused of Racist Behavior Alert on their Yelp page to inform users, along with a link to a news article where they can learn more about the incident,” Yelp said.
When an alert is placed on a