(ANTIMEDIA) Women recently expressed concern on social media after learning their iPhones sort some of their pictures into a “brassiere” category. The AI setting is intended to detect a variety of things, including food, dogs, weddings, and cars, but a simple search in the photo app for “brassiere” made some women feel violated.
The image detection feature has been active on iPhones for over a year and recognizes over 4,432 keywords, but it garnered attention after a Twitter user posted about at the end of October:
ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brassiere’ why are apple saving these and made it a folder!!?!!?😱😱😱😱
— ell (@ellieeewbu) October 30, 2017
The tweet received thousands of retweets and likes and left many women concerned about their privacy.
Despite the creepy implications of artificial intelligence categorizing pictures of women in their bras (I checked my phone, and the results were a bit unsettling) — and the fact that you can’t turn the AI setting off — the photos aren’t automatically shared with Apple.
Unless your phone is set to upload images to iCloud, the photos and their categorizations remain strictly on the individual’s device.
Further, the cataloging of bra shots was not universal. Quartz reported on the story and had its employees check their phones:
“Tests by the Quartz newsroom and others on Twitter confirm that ‘brassiere’ is searchable in Photos—however, results were mixed. One Quartz reporter’s search yielded only an image of her in a dress, skipping over photos of friends at the beach. Another wore a bra as a part of a costume, but the AI didn’t surface those pictures. The AI often included photos of dresses with skinny straps, or sports bras. Others confirmed that the folder had—disconcertingly—worked as intended.”
“For another Quartz reporter, the Photos app catalogued an image of a t-shirt (featured in a past story), which seems to confirm our working theory: It’s looking for shapes that resemble bra straps.”
Even so, the brassiere category on my phone included a picture of me in a tube top, which had no straps at all.
Regardless, The Verge noted an interesting disparity between women and men’s undergarments:
“One thing to note here is that while women’s undergarments like ‘bra’ are listed as categories, there’s no mention of men’s boxers or briefs. Clearly someone had to have made a conscious decision to include (or not include) certain categories. Even ‘corset’ and ‘girdle’ are on the list. Where is the same attention to detail for mens’ clothing?”
The Verge also pointed out that Google has the same feature and the photos are automatically uploaded to the cloud and stored on Google’s servers. Google’s machine learning photo detection has been around since 2015. As the outlet observed:
“Should the fact that ‘brassiere’ is a category at all be concerning? Or is it more alarming that most people didn’t know that image categorization was a feature at all?”
Since you’re here…
…We have a small favor to ask. Fewer and fewer people are seeing Anti-Media articles as social media sites crack down on us, and advertising revenues across the board are quickly declining. However, unlike many news organizations, we haven’t put up a paywall because we value open and accessible journalism over profit — but at this point, we’re barely even breaking even. Hopefully, you can see why we need to ask for your help. Anti-Media’s independent journalism and analysis takes substantial time, resources, and effort to produce, but we do it because we believe in our message and hope you do, too.
If everyone who reads our reporting and finds value in it helps fund it, our future can be much more secure. For as little as $1 and a minute of your time, you can support Anti-Media. Thank you. Click here to support us