The myanmar sex videosinternet has changed how kids learn about sex, but sex ed in the classroom still sucks. In Sex Ed 2.0, Mashable explores the state of sex ed and imagines a future where digital innovations are used to teach consent, sex positivity, respect, and responsibility.
The algorithms that drive products like YouTube, Facebook, and Apple's iOS software share a common challenge: They can't seem to consistently distinguish between pornography and sexual and reproductive health content.
That's because the code engineered to prevent "adult" material from popping up in your timeline or search results can also easily block educational content meant to offer internet users candid, factual information about sex, sexuality, and health.
Critics say the algorithmic confusion may reflect lazy engineering and tech's infamous diversity problem. When the engineers who write code meant to push nudity or porn to the web's margins don't understand or care about the importance of accessing sexual and reproductive health content, especially for LGBTQ youth and other users who've been historically marginalized online, of course algorithms will block the widest possible swath of content. Critics also believe a straightforward solution to this problem exists, but say tech companies aren't interested in addressing their concerns.
SEE ALSO: Sex ed is missing something key for kids who've endured sexual traumaThe online sexual health company O.school reported in October how the iPhone's new software, with the parental control setting enabled, blocked not just its website but numerous entertainment sites and health resources for teens and adolescents. While the filter restricted sites like Teen Vogue and Scarleteen, it didn't deny users access to websites like the neo-Nazi Daily Stormer or the anti-gay Westboro Baptist Church.
That shocking contrast convinced O.school founder Andrea Barrica that Apple's algorithm might just be blocking certain terms, like teen, wholesale in order to prevent any clicks that might possibly send a user to prohibited content (i.e. "teen porn"). Yet Barrica couldn't confirm or dispel her suspicions — or learn anything about Apple's algorithm.
Barrica used her own network to reach Apple employees with hopes of discussing the situation but was met with silence. Then she published a blog post entitled "Censorship and Sex Ed" with pointed questions for Apple: Who designed the filter? Were parents consulted? Conservatives and religious groups? Doctors and Sex educators? What non-porn sites are being blocked?
She never heard from Apple. The company did not comment to Mashable about its algorithms or Barrica's post.
"They’re writing the policies in the most conservative way to avoid the problem."
"They're not targeting sex ed; they’re writing the policies in the most conservative way to avoid the problem," claims Barrica. "Apple’s conservative views on sexuality have so many far-reaching effects."
Barrica isn't the only one who's written an open letter to the platform controlling whether her company's content is seen online. In May 2017, a writer for the menstrual tracking app Clue detailed in a blog post how Facebook blocked the company's ads boosting its sexual and reproductive health content. Educational illustrations that featured vulvas, breasts, and penises were blocked. Ads to promote posts about underwear, birth stories, and puberty advice were also rejected.
A representative for Clue said that while the company declined to comment on the issue, the company stood by its 2017 post. A spokesperson for Facebook said Clue's ads ran afoul of its advertising policy's restrictions on "adult content" that, among other things, forbids ads that include nudity or images focused on individual body parts. Facebook's advertising policies are applied globally and are stricter than its community standards. Clue continues to publish content and advertise on Facebook.
AMAZE, a sex ed video series for adolescents and teens, has faced a similar problem on YouTube. Since its channel launched nearly three years ago, several of its 84 videos have been rejected for advertising because they were deemed to be "adult" content. Those include videos about female and male biological anatomy.
"YouTube advertising is critical to our work at AMAZE because it allows us to reach young people all over the world who are searching for guidance around sex, mental health, and more," Lincoln Mondy, a spokesperson for AMAZE, said in an email.
Though AMAZE is not considered adult content, its videos do include accurate depictions of genitalia and discussions of sexual health. That forthrightness, which is sometimes graphic, could be perceived by an algorithm and human reviewer as violating the platform's policy against advertising adult content that's "non-family safe." YouTube declined to comment on the policies and practices that inform its algorithms.
Tech companies might argue that their algorithms are actually working as designed by flagging content that violates its policies. Yet the fact that, for example, a benign illustration of a breast in an educational context is deemed objectionable gets at a bigger issue.
Part of the challenge facing engineers and tech companies is the reality that sexual health material produced for the internet today is often free of the stigma and shame traditionally associated with talking about sex. Instead of staid explainers that use vague terms and descriptions, this new generation of content asks and answers potentially embarrassing questions, sensitively addresses the diverse concerns of marginalized readers, and is unafraid to use accurate depictions of genitalia, making what once were awkward conversations sound pretty fun along the way.
So engineers who aren't paying attention to this trend, or don't even realize it's happening, are likely to write code that assumes most explicit words or images that appear on the internet are most likely a gateway to porn.
"One of the dynamics is they're not thinking about this as a case at all," says Jon Pincus, a software engineer and entrepreneur who is an adviser to O.school. "Whether it's lazy or overly simplified, my guess is they’re not actually trying to measure if they’re letting legitimate [sexual and reproductive health] stuff in while keeping other stuff out."
Pincus says designing algorithms that perform substantially better than they do today wouldn't be hard. Engineers and the companies that employ them could embrace fairness, accountability, and transparency as guiding principles, particularly because the availability of accurate sex ed information online is a public health issue.
This Tweet is currently unavailable. It might be loading or has been removed.
Ideally, companies using machine learning algorithms would train them with words, images, and descriptions of valid sexual and reproductive health information they want to accept, as well as the adult content or pornography they want to reject.
Beyond their philosophical approach, Pincus says tech companies could invite sexual and reproductive health experts to provide feedback on how algorithms are designed, or even hire them to consult. Pincus says that's common practice in the industry when there are no subject matter experts on staff.
Mondy, of AMAZE, agrees with such an approach.
"To us, the only solution involves an intentional partnership between tech giants and sexual health experts when they’re creating algorithms and content blockers," he said. "Tech giants aren’t sexual health experts and shouldn’t make such consequential decisions on what is and isn’t 'age appropriate' when it comes to online information."
"Tech giants aren’t sexual health experts."
Those companies, however, are reluctant to surrender that power and give outsiders influence over their product. When Tumblr announced last week that it would ban adult content, a spokesperson for the company declined to explain the criteria by which its algorithms and human reviewers would distinguish sex ed from nudity or porn but instead noted that "health-related situations" would still be allowed on the platform.
Though the resistance to transparency makes sense given the ruthless competition in Silicon Valley, Barrica believes tech companies have no incentive to endanger major advertising or a spot in Apple's app store by writing more nuanced algorithms that could maximize access to sexual and reproductive health information but potentially let pornographic content slip through the cracks.
"It's really fear-based," she says. "It goes back to lack of inclusion and diversity, and back to stigma."
"There’s so much power to control what people do and don’t see."
Topics Apple Facebook Health Social Good YouTube
Facebook gets off the hook yet again in FTC antitrust case'Black Widow' is a welcome palate cleanser for MarvelTelegram now lets you video chat on group callsFacebook UFO group moderators want to know why no one cares about UFOsHow to clean an air fryerWhy the Northwest's heat wave didn't just break records, it obliterated them'Black Widow' is a welcome palate cleanser for MarvelFacebook UFO group moderators want to know why no one cares about UFOsGoogle tests new feature that will warn if your search doesn't have reliable results yetLenovo Yoga Tab 13 is a tablet that doubles as a portable screenThe 18 best motivational podcasts that could change your lifeSelena Gomez had a very sweet Instagram message for Taylor Swift's birthdayHonda finally announces its first electric vehicle, coming in 2024Oh god, here's Barack Obama playing Santa ClausTurns out you can sell your lamp on Tinder. You just have to watch out for ghosting.Prince Harry interviewed Obama and it looks like it'll be hilariousWhen no one comes to dad's art show, his daughter makes sure people see his workThe 13 best Doctor Who holiday gifts in the universeApple might launch a cheaper 6.7The city where Jesus grew up canceled Christmas because of Trump Samsung's Galaxy Note 8 might launch sooner than expected Hackers stole credit card info from Trump Hotel guests in 14 different locations Emmys 2017: Full list of nominations A huge percent of Americans get harassed online Fontgate could topple the Pakistani Prime Minister's corruption case Why 'Game of Thrones' won't be Emmy Fertility app: Android users want to get pregnant, iPhone users don't Netflix is adapting the brilliant comic from the guy who fronted My Chemical Romance Kind stranger picks up firefighters' $400 Denny's bill after they tackled huge blaze Andy Murray helpfully reminds reporter that, yes, female tennis players do exist Hey, women: Here's how to soften life's toughest emails using exclamation points! Facebook Spaces adds live video, letting users without Oculus Rift peek into VR in real time LG to launch another big Nap capsules pop up in sleep Doomfist's 'Overwatch' skins are here and we need them now Pitbull and Jeb Bush are apparently trying to buy an MLB team together Marvel's 'Black Panther' plot and character details revealed Lucy Liu set to direct next 'Luke Cage' episode Girl practicing ballet totally takes out little brother No one should buy Louis Vuitton's stupidly expensive smartwatch
2.4803s , 8635.8046875 kb
Copyright © 2025 Powered by 【myanmar sex videos】,Inspiration Information Network