Facebook megathread

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
https://www.theverge.com/2019/1/30/18203551/apple-facebook-blocked-internal-ios-apps
Apple has shut down Facebook’s ability to distribute internal iOS apps, from early releases of the Facebook app to basic tools like a lunch menu. A person familiar with the situation tells The Verge that early versions of Facebook, Instagram, Messenger, and other pre-release “dogfood” (beta) apps have stopped working, as have other employee apps, like one for transportation. Facebook is treating this as a critical problem internally, we’re told, as the affected apps simply don’t launch on employees’ phones anymore.

The shutdown comes in response to news that Facebook has been using Apple’s program for internal app distribution to track teenage customers with a “research” app.

That app, revealed yesterday by TechCrunch, was distributed outside of the App Store using Apple’s enterprise program, which allows developers to use special certificates to install more powerful apps onto iPhones. Those apps are only supposed to be used by a company’s employees, however, and Facebook had been distributing its tracking app to customers. Facebook later said it would shut down the app.

This poses a huge issue for Facebook. While Apple provides other tools a company can use to install apps internally, Apple’s enterprise program is the main solution for widely distributing internal apps and services. In an email, a Facebook spokesperson said “I can confirm that this affects our internal apps.”

In a statement given to Recode, Apple said that Facebook was in “clear breach of their agreement with Apple.” Any developer that breaches that agreement, Apple said, has their distribution certificates revoked, “which is what we did in this case to protect our users and their data.” Apple declined to comment on shutting down all of Facebook’s internal apps in an email to The Verge.

Revoking a certificate not only stops apps from being distributed on iOS, but it also stops apps from working. And because internal apps by the same organization or developer may be connected to a single certificate, it can lead to immense headaches like the one Facebook now finds itself in where a multitude of internal apps have been shut down.

Apple and Facebook have already been bickering over privacy, but this is the first instance of Apple taking an action that directly shuts down some of Facebook’s activities. Last March, Apple CEO Tim Cook criticized Facebook’s handling of the Cambridge Analytica data sharing scandal, saying, “I wouldn’t be in this situation” if he were running the company. Facebook CEO Mark Zuckerberg later said the comments were “extremely glib” and spoke of Apple as a company that “work hard to charge you more.”
 
So let me get this straight. Zuckerberg is daring the law to fuck him in the ass so hard his offspring's offspring will be born with prolapsed anuses?

He's pulling a Producers. That has to be it. He can't possibly be that fucking dumb.

EDIT: Wait, wasn't he one of the people clamoring for the government to get involved more heavily in Internet shit?
You know what I would do if I owned Facebook and wanted more government censorship of the internet?

Why, I'd force the issue using the company that I own.

Given that Zuck is a Sociopath he's probably got this all planned out
 
Man, Silicone Valley really think they can stop Trump a second time. Can't wait for Donnie to fuck em anally, no lube.
 
At first I thought this was fake news because no way this could be real, right? Welp:
View attachment 833830
"market knowledge of news event"
lmao what the fuck does this even mean
Jesus christ that literally says "it's okay to call for violence against people the Facebook mods don't like".
 
D_IpnUjXUAAVYJU.png


Yeah, I didn't think that policy was going to last long.
 

Facebook updated its community standards to allow for users to call for "high-severity violence" against sexual offenders, including death threats.

In its " Do not post" section on its website, Facebook changed its standards in a July update to allow an exception to its "Violence and Incitement" standard for individuals "described as having carried out violent crimes or sexual offenses, wherein criminal/predator status has been established by media reports, market knowledge of news event, etc."
The exception allows users to make: "Threats that could lead to death" against alleged violent and sexual offenders. Facebook does not require for the threats to be against persons who have been convicted under criminal law.

Facebook did not return the Washington Examiner's request for comment at the time of publication. In May, the company said: “We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology."
"The process for evaluating potential violators is extensive, and it is what led us to our decision to remove these accounts," the Facebook spokesperson added.
The standards change by Facebook comes as federal prosecutors charge financier Jeffrey Epstein with sex trafficking and conspiracy.

Southern District of New York prosecutors said Epstein “enticed and recruited, and caused to be enticed and recruited, minor girls" as young as 14 in order to "engage in sex acts with him."
UPDATE:
"We don’t allow credible threats of violence against anyone. We do allow some speech that calls for certain forms of violence, such as calls for the death penalty for criminals or support for military action against terrorists. We have updated our Community Standards to be more clear about this," Facebook spokesperson said in a statement to the Washington Examiner.
Facebook updated their community standards again saying the language they added earlier in July was "imprecise."
"The language we previously used to describe our policies against violence and incitement was imprecise. We have since replaced it to more clearly explain the policy and underlying rationale," the Facebook said in an update to their "Violence and Incitement" community standards.

"In some cases, we see aspirational or conditional threats directed at terrorists and other violent actors (e.g. Terrorists deserve to be killed), and we deem those non credible absent specific evidence to the contrary," the company said.
 
WTF I love facebook now?

Okay, serious talk, we all know this is going to be used to send death threats to anybody that isn't kissing the left's ass.
 
WTF I love facebook now?

Okay, serious talk, we all know this is going to be used to send death threats to anybody that isn't kissing the left's ass.
It might also give some more incentive to accuse someone of being a sex offender. After all, if you're not gonna get punished for it, why not go for it?
 
"In some cases, we see aspirational or conditional threats directed at terrorists and other violent actors (e.g. Terrorists deserve to be killed), and we deem those non credible absent specific evidence to the contrary," the company said.
Wtf I love Facebook now.
 
I can't help but wonder if this is part of "playing the long game" to try and get the public whining to the government about regulation more than they are. I see facebook surviving fine and well under a regulated system, but I really don't see them lasting as long otherwise.

Food for thought.
 
Back
Top Bottom