Since announcing last March it had banned Cambridge Analytica for misusing user data via the app ‘ThisIsYourDigitalLife,’ Facebook has now banned the myPersonality app.
Used by 4 million people.
The myPersonality app was active prior to 2012 and was used by around 4 million people according to Facebook. It is the first app banned by the company since the Cambridge Analytica crisis.
Facebook say there is no evidence that the app accessed users’ friends lists and will therefore only be notifying Facebook users and not their friends.
If that changes however, and they do discover the app was able to access data beyond the people who used it — they will notify users’ Facebook friends.
App investigation ongoing, say Facebook.
Facebook has suspended more apps as part of its investigations, as well as banning the myPersonality app. In May, it announced it had suspended 200 apps for the misuse of user data after auditing more than 1,000. Now, more than 400 apps have been suspended.
Facebook’s VP of product partnerships, Ime Archibong, reported,
“Since launching our investigation in March, we have investigated thousands of apps. And we have suspended more than 400, due to concerns around the developers who built them or how the information people chose to share with the app may have been used — which we are now investigating in much greater depth.”
Pressure to clean up app platform.
The Cambridge Analytica scandal earlier this year, showed the app had been used to exploit users’ data. Since then, Facebook has been under pressure to clean up its app platform. Initially, the company paused all app reviews, then released a new app review process.
It has also refined the amount of data available to app developers and has given its users tools to easily remove apps in bulk from their profiles.
User privacy and GDPR.
With the EU’s new data protection regulations, GDPR, coming into effect last May, Facebook has had to look at the way it manages user data.
The Media Trust’s CEO, Chris Olsen, says Facebook’s ongoing investigation into how third parties are using its platform shows Facebook is being responsive to consumer outrage over its inability to create a safe environment for users.
“Facebook’s decision to suspend 400+ apps from its platform sends a strong message that app developers and their third parties must meet new security and privacy thresholds and will be subject to stringent audits,” says Olsen. “As data becomes increasingly regulated, companies will need to put together a robust digital vendor risk management program that will enable them to pay close attention to their direct and indirect digital third parties’ activities, ensure third parties align with policies, and terminate activities that violate policies.”
Eyal Katz, a senior marketing manager for Namogoo, agrees with Olsen’s stance and says that all digital platforms need to be paying attention to the public scrutiny experienced by Facebook and its handling of user data.
“Now, with the implementation of GDPR and increased focus on the protection of personal information, these companies must have a complete understanding of which third-party services are operating on their platforms and more importantly, what those services are doing with user data,” says Katz.
Facebook – new safety measures
Facebook have had to significantly increase its user safety and privacy efforts. Earlier this year they launched a number of related initiatives. These included its first ever transparency report and new political and issue based ad policies.
The company also said it planned to increase its number of employees on the safety and security teams to 20,000 this year. 15,000 have already been hired.
The company has tried to be more transparent as well, sharing how it reviews content and its process for removing posts and accounts from the platform. After taking down 32 Pages for coordinated inauthentic behaviour in July, Facebook announced this week it had removed another 652 pages, groups and accounts originating in Iran.