Facebook has been under a lot of scrutiny since its Cambridge Analytica scandal, forcing CEO Mark Zuckerberg and his team to take a closer look into how it handles user data. Although the social media giant doesn’t plan on adopting the General Data Protection Regulation (GDPR) anywhere outside of the EU, it has begun to make an effort to amend its wrong doings, with its latest effort cutting off some APIs from guzzling user data.
Following on from Zuckerberg stating that the platform will look into its data handling practices in a hearing with Congress, Facebook will begin revoking app permissions that allow third-party companies to post on behalf of users unless the application has been approved by Facebook itself. The exact practice of this hasn’t been outlined, but the platform will be conducting more robust formal app reviews for those that seek such permission.
Previously, apps could customise the permission prompts that nudge users into giving up access to their data, however Facebook will be stripping this back to a standardised set of text. This makes it clearer and more consistent as to what each application wants from the user.
Those that post comments on other people’s content will also be protected, as changes to the Instagram Graph API will prevent developers from grabbing the name and bio of users engaging on the platform.
Developers have a chance to adjust to the new policies, which come into effect on August 1st, however not everyone is happy with the changes. Leading internet academics have begun warning of the impact these new security measures will have on research and oversight of the platform.
A total 27 researchers signed an open letter, published on Wednesday 25th April, stating that the changes are “likely to compound the real problem, further diminishing transparency and opportunities for independent oversight.”
“The net effect of the new API restrictions is to lock out third parties and consolidate Facebook’s position as the main analytics and advertising broker,” explains the open letter. “Contrary to popular belief, these changes are as much about strengthening Facebook’s business model of data control as they are about actually improving data privacy for users.”
Alternatively, the experts propose that Facebook work more with the research community to help detect emerging issues that might appear on the platform, as this has proved useful in the past, particularly with the controversy surrounding fake news.
“Unlike the platforms and commercial research companies, universities can be trusted to take an independent perspective and to manage research ethics with great care and nuance: incorrect assessments, overt bias towards the platforms and unethical engagement with social media data would seriously damage their public standing and destroy future careers,” explains Axel Bruns, a Facebook scales back third-party apps that can access user data, worrying researchers and signer of the open letter.
Academic experts won’t go without, as Facebook has invited many to form a commission which will develop a research agenda pertaining to the impact of social media on society. Still, there are concerns as to just how much harm the unavailability of critical data will be in the research sector.
KitGuru Says: It’s a valid concern, especially given what the researchers do for a living, but Facebook will more than likely stick to pleasing the public than academics for now. Still, there’s every chance a system could be put in place to ensure that the research sector doesn’t suffer too much from such a large change. Do you think the new policies are enough to protect user data?