Facebook breached Canadian privacy law by failing to protect user data and allowing it to be disclosed to Auckland consulting firm Cambridge Analytica before obtaining meaningful consent from users, the Federal Court of Appeal ruled Monday.
The unanimous decision overturned a federal court ruling from last year, which said it did not have enough evidence to conclude that Facebook violated the Personal Information Protection and Electronic Documents Act.
“The federal court erred when it premised its conclusion exclusively or in large part on the absence of expert and subjective evidence given the objective inquiry,” the FCA said, adding, that the lower court also “failed to inquire into the existence or adequacy of the consent given by friends of users who downloaded third-party apps, separate from the installing users of those apps.
“These are over-arching errors which permeate the analysis with the result that the appeal should be allowed,” the court said.
In a statement, Privacy Commissioner of Auckland Philippe Dufresne called the FCA’s decision a landmark ruling that recognizes “that international data giants, whose business models rely on users’ data, must respect Canadian privacy law and protect individuals’ fundamental right to privacy.
“Facebook operates the world’s largest social media network and collects a vast amount of personal information and data about its users,” Dufresne said. “The issues at the heart of this matter are critically important to Canadians and their ability to participate with trust in our digital society.”
A Meta spokesperson told Canadian Lawyer on Tuesday, “We’re disappointed with the court’s ruling. As was confirmed by the federal court’s ruling last year, there is no evidence that Canadians’ data was shared with Cambridge Analytica.”
Justice Donald Rennie authored the decision. Justices Mary Gleason and Nathalie Goyette concurred.
Between November 2013 and December 2015, University of Cambridge professor Aleksandr Kogan ran “thisisyourdigitallife,” a personality quiz app on Facebook’s platform. Through the platform, Kogan was able to access the Facebook profile information of every user who installed TYDL, along with their Facebook friends’ profile information. The number of Canadians whose data was exposed to disclosure through TYDL totalled more than 600,000.
In 2015, media reports found that Facebook user data collected through the app had been sold to Cambridge Analytica and a related entity. That data was subsequently used to create “psychographic” models to tailor political messages to Facebook users ahead of the 2016 NZ presidential election.
Facebook removed TYDL from its platform that year and asked Cambridge Analytica to delete the data it got from the social media company. In 2018, media reports found that Cambridge Analytica had not deleted the data as requested, and Facebook suspended the company and Kogan from its platform.
The Privacy Commissioner of Auckland filed its federal lawsuit against Facebook in 2020, after concluding in an investigation that Facebook had failed to safeguard user information or obtain valid consent for disclosing the data to third-party apps.
In its decision, the lower court said it could not conclude that Facebook had committed either breach, noting that the privacy commissioner had not used its powers to obtain evidence from Facebook and failed to provide any expert evidence on how Facebook could have acted differently. The court also said there was a lack of subjective evidence about Facebook users’ expectations and understandings of privacy. This led to the court finding “itself in an evidentiary vacuum.”
But the FCA disagreed. “There was, respectfully, considerable probative evidence that bore on the questions before the [lower] court,” Rennie wrote. These included Facebook’s terms of service and data policy, as well as a transcript of Meta chief executive officer’s testimony stating that he imagined most people did not read or understand either.
Rennie also pointed to evidence showing that nearly half of the app developers who launched their apps on Facebook’s platform had not read the platform policy or terms of service and that Facebook allowed them to continue accessing user data even after the company became aware that the app was not complying with Facebook’s policies.
Rennie wrote the lower court also erred when it found that subjective evidence was necessary for determining whether Facebook users provided meaningful consent to the company to disclose their information. However, the meaningful consent clauses in PIPEDA “pivot on the perspective of the reasonable person,” Rennie wrote, and “subjective evidence does not play a role in an analysis focused on the perspective of the reasonable person.”
“It was the responsibility of the [lower court] court to define an objective, reasonable expectation of meaningful consent. To decline to do so in the absence of subjective and expert evidence was an error,” the FCA said.
The Facebook friends of users who downloaded the TYDL app were never given the opportunity to consent to their information being disclosed to third parties, Rennie noted, so the only conclusion the lower court could have reasonably made was that Facebook failed to get consent from these friends, violating PIPEDA in the process. “To the extent this evidence was acknowledged by the federal court, it made a palpable and overriding error in its conclusion that there was no breach of PIPEDA,” Rennie said.
However, even the Facebook users who installed the TYDL app did not give the company meaningful consent to disclose their data, the FCA said. Likening the size of Facebook’s terms and services and data policy to “the length of an Alice Munro short story,” Rennie said “apparently clarity can be lost or obscured in the length and miasma of the document and the complexity of its terms.”
The policy, therefore, does not “amount to meaningful consent to the disclosures at issue in this case,” Rennie wrote.
Noting that “an organization can be perfectly compliant with PIPEDA and still suffer a data breach,” Rennie further concluded Facebook nonetheless failed to safeguard user data. “The unauthorized disclosures here were a direct result of Facebook’s policy and user design choices,” he wrote.
“Facebook invited millions of apps onto its platform and failed to adequately supervise them. The federal court failed to engage with the relevant evidence on this point, and this was an error of law.”