According to ‘futurist’ Chris Riddell, the tech world is experiencing a major ‘trust crisis. At a recent conference, he warned that big brands must rebuild trust with consumers on data safety and use technology to create transparency.
In an interview with CNBC in the US, Riddell, who describes his job title as “fundamentally seeing how technology is changing humanity,” said the world is currently experiencing a severe “trust crisis.” “People now are more willing to share data than ever before” but data breaches at major companies break “trust and confidence,” he stated.
“We’re in an era of category killers, where one organization is dominating in an industry,” Riddell said, pointing to Facebook and Google as examples. The challenge for those businesses is to rebuild trust and use technology to create transparency, he continued.
Social media platform Facebook is currently under fire amid allegations that private data firm Cambridge Analytica, a UK company, lied about deleting user data it had improperly obtained in 2015 when it was hired by president Trump’s 2016 campaign.
In a blog post, Facebook says Cambridge Analytica used data passed to it by the maker of a psychology app, against Facebook guidelines.
Cambridge Analytica had told Facebook it deleted that data, but Facebook says it recently got reports that the data had not been fully deleted. The Trump campaign paid Cambridge Analytica more than $6 million to help it target voters through ads on Facebook.
Facebook said it is investigating an employee’s ties to Cambridge Analytica, for violating its terms of service. Joseph Chancellor, a former director of Global Science Research (GSR), currently works as a virtual reality researcher at the social media giant, a Facebook spokesperson told CBS News on Sunday.
The spokesperson said Chancellor’s past work has no bearing on the current work he does at Facebook, but the company is examining the situation.
Cambridge Analytica, was reportedly involved in the harvesting of personal data from more than 50 million Facebook users. On Saturday, the company said that it contracted GSR to take on a “large scale research project” in the U.S., but said “no data from GSR was used” as part of services it provided to the Trump campaign.
Facebook also announced Saturday that it was suspending Cambridge Analytica and University of Cambridge professor Aleksandr Kogan for violating the social media company’s standards and practices. Kogan served as a director of GSR during the same time as Chancellor.
A Cambridge Analytica spokesperson said in a statement, “there was no recollection of any interactions or emails” with Chancellor.
According to Facebook, Kogan had built an app to for a ‘personality quiz’ for Facebook users and claims he passed the content onto Cambridge Analytica who then used the data to build ‘psychographic profiles’ about voters. Facebook says that it demanded that the data acquired be destroyed but that several days ago it learned that this had not happened.
Lawmakers in both the U.S. and the U.K. are now demanding that Facebook CEO Mark Zuckerberg explain how the data theft occurred and how the company plans to protect consumers.
“A lot of the spotlight is going to be on Cambridge Analytica because it seems like they were being a little deceitful here, but I think we have to look equally critically at Facebook,” Wired senior reporter Issie Lapowsky told CBSN on Saturday.
She continued, “This is just emblematic of such a crucial underlying issue with Facebook: they’ve created this incredibly powerful data operation and sell really robust data to their clients but they have very few mechanisms in place to ensure people aren’t going to abuse that data.”
Cambridge Analytica is the brainchild of its now 28 year old co-founder, turned whistle blower, Christopher Wylie. “All of these pieces of information, put together, create a digital portrait of who you are,” Wylie said.
It was reported that Cambridge Analytica worked for Senator Ted Cruz and President Trump’s 2016 campaigns.
In 2014, a company called Global Science Research (GSR) used Facebook to distribute a personality quiz to analyse whether users were extroverted or neurotic. The company said it was doing it just for research purposes, but it actually harvested the psychological data from all the users and – with their permission – got access to some data on their Facebook friends.
It then sold the data to Cambridge Analytica, which used it to create targeted political advertising. In total, some 50 million Americans may have been impacted.
“It scaled really quickly. We spent over $1 million on it, so it wasn’t cheap but in terms of the amount of data that was collected, and the quality of that data, it was a rare example of where something was fast, relatively cheap, but high-quality, ” Wylie said.
Wylie also told CBS News’ Charlie D’Agata that he’s “taking responsibility” and “owning up” about Cambridge Analytica.
“I take a share of responsibility in this because I was the research director and I worked on this program so I’m going to start by saying I’m taking responsibility and I’m owning up,” Wylie said. “In terms of who else needs to take responsibility: Cambridge Analytica — it funded the program, it approved the program — as an entity this is what ultimately became the foundation of what Cambridge Analytica is.”
Wylie added: “Last week I offered to help Facebook and work with Facebook and their lawyers confirmed that they wanted to work in a collaborative manner — when all of this came out I got banned [from Facebook] — they decided that actually the whistleblower is the person they want to apparently go after.”
Facebook said the use of the data was unauthorised. Since 2015, the company has banned third-party developers from collecting data on user’s friends.
David Carroll, an associate professor at Parsons School of Design, filed a lawsuit against Cambridge Analytica when he said he found out it collected data on him. Carroll said he followed the practices of all the presidential campaigns in 2016 and requested data from the London-based company last year. He believes Facebook should bear some of the responsibility, in line with other Facebook critics who say the company should have done more.
.”If they did let data get collected in an illicit manner and didn’t adequately protect it when they learned how it was used, then yes, they are responsible. That’s the deal they make with us, that they protect our data in order for us to use the service,” Carroll said.
Wylie calls the data a “political gold mine.” “If you’re trying to influence an American election, that’s a one-stop shop,” Wylie said.
The more information a political campaign knows about you, the more they can target ads. They know your personality and they know your interests. They know your religion and how strongly you believe it, and they can target ads against those beliefs.