There was a time when new technology automatically equated to progress. Today, this inherent trust has been permanently fractured. As we increasingly see our lives as an assemblage of data points, collected and disseminated by invisible actors behind the scenes of the products we use and the experiences we enjoy, our willingness to believe and accept that Big Tech always has our best interest at heart is dying a swift death.
Now that the warm, fuzzy glow the tech industry once enjoyed has dimmed considerably, people will continue to assess their relationship to their products and devices—and companies will have to work to earn the trust of savvier, more engaged consumers. For example, once the height of convenience, ‘always-on’ services are now observed as an addiction and a curse. Global tech giants are being examined as monopolists. Social media is no longer seen as a mirror to society, but the very reason for its downfall. The same companies we thought would set us free over the last twenty to thirty years are now fundamentally changing the very social fiber of our worlds—and not necessarily for the better. As people become increasingly mindful of how technology affects mental health on an individual and societal level, the current economies around Big Tech will require reassessing.
Currently, consumers are speaking out with their purchases. Eighty-one percent of those surveyed say trust deeply informs their purchasing decisions, claiming they must be able to trust a brand to do what’s right before buying (Edelman). Consumers are also taking back control by reconsidering the platforms they are willing to trust—and the responsibilities placed on these platforms to be trustworthy. In 2019, Facebook CEO Mark Zuckerberg has already faced criticism for asserting that political ads, even ones that contain false information, are protected free speech on the platform (Digital Trends). With 2020 as another election year in the U.S., all eyes will be on how social media companies manage the spread of misinformation on their channels (frog Voices).
While the early part of this decade started to surface the darker side of the tech industry, that was just the beginning. Looking forward, we will investigate the dangers of the over-reliance and under-regulation of technology even further.
One way we will see this shifting relationship between people and technology manifest is through the emphasis on data management. In the last decade, consumers were trained to exchange their personal data for so-called “free” services like Facebook or a Gmail account. Today, the notion of this trade has broken down because of how complex and pervasive data collection and exchange has become among Big Tech and advertisers. As a result, consumers care about their digital privacy more than ever. For several years, consumer data in the European Union has been deeply regulated, but now that this trend is expanding worldwide, many markets are scrambling to comply. For example, the California Consumer Privacy Act goes into effect in January 2020, but only eight percent of U.S. businesses report being ready (eMarketer).
Now that digital privacy is top of mind for consumers and regulators alike, companies will need to take it seriously, too. Just as businesses have been dismissed for ‘greenwashing,’ or hiding behind eco-conscious messaging when in actuality trying to improve their bottom-line, companies will soon be accused of ‘privacy-washing’ if their efforts to protect consumer privacy are revealed to be hollow or meaningless.
In this new era, the currency of trust will also shed light on data ethics. We have long seen people’s willingness to share information in the name of personalized, convenient interactions. Everything from the foods we like to eat to the components of our genetic makeup are now a matter of public data discourse. While our data unlocks innovation in domains as far-reaching as consumer goods to financial services to healthcare, failing to safeguard this information puts people in real danger. Moreover, designing for some and not all based on data profiles alone risks ostracizing people from different cultural and socioeconomic backgrounds. Moving into the 2020s and beyond, this will increasingly be a matter of serious concern.
To avoid exposing or alienating individuals, manipulating the truth or contributing to the rampant spread of misinformation that using our data can enable, companies will need to be thoughtful with how and why they use this information. They’ll need to define and flex some real data ethics—or be prepared to face serious societal backlash (Design Mind). Companies—Big Tech and startups alike—will have to design new services and even perhaps entirely new business models that account for the currency of trust. While it may seem difficult to break away from these giants that have become ubiquitous in our lives—who goes through a day without using a service from Google, Apple, Facebook or Amazon?—these technologies are but tools. It’s up to people to design the systems and business models those tools work within. And we’re optimistic that a new era of trust will usher in technology and systems that work for people.