As data speeds us forward into the future, is the smarter use of data our best hope for a more prosperous society and for healthier humans and the planet? What kinds of roles will humans have in that future? Are we our own best hope or the weakest link as more and more organizations “think” with data at machine speed?
The entry into a new decade prompts us to ask the critical question about the role of expanded data collection and data creation, especially from observed data — in improved products, services, health treatments, urban solutions and in solving the world's economic and well-being problems. It seems that increasingly, businesses and governments are equally at odds and in partnership with each other. Driven by the growth of machine learning algorithms and the expansion of artificial intelligence programs in companies and in governments, some are claiming there is too much data collection and creation, leaving consumers feeling vulnerable about their data and their privacy — too much ‘surveillance’. In fact, a majority of Americans report being concerned about the way their data is being used by companies (79%) or the government (64%). Most also feel they have little or no control over how these entities use their personal information, according to a new survey of U.S. adults by Pew Research Center that explores how Americans feel about the state of privacy in the nation1.
It seems we’ve become a bit cynical about data uses. Nearly 75% of consumers think companies are tracking all or nearly all their behavior online and movements in the physical world -- few, just 18%, think they have control over their personal data, or understand the ‘why’ it’s being used2. Concerns like these are the root of the California Consumer Privacy Act, and similar efforts across the U.S., inspired by the European General Data Protection Regulation. It’s becoming a novel virtuous cycle of data privacy regulations leading to possible U.S. Federal Privacy legislation — the EU to California to the U.S. Government, and then the rest of the world. We can count on there being more rules challenging digital workflows and accompanying digital data flows.
The benefits of responsible and ethical data uses are profound and broad, such as community health or inclusive growth, and yet, sometimes just under the surface, that makes us nervous. More than nervous, it’s a crisis of trust. Roughly 80% of Americans think the risks of companies collecting data about them outweigh the benefits3. Transparency is central, but so is a public commitment to ethical data practices, tools, and data governance. As a starting point, businesses, more specifically, the people — data analysts to chief data officers — need tools as a means to analyze the data in their own databases, minimize sprawl, and reduce the risk of breach or misuse. We should be expecting data governance at machine speed. Just 21% of adults say they are very (3%) or somewhat (18%) confident that companies will publicly admit mistakes and take responsibility when they misuse or compromise their users’ personal data4.
We need to hold companies to account and to a high standard for transparent and ethical uses of data, so that we can gain the benefits of new and novel services, treatments, and products provided through smart data collection and analysis.
We start with us and our people here at Looker — committed to data ethics and responsible data tools that reduce risk and enhance trust. People are the ultimate key to data responsibility and data opportunity. They’re the ones making decisions about what to observe and collect, how to analyze it, what inferences to draw and actions to take or not take, enabled by state-of-the-art tools and governance.
“At Looker, we believe people are at the core of everything we do. Our commitment to being an ethical and trusted steward of the data entrusted to us is core to who we are” — Frank Bien, Looker CEO.
1 Pew Research Center, November 2019, “Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information”