Quantcast
Channel: Politics
Viewing all articles
Browse latest Browse all 57958

Navigating the 'Fog of Data Ignorance'

$
0
0
Click here to watch the TEDTalk that inspired this post.

"Part lies, part heart, part truth, part garbage" -- this quip from R.E.M. guitarist Peter Buck back in the '80s says a lot about the Big Data anxieties Jennifer Golbeck discusses in her new TED Talk. The conflicting feelings people have about big data breed confusion. With apologies to Churchill, it's a conundrum, wrapped in a dilemma, agitated by distrust and evidently correlated with how much you like curly fries.

Having led a global initiative for a number of years strengthening the trust, transparency and empowerment of individuals in the use of personal data, I couldn't agree more with Jennifer on the overwhelming complexity of personal data. The conundrums she points to are the emerging attributes of a hyperconnected world. Whining about the complexity of modern times won't help.

As Senior TED Fellow Eric Berlow notes, we need to get to the other side of complexity -- to simplicity. But getting there will be no simple task. The personal data narrative is conflated, the uncertainties are expanding, and we are all shrouded in what author David Brin calls a "fog of data ignorance." Anxieties are high. The trust deficit is deep. Many have resigned themselves to saying (to crib another R.E.M. lyric), "It's the end of the world as we know it."

And I feel fine. Jennifer's TED Talk is an example of a richer, more nuanced, and globally informed dialogue on personal data that is quickly gaining momentum. The energy fuelling the ascendant narrative comes from multiple sources. For example, the voices of the most vulnerable and the global south are being addressed in the open data movement, the risks of data harms (at the individual, community and societal level) are being evaluated and managed in increasingly pragmatic and practical ways. Global research initiatives (PDF) on innovative personal data management systems (which granularly assess the context of usage to ensure the consent of individuals) are taking root.

Lots of good stuff is happening and Jennifer's talk provides a front line perspective on pragmatic ways to move forward. Most importantly, her talk highlights the need for effective transparency, particularly with respect to how the high priests of data science can glean insights from seemingly innocuous bits of digital exhaust. But as everyone knows, transparency is itself a paradox; with too much transparency, you get a haze of opacity. Those 30-page end user agreements we all agree to, but never read, are just one example of opacity through transparency.

Effective transparency is not a one-way mirror that reduces individuals to being spectators on how their data is used. Instead, meaningful transparency requires both inbound and outbound information flows. It requires institutions (commercial and governmental) to listen and act upon the wants and needs of individuals. We need better feedback loops and tools to strengthen the choice and control individuals have over how data about them is used in a variety of contexts. And the burden and risks can't just rest on the shoulders of individuals. The ecosystem needs to pivot even further from being user-centric to being holistically user-centered.

Additionally, beyond Jennifer's insights that arise when individuals actively and intentionally like certain things (such as curly fries) on Facebook, we also need to be aware of the increasing volumes of passively observed and inferred data that are continually generated about us and arguably even more powerful when combined with other data. Beneath our awareness, billions of connected sensors and advanced algorithms are creating highly detailed profiles about us and enabling increasingly accurate inferences about our future intentions. Being in the dark about all the raw data that's being collected about us is one thing. But being blind to the inferences and probabilities of our own personal futures is another. As noted legal scholars Mireille Hildebrandt, Tarleton Gillespie, Nicholas Diakopoulos and Ryan Calo all separately note, we need to focus on the accountability of algorithms and the sharing of intended consequences to individuals.

I know that last sentence was a mouthful. Have another curly fry, watch Jen's talk one more time and think about it.

We want to know what you think. Join the discussion by posting a comment below or tweeting #TEDWeekends. Interested in blogging for a future edition of TED Weekends? Email us at tedweekends@huffingtonpost.com.


Viewing all articles
Browse latest Browse all 57958

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>