Search this blog

07 May, 2017

Privacy, bubbles, and being an expert.

Privacy is not the issue.

Much has been said about the risk of losing our privacy in this era of microwaves that can turn into cameras and other awful "internet of things" things. It seems that today there is nothing you can buy that does not both intentionally spy on your behaviors and is also insecure enough to allow third parties to spy on your behavior.

It's the wrong problem.

Especially when dealing with big, reputable companies, privacy is taken really quite seriously, there is virtually zero chance of anyone spying on you, as an individual. Even when it comes to anonymized data, care is taken to avoid singling out individuals, it might be unflattering, but big companies do not really care about you.

What they care about is targeting. Is being able to statistically know what various groups of people prefer, in order to serve them better and to sell them stuff.

CV Dazzle
The dangers of targeting.

Algorithmic targeting has two faces. There is a positive side to it, certainly. Why would a company not want to make its customers happier? If I know what you like, I can help you find more things that you will like, and yes, that will drive sales, but it's driving sales by effectively providing a better service, a sort of digital concierge. Isn't that wonderful? Why would anyone not opt in in such amazing technology...

But there is a dark side of this mechanism too, the ease with which algorithms can tune into the easiest ways to keep us engaged, to provide happiness, rewards. We're running giant optimizers attached to human minds, and these optimizers have access to tons of data and can do a lot of experiments on a huge (and quite real) population of samples, no wonder we can quickly converge towards a maximum of the gratification landscape.

Is it right to do so? Is it ethical? Are we really improving the quality of life, or are we just giving out quick jolts of pleasure and engagement? Who can say?
Where is the line between, for example, between keeping a player in a game because it's great, for some definition of great, and doing it so because it provides some compulsion loops that tap into basic brain chemistry the same way slot machines do?

Will we all end up living senseless lives attached to machines that provide to our needs, farmed like the humans in the Matrix?

Ellen Porteus


I don't know, and to be honest there are good reasons not to be a pessimist. Even with just a look at the history behind us, we had similar fears for many different technologies, and so far we always came up on top.

We're smarter, more literate, more creative, more productive, happy, healthy, pacific, rich that we ever were, globally. It is true that technology is quickly making leaps and opening options that were unthinkable even just a decade ago, but it's also true that there is not too much of a reason to think we can not adapt.

And I think if we look at newer generations, we can already see how this adaption is taking place, even observing product trends, it seems to be becoming harder and harder to engage people with the most basic compulsion loops and cheap content, acquiring users is increasingly hard, and the products that in practice make it onto the market doing so by truly offering some positive innovation.

Struggling with bubbles.

Even if I'm not a pessimist though, there is something I still struggle with: the apparent emergence of radicalization, echo chambers, bubbles. I have to admit, this is something hard to quantify on a global scale, especially when it comes to placing the phenomenon in a historical perspective, but it just bothers me personally, and I think it's something we have to be aware of.

I think we are at a peculiar intersection today. 

On one hand, we have increasingly risen out of ignorance and starting to be concerned with the matters of the world more. This might not seem to be the case looking at Trump and so on, but it's certainly true if we look at the trajectory of humanity with a bit more long-term historical perspective.

On the other hand, the kind of problems and concerns we are presented with increased in complexity exponentially. We are exposed to the matters of the world, and the world we live in is this enormous, interconnected beast were cause and effect get lost in the chaotic nature of interactions. 

Even experts don't have easy answers, and I think we know that because we might be experts in a field or two, and most big questions I believe would be answered with "it depends".
There are a myriad of local optima in the kind of problems we deal with today, and which way to go is more about what can work in a given environment, with given people, than what can be demonstrably proven to be the best direction.

Echo Chambers

The issue.

And this is where a big monster rears its head. In a world with lots of content and information, with systems that allow us to quickly connect to huge groups of similar-minded people, algorithms that feed contents that agree with our views seeking instant satisfaction over exploration, true knowledge and serendipity, when faced with increasingly complex issues, how attractive the dark side of the confirmation bias becomes?

We have mechanisms built-in all of us, regardless of how smart, that were designed not to seek the truth but to be effective when navigating the world and its social interactions. Cognitive biases are there because they serve us, they are tools stemmed from evolution. But is our world changing faster than our brain's ability to evolve?

Pragmatically again though, I don't intend to look too much at the far future (which I believe is generally futile, as you're trying to peek into a chaotic horizon). What annoys me is that even when you are aware of all this, and all these risks today, it's becoming hard to fight the system.
There is simply too much content out there, and too many algorithms around you (even if you isolate yourself from them) tuned to spread it in different groups that finding good information is becoming hard.

Then again, I am not sure of the scale of this issue, because if again, we look at things historically, probably we are still on average better informed today and less likely to be deceived than even just a few decades ago, where most people were not informed at all and it was much easier to control the few means of mass communication available.

Yet, it unavoidably irks me to look around and be surrounded by untrustworthy content and even worse, content that is made to myopically serve a small world view instead of trying to capture a phenomenon in all its complexity (either with malice, or just because it's simpler and gets clicks).
Getting accurate, global data is incredibly hard, as it's increasingly valuable and thus, kept hidden for competitive advantage.

John W.Tomac

Being an expert, or just decent.

I find that similar mechanisms and balances affect our professional lives, unsurprisingly. I often say that experience is a variance reduction technique: we become less likely to make mistakes, more effective, more knowledgeable and able to quickly dismiss dangerous directions, but we also risk of becoming less flexible, rooted in beliefs and principles that might not be relevant anymore.

I find no better example of these risks than in the trajectory of certain big corporations and how they managed to become irrelevant, not due to the lack of smart, talented people, but because at a given size one risks to have a gravity of its own, and truly believe in a snapshot of the world that meanwhile has moved on. How so many smart people can manage to be blinded.

Experience is a trade-off. We can be more effective even if we might be more wrong. Maybe, more importantly, we risk losing the ability to discover more revolutionary ideas.
How much should we be open to exploration and how much should we be focused on what we do best? How much should we seek diversity in a team, and how much should we value cohesion and unity of vision. I find these to be quite hard questions.

I don't have answers, but I do have some principles I believe might be useful. The first has to do with ego, and here it helps not to have a well-developed one, to begin with, because my suggestion is to go out and seek critique, "kill your darlings".
This has been taught to me at an early age by an artist friend of mine who was always critical of my work and when I protested made me notice how the only way to be better is to find people willing to trash what you do.

In practice, I think that we should be more severe, critical, doubtful of what we love and believe than anything else. We should set for our own ideas and social groups a higher standard of scrutiny than what we do for things that are alien to us.

The second principle that I believe can help is to encourage exploration, discovery, experimentation and failure. Going outside our comfort zones is always hard but even harder is to face failure, we don't like to fail, obviously and for good reasons.
So one cannot achieve these goals without setting some small, safe spaces where exploration is easier and not burdened (I would say unconstrained, but certain other constraints actually do help) by too much early judgment.

Lastly, beware that for how much you know about all this, and are willing to act, many times you will not. I don't always follow my own principles and I think that's normal. I try to be aware of these mechanisms though. And even there, the keyword is "try".

Epilog: affecting change.

I believe that polarizing, blinding, myopic forces are at work everywhere, in our personal and professional lives, in the society at large, and being aware of them is important even just to try to navigate our world.

But if instead of just navigating the world, one wants to actually affect change, then it's imperative to understand the fight that lies ahead.

The worst thing that can be ever done is to feed the polarization forces, cater to our own and scare away people who might have been willing to consider our ideals. It does not help, it damages.

Catering to our own enclaves, rallying our people, is easy and tempting and fulfilling. It's not useless, certainly, there is value in reaffirming people who are already inclined to be on our side, but there is much more to be gained in even just instilling a doubt reaching out to someone who is on the opposite side, or is undecided in the middle, than solidifying beliefs of people that share ours already.

You can even look at current events, elections and the way they are won.

Understanding people who think differently than us, applying empathy, extending reach, is so much harder. But it's also the only smart choice.

No comments: