I am one of the people who delete web cookies, browsers cache, and make sure not to be logged into anything while I do search or online shopping. For a person working in the field of digital marketing, one would assume that I should support these technologies knowing how beneficial they are in delivering the right content to the right person.  Which I do support from that perspective, however,  I question the level of targeting currently being deployed and automated to the point of limiting what we get to see and become aware off although it is publicly available online.

Relevance

Digital marketing adapted all the techniques we all know from traditional marketing, however, it raised the bar to new levels when it came down to reporting and measuring. The level of details we receive from EDM’s, online advertisements, and social integrated digital campaigns is overwhelming.  On the other hand, those details , or as they are formally known as consumer insights, helped tailer down the Web to be more relevant to casual browsers and online shoppers alike. The question I am asking, is when would relevance become more of censorship and control over what online users get to see and interact with?

Eli Pariser talks on Ted.Com about his new book called “Filter Bubbles” which addresses this very same issue. He starts the discussion about relevance from Mark’s  Zuckerberg point of view and then moves into why two people on two different locations will get totally diverse result pages searching for the same topic using Google.com. A marketeer  would argue that this is  smart and produces higher ROI, however, freedom advocates would say that we are not able to choose what we see online and are being driven by what our search patterns, click habits, web cookies, and geo locations tell the likes of Google, Facebook, and Amazon systems to automatically load on our computer screens.

Behavioral Targeting

I have read a statistical report from Emarketer.com that showed that women welcome targeting more than men, I even wrote a blog piece about being a sitting duck for online advertisements. Back then, what I did not account for is the fact that there is little we can do when the censorship and filtering, as mentioned in the “Filter Bubbles” book, extend to surround us by information we sort of opted in for. Our behaviors online on sites like Facebook.com, enabled the content index system in Facebook to show only posts from friends we usually interact with and hide all others. With the amount of posts that are generally available for an average social person on Facebook, the automated relevance mechanism does not really address the need to be social enough and as such even those who welcome online targeting would think there is a need to draw a line between content relevance and plain content censorship !!