Posts Tagged ‘advertising’

Monetizing Social Media: The conditions for sharing

November 18, 2011

A presentation for the Critical and Cultural Studies Division, “Voices for Sale: Monetizing Social Media” National Communication Association, New Orleans, LA (November 2011), organized by Christopher M. Boulton, University of Massachusetts, Amherst; with Kathleen M. Kuehn, Christopher Newport University; and James Hamilton, University of Georgia.

Monetizing Social Media: The conditions for sharing by Robert Bodle, College of Mount St. Joseph

I have been researching, writing, and talking about the conditions for sharing on Google, Facebook, and other online spaces and services for some time,

Mostly interrogating the tacit agreement between Internet companies and users, that we get wonderful applications for free, but not really for free, at a cost.

This cost, as many of us already know, is the disclosure of our personal information (and sale of that info to advertisers)

But so what? Why does this matter?

I was worried about this question, especially when I first began to cover this tacit agreement in my new media and society classes.

After covering at length how our information is shared online, many of my students were somewhat concerned but mostly appreciative of the benefits of Google, Facebook, and other online services, regardless of how they function.

Today, I am happy to see that my students are becoming more aware of the conditions of sharing on SNSs, without my help,  when a freshman recently told me that she was repelled by the “creep factor” of FB’s “Real-Time Activity” feature on the upper right hand of her Facebook wall. Hooray!

Recent polls conducted by the Pew Internet and American Life Project indicate that people are very concerned about their privacy from all demographic groups, bearing out the truth of my anecdote.

Perhaps this is why there are rumors circulating in the industry press that Facebook will start to make its privacy changes opt-in, instead of opt out (by default).

Yet, while we’re becoming more concerned about how our information may be used, it is still difficult to get the full picture, and of course, this is intentional.

To get a fuller picture of the conditions of our sharing, I ask how are we sharing? what is being shared? what are the market incentives and the consequences?

I cover “the how” at length in another article, Regimes of Sharing.

But quickly, in order to participate in  the main currents of social and political life many of us are participating on social media sites, whether we actually are on these sites or not (e.g., “Liking” an op-ed on NYTimes.com).

And while we are sharing with one another, and moving from site to site, we are also sharing with our social media services, advertisers, and other third party websites (NYTimes.com) within and outside of social media spaces. (Technically thiis done with the help of cookies and Open APIs).

What was once surreptitious gathering of our data under the awareness of users and without our consent (e.g., Beacon),

is now freely given through social plug-ins, such as the “Like” button.

Whenever we “Like” something using FB’s Like button on websites outside of Facebook.com we share this preference

Which is combined with much more personally identifiable data, including a running log that tracks our web browsing session with “date, time and web address of the webpage you’ve clicked to . . . IP address, screen resolution, operating system and browser version” for 90 days.

What is not clearly known is how much of this information is being shared with Facebook’s 300 million third-party websites, including powerful yet little known marketing and data processing firms like Acxiom Corporation (largest data mining company in the world).

Again, the obvious answer as to why this information is shared is the profit motive, our social labor is being converted to financial value for companies, and some have suggested that not only should we get the use of online spaces but that we also get a cut. I would argue that we should at least have the choice, because,

The condition for use on social media sites and cloud services is that we submit to surveillance, monitoring, and targeted advertising for “personalized web experiences.”

My Central Claim is that we need to change the conditions for sharing (granular control, transparency of how info is used, security measures to protect our data).

Yet, we might still ask, “what’s the big deal?”

There are intended and unintended consequences.

Among the intended consequences are that our online history is used to serve relevant and interesting news and information, advertisements, and money saving deals that coincide with our past interests.

Another intended consequence is that while Facebook turns our information into product – where we are the product, it takes away what Tavani calls our informational privacy, or “control over one’s daily activities, personal lifestyle, finances, medical history, and academic achievement stored and transmitted over ICTs.”

This loss of control over one’s own information deprives us of the freedom to make informed decisions on our own behalf.

When we lose the ability to control our information, we lose our autonomy and self-determination.

In political economy terms, this loss of control over our information reconfigures social relations between social media sites and individuals

establishing a power imbalance,

where we become more dependent and vulnerable to the SNS, and to the intended and unintended consequences such as government access to our data, and the implications of a personalized Web . . .

As our information is shared with third parties to serve ads inside and outside of facebook, it helps to construct,

what Eli Pariser calls the Filter Bubble or the personalization of content,

Through a human/algorithmic hybridization our past clicks help rank most of what we see online, including:

our friends feeds, news stories, and search queries.

This personalization makes us four times more likely to click on a link (FastCompany) which is the intended consequence, but

it also helps “to shape the information diets of its users” (Novey, July 1st, 2011).

The problem is that this process is invisible, we’re not aware this is going on (because we don’t see it happening), and ultimately,

it limits our exposure to different points of views, enabling entrenched political polarization, preventing real consensus, critical thinking, and tolerance of diversity and appreciation of our irreducible differences in society.

My question, then, is that by the time we all come around to what is happening, when navigating the Web feels like exploring our own subconscious such as in Being John Malkovich, will the conditions for sharing matter to us?

Thank you.

Advertisements