Posts Tagged ‘Facebook’

IR13 Session: “The Ubiquitous Internet”

October 13, 2012


Salford, Greater Manchester, UK  ·  October 18 -21, 2012

Session 001: “The Ubiquitous Internet” Friday, 19/Oct/2012: 4:40pm – 6:10pm, Location: 0.11. Session Chair: Anja Bechmann

Looking forward to our presentation The Ubiquitous Internet, which will include: Anja Bechmann1, Stine Lomborg2, William Dutton3, Grant Blank3, Christine von Seelen Schou4, Robert Bodle5, Laura DeNardis6

1Aarhus University, Denmark; 2University of Copenhagen, Denmark; 3Oxford Internet Institute (UK); 4The University of Copenhagen (DK); 5College of Mount St. Joseph (US); 6American University (US)

My presentation below (full session overview here) –

Monetizing Social Media: The conditions for sharing 

Robert Bodle, College of Mount St. Joseph (US)

This presentation looks at the conditions for sharing on Facebook and its over 300 million partner websites, by identifying the tacit agreement between internet companies and users – that we get useful and interesting online services in exchange for the disclosure of our personal information. Increased advertising revenue provides incredible incentives for mining user data obtained via social network sites and services. Although people are increasingly concerned about how their information may be used, it is still difficult to get the full picture, which, I argue, is intentional.

To get a fuller picture of the conditions for sharing, I analyse the relationships between Facebook and its third-party advertising ecosystem, utilizing extensive internet industry press coverage, Public comments by Facebook’s Developer Blog and management team, as well as Facebook’s public communications in interviews and trade conferences. I apply a political economy approach (Terranova 2000; Mosco 2009; Wasco & Erickson 2009) to evaluate the conflict of interest between market logic and user needs. Additionally, I apply the progressive and humanistic ideals of liberalism (McChesney 2007) and cross-cultural communication ethics (Ess 2009), to assess the social, cultural, and political implications of personalization.

I provide a current appraisal of Facebook’s human/algorithmic hybridization practices used to personalize the web experience for social advertising revenue. I then look at the intended and unintended consequences of personalization, which includes limiting our exposure to different points of view, enabling entrenched political polarization, and discouraging consensus, critical thinking, and tolerance of diversity and appreciation of people’s irreducible differences. Ultimately, this presentation argues for the need to change the conditions for sharing on social network sites (granular control, transparency of how information is used, and regulated security measures to protect our data), and suggests that opt-in defaults be the Internet standard for data-driven advertising practices.

Advertisements

Monetizing Social Media: The conditions for sharing

November 18, 2011

A presentation for the Critical and Cultural Studies Division, “Voices for Sale: Monetizing Social Media” National Communication Association, New Orleans, LA (November 2011), organized by Christopher M. Boulton, University of Massachusetts, Amherst; with Kathleen M. Kuehn, Christopher Newport University; and James Hamilton, University of Georgia.

Monetizing Social Media: The conditions for sharing by Robert Bodle, College of Mount St. Joseph

I have been researching, writing, and talking about the conditions for sharing on Google, Facebook, and other online spaces and services for some time,

Mostly interrogating the tacit agreement between Internet companies and users, that we get wonderful applications for free, but not really for free, at a cost.

This cost, as many of us already know, is the disclosure of our personal information (and sale of that info to advertisers)

But so what? Why does this matter?

I was worried about this question, especially when I first began to cover this tacit agreement in my new media and society classes.

After covering at length how our information is shared online, many of my students were somewhat concerned but mostly appreciative of the benefits of Google, Facebook, and other online services, regardless of how they function.

Today, I am happy to see that my students are becoming more aware of the conditions of sharing on SNSs, without my help,  when a freshman recently told me that she was repelled by the “creep factor” of FB’s “Real-Time Activity” feature on the upper right hand of her Facebook wall. Hooray!

Recent polls conducted by the Pew Internet and American Life Project indicate that people are very concerned about their privacy from all demographic groups, bearing out the truth of my anecdote.

Perhaps this is why there are rumors circulating in the industry press that Facebook will start to make its privacy changes opt-in, instead of opt out (by default).

Yet, while we’re becoming more concerned about how our information may be used, it is still difficult to get the full picture, and of course, this is intentional.

To get a fuller picture of the conditions of our sharing, I ask how are we sharing? what is being shared? what are the market incentives and the consequences?

I cover “the how” at length in another article, Regimes of Sharing.

But quickly, in order to participate in  the main currents of social and political life many of us are participating on social media sites, whether we actually are on these sites or not (e.g., “Liking” an op-ed on NYTimes.com).

And while we are sharing with one another, and moving from site to site, we are also sharing with our social media services, advertisers, and other third party websites (NYTimes.com) within and outside of social media spaces. (Technically thiis done with the help of cookies and Open APIs).

What was once surreptitious gathering of our data under the awareness of users and without our consent (e.g., Beacon),

is now freely given through social plug-ins, such as the “Like” button.

Whenever we “Like” something using FB’s Like button on websites outside of Facebook.com we share this preference

Which is combined with much more personally identifiable data, including a running log that tracks our web browsing session with “date, time and web address of the webpage you’ve clicked to . . . IP address, screen resolution, operating system and browser version” for 90 days.

What is not clearly known is how much of this information is being shared with Facebook’s 300 million third-party websites, including powerful yet little known marketing and data processing firms like Acxiom Corporation (largest data mining company in the world).

Again, the obvious answer as to why this information is shared is the profit motive, our social labor is being converted to financial value for companies, and some have suggested that not only should we get the use of online spaces but that we also get a cut. I would argue that we should at least have the choice, because,

The condition for use on social media sites and cloud services is that we submit to surveillance, monitoring, and targeted advertising for “personalized web experiences.”

My Central Claim is that we need to change the conditions for sharing (granular control, transparency of how info is used, security measures to protect our data).

Yet, we might still ask, “what’s the big deal?”

There are intended and unintended consequences.

Among the intended consequences are that our online history is used to serve relevant and interesting news and information, advertisements, and money saving deals that coincide with our past interests.

Another intended consequence is that while Facebook turns our information into product – where we are the product, it takes away what Tavani calls our informational privacy, or “control over one’s daily activities, personal lifestyle, finances, medical history, and academic achievement stored and transmitted over ICTs.”

This loss of control over one’s own information deprives us of the freedom to make informed decisions on our own behalf.

When we lose the ability to control our information, we lose our autonomy and self-determination.

In political economy terms, this loss of control over our information reconfigures social relations between social media sites and individuals

establishing a power imbalance,

where we become more dependent and vulnerable to the SNS, and to the intended and unintended consequences such as government access to our data, and the implications of a personalized Web . . .

As our information is shared with third parties to serve ads inside and outside of facebook, it helps to construct,

what Eli Pariser calls the Filter Bubble or the personalization of content,

Through a human/algorithmic hybridization our past clicks help rank most of what we see online, including:

our friends feeds, news stories, and search queries.

This personalization makes us four times more likely to click on a link (FastCompany) which is the intended consequence, but

it also helps “to shape the information diets of its users” (Novey, July 1st, 2011).

The problem is that this process is invisible, we’re not aware this is going on (because we don’t see it happening), and ultimately,

it limits our exposure to different points of views, enabling entrenched political polarization, preventing real consensus, critical thinking, and tolerance of diversity and appreciation of our irreducible differences in society.

My question, then, is that by the time we all come around to what is happening, when navigating the Web feels like exploring our own subconscious such as in Being John Malkovich, will the conditions for sharing matter to us?

Thank you.