Emerging Media Episode 7: Cultural Analytics

Just last week Wednesday I delivered a presentation titled "Surplus Futures: Repurposing Archives via New Media Arts" (presentation slides here) to an auditorium 1/3 full of at Loyola University Art + Design department undergrad students as part of a monthly Design Forum speaker series, where they invite local professionals to present on work philosophies or approached to medium, in order to help students "paint an attainable picture of working as a designer in the South."

The goal of my presentation was to engage a new audience in understanding how the skill sets they were building could be matched up with good historic graphic data: namely, premium quality, high-resolution scans of newspapers containing texts and graphics that will be interpreted and reused and programmed not only for commercial projects but eventually for data extraction and digital image processing.

It was very clear to me, looking around the room, that the collection of humans was actually not lacking in cultural diversity, as far as the general make-up of the student population appeared. (I was quite impressed with my initial impressions of the group as I actually had their attention - tools down and all! I asked the design students what types of materials and techniques they wanted for future projects and what wasn’t adequate as they saw it now. The consensus was that they wanted more access to usable historic materials that they could incorporate into their projects. My expectations were exceeded in that this younger generation was REALLY INTO the idea of media archeology as the practice of creating New Media Art has been fully imbedded in their own design practices.

But this got me thinking about this week’s readings when this realization set in: the most imperative step forward is building a new set of metrics for inclusion within the milieu of cultural diversity. A social environment at once in opposition to how an echo chamber functions - because inclusion means everybody is at the table in the same discussion, not homophilic enclaves of the same, usual policy makers deciding for all.

In his paper “Critical technoculture discourse analysis” for the journal new media & society, André Brock states that his approach to examining the specific discourses going-on via Black Twitter was “born of my frustrations with digital divide research, which operates from the technodeterministic premise that access to the “digital” improves the lives of underrepresented groups”. This is where the rubber meets the road, in applying his CTDA in highlighting the technoculture to an understudied group via a semiotic approach to understand meaning. I’m encouraged by this, as a reader and investigator of Roland Barthes’ Semiotics in decoding early 20th century imagery from American newspapers (an approach that is just as valid as it is today and it was then).

I took the time to listen to Katrin Tiidenberg’s lecture on vimeo: “What should cultural analytics be?” as I have also asked myself this question in dealing, conceptually, with how digital image processing might inform new types of cultural analytics. Even though this is NOT a peer reviewed journal article, Tiidenberg makes some incredible points and provocations regarding how these concepts might be employed. First she addresses the implications of the politics, second the power dynamics of how that works via methodologies, and how all that expects, or suggests positivist and inclusive visions. But if there is anything that we might all agree on in studying technoculture, it is clear that because of the biggest players, and how they dominate the digital environment in a so-called “API-apocolypse”. So it’s not like we have just a big brother anymore, as big mother, brother, and sister have all taken stake. Tiidenberg references GAFA (Google, Apple, Facebook, and Amazon) as being the stakeholders here who, ultimately have the most to gain or profit via our current system. While “many professionals and scholars see datafication as a revolutionary research opportunity to investigate human conduct” more critique is needed to fully unpack how “data capital is convertible, in certain conditions, to economic capital” (as Jathan Sadowski argues). While Tiidenberg may be sounding a proverbial alarm, the writing is however already on the wall, as we want to think social media companies are neutral providers of platforms, when in fact they are essentially farms for our data.

Perhaps the most understated piece of her lecture is the examination of the metaphors and symbolism surrounding the phrase “data is the new oil” or “data is the new ______” or whatever the placeholder may be. She says: “If the comparison sticks, and everyone starts calling data the new oil, as they kind of have, it will work under the surface not only to reflect, but to influence how we think about data.” I find this observation to be salient, as I approach much of my reading with the creative mind of a poet - seeing metaphor in much of ways language is used to define the terms in which scholars discuss their respective topics, and often have problems with how these types of terms intend to become commonplace, or have overlapping meanings, like in the case of filter bubbles and echo chambers. Furthermore the metaphor is extended via liquid terms, and thus the implication comes through that data becomes uncontrollable, when it is not.

Setting our sites on new methods will help us create new environments in which new research can proceed. We grow the onion only to peel apart its layers of information to reveal new insights (with the repercussions that come with the metaphoric tears that may fall while doing so). But the ethical questions must arrive in a more comprehensive and inclusive understanding of how we ultimately thicken that data and what the thickening does for how we cook it down, or process it for visual insight. In a way Tiidenberg is calling for an alternate set of metrics, and while this lecture of hers may not present all the answers, it introduces an idea that simply cannot be ignored. That is: "a truly critical, contextual, and ethical version of data analytics."

The most exciting thing is that the sun hasn’t set on the possibilities – and a new horizon can be set for future endeavors, hopefully aligned with the hopes and dreams for application via the wildest projects possible, conducted by students like the ones I presented last week at Loyola.

Afterwards:

In searching for tools that might help me better understand the inclusive nature of Media Literacy, I stumbled upon a small list of “Horizon Reports” produced by the now defunct group, New Media Consortium (which WAS “an ongoing research project designed to identify and describe emerging technologies likely to have an impact on learning, teaching, and creative inquiry in education.”). Although NMC is no-longer, there is still MUCH to learn from these reports in understanding what is at stake in understanding Media Literacy in various educational settings.

Here is the general 2017 report.

Including a Museum Edition for 2010-2016.

An edition focusing on Academic & Research Libraries for 2014-2017.

And one focusing on K-12 for 2009-2017.

And, an aside:

I didn’t want to ignore the idea of Fake News all together, and want to dig deeper (later) into Jack Hamilton’s four pernicious factors of Fake News, as mentioned in his guest column for The Baton Rogue Advocate here.

Leave a comment.

comments powered by Disqus