Connectivity Cognizance: A Social Media Strategy

(To WRT 232 readers. Sorry, but this final post for class is going to be unsatisfying to read. I have been more distracted by my 16-page historiography on the Black Death. However, I enjoyed having class with you all! Best of luck.)

https://sites.google.com/a/oakland.edu/jason-harris/home

connectivity (2)As the final requirement for a course I took on Writing for New Media, I was made to develop a forward-looking strategy for my online social media presence. Writing for New Media was basically an introductory course to Web authoring, but not in the technical sense of the term. Since the course was designed by the Writing and Rhetoric department at Oakland University, emphasis was on social media theory and digital literacy; not desktop publishing (like HTML coding or graphic design). The intention, at the end, was to understand how to develop and maintain an academic or professional Web presence. My Google Sites website above was made for that express purpose. While I am aware that some of my interests and cultural references linked through this site may not be agreeable to all parties, it is my hope that I project an honest digital representation of who I am and aspire to be. Part of accomplishing this is realizing that my social networks need to be appropriate for everyone I can possibly imagine viewing them.

In addition to being introduced to new concepts, Writing for New Media made me more sensitive to the fact that we are all interconnected through new media. New media platforms are considerably trendy and multiform in today’s world, and they allow for social networking and instant user feedback. I realized that, if I was on even just one social media website, my digital footprint would extend far and wide across the Internet, making it easy for others to find me and view what I publish online. Crucial to this understanding is that, whatever I do or say on social media is bound to make an impression on others, including people that could play a pivotal role in my future. Therefore, maintaining a respectable image is very important. It is my goal, then, to reflect an academic, professional, and productive image of myself wherever a digital representation of me exists online.

Here (on Google Sites), all of my social media accounts are joined together into one central hub. Having all of them in one place underscores my confidence in regards to my Web presence. I have not been worried about my online presence in the past, but before taking Writing for New Media, I was not as cognizant of how social media facilitates many connections. Based on how I present myself, these connections can be fruitful, or they can inhibit my progress in the long-run. I have decided that the former option is more desirable.

On the sidebar of my Google Sites page are all of my social media accounts that are currently in use. They are all popular, and likely to reach a large number of people. Here, I will outline my intended use for each platform and suggest ways to maximize their potential:

  1. Facebook – From the start, I have intended for Facebook to be a social network where I connect and share with anyone who is present in my day-to-day life. These individuals should hold an informal relation to me as family, friends, or acquaintances.
  2. GrizzOrgs – As the Oakland University network for student organizations, I am hoping that GrizzOrgs enriches the rest of my university experience at OU. Having a way to connect with other students and faculty will hopefully lead to new activities, opportunities for leadership and personal development, as well as meaningful, lasting friendships.
  3. LinkedIn – LinkedIn, obviously, is for professional use. Wherever my degree takes me, I am hoping to connect with my supervisors, colleagues and references through LinkedIn. There has never been a better way to get endorsed and more readily reviewed by employers in today’s world.
  4. TwitterTwitter was a site that I never anticipated using. But through my perfunctory use of Twitter during my time in Writing for New Media, I realized how effective a simple “tweet” could be when it came to sending out reminders and notices for things like events, cancellations, etc.
  5. WordPress – WordPress is where the bulk of my substantive content will go, including essays, papers, or anything of serious intellectual endeavor. Several peers of mine have encouraged me to keep up my blog, Digital Téchnē, and I am grateful to them for this encouragement and interest in my writing.
  6. YouTube – With YouTube, I can create video presentations in the future for either school or work. YouTube allows for the opportunity to push my creative capacity, and maybe learn some video editing software in the future to showcase a media-rich portfolio.

Changing Literacies: A “Manifesto” in Five Narrowly Construed Parts

Know-Thyself-IV_web[1] Essentialism:

In Plato’s Phaedrus, Socrates declares that speech bears a closer relation to thought than writing. Like any uncompromising tutor, Socrates insists that Phaedrus recount his morning conversation with the orator Lysias on the subject of love without any textual aid. Phaedrus made to remove his written notes from that morning’s conversation, and Socrates admonished him for doing so. Such an admonishment may have been disconcerting to Phaedrus, but as Socrates reasons, recounting things as they happened could only be of benefit to working memory.

Later, in the same dialogue, Socrates relates to the young Phaedrus a curious legend regarding the Egyptian king Thamus. The god Theuth had come to Thamus with the invention of writing, so that the Egyptian people could enjoy this gift from their beneficent king. However, Thamus concluded that his people would be much better off without writing, because such an invention would not improve the memory of his people, but do the opposite. Thamus responds: “The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth.” Therefore, Socrates famously rejected the “technological” innovation of writing. Writing was meant to be a secondary method to the methods and insights of face-to-face dialogue (i.e.: the Socratic method).

Ironically, Western civilization has taken its literate heritage from Greek philosophy. Even Plato intended writing to be secondary to logically constructed thought and speech. But for both Plato and Socrates, genuine knowledge consisted of what was unchanging and eternal. For instance, there could be no “true” knowledge of a particular river, but only of the pure Idea (or eidos) of “river.” From this literary heritage – written down in words no less – models and laws were affixed to the processes of learning, and were aided by the Enlightenment. This has done much to separate man from the natural world (it has been argued that Plato’s popularly-held conception of eidos has contributed profoundly to civilization’s distrust of nature as well as bodily, sensorial experience; see quotation below), but it has also led (perhaps unintentionally) to an essentialist view of literacy in education. This is the idea that there is only one meaning (one eidos, if you will) and the goal of education is to convey that meaning across cultures.

“. . . the process of learning to read and to write with the alphabet engenders a new, profoundly reflexive, sense of self. The capacity to view and even to dialogue with one’s own words after writing them down, or even in the process of writing them down, enables a new sense of autonomy and independence from others, and even from the sensuous surroundings that had earlier been one’s constant interlocutor. The fact that one’s scripted words can be returned to and pondered at any time that one chooses, regardless of when, or in what situation, they were first recorded, grants a timeless quality to this new reflective self, a sense of the relative independence of one’s verbal, speaking self from the breathing body with its shifting needs. The literate self cannot help but feel its own transcendence and timelessness relative to the fleeting world of corporeal experience.” From The Spell of the Sensuous by David Abram

[2] A word on pedagogy:

Traditional pedagogy adopted essentialism not based on philosophical antecedents alone, but also because essentialism went hand-in-hand with the whole idea of national progress. In Ken Robinson’s words, education was “conceived in the intellectual culture of the Enlightenment and in the economic circumstances of the Industrial Revolution.” Public education was no doubt revolutionary for its time and place, but society and culture have changed dramatically in the last sixty years, and education has tarried behind. Even today, education is based on league tables, standardized test scores, and examinations. The standardizing of public education is done to achieve a capital outlay, so to speak (the proper term is performance pay), with returns so poor that a teacher’s salary often suffers. But even with the advancements in technology, the ideal to maximize efficiency is given a new edge. Indeed, the emphasis on standard expectations readily identifies individuals who are not performing “efficiently” in comparison with their peers. Furthermore, if the fundamental purpose of education is “to ensure that all students benefit from learning in ways that allow them to participate fully in public, community, and economic life,” then even the New London Group is distracted from the current reality. It is impossible to predict with accuracy what things will look like publicly, communally, and especially economically in the future:

“The purpose of education is often said to be to equip young people with skills ‘for the real world’, as if there were – in some dimension of genuine reality – such a thing, to be known so indubitably that we could stipulate what skills might be required to deal with it, rather than, inevitably, interpretations of and perspectives on it.” In “Poststructuralism, Postmodernism, and Education” by Richard Smith

kr[3] Singularity:

Marc Prensky is responsible for coining the term “digital native,” which has been used in the work of John Palfrey and Urs Gasser (see my previous post). While the exhalation of distress over the decline in the US education system issues forth from teachers, political pundits, school board members etc., Prensky steps back to make an obvious claim: “Today’s students are no longer the people our educational system was designed to teach.” The rapid takeover of digital technology in every area of life has led to a “singularity;” an event in which new trends are no longer simply reformulations of the old. In understanding nascent technologies, it is often the young who are leading the way, and adults are anxiously trailing behind. In education, models of learning are immutable because curriculum’s have been devised by digital immigrants (or teachers with little experience with computers outside using email clients or academic domains) who “know better.” Indeed, the student who grew up on technology can no longer maintain interest in static lesson plans with no imagery or graphics. We are often disparaged by digital immigrant instructors who converse in an academic argot that is not in tune with our memes, colloquialisms, and digitally lexical interests. This often leads to a “those kids mentality” on the part of the instructors, who give up on helping students, because students “just don’t care to engage.”

Still, there may be good reason for teachers to be so disparaging. Prensky notes that “today’s students think and process information fundamentally differently from their predecessors.” There is nothing wrong with processing information differently, per se. But there is evidence to suggest that digital natives are negatively shaped by emerging technologies, both biologically and neurologically. For instance, the renowned scientist Susan Greenfield believes that the “modern” world is altering our human identity. With the influence of emerging technologies, Greenfield notes that “[a]ttention spans are shorter, personal communication skills are reduced and there’s a marked reduction in the ability to think abstractly.” More startling is the apparent use of technology among those just beginning to bud into reciprocative persons. The effects of technology on developing brains obviously goes deep. It is no wonder then that digital natives engage so masterfully with new media, even if it is at the expense of their memory and cognitive capacity:

“We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world.” – Susan Greenfield

[4] Post-structuralism:

In 1979, Jean-François Lyotard published The Postmodern Condition. In it, Lyotard states that, “the miniaturisation and commercialisation of machines is already changing the way in which learning is acquired, classified, made available, and exploited. . .  The old principle that the acquisition of knowledge is indissociable from the training (Bildung) of minds, or even of individuals, is becoming obsolete and will become ever more so.” Here, I think, are some rather prescient comments about the influence of computers on learning and knowledge in a postindustrial and postmodern society. We have already seen how schooling was predicated on industrialization (and before that, on a classical Greek variant of essentialism), but when new digital media takes over the classroom, the first shift in any pedagogical organization comes in the form of a linguistic paradigm of mediation. The research team of the Upper Yarra Community House have demonstrated in their New Literacies Project that being literate now means being able to “decode written text, understand and compose meaningful texts, use texts functionally and analyze texts critically.” Essentially, digitally written communications are now characterized by morphing sets of graphemes, spellings, hypertexts, etc. In other words, new semiotic systems are evolving every day, and it is becoming increasingly important to rein these in to analyze and interpret their content.

In Sherry Turkle’s book Life on the Screen, Turkle notes that “the mechanical engines of computers have been grounding the radically nonmechanical philosophy of postmodernism.” She relates an anecdote where a student of hers did not understand Jacques Derrida‘s theory of deconstruction until an early Macintosh computer showed him stacks of hypertext. With hypertext, suddenly the student was made aware of the multiplicity of possible meanings hidden behind a single text. For the purposes of this essay, it will be best not to get too caught up in post-structural concepts. But the mission of education in the 21st century, as outlined by The New London Group and others, is to strip away the monocultural and monolingual monopoly on texts. This makes room for diversity, creativity and individuality. As Richard Smith puts it, “Postmodern knowledge refines our sensitivity to differences and reinforces our ability to tolerate the incommensurable.” These new measures, however, are still interconnected with the technological.

In some theoretical circles, there is an interesting discussion about the end of postmodernism, which I view as very positive, but have no hope of seeing, because this discussion involves radical ethics that do not comport well with the current political reality. Still, what I think schools are lacking is a proper critical distancing from technology, not just in the Philosophy department, but in all departments (I am not even going to broach K-12. I will just suffice to say that I think primary school should be based on creative play and fundamental learning, whereas secondary school should be more focused on abstract thinking; neither mediated by technology). My class on digital writing began with figures like Howard Rheingold, Guy Debord, Michel Foucault, and Jean Baudrillard, so for my purposes, I am relatively pacified. But I still perceive an implicit acceptance of technology (in computer-oriented classrooms), even despite some critical engagement with theorists.

“The ideology of those who desire to be wired and who see the Internet as the experimental grounds for allegedly heterogeneous experiments with alternative subject-positions is integral to the political economy of bio-technological capitalism.” – Rosi Braidotti

[5] Psychê:

I have earnestly written about my interaction with new digital media elsewhere, so I don’t think I need to say much more about my personal feelings toward emerging technologies. I am cautious, if not downright dismissive of most technology. This is mainly due to my interest and reading in the philosophy of self, continental phenomenology, and deep ecology. I view the techno-capitalist world in which we live as inherently destructive, and have a hard time reconciling myself to the idea that technology is meant to be an integral part of human experience. When it comes to social media, I echo figures like Noam Chomsky, who maintain that social media conditions people to think and act superficially, with little or no serious thought or reflection. Still, I am not ignorant of the benefits technology has brought to society. New media websites that are participatory are quite different from the top-down mass medias of which intellectuals like Chomsky disapprove. For instance, the pressures put on Mubarak during the Arab Spring offers a counterexample to media critics about the power of social networking. However, restrictive regimes are still finding a way to “pull the plug,” so to speak, from a united and democratic citizenry. This returns the entire conversation back to critical theory, in which what is panoptic in any governmental system needs to be defined and checked for abuses.

In roughly 370 BC, Plato defined the soul as being the core essence of any living being. The process of writing can, at best then, only capture elements of one’s own “soul” or personhood. What one puts down in writing today, unless reflected on ad infinitum, will mostly be lost or fragmented in the days that follow. Indeed, what is written down will have to be continuously referred to by the writer (maybe not so much the rare savant) in order to deliberate its full weight and meaning. If this is true, then Socrates was not all that insane. In fact, I think he was correct to suggest that writing was only an aid to reminiscence. In other words, writing is very self-reflexive and has nothing to do with essences or what makes a person who they really are. I think that is why the great writers have often been lost to themselves. For instance, Franz Kafka burned 90 percent of his manuscripts. Virginia Woolf, Ernest Hemingway, and Hunter S. Thompson all chose to kill themselves. At any rate, writing is, of course, a skill that has a lot of value and virtue in the arts. Digital writing can have the same value, so long as digital literacy is taught as an adjunct to traditional forms of literacy. But to foster a truly enlightened citizenry, individuals need to find happiness in the skills that they learn, and these skills need to be relevant socially, publicly, and economically. That is why I like Ken Robinson so much, because he understands the need for creativity in developing a productive, vivacious society.

DSC01620So much of the University experience is not productive or vivacious. It is rote and unappealing. As an undergraduate in the arts, I am often lectured on enlightenment ideals. This is not a surprise, since I am a history major. But the lectures my professors give are not the problem; it is the delivery of their lectures that leaves something to be desired. It may seem that, based on what I have written here, I am all for purely lecture and discussion-driven teaching. However, I think this really only works in close-knit discussion groups on subjects where individuals share a mutual passion. Given the preceding discussion on digital natives, University teachers should give visual lectures just as must as verbal ones. Only one out of my four history professors thus far in my University experience has decided to do this. The others have disparaged media in the classroom (and the students who have dared to use it). As a digital native (albeit a strange one), I feel that I can do better teaching myself, and so I focus as much as I can on reading what interests me. Still, the standards are set, and the drudgery continues. Maybe a little bit of video games to take my mind off things?

“I have maintained a passionate interest in education, which leads me occasionally to make foolish and ill-considered remarks alleging that not everything is well in our schools. My main concern is that an over-emphasis on testing and league tables has led to a lack of time and freedom for a true, imaginative and humane engagement with literature.” – Philip Pullman

A Pedagogy of Multiliteracies: Designing Social Futures

A strong sense of citizenship seems to be giving way to local fragmentation, and communities are breaking into ever more diverse and subculturally defined groupings. The changing technological and organizational shape of working life provides some with access to lifestyles of unprecedented affluence, while excluding others in ways that are increasingly related to the outcomes of education and training.” In A Pedagogy of Multiliteracies: Designing Social Futures

The central issue raised by The New London Group in A Pedagogy of Multiliteracies (published in the Harvard Educational Review in 1996) concerns the idea that teaching traditional literacy is no longer relevant in today’s world (even as far back as ’96) because of changing cultural, institutional, and global realities. Traditionally, a monolingual and monocultural pedagogy was taught based on canonical English. This sort of curricula was championed because it enforced the status quo. When The New London Group published their “programmatic manifesto,” there were no computerized networks spread across various levels of social strata (the middle class citizen was not yet a Granovetterian “node” interconnected with a widening social network). Individuals were taught a precise curriculum that was very limited in scope, breadth, and knowledge. This was done in order to prepare individuals for a disciplined work life, in which hierarchical structure predominated. It is easy to entertain here images of those one-room schoolhouses where kids were chastised by ruler-slapping pedagogues. But since the development of “fast capitalism,” the realm of the workplace has moderated, becoming more centered around the concept of “a workplace culture in which the members of an organization identify with its vision, mission, and corporate values.”

If we recall Benjamin Barber’s insights in his seminal article on Jihad vs. McWorld, this notion of “fast capitalism” has ballooned into the global marketplace. The very issues that The New London Group malign are reawakened in the commercial sphere. So, now we have a global capitalist model that is both monocultural and monolingual (through branding slogans, advertisements, etc.). I think this may be what the authors refer to as “new systems of mind control or exploitation” in the new workplace discourse. Therefore, the new school pedagogy of multiliteracy should be “authentically democratic” and focused on “meaningful success for all. . . success that is not defined exclusively in economic terms and that has embedded within it a critique of hierarchy and economic injustice.” This feeds into the twin goals of student participation and critical thinking.

On the subject of changing public and private lives: For one, the idea of The New London Group is that ever-increasing local diversity necessitates the need to diversify public education. It used to be that those children who were ethnic or multilingual would learn English at primary school, while speaking the native language at home. As far as I’m concerned, this worked fairly well, and it placed these children in a good position to be cultural mediators later in life. But this call for a pedagogy of multiliteracies seems problematic to me, simply because of the way it deconstructs the language-learning process. I can understand the desire to promote cultural pluralism, so as to reduce the likelihood and dangers of xenophobia in schools and society. But, this goal to create a pedagogy of multiliteracies will – in my estimation – create a linguistic hodge-podge that will end up destroying cultural and historical identities. The New London Group bring up this issue by saying that “differences [appear] as evidence of distressing fragmentation of the social fabric.” However, at least as far as I can tell, they fail to answer this crucial question. So I am having a hard time seeing how The New London Group planned to implement their strategy.

Don’t get me wrong, I think the post-structural aspect to literacy pedagogy is important. Indeed, literacy should not be a static discipline, because it is always changing based on differing realities in the social structure of communities, which engage with new literacy practices on social media that are often subcultural and diverse. In the past, literacy practices were characterized by strict grammar rules that every school child had to adhere to, and this is sort of a microcosm for what curriculum in general was traditionally designed for to achieve. Namely, enforcing monolingual and monocultural discourse. However, I still have trouble coming to terms with the thought that multiliteracy could morph into a linguistic potpourri of different cultures and diverse communities. I don’t think we should marginalize any ethnic groups in our communities, but I don’t think we should meld them into one single discourse either. I mean, I think Guillermo Gómez-Peña is great as a performance-artist, but his use of language, which is a mash-up of English-Spanish-chicano-whatever would be hard for me to accept as the standard lingua franca.

So, how do schools adapt to changing cultural, institutional, and global realities? That’s the main question here and, well… at a certain point I just gave up on the dense program that The New London Group was putting forward. Instead, I watched an old TED talk by Ken Robinson (probably the funniest TED talk ever), which may have been a better use of my time to begin with. But it is obvious that literacy pedagogy should no longer be a narrowly conceptualized exercise of learning a standardized language and applying it to the traditional mode of reading and writing. For instance, the textual now no longer means simply reading something and having a cognitive reaction to it. In this new pedagogy, the textual is related to the visual, the audio, the spatial, the behavioral, and so on. So it encompasses a lot more than it used to. That’s why design elements are necessary, and after a cursory reading of this part of the article on design, I felt relatively positive about this approach because it appeared to be multi-modal, which is more conducive to creativity. One of Robinson’s main points is that “we are educating people out of their creative capacities,” and what happens when students no longer feel like they have a creative or meaningful role in their educational process? They become disillusioned, apathetic, and cynical on the whole. The New London Group in 1996 determined that the disparities in educational outcomes did not seem to be improving, and they are certainly not improved today, as almost all evidence seems to suggest that the American educational system is mediocre at best.

Constructing an Identity: New Media and “Becoming-other”

New Media platforms are considerably trendy and multiform in today’s world. No matter where you go on the Web, if you stumble upon something newsworthy (read current), you are likely able to share whatever that is on a variety of linked social media websites. Here I have already invoked the term ‘social media,’ which I think can be used synonymously with new media. Although, the term ‘New Media’ can mean different things to different people, depending on the media’s function. For example, individuals in jobs like graphic design, marketing, advertising or interactive branding may all use specialized new media platforms to achieve certain goals. But for the casual majority of Internet users, New Media technology is simply an apparatus for engaging in social experiments on the Web. I say ‘experiments,’ because it is not necessary to interact with people you know on social media sites. With this type of new media, you can engage with just about anyone; from writers and pop-culture fans, to politicians and celebrities. This has tremendous implications in respect to the nature and purpose of the public sphere, where Internet technology and social media networks have brought people together in dynamic new ways.

I personally came to New Media as a teenager, eleven years ago. In 2002, engaging with electronic communications media was not yet a mentally fraying experience, as I will argue for. I started out innocently enough, creating a chat handle on AOL Instant Messenger. Now, while I don’t characterize stand-alone client software like AOL, Yahoo, or even Skype as social media (because they are a two-way street and not subject to the whims of an interactive public), AOL Instant Messenger (AIM) was my first step into the virtual world of identity construction. My screen name was redragon720; a lame portmanteau of sorts followed by my birthday month and date. My AIM icon was a logo of a reoccurring dragon motif used in the artwork of my favorite band at the time. Believe it or not, this was my first experience on a strictly communications-based network. Of course, I communicated with people online before, but this was usually in game rooms and multiplayer channels for online computer games. AIM was great because I could communicate with newly-formed friends at a new high school, which perhaps facilitated the process of forming friendships at a tender age. However, it wasn’t long until I realized that being on the Web didn’t mean simply chatting with friends. With AIM, you could craft a very primitive profile that allowed chat “buddies” to see how you described yourself. I remember doing all of the typical things; listing my favorite bands and linking to my favorite websites. Indeed, this primitive profile-tweaking made me realize that one could showcase their interests online in neat formatted rows and columns that appealed to my sense of symmetry.

I don’t recall if I was hungry for online identity construction, or if I came upon blogging by pure chance, but as a sophomore in high school I created a profile on the social networking site Xanga. Immediately – as if real-world friends weren’t reliable enough sieves to filter my more antsy, existential problems of youth – I sifted through Xanga to find rapport on the Web, joining groups self-described as “Bookish,” or trying to find sympathizers in groups with titles like “I Hate my Hometown.” This last should reflect my once-restless status as a marginalized, socially-choked teenager. However, my Xanga never saw any substantive content, even though I did spend considerable time changing the appearance of my profile, picking up some rudimentary web design skills along the way. But after sophomore year, I soon realized that a lot of my confidants at school were using LiveJournal, so I switched platforms as I was more interested in partaking in a close-knit social group.

Junior year of high school was an interesting one. As quickly as the year started, I began to exhibit signs of major depressive disorder, and I plummeted during this very inopportune time. Junior year, of course, is that time when kids start to amp up their school performance in preparation for college; either by themselves or owing to intimidated motivation by their superiors. With little interest in congruency where that was concerned, I began to compulsively use LiveJournal in an attempt to come to self-knowledge in a world that did not make any sense to me. My LiveJournal account has since become a hideous intrigue, being the only testament to my teenage years which still exists online. That year, I sacrificed my academic cares, my inhibitions, my beliefs, my real-life relationships, and a lot of bodyweight to the twin demons of emotional turmoil and disphoria. My output on LiveJournal was both excessive and insane, as I devoted myself to introspection online through a process of creative writing, poetry and self-loathing. At the same time however, while I was trying hard to carve out what has recently been called an “(e)dentity” by Stephanie Vie (which for me was an explicit search for my own soul), I was also posting on several message boards related to my favorite music groups. Furthermore, I was active on a ProBoards forum a friend set up, where a group of twenty of us or so talked about books, religion, philosophy, and engaged in a nerdy role-play where each of us added a fantasy narrative onto a friend’s preceding post (I shouldn’t have to mention this, but the goal was literary; not sexual…). However, this was a peripheral activity for me, overshadowed by how much time I spent on LiveJournal.

Activity on LiveJournal died out heading into my senior year of high school. I think this was due to the advent of MySpace, and I couldn’t understand why friends were making the transition. MySpace looked stupid to me, and I didn’t engage with it until – if I remember correctly – my homecoming date persuaded me to make a profile. It is perhaps important to note that, up until this time (as well as during and after), I was hopelessly inept when it came to forging meaningful relationships, let alone romantic ones. But a new girl had come to my school during my year subsumed into LiveJournal, and due to some fortuitous events at a mutual friend’s graduation party that summer, we both looked forward to each other’s presence the following year at school. However, a month of school went by with me completely self-absorbed and this girl growing impatient with my taciturn behavior. As homecoming loomed closer, I talked to my date via MySpace and by revisiting AOL Instant Messenger (this time with a new screen name). Throughout, I was becoming acutely aware of how different online socialization was in comparison to real-life socialization. It was easy to create a digital persona, and to feel safe in that persona both as a crafted image and as a vessel for communication. However, it was radically different from the threat of communicating in personam. Therefore, MySpace (as LiveJournal had been before it) was just as existentially jarring to me as it was liberating. Here was somewhere I could be myself without the fear of rejection or alienation. I could experiment and channel ideas through my mental framework which felt empowering, but it was all essentially a facade for my undeveloped real self. Needless to say, homecoming was an abysmal failure, and so was dropping out of school, for the most part (even though I graduated at the same time as my peers).

LiveJournal comment
My story with social media so far should be obvious. Life on the screen as an inwardly dispossessed teenager had its consequences. After high school, it was documented that I had Post-traumatic stress disorder, moderate-to-severe depression, and severe social and generalized anxiety (among other things). External factors largely determined these results, but I maintain that obsessive preoccupation with social media – used for introspective purposes – is to be given equal weight. For instance, on LiveJournal, I had a “friend” whom I didn’t even know tell me, “You have to be the most introverted person I know.” For me, this epitomizes one of the defining traits of social media. Oftentimes, comments from “Others” serve as feedback loops to reinforce perceptions of ourselves; perceptions which may not intrinsically be true. On Facebook and Twitter (and wherever else), individuals are led to believe in a certain version of themselves, based on their digital persona and how others respond to that persona. In Alone Together, Sherry Turkle finds reason for concern in this phenomenon, maintaining that our connectivity culture destroys the opportunity for proper maturation.

As a youth, I poured so much of my undeveloped self into the computer that the line between reality and the hyperreal began to blur. For example, I recall that while I was at school, I thought about how I could explicate some “deep” insight about what was going on around me if only I were on LiveJournal. For me, this reliance on social media in order to come to terms with reality was only good for causing an existential rut, which I think many young people are in, owing to technology. Indeed, as John Palfrey and Urs Gasser state in Born Digital: “Digital Natives live much of their lives online, without distinguishing between the online and the offline.” When I first read this quote, I had a hard time understanding how digital technology could be so immersive and abstracting that it diminished the concept of identity, forcing it to mean the same thing in both spaces. But after combing my past, I realize that it is an easy psychological deception. Without doubt, I lost myself to the social frivolities of New Media when I was younger. Clawing out of this rut was only possible after high school, when I had more room to breath, and when the social connections that kept me appealing to the Web in the first place suddenly disappeared.

My experience with New Media has definitely shaped my attitudes toward technology today. Having subsumed myself into the virtual world for nearly two years, I was left with almost nothing approximating reality. Loss of friendships felt like betrayal, instead of what they really were: self-deprivation and alienation from pursuing real social relationships myself. Furthermore, being without goals after high school created an aimless state of recklessness. New intrigues were discovered in the forms of alcohol, smoking, bitterness and escapism into music (a singular devotion to black metal further galvanized feelings of bitterness and misanthropy). At any rate, after repeated failures and a considerably lengthy stint in cognitive behavioral therapy, the search for identity was redirected into more positive channels. The virtual was relegated in favor of pursuing more active interests in the real world, and I learned to tame all of this bedeviled technology by deflecting the hold it can exert on the fundamental needs of mind and body.

I now use social media to reflect interests that are predominately formed without the influence of technology. Furthermore, I began to use New Media to unplug. After Christmas of 2009, I met my fiancée through Zoosk; a romantic social network and online dating service. Admittedly, for awhile, I was embarrassed that Zoosk was the only way I could find intimacy and understanding in the “real world,” as if my relationship was predicated on some technological ersatz that reflected an inherent shortcoming on my part and therefore devalued the relationship. But I soon realized that this was unfair both to myself and my then-girlfriend. However, this is why Howard Rheingold’s chapter on “The Heart of the WELL” in The Virtual Community resonates with me so deeply. The soulless nature of CMC (Computer-mediated communication) technology can lead to someone “becoming-other,” or becoming something other than who they really are. That’s why I think it is important to maintain a critical distance from technology, analyzing it in terms of what is known about the human condition. As Rheingold says regarding cyberspace, “the most obvious identity swindles will die out only when enough people learn to use the medium critically.” Therefore, let us hope that future digital natives (especially those inclined to certain dispositions like myself) have the proper wisdom imparted to them so they may regard technology critically, as something to be used meaningfully and without troublesome consequences.

Mapping the Old onto the New

So, I’m reading a chapter from cultural Marxist historian E.P. Thompson’s book The Making of the English Working Class, and I was kind of startled by my reading of a quote from an itinerant Irish writer named W. Cooke Taylor. In Cooke’s Notes of a Tour in the Manufacturing Districts of Lancashire (1842), he’s writing about his observations of the ‘novelties’ that came about during the Industrial Revolution. But when reading this quote, all I could think about were the novelties of New Media and how they have shaped our own cultural production and practices. I don’t know, just a thought:

“As a stranger passes through the masses of human beings which have accumulated round the mills and print works. . . he cannot contemplate these ‘crowded hives’ without feelings of anxiety and apprehension almost amounting to dismay. The population, like the system to which it belongs, is NEW; but it is hourly increasing in breadth and strength. It is an aggregate of masses, our conceptions of which clothe themselves in terms that express something portentous and fearful . . . as of the slow rising and gradual swelling of an ocean which must, at some future and no distant time, bear all the elements of society aloft upon its bosom, and float them Heaven knows whither. There are mighty energies slumbering in these masses. . . . The manufacturing population is not new in its formation alone: it is new in its habits of thought and action, which have been formed by the circumstances of its condition, with little instruction, and less guidance, from external sources. . . .” – W. Cooke Taylor