Monday, November 21, 2011
Abstract
Many of these online communities are even extending their communication out of the online-only sphere and engaging in real-world rhetoric. Several examples from recent years demonstrate this. One of the most prominent of these groups is the "hacktivist" collective "Anonymous." Since its inception, this group has successfully and consistently planned activities online and then executed them in the real world. This group in particular is significant because it has received attention from the mainstream media as well as worldwide governmental agencies, indicating that it is perceived as a viable source of activity by these influential groups (though whether these activities are ethical or not will not be discussed in this paper).
Primarily using rhetorical models provided in the work of Spinuzzi and Zachry, this paper will examine the rhetorical genres used by Anonmyous and attempt to identify if there is a collection of genres that significantly contributes to the success of the organization. The purpose in doing this is to perhaps identify patterns of communications that could be emulated by other groups to build communities online and then meaningfully activate community members in the real world.
(This is obviously a rough work in progress which I'll be working on during the day. I wrote this up during an all-night marathon of House and while running on caffeine.)
Monday, November 14, 2011
Anonymous and Activity Theory
Tracking the development of the Anonymous Network
I figure since this week's book could relate directly to what I'd like to write my thesis about, I'd take the time to formally track how the Anonymous network of commmunication developed over time.
Anonymous started on the website 4chan.org. On that site's message boards, instead of having unique login and screennames to identify individual users, users were instead all given the username "Anonymous" and posted anonymously. This counterintuitively developed a strong user community.
One of the first activities the Anonymous collective executed was a protest of the church of Scientology over perceived human rights violations. Protesters in the real world dressed in Guy Fawkes masks and played Rick Astley's "Never Gonna Give You Up" on repeat outside of Scientology offices and churches. The group organized when a video was released explaining the operation (dubbed "Operation Chanology" after the 4Chan message board the group originated from).
So far I count several genres: internet message boards, a video uploaded to YouTube, the boom boxes and Rick Astley, and the holding a sign in front of Scientology offices. These genres worked together to have people from all over the world unitedly protesting with a single message. All of this was done through genre ecologies.
Later protests and activities would spread to be used on other message boards (Reddit being a significant player here) as well as unofficial "official" Twitter accounts and blogs. One of the defining characteristics of the Anonymous group is the lack of any sort of single leadership, and so this makes authoritarian models of network theory difficult to adapt. There are inherent "leaders" in the group--those who actually make the videos, flyers, blog posts and tweets--but they are impossible to identify (unless they mess up and leave metadata in the files they upload to the internet).
I'm trying to figure out how to approach this from a theoretical standpoint. I think this type of organization would be more actor-network theory than activity theory since it's not centralized and isn't very structured (Spinuzzi [45] claims that "activity networks are much more structured than actor-networks"). However, activity systems and activity network approaches seem to work because if the linkage through shared tools, resources, or communities (43). They also cast nonhumans as objects of labor through actants by implementing software tools to take down websites. For example, the LOIC (Low Orbit Ion Cannon) tool is what was used by several participants to take down several websites. I think. Right?
Monday, November 7, 2011
Annotated Bibliography
Freedman, A., & Smart, G. (1997). Navigating the current of economic policy: written genres and the distribution of cognitive work at a financial institution. Mind, Culture, and Activity, 4, 4, 238-255.
However, to their credit, this article was written before the invention of Wikipedia and the explosion of user-generated content (and user-mined technical support). They use the term "Genre ecology" to describe the interplay of genres by a user, and their demonstration of this in the article is particularly useful in understanding the idea. It implies the co-dependent interrelationships the different points of information have with each other. Every piece of the ecology has an effect on the other, and the entire situation would fundamentally channge if one point of conversation were removed (rhizomic?) They write, "Genre ecology diagrams can help designers to lay out relationships, analyze the interplay among genres, and identify which genres are central or peripheral to the use of the technology. The diagrams thus can be a resource for replanning the ecology." This is cool because it gives a practical way to apply the idea of genre ecology. As a writer and someone feebly metathinking about communication, I'm a big fan of the drawing things out. Mapping the genre ecologies will put us in a place where we are able to see how the genres overlap and then look at how technology mediates those communications.
Zachry, M. Ecology of an online education site in professional communication (2002). In Proceedings of IEEE professional communication society international professional communication conference and Proceedings of the 18th annual ACM international conference on Computer documentation, IEEE Education Activities Department, 433-442.
I felt this article was a strong case study of seeing how to facilitate communication between different actors in a group. It could also potentially serve as an example for what we're going to be doing as a class in a few weeks when we begin working on professional development.
The ways of communication discussed in the article (discussion boards, file sharing, etc.) were based on technology from 10 years ago. Online communication has become much less obtrusive since then. As tech writing (or composition, or whatever) instructors, are you aware of any useful tools to allow students to communicate that integrate new technologies while easily allowing you to observe? Some of the ideas in this article helped me to see how I can apply genre theory to my instruction, but the tools on Blackboard are cumbersome (traditional groan of agreement, etc. okay). However, outsourcing the communication of my class to an effective tool--email through a competent client, for example--either shuts me out or doesn't maintain the privacy the students require when discussing their papers. A Facebook comment thread doesn't seem like the appropriate place to give feedback on a revision. Anyone got any ideas? Is Google+'s circles the best way to do this?
Rounding up this Spinuzzi/Zachry fanfest is an article cited by Ryan in his article about the NSF.
- Genre Systems: Spinuzzi writes, "Unlike genre sets, genre systems involve “the full set of genres that instantiate the participation of all the parties"" but only look at official genres, not unofficial genres. An expanded way to see genre sets by acknowledging additional genres in play.
- Genre Repertoires: The most significant thing genre reps do is acknowledge the presence of overlapping genres (typing on a computer while talking on a phone). This allows genres to be amped in more of a non-sequential way, although only official genres are acknowledged.
- Genre Systems Redux: Genre systems is a term whose usage has become loose, so it is again used in a different sense. In this one, it refers to how genres function in assemblages.
- Finally, genre ecology focuses on mediation--how genres interact with each other in dynamic ways. It's neat.
This article is a fantastic primer if you, like me, were very confused by the seemingly synonymous terminology (hint: it's not, really). His breakdown of genres allows people like us to detemrmine actively what framework we want to use in our work owerk. Essentiually, he gives us the rhetorical tools we need to take part in scholarship. The issue of quantifiability remains, but I don't think it's going away in this field.
Monday, October 31, 2011
Poke-Rhetoric
Good game designers, like good advertising agents, are particularly skillful at knowing intuitively what will capture the audience's attention . . . [Game] developers adorn games with plot details that depend on rhetoric functioning conditionally. (90)
I think skilled game designers are also good at creating subcultures for their games complete with specialized language and rhetoric. This pulls the gamers in to create a higher investment in the actual game.
I went through that conversation and underlined every word that was either unique to the subject or reappropriated to mean a different thing. Here is a list of the words I found:
- Game Boy Advance
- Web
- evolved
- Spearow
- Fearow
- trainer
- LCD
- discovered
- trained
- fought
- traded
- Poliwrath
- water stone
- Pokerus
- Game Boy
- Pokemon's
- stat experience
- Internet
Monday, October 24, 2011
Hypermediacy and Convergence as Prosthetics
The reason I bring this up is that I see the eventual convergence of technologies as well as hypermediacy as eventually functioning as a virtual/digital prosthetic--a seamless and essential extension of ourselves. I think the corporations. hegemomic overlords, etc. want this to be the eventual goal as well but maybe don't know it yet.
Maybe Apple knows it. When the iPhone was first released in 2007, I read an article (it might have actually been on Wired, but I don't remember exactly) that explained to iPhone users how and when it was appropriate to look up information when you were with friends in some social setting. The point of the article was to give geeks gentle advice on how not to be a know-it-all with their new gadget and constant internet access, but what I took from it was how constant internet access would eventually fundamentally change how we interacted with each other. Don't remember the name of that movie with the cowboy guy from Ghost Rider and The Big Lebowski? In a few seconds, you'll know his name is Sam Elliot, and you'll know the name of the movie, too. Apple wants you to see your iPhone not only as a tool or gadget but part of your identity. We're going to become so accustomed to having video communication, voice communication, libraries of data and information, picture galleries, etc. all available to us at all times, we're going to become dependent on it.
I don't think that's a good or bad thing; I just see it as the way it's going to be. It won't be universal, not for at least 20 or 30 years, but as baby-boomers like my parents age, retire and die, those of us who grew up with technology and are completely comfortable with it will see its place as an extension of our identities as natural. Privacy concerns won't really matter to us like it does to some groups today.
Internet browsers won't ever actually go away entirely, as the Wired article referenced in Bolter and Grusin posits (see 221-226), but they will first become more important as they replace the need for a desktop computer. I mentioned this in my introductory post, but I have 5 machines I use consistently right now: my desktop at home, my office computer, my tablet, my phone, and my netbook. I can access all of my necessary files on any of these computers due to the way I use apps primarily available in an internet browser. I bet Google, Dropbox, and the other services I use see me as the ideal customer because I've become somewhat dependent on what they're offering. As companies start to realize what it means to have everyone constantly using a different digital device, the smart ones will offer a way to make accessing the virtual self across platforms seamless and natural. The actual medium will become inconsequential.
When that happens, the corporations will love it. Advertising will be everywhere, and everything will be monetized. Instead of using your tablet one way and your PC another, your individual digital experience will seamlessly transfer across all technological platforms in an example of ubiquitous computing. Here's kind of an example of what I mean:
Of course, the reality in that video would be hell, at least from our perspective now. Maybe by the time that technology develops we're have become gradually accustomed to it--like boiling a frog.
What does this mean for tech writing? As readers/users/our audiences gradually shift towards the "prosthetic" model of technology use from the current "window" model (having a very obvious interface through which technology is accessed instead of a seamless platform), I think we'll have to assist in that transition somehow.
I don't know, though. Maybe it's because I've been awake drinking Mountain Dew all night, but watching that video again kind of makes me want to throw up.
Sunday, October 16, 2011
Occupy Marx Street?
I wonder what Karl Marx would say about the #Occupy movement. On Saturday, I was in San Francisco/Oakland/Berkeley, and there were signs for local Occupy protests in each of those places. I don't actually see Oakland as a financial center for the country, but the mere presence of a protest shows how widespread the rise up against what Marx might call bourgeoise oppressors has become.
Being interested in this sort of thing, I was very excited when I began reading Cyber-Marx and came across this lovely passage:
The unleashing of computerization, telecommujnications, and genetic engineering within a context of general commodification is bringing massive crises of technological unemployment, corporate monopolization of culture, privatiztion of bodies of knowledge vital for human well-being and survival, and, ultimately, market-driven transformations of humanity's very species-being. In response to these developments are emerging new forms of resistance and counterinitiative. And insofar asthe force with which these movements collide is capitalism--perhaps a post-Fordist, postmodern, informational capitalism, but capitalism nonetheless, and not some postindustrial society that has transcended commodification--Marx's work can continue to provide participants in these struggles a vital source of insights.
Technological developments are allowing new forms of resistance and counterinitiative to emerge. I need to do some more research on this, but from what I saw, the whole Occupy Wall Street protest began as a activity by Anonymous, the hacker/mischief-maker collective. Their @AnonOps Twitter channel was the unofficial way much of the early information about these protests were disseminated and organized. Anonymous works well as an example of the proletariat using emerging technology to effectively activate against the modern aristocracy. There isn't any central leadership, allowing strong ideas to simply command attention and support organically. This does lead to a sort of "mob rule" at times, but for the most part, the organization has been very effective in executing attacks on the ruling classes.
I'm not sure what Marx would say about this exacty, but I think it's incredible the supposedly-discounted Marx and his theories actually have plenty of application in the new user-driven world of technology.
There are a lot more factors that go into this, I know--organizing on Facebook requires use of Facebook, one of these hegemonic institutions--but I think it's still an application of Marx's theories. I wrote this a few days before class just to make sure I got it done, and I'm eager to see what else Dyer-Witheford has to say about Marx in the modern technological world.
Monday, September 26, 2011
I'm sorry, what?
I think the example from "10,000 B.C.: The Geology of Morals" helps to clarify this.
The proof that there is isomorphism is that you can always get from one for on the organic stratum to another, however different they may be, by means of "folding" . . . There are irreducible axes, types, branches. There are resemblances between organs and analogies between forms. (46)Okay, so any two things can be connected somehow by "folding." We're getting somewhere now. And even though the second guy is trying to argue with the first guy, I think this helps to make more sense of rhizomes. The relationship between two things can be demonstrated and interpreted, even if that relationship is that there isn't really a relationship. The point is the connection between the two things can always be described somehow.
So let's talk about how this opens us to new ideas and thoughts. (First, let me make a disclaimer: this is going to over-simplify the concept of a rhizome and its potential use in my field.) Everything is connected in a complicated web of relationships. The strip of paper on my desk is connected with the tablet PC in my bag in several different way. By being aware of their rhizomic state, I instinctively begin to look for connections and see the greater One-ness of everything. I see how the actions of one thing affect all others, and now the pen sitting next to me has much more significance.
That's kind of an "I Heart Huckabees" soft-existentialist type philosophy, though, and not very useful to me as a tech writer. In relation to me as a writer and instructor, I am now aware I need to be aware of connections and relationships I had previously taken for granted or overlooked. Thesis statements relate to level of detail and also what sources are used; a student's thesis statement will also relate to their personal life and values as well. I need to be aware of these things to see how I can properly identify and interpret what a student is really trying to say and why so I can coach them properly. If I were writing an instruction manual, I'd need to be aware of how the different parts of instruction, pieces involved, etc. all related to each other in order to communicate them in the most effective, logical way possible.
Really, though, at the end of all this, I think D&G are really playing some sort of prank on intellectuals and philosophers in the same vein as Ern Malley.
Monday, September 12, 2011
Don't be chicken to change
“During long periods of history, the mode of human sense perception changes with humanity's entire mode of existence. The manner in which human sense perception is organized, the medium in which it is accomplished, is determined not only by nature but by historical circumstances as well.” (Benjamin 222)My academic interests being focused on internet culture and the viral spread of information through online methods of communication, I can't help but think about how that quote applies to the modern internet age. Historically, we're in an interesting place. Just about the whole world is networked together, and instantaneous global communication has never been so accessible. This means that something popular from Africa (say, South Africa) can suddenly become a pop video hit all over the world even without the marketing machine that powered such global pop stars like Michael Jackson or some other one (there's more than one global pop star, right?).
However, there are some circumstances where viewing the original has been limited by the superstructure. One of the reasons I’m hesitant to follow any sort of inclination to move abroad is that the technological services I’ve become accustomed to using (see: Netflix and Hulu) are restricted by region due to licensing rights. Even if a service is a great idea and sure to be a hit, navigating the laws and procedures of the superstructure means companies will spend months or years negotiating deals with record companies and movie labels before they can expand into a new national market.
The recently-released-in-the-US Spotify is a great example of this. Available in the UK for more than two years, Spotify was only released in the US this last summer. It’s basically the Netflix for music: almost any song you’d want to listen to can be found on the service, and you can stream them over the internet to your computer or mobile device. On my recent drive to and from LA, I used the Spotify app on my phone to listen to all sorts of music I hadn’t been willing to pay for individually but was willing to enjoy as part of a package deal. (Who knew Tom Petty had so many hits?)
Monday, August 29, 2011
Technology Autobiography
Soon after we got the computer, though, my career as an amateur techie began. I guess the whole thing was my fault. I discovered that on the visual skin of Windows 3.1 (some weird hallway scene that helped you visualize your folders as rooms in a house) I could add a password to any "room." After typing in a random string of characters (remember, this is the 90s and before you had to confirm password entries), I found myself locked out of the computer. Panicked, I tried to figure out what I had typed. Dad saw what I was doing and gave a frustrated sigh. "Do you remember what you typed?" he asked.
"I think it was something like, 'N, n, n, n."
Dad tried that combination of letters, and I watched him become angry after a few unsuccessful tries. I went into the other room, and a moment later I heard him say, "Woo-hoo!" like Homer Simpson. "It started with a capital 'N'," he said.
Remember Trying to Use Computers in the 90s?
I spent the next several years accidentally messing up the computer and then figuring out how to fix it. As new versions of America Online and Windows were released, I'd have to learn them and then explain how to fix their inherent software problems to my parents. (As a side note, I'm glad the days of software becoming dramatically buggy from one version to the next seem to be over, at least to some degree. Does anyone remember using AOL 3.1 because version 4.0 simply didn't work?) When our first computer finally died sometime in 2002, we replaced it with a new tower and monitor with new features and problems. Sure, Iomega Zip drives had a great storage capacity, but they also had a notorious fail rate.
Video Games in High School, or,
How to Play Brood War Instead of Going to Prom
Around this time high school set in and my friends and I started playing network games on the school computers. While I was never the best at Quake 3, Starcraft or Jedi Knight, I was certainly always in the top 3. My interest in technology (or at least how technology could play video games) merged with my social life. A new game meant a new way to pass the time with my friends. Our technological leanings both brought us together as friends and separated us from other students, placing us deeper into geek culture without really being aware of what was going on.
During this time, I also started to participate in online communities. I experienced the thrill of seeing over a hundred replies to my forum thread, of having people on the forum who I'd never met in person refer to me in conversation, of complaining about the minor changes to the site with a united voice that said, "We're in control here, not you, Mr. High-and-mighty Webmaster." It was my first experience with the potential of online communities.
Accidental English
Even though I started my college career with a declared major of Computer Engineering, I changed it to English for some reason when I finished my AA and transferred to BYU-Idaho. There wasn't any fanfare or difficult meditation about what my career path would be. The lady from the BYU-I registrar's office asked me, "What is your major?" and I told her, "Ah, English."
"Okay," she replied. "And what would you like your emphasis to be?"
"What are my choices?"
"Literature, English education, professional writing and creative writing."
"Okay, let's do professional writing." It just sounded like the one that would get me a job someday.
As I took courses, I started to merge my interest in writing with my interest in technology. When I learned that technical writers are responsible for writing instruction manuals, I was genuinely excited. "Wait," I said, "People will pay me to write manuals for vacuum cleaners?" The challenge of explaining a technical concept to someone in a way that couldn't be misunderstood brought me back to my childhood when I'd have to explain to Mom, again, how to make a new folder on the desktop. My interest in humanity and psychology seemed to marry my interest in technology so perfectly. Audience awareness became a game and challenge.
This brings me to my current place in graduate school where I don't have to get a real job for two years because I'm writing a long paper on how internet communities can throw birthday parties for old men in the real world. Even though I'm an American Studies major, that's really an excuse to look at the psychology and humanity of technology users and try to understand how to reach them effectively, how to communicate with them so a message is understood and received perfectly. It's all kind of come full circle in a way.
Technical Prophecy
I feel my experience with technology has given me a kind of technological foresight. I can see new technology and see how it's going to change things down the line. I'm not always right--for example, Google Wave wasn't as revolutionary as I thought it would be even though other services have successfully implemented some of its features--but I think I have a pretty good track record. For example, I'm calling it now: the Google Chrome OS won't take off until it fully merges with Android for tablets, and tablets themselves won't be as popular as they are now in 7 years because people will realize it's a complete pain in the ass to type on an iPad. Google+ is trying too hard but it doesn't offer anything unique, meaning it will turn into more of a wasteland than it is now unless Google does something to make it truly innovative and necessary. And to make a prediction not related to Google, in about 10 years we won't be bothering with full desktop apps except in certain situations (which will be rare); instead, I think everything will be based on the web and consumers will be used to logging in on multiple devices to access their programs and files. Despite what a few loud voices say, people under 30 don't really care about privacy,so we'll let anyone store our data for us so we don't have to.
Of course, those things can only be proven or disproven in time, but I think the advancements in technology and communication are intrinsically tied to changing generational attitudes. For example, since my generation has always grown up with decreased privacy due to Patriot Act-type laws and the rise of internet profiles all coinciding with each other, we're not going to whine as much about what Facebook does with our data as the current generation of professionals using computers does. We willingly give up our data without a second thought, and we don't even think about how we use it. Does anyone read the iTunes EULA? Features like location logging and involuntary message archiving are simply going to become the norm, and we're going to let it happen because we're used to our technology not being private.
So Matt, Who Cares?