Saturday, May 2, 2009

Lucid Dreaming, the "Meh" Generation and Black and White TVs

So our lucid dreaming conversation got me thinking the other day. I was on XBOXlive discussing class stuff with my "friends" that I don't know on there. Most reported having the ability or the experience of lucid dreaming. Most reported having experiences like a video game that they had been playing for a while. All agreed that it mostly happens when they get a new game and beat it in a marathon 12 hour session.

So after giving it some thought, I just Googled "Video Games and Lucid Dreaming." Well this researcher here, Dr. Gackenbach, is a PhD in Experimental psychology. She gave a presentation which can be found here, that links video game usage to lucid dreaming. Really interesting data about consciousness and the aware mind.

Later, I remembered a drunken conversation at a party one night and someone had mentioned how people who lived during the era of only Black and White TVs tend to dream only in black and white. I decided to Google "black and white television and dreaming" and found this.

It's interesting to consider that perhaps the environment we choose to put ourselves in will have such a tremendous impact on our subconsciousness. Perhaps even leading to differences in experiencing awareness, like being able to consider more than one thing at a time. Perhaps "unlocking" the brain's ability to process data.

Sunday, March 29, 2009

Code: History And Impact

Code exists all around us. Biological, scientific, physical, political, seasonal, social, housing, military, art, and design. The word conjures many different meanings and translations into the mind. The single underlying idea is that, in code there exists rules and guidelines that outline the limitations and freedoms to a system. These factors force people in these systems to build, bend and struggle to form codes into the world that we see around us. Nowhere in the context of human history has code been more important than today. With the advent of the commercially available Internet in the early 1990’s, human history became universally re-written.

The formation of code became necessary during the height of the cold war. Programmers and directors of missile installations were looking for a more appropriate way to connect staff to computers that, due to their huge size, were required to be isolated in air-conditioned rooms. If a programmer wrote code for a particular procedure, the code would have to be physically carried into the mainframe room and entered by hand. If there were any misprints, typos, or bugs, the data entry staff would have to go back to the programmers and tell them that it didn’t work. Hours and hours were wasted coding and recoding; attempting to set the parameters for the nations’ most devastating array of weaponry (Bilgil).

Soon after, the British and French strategic defense computer networks began to be interconnected to the United States defense networks and the “Integrated Network” was created. An interrelated system that sprawled over two continents, used to monitor and share packets of information, the same underlying system that is being used today (Bilgil). After the fall of the Soviet Union and the increase of personal computer owners, enterprising phone companies who had the capital to invest in hardware and programmers soon introduced a code of connecting personal computers to one another via phone lines for a moderate fee. The code for separating commercial users from the military networks was called X.25, and soon after, this code became standardized by Open Systems Interconnection Reference Model and made it possible for every computer to communicate in the same format to avoid compatibility errors (Bilgil). With the backing of the OSI model, and the commercial interests of the telecommunications corporations, the commercial Internet was created and by 1991, applicable coding for the future of the technology began to be written.

In the early 1990’, at the headquarters of the research facility CERN, an outside contractor named Tim Berners-Lee began his work on what is now known as Hyper Text Markup Language or HTML. He had developed this idea of a coding language so that researchers at the world-famous CERN laboratories could share and store information from multiple scientific studies from one country to another. Berners-Lee imagined many different texts and documents could be coded in HTML and then put onto a host computer where other client computers could access the data and de-code the document to display. Such a system would require a program to sort through the information, Berners-Lee is also credited with the coding of the world’s first web browser, an application that could display or de-code the HTML format to make a visible “page” (“Birth of the Browser”).

A top of the line computer became the first web server as Berners-Lee and colleagues began to code their documents and place them on the first accessible Internet. His ideas gave way to the first World Wide Web of information. Data that had been created in the workspace of one CERN related office could be published and stored on a server, where other users connected to this server could figure out who published the data, where they worked and when the last update to the file had been done. In his personal accounts of “Information Management: A Proposal” Berners-Lee outlines his vision of hypertext as follows, “To find information, one progressed via the links from one sheet to another, rather like in the old computer game ‘adventure’.”

The connectivity of today’s Internet closely matches Berners-Lee description of the Internet of the past. Though today’s code differs in many ways after several iterations of browsers, sites and innovations. Programming in HTML was functional and met the needs of the early populace of the Internet. Text and the format of the text were inseparable, much like the way we consider written text; liner, unchanging, immovable. USEnet and other literary Internet portals were accessed one way, through one web site and not coded for automatic replication or moved automatically in regards to subject material or topic. These became the first communities that were formed around the public transferring of data, where most users wrote under pseudonymity. These ideas would be transferred in a very academic way, much like a public scientific journal or master’s thesis. The USEnet became a way to publish any data, from the mundane to the highly intellectual, though very little thought or consideration was given to how the text was presented or displayed (Lessig 103).

HTML was also very hard to code, causing a literacy gap between the people that consumed the coded information and those who wrote the coded information. Web coding became a profession, and those who began writing websites were competing in ways to differentiate their material from the rest of the crowds. To make matters more confusing, web browsers began to deviate from one another in the way that they would display coding. After the introduction of Dynamic HTML, effects could be used to accentuate certain ideas or thoughts with flashy text or animation, though two versions of the code would have to be written, one for Microsoft’s Internet Explorer and one for Netscape’s Navigator. This began to make it very difficult for consumers to access data or organize data on their terms, making the Internet seem more of a gimmick than a useable, malleable tool (Metcalfe).

This soon began to change in the mid 1990’s, with the browser wars beginning to come to an end and Microsoft owning 90% of the operating system market, Internet Explorer became the dominate web browser (“Browser Statistics Month by Month”). Though during the early 1990’s both browsers were ignoring the guidance of the World Wide Web Consortium, founded and chaired by pioneer Tim Berners-Lee, which set the rules and standards for coding the Internet. Through Microsoft’s continual cornering of the browser market and both companies rouge exploration of different features and functions of the browser, eventually a common function protocol was made for the code of HTML (“Position Paper for the W3C Workshop on Web Applications and Compound Documents”). This iteration was called XHTML, and it allowed the coded text of HTML to now be displayed regardless of the user’s choice in browser application or formatting functions. Numerical data, language coding, step-by-step instructions, and quoted text are just a few of the examples of the seamless text that could be integrated into any display or application that a user ran to browse the information.

Then in the early 2000’s, with the advent of XHTML and more uniform standards of the applications that displayed the Internet, it seemed as if the Internet had reclaimed its place as a useful tool for the everyday user. Format became separate function and it was coded for universally, so that any content could now be applied to many different facets of cyberspace. This allowed for programmers, web development teams and coders to write code that would be more open to accepting user input as data. A framework of a page could be written for users to place their own professional or personal data into. The web became a development platform, one were users were to fill in the canvas built by developers (O'Reilly). A surge of user-generated content was about to emerge.

With the creation of a more user-based system, the Internet became a veritable breeding ground for new ideas. “Publishing” as it had existed in the past was now changing. Not only could people begin to access data created by other people just like them, they did not need a programmer to interpret their ideas or thoughts into an Internet reality. The users of the Internet could now publish data as fast as they could type it, and just as quickly organize the information into keyword engines. Programmers now rushed to create the most effective framework for Internet users to connect to. Every facet of life began to get a digital makeover. Corporations like Yahoo and Google now began to battle over prefer search engine status, for guiding users through the sea of new data. Facebook and Myspace began to clash over whose public social networking site was to reign supreme. Blogs created by people through companies like Blogger began to get more readership than books and magazines. All of this data was no longer being created and owned by corporations that had paid millions of dollars for skilled programmers to code documents. This data was created by millions of people paying a few dollars per month to publish their thoughts and create new ideas (O'Reilly).

The user-generated age has brought on many interesting legal challenges as well. The creation of youtube and the code within allowed the first generation of youtube users to post nearly anything they could get their hands on, including copy written materials like movies and TV shows. This code has now been done away with and most users are allowed to place only 10 min videos. Facebook and Myspace now have stringent policies regarding ownership of photos and information, that is to say that the code of law has become a prominent aspect in the code of the Internet. As the Internet becomes more personal, so does the ease of individuals to create their own realities and identities (Lessig 50). This provides an incredible sense of freedom in the expression of ideas and poses many questions for those who can arbitrate the decisions of code and the role that it will play in the future of the Internet.

Works Cited
Berners-Lee, Tim. "Information Management: A Proposal." World Wide Web Consortium. World Wide Web Consortium. 27 Mar. 2009 .
Bilgil, Melih. "The History of the Internet." Youtube. 04 Jan. 2009. 27 Mar. 2009.
"Birth of the Browser." 27 Mar. 2009 .
"Browser Statistics Month by Month." W3schools. W3schools. 28 Mar. 2009.
Lessig, Lawrence. Code Version 2.0. New York: Basic Books, 2006.
Metcalfe, Bob. "Is the Internet dead?" CNN.com. 29 Mar. 2009 .
O'Reilly, Tim. "What Is Web 2.0 Design Patterns and Business Models for the Next Generation of Software." 28 Mar. 2009.
"Position Paper for the W3C Workshop on Web Applications and Compound Documents." World Wide Web Consortium. W3. 28 Mar. 2009.

Friday, March 27, 2009

Paper on CODE: PLZ ADVISE AND/OR COMMENT.

The following is a couple of paragraphs introducing my paper. I'm curious if anyone else is doing any literary imagery or such? I'm afraid I got a little flowery after the break, I'm wanting it to fit in along with everyone elses. Please leave a short note on the issue. Thank you in advance.

"Code exists all around us. Biological, scientific, physical, political, seasonal, social, housing, military, art, and design. The word conjures many different meanings and translations into the mind. The single underlying idea is that, in code there exists rules and guidelines that outline the limitations and freedoms to a system. These factors force people in these systems to build, bend and struggle to form codes into the world that we see around us. Nowhere in the context of human history has code been more important than today. With the advent of the commercially available Internet in the early 1990’s, human history became universally re-written.

The formation of code became necessary during the height of the cold war. Programmers and directors of missile installations were looking for a more appropriate way to connect staff to computers that, due to their huge size, were required to be isolated in air-conditioned rooms. If a programmer wrote code for a particular procedure, the code would have to be physically carried into the mainframe room and entered by hand. If there were any misprints, typos, or bugs, the data entry staff would have to go back to the programmers and tell them that it didn’t work. Hours and hours were wasted coding and recoding; attempting to set the parameters for the nations’ most devastating array of weaponry.
_________________________________________________________

Code will build the streets of the cyber city, creating languages, fashions, customs, connecting some users who will meet under flickering street lights and between electronic recycling bins, swapping files and viruses. Encrypted data, classified documents, ideas and evidence will be swapping between hands that have no fingerprints, after which, they will fade away never to resurface. These users might number in the dozens, hundreds, maybe thousands...there's no way to tell, as they share data torrents in servers quietly humming in Sweden, whose very existence is the bane of the NRAA and hundreds of publishing companies.

Others clients will be meeting on gleaming clean streets of monitored sectors like facebook. Shimmering blue and white formatted roads where users are willingly handing over identity, thoughts, feelings and preferences, a veritable stockholders dream. Tracking buying patterns, product marketing...creating the very people it is feeding from. Catching users in an endless feedback loop of shopping and re-creating identities. All the while a staff of bots and people read all the incoming data for "offensive" material, changing the code by force if necessary, verdicts of deletion being handed out fast before the offender has a chance to react."

Thanks again.

Monday, March 2, 2009

Research Methods (and other topics of general awesomeness)

After more consideration of my topic and it's presentation. I believe that my video would benifit from a couple of hours filming in second life, WoW, or some other gaming environment to exhibit how code can change cyberspace. HTML coding for the Internet and a short piece perhaps on the code in Chinese cyberspace would be worthwhile.
Discussing code while mixing in code from law and meatspace will be really cool, particularly if I could green screen in characters and motions into my videos.
As far as the IRB and I are concerned, I think that my interviews (if done) will be easy to record with consent, and my talking heads (if used) will be easy enough to get consent for using their content.
The only issue with discourse analysis would be if i did a piece on Chinese Coded Internet. I would want to interview some students possibly who agree and disagree with the government's decision to censor. This move would require censuring or hiding the identities of the people I interviewed. Though I'm not sure if that would affect the validity of my research.

Wednesday, February 25, 2009

On "Code" by Lessig

Lessig points out early in his piece that “Code is Law” (Lessig 18). While many definitions exist for both of these words, exponentially many more connotations are associated with any combination of those definitions. Code exists in a technological sense, shaping and bending the actions of everything in cyberspace. From RSS feeds to Facebook friend requests, World of Warcraft magic spells to Google Earth buildings, the code defines what we see and how we see it. Code also exists between people who interact in cyberspace; though this code is social in nature. This is the code that defines what “poking” on facebook really means, the code that make you mortified when you accidentally “reply to all” on a what was a secret email, the code amongst thieves on the piratebay.org asking everyone to “seed plz”. The Law exist in real life, the law that Lessig says makes you a thief if you steal a book, but an idiot if you don’t pick up a twenty dollar bill blowing past your feet on a sidewalk. The Law is also lurking in cyberspace, FBI worm viruses silently inviting itself into hardrives and searching for any illegal documents, reporting back to its superior servers.
There is a sort of decaying optimism in the book. An idea that the structure of freedom and liberty lie within the network of cyberspace and that it can be properly regulated to maintain freedom, but there is no one to trust to do this right. “Regulability” is his first concept in which he argues that it is possible to regulate the Internet. “Regulation by Code” is his second theme, where the technical code becomes the means to the end of “Regulability”. The nuts and bolts of what can and cannot be done, what factors limit the users of cyberspace. A third piece is what Lessig calls “Latent Ambiguity” where the freedom of the internet allows for the government to allow the FBI to use the aforementioned worms in order to search private data. The ambiguity coats the issues of code with layers of hazy indistinctness. The last issue is sovereignty, how can one individual, organization, or government rise above the Internet to attempt to label what is good or bad, harmful or helpful? Sovereignty on the Internet allows for a group to be legitimized, for their norms to be the final say in their space. These spaces, however are constantly overlapping in cyberspace, given that all the information is traveling the same way, and there is no enforcers to give one side the leverage of justice in hopes of being vindicated.
I believe that most of Lessigs arguments are fairly grounded. Though I’m not sure if there will ever be a time where cyberspace will become subverted through an ultimate control. I believe it will be more of the battles that we have seen in the past few years against the piratebay and the IRAA, or battles between Anonymous and the Church of Scientology or battles between the culture of democracy and the Chinese government. Where the interests of “In real life” become at odds with the interests of cyberspace and the denizens of it. They tend to be one and the same, which makes legal battles even more difficult for defendants in meat space.
A completely controlled internet creates an issue where there will always be the entrepreneurial spirit for programmers or “pirates” and “hackers”. A segment of society will always attempt to keep the internet free of controls or limitations, where the code will dictate very little, if anything.
I believe that the code of cyberspace will be very similar to the code of today’s space. In that there will be places where anonymity or pseudonymity will be an option, and places where identity will be enforced. I don’t think that Lessig ever is far off from this, the problem is that the future is anybody’s guess, so there are many possible outcomes.

Sunday, February 22, 2009

Lit review (Pt. 1)

Lessig has very few detractors on the Internet, which is making my "conflict, limitation, debates or holes in the literature" part very short. Here is the first half of it though.

Lessig points out early in his piece that “Code is Law”. While many definitions exist for both of these words, exponentially many more connotations are associated with any combination of those definitions. Code exists in a technological sense, shaping and bending the actions of everything in cyberspace. From RSS feeds to Facebook friend requests, World of Warcraft magic spells to Google Earth buildings, the code defines what we see and how we see it. Code also exists between people who interact in cyberspace; though this code is social in nature. This is the code that defines what “poking” on facebook really means, the code that make you mortified when you accidentally “reply to all” on a what was a secret email, the code amongst thieves on the piratebay.org asking everyone to “seed plz”. The Law exist in real life, the law that Lessig says makes you a thief if you steal a book, but an idiot if you don’t pick up a twenty dollar bill blowing past your feet on a sidewalk. The Law is also lurking in cyberspace, FBI worm viruses silently inviting itself into hardrives and searching for any illegal documents, reporting back to its superior servers.

There is a sort of decaying optimism in the book. An idea that the structure of freedom and liberty lie within the network of cyberspace and that it can be properly regulated to maintain freedom, but there is no one to trust to do this right. “Regulability” is his first concept in which he argues that it is possible to regulate the Internet. “Regulation by Code” is his second theme, where we the technical code becomes the means to the end of “Regulability”. The nuts and bolts of what can and cannot be done, what factors limit the users of cyberspace. A third piece is what Lessig calls “Latent Ambiguity” where the freedom of the internet allows for the government to allow the FBI to use the aforementioned worms in order to search private data. The ambiguity coats the issues of code with layers of hazy indistinctness. The last issue is sovereignty, how can one individual, organization, or government rise above the Internet to attempt to label what is good or bad, harmful or helpful? Sovereignty on the internet allows for a group to be legitimized, for their norms to be the final say in their space. These spaces, however are constantly overlapping in cyberspace, and there is no enforcers to give one side the leverage of justice in hopes of being vindicated.


My full review of his book "Code" will be coming shortly.

Wednesday, February 18, 2009

On "Media and Behavior"

This article has me more convinced that my assignment of "code" is just as much a social one as it is a technological one. The technological code sets rules for online behavior just as much as social ones do. Since online communities are a mix both tech and individuals, the rules apply to both of them.

Pages 29 and 30 discuss professions where a persona is taken. Waiters and Doctors take a persona on when they are interacting with customers. Vernacular, posture, interaction with co-workers all change. This is exhibited everywhere online. From blogs, facebook, twitter, or any forum...rules change, personas change, identities are honed and whittled into whatever form they need to be.

The process of code will mold the frame in the future of online interactions. Dichotomies in code exist in today's fledgling social network. China and the strict law that governs every social interaction from surfing to posting, is often at odds with the rest of the worlds pirating, streaming, and updating. I think this would make a great case study for "code".