Monday, November 2, 2009

Don't Make Me Think -- Steve Krug

Krug's book focuses on lessons supporting one major tenet of web usability and human-computer interaction: a good website (program, application, etc.) should enable users to accomplish their tasks as easily and directly as possible.

Getting sidetracked for a moment: this reminds me of a discussion of "good technical instructions" in my technical writing class. However we approached what makes instructions "good" we kept arriving at the same point -- users don't really think about them when they're good. When they're good, users don't have to think at all, which is perfectly in line with what Krug presents.

Major Points

1. Don't make the web user think: little questions on a website add up and every time a user has to think even a little bit about where he should click, there is a greater likelihood that he'll quit. Designers should aim for providing a self-evident experience on a website. However, if that is unachievable, the goal should be self-explanatory.

2. Satisficing: most people don't read an entire web page. They'll scan until they find what they think they need and click. They satisfice, or attempt to meet criteria for adequacy, rather than to identify an optimal solution. They "make the best with what they have" instead of poring over an entire website and determine the best path to the best information.

Accordingly, pages must be designed for scanning, not reading. Krug suggestions several design choices:

-creating a clear visual hierarchy
-nesting text to show connections between text, pictures, etc.
-sticking to conventions
-defining portions of the page into clear sections
-making clickable items clear
-avoiding "noise" in the design

3. Writing for the web: because users won't read everything, it serves designers well to:

-keep it brief
-get rid of half the words...then get rid of half of what's left
-instructions are often useless -- users won't read them -- make all actions self-evident or users won't stick around anyway

4. Navigation: design for browsers (clickers) and searchers (those who go for the search box immediately). Make sure navigation is persistent so users will feel comfortable knowing where they are on a website, which will increase their confidence in the website. A few components of persistent navigation:

-site ID (logo plus tagline, usually on top left of web pages)
-sections (different areas of the website, such as product lines, services, etc.)
-home link (so users can always "start over" if they feel lost)
-search option
-utilities (explain different components of a website -- how to make a purchase, contact a sales rep, etc., but aren't part of the hierarchy)

Home page will usually be an exception since the mission of the home page is different than that of the second-level, third-level, etc., pages.

5. Home page is usually more complicated since many parties will be vying for the prime real estate. In addition to the tagline and a welcome blurb (description of what the site is about, which is different from a mission statement), it must contain the following items:

-site mission
-hierarchy -- showing what the site contains
-search option
-content "teasers" and promotional offers
-shortcuts to the company's "best stuff"
-registration/log-in area, if necessary

Most importantly, the home page needs to show a user HOW to get started. Do they need to register first? Should they jump to products?

6. Page names: it's important that a user can identify where they are on the website. Clear page and section names and matching link names can help. Tabs work extremely well here because they are hard to miss and are visually appealing. A good way to test this is what Krug calls the "trunk test" -- pretending that you’ve been blindfolded and locked in a car trunk, you should be able to answer these questions about a site immediately when your blindfold is removed:

-What site is it?
-What page am I on?
-What major sections does this site have?
-Where can I go from here?
-Where am I in relation to the rest of the site?
-Where can I go to search?

7. Simple, cheap usability testing: few companies can afford a large-scale usability test, but Krug suggests that even a little bit of testing is better than no testing. He suggests that only 3-4 general web users, a private room and a computer are necessary. Show users the website and ask if they can identify what it's about, why it's valuable, how it works, and have them perform a few tasks. Designers should review the results immediately and focus on the big problems -- those are usually pretty evident right away. Take care to not break something else while "fixing" one component.

8. General usability "common courtesy" points:

-don't make users provide more info than necessary (phone, address, email -- are all of these really needed?)
-don't let fancy features get in the way of usability
-make it easy for a user to back up and try again if he navigated to the wrong place

In just a few short months, my company will launch a new website, so this was a timely read for me. So many of Krug's points made me chuckle because I can imagine running into a lot of these problems, specifically the web design arguments is lists. Everyone is a user, so everyone knows what they like and have developed hard opinions about it (I hate pulldowns vs. I love pulldowns). Rather than debating what "most people like" (because there really is no average users) we must look at what works for our site. Does this hierarchy make sense for this information presented on this page? Is this the best way to present our story? If the presentation is the best it can be, then the users will take it and run with it, even if they feel strongly one way or the other about certain components on a website.

There were several points in this book where I stopped and thought, "Well, yeah. Of COURSE you do that." Of course you test early and often. Of course you can't muddle the home page with ten competing messages. Of course you keep navigation consistent throughout. But as I prepare for the website redesign, I find that I will absolutely hit some obstacles on the way. There are several parties in my company that will want their message front and center on the home page. I'll be in the unenviable position of managing that process, but I feel better equipped after reading Krug.

I've been through web redesigns before and I've become so lost in the process that it gets difficult to step away and see it with fresh eyes. Krug's suggestion for just a few random people to test it out ahead of time seems so simple, but there are always reasons for not doing it. We're too late in the game and we don't have time. They aren't the users we're speaking to, so they won't be of any use. Additionally, pride often gets in the way. When you spend months developing a website and open yourself up to average users poking around and finding faults, your ego can get a little bruised.

I'm really looking forward to soliciting help early on this project. I'll feel much more confident with my highlighted and dog-eared copy of "Don't Make Me Think" by my side!

Thursday, October 22, 2009

Design of Everyday Things -- Norman (Preface and Chapter 1)

Article here.

by Donald A. Norman is a cognitive scientist whose studies focused on how items are designed. Specifically, he investigated how many "human errors" were actually caused by poor design choices made when items were created, such as doors, water faucets, or nuclear plant control rooms.

His "Design of Everyday Things" covers three main topics:

1. It's not your fault -- when people have trouble with something, it's a design fault, not theirs

2. Design principles -- people need conceptual models of how things work, feedback, constraints, and affordances (appropriate actions perceptible, inappropriate are invisible)

3. Power of observation -- people must observe, learn, critique designs

Norman notes that all human-centered design requires all considerations be addressed from the very beginning, which suggests that users must be included in the design process from start to finish. This is where the problems typically begin. Most see it as expensive and cumbersome to include users so early in the development process. Moreover, it probably doesn't make much sense to companies from a cost-benefit perspective.

For example, the Huatong Sun article "
The triumph of users: Achieving cultural usability goals with user localization" notes that MMS blew up despite hard-to-use and poorly designed cell phone technology. In this case, what is the incentive for cell phone companies to include users throughout the development process if users are going to use it regardless of poor design?

Moving on...

I'm decent at Photoshop -- good enough that I can "make pretty" most design items on my company's website by myself. However, we recently launched a Christmas card online ordering system and needed a splash page for the online store. Sure, I could probably take the cards and making something pretty. But there are a few different types of cards...with a few different options...and different prices on each...and we need to let users know which items they can click to get where they need to go...all on 800x600.

But this wasn't a matter of space. It was a matter of organization and ensuring visitors can go where they need to go without even thinking about it. This wasn't designing for aesthetics (which a lot of people can do...if I can do it) but designing for effectiveness. This was real design.

Norman might pat me on the back, but not before he yelled at me to hand the splash page off to a real designer.

I got a little tripped up on affordances and constraints. Norman says that affordances aren’t only positive, nor are constraints only negative. Instead Norman says that affordances only benefit if they’re taken advantage of, which means that “the user knows what to do just by looking: no picture, label, or instruction is required.” Constraints, on the other hand, aren’t always negative; just because an object can’t do something doesn’t mean it should.

I also wondered how Norman would view shortcut keys through the lens of his "mapping" concept. If with "no visible relationship between the buttons and the possible actions,” there is “no discernible relationship between the actions and the end result” what does that say for technology a little more advanced than coffee cups and doors?

Monday, October 19, 2009

Professional Investigation

WRD: I enjoyed the Text and Image class I took and I'm currently enrolled in Technical Writing (WRD 521). Document Design in the Winter quarter might be interesting, but I think Text and Image and Technical Writing together probably give me a pretty good foundation in this area.

HCI: I loved HCI 402 (Foundations of Digital Design) and will definitely take another elective in this program. Some interesting courses are Usability Engineering, Foundations of Human-Computer Interaction, and Digital Page Formatting. As I've mentioned before, my company will be launching a new website sometime in the next six months and I think it would be very useful to be in a position where I can be more hands-on than just telling a developer no, do it this or that way.

MBA: I intended to avoid MBA classes since I already have my MBA, but we're in a much different space than we were three years ago. The online marketing class might be worthwhile, but I'd like a chance to review the syllabus first.

CMNS 543 (Communication and Organizational Change): I'm not quite sure how I could get this to fit in my current position, but seeing this class makes me think about the industry change in Metro Detroit (where I'll likely return in a few years) and how useful of a course this could be in an environment like that. Just a thought.

I've discussed some training opportunities with my supervisor and would like to take a couple Adobe CS4 workshops. I would prefer not to spend much time in the DePaul classroom on what I might be able to learn in a few days. Ascend Training has come up as a possibility.

I am also interested in Prof. Akiyoshi's web design classes. I've heard good things about them and I hope I can bypass part I and enroll in part II for the winter. I'll need to show him my portfolio to see what he suggests.

Social Media Revolution




I guess eye rolling is the theme of the day for me. I clicked play on this video and waited to be bombarded with messages about how SOCIAL MEDIA IS ALL AROUND US!! and GET READY FOR THE REVOLUTION!! But when I saw the breakdown of how long it took radio, tv, internet, iPods and Facebook to reach millions, I stopped. I know the Facebook total users statistic is one we all know and love to cite. But when stacked against "old media" the number is staggering. Are you kidding me?

A few other interesting points (of about a zillion):

-1 in 6 higher education students are enrolled in online curriculum -- this makes me laugh because I remember that I used to say online education programs were a joke and a waste of time and money. Oops.

-80% of companies used LinkedIn as their primary tool for finding employees -- wait a second...I found my current job through LinkedIn.

-The record declines in newspaper circulation is no surprise, but I started thinking about how I find the most up-to-the-second news stories. At work one afternoon, I heard about 10-15 emergency vehicles fly past the building and immediately got a little spooked considering my office is across from the Sears Tower. I didn't go to cnn.com, or the Chicago Tribune online, or search Google News. I went to Twitter. And I found out what was going on in about ten seconds -- one quick search for Chicago and emergency. It took another hour or so before the Tribune had any mention of the story.

Pew Internet Report -- Digital Footprints

Article here.

First of all, it absolutely blows my mind that only 47% of people have ever searched for information about themselves online. I google myself all the time (probably an embarrassing amount) but maybe I'm just really nosy. I also pride myself on my stalking abilities, which admittedly sounds pretty creepy, but there's no way I'm the only one out there. Consider that I'm not trying to steal your identity or drive by your house at 2am...but others could be.

A coworker was talking to me today about a google search she did on her name and she was surprised at the amount she found about herself. It makes me think of the steps people advise you to take if there is embarrassing information about you on the web or if someone with your name is involved in questionable activity. It's funny that the answer is to basically take back control of your name by creating your "official" outlets on the web -- a website with your name as the domain name, social networking profiles with your name and appropriate information. So you address your digital footprint problems by stamping your official digital footprint that you can control. Probably good advice but not the easiest for non-web-savvy people.

I'm most concerned about this finding in the study:

(For teens) just 40% said their profile was visible to anyone, while 59% reported access that was restricted to friends only.

The information of four in ten teenagers is available to the public.

It makes me think about my AOL days and how gullible I used to be. I never gave out personal information or agreed to meet up with anyone, but back then, I really believed that I was talking to a 14-year-old boy in metro Detroit who also really loved the Tigers. And maybe I was, but there was a good chance that I wasn't.

And the stakes are so much higher n0w considering how sites like Facebook display all of your contact information and encourage you to share pretty personal details. I'm not blaming Facebook or any other social networking sites. It's just frightening that if adults -- likely the parents of these teenagers -- don't know enough to keep an eye on their digital footprints, who will keep an eye out for these kids?

What value do users derive from social networking applications? Neale & Russell-Bennett

Article here.

I knew I had to applaud this undertaking when I read the section where they tried to define "cool."

Dutch researcher Carl Rohde describes cool in product terms as “inspiring and attractive … providing empowerment” to the user. Cool products help people “to bring out the best of their capacities and abilities.” (Parvaz, 2003)

I rolled my eyes until I tried and struggled to define what makes a Facebook application "cool." I kept resting on "something popular that a lot of my friends use" (otherwise why would I use it?) but that doesn't really explain how it became cool. So let's stick with the provided definitions even though it's decidedly uncool to define cool...

Applications are tough. I thought about those that I pass on and those I don't and realized that I've never sent an application invitation. I don't like when I receive invitations and I don't think it's cool to bother my friends with it. The only application I've used is Scrabulous and even then I didn't have to invite friends -- I could see who had the application downloaded so I could invite participating friends to play.

Well, now I feel pretty smug. My advice: don't bother with the applications because I don't think they're cool and there's no magic formula for what will be popular and what won't. Except this thinking isn't very useful when I consider purchasing ads or trying to develop applications for Facebook for my company. Given its popularity, everybody wants in on the Facebook game.

The lists of value examples and features were no doubt interesting -- but the managerial implications weren't particularly useful. I could have told you without conducting a study that you should encourage users to participate in application development, ensure source credibility, and develop an easy-to-use application.

The bottom line for me: these are great data points in a field that hasn't been nailed down (and probably won't be for a long time) and at this point, anecdotal data ("it's fun to compare people!") is probably the best that can be done.

Monday, October 12, 2009

Professional Investigation & Hey Unemployed Media Professionals...

Article here.

I'm currently employed and likely won't be looking for a new position anytime soon, but one particular bullet in the article made me want to forward it to my company's leadership team:

3.) If you really want to understand social media, you must participate.

Really. You have to jump in. "In it to win it" as the article suggests.

I've personally participated in social media for a long time. I'd like to think that I'm pretty comfortable with connecting with others and getting my message "out there." And when I first started in my current position, I jumped right in to myriad social media channels. It was important to get our name out there, connect with supporters, connect with other similar organizations, etc.

And then my company put the brakes on. They wanted to know what my strategy was. What was I doing with this. They seemed to be scared by what they didn't quite understand. That's totally normal and I don't fault them for that. But then they asked for a strategy -- a detailed communications plan of what we hoped to do with our social media outlets.

I'll be honest: it's hard to say. I could set a goal of 250 Twitter followers by the end of the quarter...50 Facebook fans by the end of the year...1,000 YouTube views by next April. But there are so many moving parts to social media that throwing out those numbers would be meaningless. Our primary goals needs to be: PARTICIPATE. Jump in. Make some connections. See how our messaging resonates. When are we getting people excited? When are we hearing crickets? We can only learn and improve by participating first and we're doing ourselves a huge disservice by not participating until we nail down a detailed strategy...a strategy that could be completely outdated and useless by next month...or even next week!

I don't intend to become a Adobe expert. I'm not going to make a sudden career change and decide that I want to be a graphic or web designer. There are a million people out there who are much better at it than I am and always will be, regardless of the classes I take. However, as a communications professional, it's important for me to understand what works and what doesn't.

In the not-too-distant future, my company will completely redesign its website and I will manage the process and the vendors involved. With just my marketing and communications background, I would have been pretty limited in describing what I want to see, what I know can be done, anticipating our long-term needs, etc. With some of the classes I have taken and am currently taking (elements/foundations of digital design, text and image, technical writing, etc.) I will be in a much better position to effectively work with web/graphic designers, web/technical writers, photographers, etc. And even produce some of the work on my own!

I plan to continue mixing up theoretical and practical courses as the balance is giving me a really good handle on the whys and hows of new media in my current work environment.

Multimodal Discourse, Intro -- Kress and Van Leeuwen

Article here.

I looked up a couple terms before diving into the reading:

modality - noun: one of the main avenues of sensation

semiotics - noun: a general philosophical theory of signs and symbols that deals especially with their function in both artificially constructed and natural languages and comprises syntactics, semantics, and pragmatics

This section spoke volumes to me:

“In the past, and in many contexts still today, multimodal texts (such as films or newspapers) were organised as hierarchies of specialist modes integrated by an editing process. Moreover, they were produced in this way, with different hierarchically organised specialists in charge of the different modes, and an editing process bringing their work together. Today, however, in the age of digitisation, the different modes have technically become the same at some level of representation, and they can be operated by one multi-skilled person, using one interface, one mode of physical manipulation, so that he or she can ask, at every point: ‘Shall I express this with sound or music? Shall I say this visually or verbally? And so on.’”

So the digital age enables a lot of people to be a jack of all trades. But it's not just that, say, web designers have the *ability* to become the graphic designer, photographer, user interface designer, technical and copy writer. There is a growing *expectation* of this. When I started my position as manager of web-based communications, I had some coworkers wondering why I couldn't design the website by myself...write the web copy by myself...design graphical components by myself...take and edit the pictures by myself. And I can absolutely do a little bit (and in some cases a lot) of all these things.

But this multi-skilled (more like expertly multi-skilled) expectation is a little frightening. I might want to throw out the possibility of specializing in any one thing. Better learn it all if I want to continue to be competitive in this space.

The four strata of communication (just for reference)

Discourse - socially constructed knowledges of some aspect of reality.

Design - the means to realize discourses in the context of a given communication situation.

Production - the organization of the expression, to the actual material articulation of the semiotic event or actual material production of the semiotic artifact.

Distribution - the mode(s) of delivering the expression.

Remediation, Intro and Chapter 1 -- Bolter & Grusin

Article here.

I struggled a bit with terminology in this article, specifically remediation, immediacy and hypermediacy. I will attempt to work my way through it...

The term remediation in the book's glossary:

“Defined by Paul Levenson as the ‘anthropotropic’ process by which new media technologies improve upon or remedy prior technologies. We define the term differently, using it to mean the formal logic by which new media refashion prior media forms.”

Hmm...how about a specific example on page 44:

“Hypermedia CD-ROMs and windowed applications replace one medium with another all the time, confronting the user with the problem of multiple representation and challenging her to consider why one medium might offer a more appropriate representation than another. In doing so, they are performing what we characterize as acts of remediation.”

So remediation is the representation of one medium in another. So what does this mean in the realm of new media? What is "new" about "new media" comes from the particular ways in which they refashion older media and the ways in which older media refashions itself to answer the challenge of new media (15). Which leads us to...

Old media vs. new media

Popular opinion has been that new digital technologies (internet, computer games, etc.) separate themselves from old media in order to transcend and evolve the limitations of old media. However, Bolter and Grusin contend that new media achieves such significance and growth because they refashion old media -- this is what they call remediation. New media aren't the first to do this: photography remediated painting, film remediate stage productions, television remediated film and radio.

There is a lot to discuss in this new media vs. old media game, but I'm interested in how immediacy and hypermediacy come into play.

Immediacy vs. Hypermediacy

Immediacy: immediate awareness of an occurrence with the lack of an intervening/mediating agency. Where representation "is the thing itself" (a little help here)

Hypermediacy: "style of visual representation whose goal is to remind the viewer of the medium" (Bolter and Grusin, 272). Our desire for immediacy but transparent immediacy. A good example from here: "In Psycho, when we see an extreme close-up of Norman Bates's eye as he watches Marion Crane through the peephole, then find ourselves looking through it ourselves, Hitchcock foregrounds the act of seeing, implicating the viewer in the voyeurism that is at the root of Norman's (and our?) psychosis. Hitchcock's is an act of hypermediacy."

Professor Stephen Dobson offers a very useful chart to explain immediacy vs. hypermediacy:





















My immediacy example: my heart is racing and my palms are sweating while watching a scary movie. This is a synthesized experience of reality -- there are actors and scenery but my emotions from the experience are very real. I feel like I'm there.

My hypermediacy example: I'm marveling at a Pixar movie -- I can't believe how real it looks. I identify that a medium is present and I am having this experience because of the qualities of the medium. I am reminder that the sense of "reality" I'm experiencing is mediated.

Monday, October 5, 2009

Pew Internet Report -- A Portrait of Early Adopters

Report here.

This report explores the qualities of longtime internet users and what initially brought them online. Basic priorities haven't changed -- the initial desire was to connect with friends and colleagues and explore the new "cyberworld." Today, this group continues to connect, although much of the activity is now on social networking sites; this group continues to explore, although more aggressively as they upgrade to broadband and wireless.

Many respondents (51%) cited personal reasons for going online, while only 31% said work was the cause and 19% said school was the cause. Some of these personal reasons included searching for assistance with health issues, education and job training, or classes. These people, rather than choosing face-to-face interaction, public libraries, or newspapers for this information, chose the internet -- a medium that could link them to the most people and sources.

Additionally, most respondents reported that they were originally only "consumers" of the internet -- they got their news, conducted research, downloaded software and emailed friends on the internet. However, as they became more familiar with the online environment, easier online tools became available, and faster connections arrived, they began to create content: publish writing, share photos, rate products, tag content. These folks didn't wait to learn the "proper use" for the internet -- they jumped right in, made it their own, and created their own rules. The medium is influenced by its users like no other.

This particular report interested me because in 1993 I, at the tender age of 13, considered myself to be on the cutting edge of the world wide web. My parents, on the recommendation of my techie uncles, allowed me to have an AOL account and I spent an embarrassing amount of time on it. My mind was immediately blown. I could communicate with people across the country in real-time in chat rooms. I could take online classes in genetics (I know, I was a pretty popular kid). I could read the newspaper. I COULD BUY THINGS. At 16, I was setting up my own message boards and had a (feeble attempt at) a website.

I wasn't just consuming. I was CREATING. This sixteen-year-old had a lot of things to say! And people from all over the world could read it! I couldn't believe how important I was. Hey, friends, could you just email that information to me? Oh, you don't have email? Sheesh.

I'm not terribly surprised that these longtime users first hopped on the internet for personal reasons. There wasn't much to do that was work- or school-related -- plenty of workplaces and schools did not even have an online presence. It wasn't until I went to college in 1998 that we were communicating with email and even then, our email technology was still fairly young.

My communication priorities have changes a lot in sixteen years. Back then, I was interested in connecting with anyone and everyone and discussing anything and everything. A 45-year-old woman in Edina, MN who wants to discuss Springsteen's "Nebraska" album? Let's chat for a few hours. An email from a Boston College student who wants to talk about their chances against Notre Dame? Hi there, pen pal. Now, my connections are heavily guarded. I would never add a stranger as a Facebook friend. I don't care if you happen to like the same music as I do, MySpace user, leave me alone.

Maybe I feel like I don't have the time to put into developing these internet connections. Maybe I'm just not much of a networker (although I really should work on it). I blog and I tweet, but I rarely consider the non-friends out there who are consuming my "product." I've come a long way from that sixteen-year-old who had a lot to say...and maybe there's something a little sad about that.

Monday, September 28, 2009

Critique of McLuhan's Technological determinism viewpoint or lack of one thereof -- Mentor Cana

Article here.

Mentor Cana notes two criticisms of McLuhan's ideas:

1. McLuhan excludes the process of technology innovation and social constructionism of media technologies in his argument

2. McLuhan oversimplifies the medium-message argument with the content being excluded in most cases. This leads McLuhan toward a "conclusion that media somehow have a life of their own independent of the context and social structures, thus attributing hegemonic like processes and properties to media technologies." The properties of media are manifestations of attributes that have been embedded within themselves as a result of a more complicated process of the play of context, socio-economic, and political factors.

The medium is the message.
The medium is kind of the message?
The medium is also a message.

Well, that I can hop on board with.

I had a hard time grasping that the technology had so much power and its effects had little to do with people and societies and their attributes and beliefs. Technologies indeed have social and political effects, but it would be wise to examine by whom they were developed. What social and political structures were in place when they were created? So Cana makes more sense to me.

One aspect of McLuhan's body of work that stuck out is the idea of people being unaware of the downfalls of technology and defenseless to its growth. On one level, it makes sense -- I consider the sometimes-significant effects of advertising on people. However, I've never been completely in the "poor defenseless humans" and "big bad technology" camp. So Cana's conclusion made a lot of sense to me:

"The conditions under which McLuhan could make lots of sense would be when media and communication technologies become so advanced that they could transparently and in totality, through complex web of advanced sensory probes, absorb into themselves the complex conditions of their environments, and at the same time be able to manifest the same into the environment. In these conditions, medium might be the message, media might posses in and by themselves hegemonic tendencies. But we are not there yet. I don’t believe we’ll ever get there."

And maybe we will get there. But as it is, let's not write off our ability to understand the effects of technology.

Or maybe that's what the machine wants me to think...

Excerpts from Understanding Media, The Extensions of Man -- Marshall McLuhan

Article here.

McLuhan suggests that “the medium is the message because it is the medium that shapes and controls the scale and form of human association and action.” I loosely grasp this, mostly because I view it as how we can't undervalue the importance of how we receive the message. McLuhan would roll his eyes at me, though. I'm not quite there.

He says that content and medium are the same thing. This makes more sense to me through a professional lens where, in marketing/communications, the focus is typically on the what of the message. However, the medium of the message -- web, email, phone call, tv ad, print ad, etc. -- determines the human association and action. Inextricably linked within those media are certain values and notions. If I'm trying to sell you a product, would I send you an email? If data points to the fact that customers are less trustful of email marketing, and are more inclined to look favorably at products that well-known bloggers promote, wouldn't I choose to develop a relationship with a blogger? This "trustworthy medium" is the message.

McLuhan would likely tell me that I'm way off still, much like other analyses of media. I'm not detached enough to determine the social effects of media content since the spell, as he puts it, can occur immediately upon contact. Could he find someone who hasn't been put under the spell? Could anyone really be completely detached from society and able to distinguish the effects of media technologies from their messages? I doubt I'm a good candidate.

Marshall McLuhan: The Medium is the Message -- Todd Kappelman

Article here.

In addition to a brief discussion of McLuhan's thoughts on the impact of technology on popular culture and some of the important terms he coined (media (!), global village), Kappelman dives into the "meat" of McLuhan's concepts -- technology as extensions of the human body and four questions he applied to media.

Kappelman says, "An extension occurs when an individual or society makes or uses something in a way that extends the range of the human body and mind in a fashion that is new." Every extension succeeds in modifying or amputating some other extension. McLuhan suggests that people typically ignore or minimize the amputations.

For example, I used to talk to my parents on the phone almost every day. When my mom figured out how to text, she started to communicate with me almost exclusively through texting. The medium is quick and convenient, but I miss our daily chats and notice that texting doesn't afford the kind of connection a phone call does. This is likely more of a modification -- not a complete amputation -- but I've noticed that there are several friends with whom I communicate solely on Facebook. No more phone calls. No more emails. Amputation? It's sure starting to look that way.

In McLuhan's "The Global Village" he poses four a scientific basis for his thoughts around what he called the tetrad. He sought to apply four questions to the endeavors of mankind as a new tool for looking at our culture.

1. What does the medium/technology extend?
2. What does it make obsolete?
3. What is retrieved?
4. What does the technology reverse into if it is over-extended?

If we were to apply this tetrad to the Kindle (digital reading device):

1. Extends our ability to gain access to a larger number of books, magazines, etc.
2. Print publishing could become obsolete.
3. People could retrieve (or re-retrieve!) a love of reading.
4. If over-extended, people could become more isolated.

Monday, September 21, 2009

Towards a Mediological Method: A Framework for Critically Engaging -- Melinda Turnley

Article here.

Turnley describes media convergence as "the ways in which digital technology allows previously distinct media to come together." Content like images, video and text are being merged by anyone and everyone and the combination of these "multimodal elements" is blurring the relationship between producers and consumers (Turnley, 1).

Because of the technological, industrial, cultural and social assumptions behind media, tools must be developed to understand media individually and in their convergence. Turnley's article explores a mediological (method developed by French theorist Regis Debray) theory through the lenses of semiotics, communication, art history, sociology, linguistics, psychology, philosophy, and more. Seven dimensions are included in the framework of this theory (Turnley, 36):

technological: technical components necessary for medium to function

social: metaphors, images, narratives which circulate in relation to the medium

economic: systems for production which support the development, distribution and maintenance of a medium

archival: material and conceptual components for the reception, accumulation, distribution and retrieval of information

aesthetic: conventions and expectations for form, formatting, design andcontent associated with a medium

subjective: patterns amd expectations related to subject formation, the nature of the self and the positionality of users/audiences

epistemological: assumptions concerning the nature of knowledge, information, truth, intelligence and literacy

I'm looking forward to using this framework this quarter and beyond. Although it's a lot to "unpack" right now, I can see how the categories can help me execute a better multi-dimensional media analysis.

As We May Think -- Vannevar Bush

Article here.

Bush's prophetic 1945 article outlines a postwar environment where information is plentiful but methods for collecting, storing, indexing, and retrieving are inadequate. He proposes that there must be a shift in scientific efforts from controlling human physical environment to making human knowledge more accessible. Where's an internet when you need one?

Bush discusses several tools that exist and ones that *should* exist, most of them focused on data recording (image and sound) and storage. He notes that scientists are not the only people who manipulate data and examine the world around them and that all people can profit from the "inheritance of acquired knowledge." But if only a limited amount of information is available, people cannot use it to their full potential. However, because of the way people naturally organize information by association (instead of numerically and alphabetically), new methods must be considered. Bush discusses a "memex" -- a computer system implemented with electromechanical controls and microfilm equipment, that would permit a researcher to follow and annotate topics of interest, analogous to later hypertext technologies (from wikipedia).

Some interesting points:

  • The parallels Bush draws between the technology and human anatomy. The technology would be most effective if it can mirror the way branis operate. He poses the technology as an "extension of the human brain" that follows "trails" much like the brain does but with much more clarity and permanence. What's even more interesting is that this type of indexing has yet to be fully explored -- we still interact with technology, especially computers, largely in unnatural and illogical ways.

  • Speech recognition -- "talk to type": what has kept this from catching on? I play with it on my computer all the time, but I never think to "write" a paper with it. Too inaccurate?

  • How far long are we...really? I used to work for a content management software company that enabled users to input, index, store, retrieve, share, workflow, and destroy information. Sounds pretty cool. Who wouldn't use that?

    The product was solid. We won many awards and had a partnership with Microsoft. Yay, big money. Except no. Because at the end of the day, content management software is only as good as the users inputting the information, regardless of how easy the input steps are.

    Sure, OCR could help users auto-index items and fields could be set so information us automatically pulled from certain types of reports and documents. But if Joe in accounting doesn't tag the report as a mortgage note or Jane in finance doesn't approve the P&L sheet in the workflow and send it to the auditing team, the software is useless.

    In this regard, Bush's proposal for technology to closely mirror the way our brains are wired make a lot of sense. When we interact naturally with tools -- the big Google search box comes to mind -- we're more inclined to use them more often. But storing and indexing are huge undertakings, regardless of task. Just consider how many times you've created a document that required collaboration. Think about your naming conventions and how you organized the content. Now think about how others try to understand it and make it their own, and you end up with multiple versions of the document. Multiply that by hundreds for some industries -- hundreds of thousands for, say, research and financial industries.

    Who will organize, index, and generally make sense of this? Not the technology.

Giving up my iPod for a Walkman -- Scott Campbell

Article here.


Thirteen-year-old Scott Campbell swaps his iPod for a Walkman and discusses his experience. In addition to comparing the appearance, sound, convenience, Campbell also contemplates his limited knowledge of technology.

My initial thought was that this kid would not have the patience for the Walkman. The slow rewinding and fast forwarding, short battery life, having to flip the cassette (which he didn't realize until later). Kids (well, to be fair, mostly everyone) these days want immediate gratification -- songs when they want them, no commercials, etc. Most don't have the patience to listen to an entire song -- let alone an entire album -- before they're skipping forward. The absolute highlight for me was when Campbell discovered a "shuffle" feature by rewinding and releasing randomly. Of course!

One interesting thing that I recall about my Walkman is that most of my tapes were recordings from the radio. I didn't have a ton of cash back then, so purchasing tapes was typically out of the question. So I improvised and recorded all of my favorite songs from the radio...which had commercials. What a huge difference between my childhood and the current generation. iPods and DVR -- do they listen to ANY commercials anymore? They may have an attention span of a gnat, but perhaps "big bad advertising" isn't getting to them like they got to me and my peers.

I had to laugh when he said he was "relieved that the majority of technological advancement happened before (he) was born.” I'm sure everyone has stopped at one point and thought, "Wow, what more can be done? Teleportation and that's about it." But the window of time for research and product development is getting shorter and shorter. Gone are the days of products hitting shelves in 10-15 years from initial development. Look how far the iPod has come since its initial release in 2001. Capacity from 10GB to 160GB, black and white to color screen, three times the battery life, available in multiple colors, video camera, wi-fi, and the list goes on and on. There have been dramatic changes in just eight short years. Hold on tight, young Scott Campbell. Technology still has a long way to go.

The Machine is Us/ing Us

This short video highlighting the rise of Web 2.0 and its social and cultural implications was produced in March 2007 -- light years ago considering the rapid pace of technology development. So I'm pretty surprised by its freshness. The presentation of ideas was unique; the author employed blogs, wikis, video and photo sharing sites, RSS feeds, etc., to present information about blogs, wikis, video and photo sharing sites, RSS feeds, etc. This media is enabling anyone to produce and edit information, create videos, share photos, broadcast news, and the list goes on and on.

Wait...anyone?

Yep. Anyone. So that's where it gets crazy.

It reminds me of when I was 12 and AOL was my drug. Hey, this guy wants to chat with me. He's 14 and from Boston. Awesome. Except he's probably 50 and in his mom's basement. That potential for nefariousness still exists, but now the lack of gatekeeping and fact checking poses trouble on a much larger scale. Who's credible? Who are the experts? In a race to break the story first, who even has the time to check?

What are we really teaching the machine? I hope it's not paying close attention to me because I rarely organize my web life. Tagging takes forever. My inbox has thousands of emails and no rules or filters. Ah, a story for another time.