Friday, May 22, 2009

UI's - Full Circles

Note: I'm utterly swamped, so doing a repost of an article I wrote a while back that I've always been fond of. Here's to hoping next week is calmer!

In the beginning, there was the command line, and we were grateful for it, darn it. A single blinking underscore in orange or amber or green (depending one what strange theory was vogue about eyestrain at the time) was all we needed to get our computers to do what we needed them to do.

My first foray into computers was during this time. I cut my teeth on the Commodore 64, a beige box filled with strange churning sounds that allowed me to program and play games made simply of words and my own mad stabs at the game's internal dictionary. A few of my games had images, but nothing that could be called an interface, per se. They were usually badly pixilated depictions of trolls that were trying to crack open my skull before I saved the maiden fair.

Then came the GUI. Graphical user interfaces (GUIs) first appeared for the general public in the Apple ]['s. Instead of having to remember commands like CD (change directory), LS (list all files), or PS -A (list all processes), you found it in the interface. Instead of CD MYDOCU~1, you clicked on "My Documents". Instead of LS -v, you simply looked at the contents of the folder, displayed in a handy box. PS -A? They had a handy tool for that. The command line as a common tool was dead.

Where GUI's removed obfuscated commands from the user experience, they ushered in a way that was easier for the average person to understand. People outside of the usual circle of technophiles started to use computers. Businesses brought them in, as it no longer took a separate degree to be trained on then. The machine, created by us, had started to change us, how we did work, how we thought about computers.

GUI's were not without problems, though. Problems usually arise when engineers have created the interface. The melt-down at Three Mile Island was the result of an engineer doing this. Two dials that had to be monitored simultaneously were placed on opposite ends of the room. If there were two technicians on hand, they had to yell out the readings to each other. If only one was on duty, he or she had to run back and forth from one dial to another. To an engineer, where the dials were placed made sense. It simply didn't make sense once you added the human element.

While a faulty interface in a computer program never lead to nuclear melt-down, it can lead to endless frustrations to even the most technical of users. In the early days of graphical programs, standards hadn't been settled upon by the various competitors, with good reason. Each hoped to get the customer so used to their style, that they would never make the switch. A famous rivalry was that between Microsoft and Corel. Both produced a set of expensive office tools used for document and spreadsheet creation. The core features were nearly identical. In other arenas, the battle for dominance might have taken place in advertising, with clever slogans and campaigns to keep users hooked. With Microsoft and Corel, however, the battles took place in the UI. They moved around common elements, such as word formatting and options, just enough to make the switch daunting to the average user.

Once the initial rush to claim users was over, standards began to evolve. We based icons and organization off of things in our everyday life: folders to hold things, magnifying glasses to look for things, images of disks to indicate saving things. As time passed, users and developers began to agree on standard icons and terminology. An image of a disk was used to indicate the ability to save what you were working on, not to open a saved item.

Having an interface, though, started to change the physical appearance of the computer. Monitors, in the days of amber screens, didn't have to be high resolution, or even very large. A nine-inch monitor wasn't unheard of. Color was rarely necessary, and when it existed, it was usually only in the most basic of primary and secondary tones. An interface done only in blocky ROY G. BIV is painful to use, so higher resolution monitors started to become the standard. In order to render these higher resolutions, better processors became necessary.

Not only were existing elements added to, but new pieces of hardware were added. The mouse became absolutely vital to navigating a visual interface. Where before you found your way in a file system semantically, by file paths, you did it visually, by remember what you put where. Instead of keystrokes, you had clicks.

A funny thing happened with adding the mouse. People started complaining that their wrist hurt. The first reactions, in the mid- to late-eighties were derisive. "Mouse arm" became a running joke around many agencies. To those that had developed it, it wasn't nearly so funny. We were used to the idea that sitting in a chair for eight hours straight might hurt our backs that evening, but it had never occurred to us that using something might actually damage us. Ergonomics was beginning to enter the scene. Gel pads found their way onto desks, mouses were made to fit the hand better, and studies were conducted to see what the optimal positions were to make sure we weren't hurting our backs, hands, or eyes. As for those who had already developed what came to be known as carpal tunnel syndrome, many resorted to braces around their wrists. When that didn't help, they often had to resort to surgery.

In the past ten years, a curious thing has been happening with interfaces. The humans using them are beginning to push back. First, engineers thought they knew best how to display data. Then it was human factors psychologists. What became clear, however, was that the user wanted to be able to define their own experience. Applications with a static tool bar began to lose favor, as those that gave the user the most choice in where elements might be displayed, and which ones they wanted to toss totally.

The first time I saw a custom interface was when I was introduced to the Plus! package from Microsoft. It seemed a cute way of customizing the way your desktop looked, linking background, icon sets, and color schemes. As I looked around for new themes to install, I found the usual gambit of cartoon characters and puppies, but I also found something interesting: themes based on working. One used a background where you could sort files and folders into color-coded categories. File types were color-coded blobs that were easy to find in a visual scan.

As the years passed, I noticed more products coming out that allowed a user to customize their experience. Products like Confabulator and Google Desktop not only allowed a user to change how their desktop looked, but what was displayed there. Little portlets could display the weather, headlines from a news source, or the latest entries from a blog.

Up to this point, customization seemed limited to serious applications, like word processors and spreadsheet managers. A few less-serious areas had grabbed onto customization technology, like RSS feeds and blogs, but things like games remained locked into whatever a designer had decided back in development. This all changed with a game called World of Warcraft.

World of Warcraft is a massive online game (MMO), where people level up avatars by killing rats, then ogres, then demi-gods (with a few steps in-between, naturally). It wasn't the first to do this. Earlier games, such as Everquest, Ultima Online, and Dark Age of Camelot worked along the same lines, and each had a reasonable player base. Warcraft came out, and sales sky-rocketed. People not only bought it, but played it, and kept playing it.

My husband had to talk fast to get me to play another MMO. I'd left the last one in disgust, and swore never to play another one again. He assured me that Warcraft would be different. After installing it, he went to work on my interface. Blizzard, Warcraft's creator, had opened up a tool-set to allow users to create custom interfaces for their game. Users then turned around and posted them. I was able to install products that allowed me to see information that I wanted, how I wanted it. I was a damage-dealer, so I wanted data on how hard I was hitting. I could get that in floating numbers as I hit something, then as a print-out after a fight was over. My husband wanted a map that he could mark up however he wanted, noting everything from where he found cool things, where neat views were, or where a group of us were meeting up.

While advertising and buzz got people to the store to buy the game, it didn't make them continue to play (paying a monthly fee all the while). The other games had content. They had dragons and neat gear to wear. What they didn't have was the ability for the user to have control over what they saw, and how they experienced the game.

One intriguing result of the add-ons was how they began to influence the game itself. As more dungeons were created, more encounters were not only made easier by the add-ons, but seemed to require it. One popular modification was Decursive. When a person in your group became cursed, certain classes have the ability to remove that curse. Before Decursive, this took constant scanning. With the mod installed, a box would pop up showing the affected character. Click the box, the right spell went off, curing him or her. After Decursive became popular, the dungeon designers at Blizzard started adding in creatures that, instead of sending out curses one at a time, would affect the entire group or raid. Curing them all would be impossible without Decusrive installed. The interface was now not only changing how the user interacted the game, but was changing how that game was further developed. Not only were the humans pushing back, but the machine was responding.

It has taken time for designers and engineers to let go of the idea that they know what the users need most. As our capabilities grew in designing interfaces, studies grew, trying to discern how to capture the attention of the most users. Were animations helpful, or harmful? What colors were best for indicating something was a link? What part of the page is best for getting someone's attention? How can we affect how much a user comes away with? Any time a study tried to answer one of the above answers, the researchers usually came away with an option that was strong, but certainly didn't cover the entire subject pool they had studied.

The recent release of Google personalized web page works off the basis that one answer will not suit everyone. Previous portals, such as Yahoo's portal circa 1998, only allowed a set number of items to be shown, and all had to be on the same page. With Google's portal, users have the ultimate flexibility: they can choose content, placement, and even group items in ways that make sense to them. Users can even create their own custom portlets, then share them for others to use. In my office, most of my coworkers have Google's portal as their homepage, but everyone uses it differently. One groups different news sources on different pages. Another keeps track of blogs and local events. I have weather, comics, and a few news feeds to keep me current. When I was a user of Yahoo's portal, I knew of almost no other users. Now, everyone I know seems to use some variation of Google's homepage.

The cycle of us pushing technology is showing signs in one area: it's encouraging people to become more technical in order to get what they want. While most will never pick up a programming language, more people every year seem to know what an RSS feed is. For those that do know how to program, user communities are expanding for popular products, like bulletin board software or content management systems. Ten years ago, most were computer science graduates, or those that had been in the industry for years. Today, online guides and "Dummies" books let nearly anyone learn to code. Today, communities are made of professionals, but also those who only picked up a book when they wanted their board or CMS to do something, but couldn't find someone else who had done it already.

Indeed, in a few small ways, we're almost coming full circle. I was in one of my clients' offices a few weeks ago. He wasn't the most technical of customers. Though brilliant, he had trouble with his laptop on a daily basis. I was there to find out why syncing was taking him so long.
"Can you bring up your task monitor? Go to the Apple icon--"

He cut me off. "Oh! I found another way you can do that!" He opened up a command line terminal, then pecked out PS -A. He hit enter, and a list of his current processes popped up, complete with how much they were eating at his processor. "Isn't that clever?"

"Boy," I said, "Wait until I show you grep."

No comments: