16 February 2007

(Uninformed) Thoughts on GUIs

Unlike other members of this blog, I have little to no actual technical knowledge. But I'm still a techie of sorts (early adopter, and so forth), and since this is a blog I figure my credentials matter not at all. So here are recent musings on user interface design.

It strikes me that when I read about UI as something people are thinking about on the meta level, the writing tends to take a rather narrow perspective by trying to judge which particular UI is 'better', 'easier to use', 'more efficient', or 'more intutive'. Those are all important questions to ask, but I was struck this week by a different dimension of UI and UI design. Putting aside what UI is 'better' there's still the question of what the UI encourages you to do and what it encourages you not to do.

This musing came about when I was sitting on the sofa earlier this week and thought, 'damn OS X rocks!'. This spontaneous head-thought came about because at that moment I was doing the following:
  • ripping a DVD in Handbrake, which requires intense video encoding
  • copying a huge file over the WLAN to the Mac mini
  • copying another file over an SSH tunnel to my office computer
  • exporting a Video clip from MPEG Streamclip
  • working on a huge Keynote presentation (filled with video clips)
  • checking my email in the background
  • actively typing something in Pages
(Of course there were a dozen other apps open that I wasn't using, and quite a few *nix processes running of which I almost always remain happily unaware.)

So by 'OS X rocks' I meant to express how impressed I was that both the OS and the processor could handle all this, and not make me think for a moment that I'd have to wait for the machine.

But, of course, this is complete OS X chauvinism: any modern multithreaded operating system (OS X, XP, Vista) running on the Intel Core Duo processor could do all of this. It was not necessarily something special about OS X.

However, I then got to thinking about how my XP friends and colleagues actually use their machine, and it seems to me that they are very unlikely to really take advantage of their Core Duo chip or their multithreaded OS. Indeed, just the day before I watched as my utterly computer literature colleague interacted with his XP machine. Seriously, this guy understands computers far better than most. But like everyone I watch use XP he interacts with every app in full-screen mode. And he never really switches between apps; instead, he constantly clicks the little icon that sends the window down to the bottom of the screen, but the idea of leaving an app with 5 windows open just sitting there while sliding over to a different app is just a thought he's never had. In this particular session we were looking at a few different word files that I had emailed him, and then working out a structure in a different word Doc. We'd be working on the main doc and every time he needed one of those other word docus he would do the following:
  1. Switch to MS outlook
  2. Find the email I sent him
  3. Click on the Zip attachment
  4. Answer yes to unzipping each of the 6 files
  5. Select the file he wanted and have it open in word

My very much speculative point is this: isn't it the case that the set-up of XP encourages this type of time-wasting behavior and discourages more sophisticated multitasking? Every window wants to be full screen. App switching (as in, through the dock or command-tab in OS X) doesn't seem readily available that I can see. Users are encouraged to think about their computer the same way they thought about it in DOS. That's sad.


Transient Gadfly said...

I think not only are you correct here, but that it is a unique success of OS X. Red Hat gives it a good go, and it's better than MS/Windows, but it still encourages you to run one application in full window mode. Here at work (where, ironically, I build UI's) we have solved this problem by having a work environment that has three monitors. It's awesome, don't get me wrong, but when I pick up the laptop and go home I find that I've learned to work with multiple displays and can barely function with only one (of course, the laptop is a windows-running pile of crap that leaks memory as if it were going out of style, but that's another story). Then I pick up my PowerBook and take it down to the basement where I'm simultaneously recording an epic 70's rock and roll album and iChatting with my spouse who's upstairs, while lyric sheets and notes are open in pages. Then I weep for the beauty of the current state of technology. Well, at least the current state of some technology.

Periapse said...

I got a MacBook last year. It was my first Apple computer since my 64K Apple II+ in 1982. OS X actually took some getting used to for me. Naturally what I missed the most was the right click context menu. My appreciation for OS X came not from the UI but from how smoothly it became a part of my existing home network. With all versions of Windows to date networking has been a complete pain, and something I never can get to work exactly the way I want. My MacBook required none of that. It found my wireless network, got the security key from me, and then just worked everything out. It can see the HDs of my Windows boxes, and has no problem connecting to my home stereo through the wireless network.

But I do agree there's something appealing about the OS X UI itself. For me it's a conceptual difference. With MS, I'm always very much aware of what processes and applications are running (and watching out for rogues). With OS X I never have to think about it. Several times I've switched to some app I haven't used in over a month and it's still there, with whatever I was doing then. I had no idea it was "running" all this time, nor did I need to know it.

In all fairness my dual core XP box can multitask just as well. I'll commonly be in Second Life while it's recording something off of cable and both SL and the video are fine. But I do find that with 1280x1024 display resolution my apps tend to require most, if not all, of the screen.