logo
User Experience, Not Metrics

This is Scott's first series of articles where he starts by asking How many times have you surfed to a web site to accomplish a task only to give up and go to a different web site because the home page took too long to download? "46% of consumers will leave a preferred site if they experience technical or performance problems." (Juniper Communications) In other words, "If your web site is slow, your customers will go!" This is a simple concept that all Internet users are familiar with. When this happens, isn't your first thought always, "Gee, I wonder what the throughput of the web server is?" Well no, that is certainly not the thought that comes to mind. Instead, you think "Man, this is SLOW! I don't have time for this. I'll just find it somewhere else." Now consider this, what if it was YOUR web site that people were leaving because of performance?

Face it, users don't care what your throughput, bandwidth or hits per second metrics prove or don't prove, they want a positive user experience. There are a variety of books on the market, which discuss how to engineer maximum performance. There are even more books that focus on making a web site intuitive, graphically pleasing and easy to navigate. The benefits of speed are discussed, but how does one truly predict and tune an application for optimized user experience? One must test, first hand, the user experience! There are two ways to accomplish this. One could release a web site straight into production, where data could be collected and the system could be tuned, with the great hope that the site doesn't crash or isn't painfully slow. The wise choice, however, would be to simulate actual multi-user activity, tune the application and repeat (until the system is tuned) before placing your site into production. Sounds like a simple choice, but how does one simulate actual multi-user activity accurately? That is the question this series of articles attempts to answer.

rule