Friday, February 5, 2010

#onw2010 Web Traffic & Campus Trends: A Multi-Institutional Analysis

Jon Jablonski, University of Oregon (soon to be UC Santa Barbara)
Robin Paynter, Portland State University
Laura Zeigen, Oregon Health & Science University

The project evolved from the Orbis Cascade Alliance Research Interest Group. All three universities had web traffic data, which raised the question of whether different university types would have different traffic patterns. The existing library literature didn't establish comparisons or benchmarks, and Google Analytics's library benchmarks doesn't indicate which or what type of libraries.

Many different methods can be used to analysis web logs. Most existing research uses conceptual frameworks, an inductive method so far mostly used on OPAC searches.

This project is a transaction log analysis (not a search log analysis). Common measures include number of hits, unique visitors, page views, etc. Remember that hits includes each different component of a page (images, etc.), and so has been discredited. Also remember that page views includes refreshing, but that pages from Back and Forward aren't counted because they're loaded from cache. Dynamic IP addresses and computers with multiple users complicate the measure of "unique" visitors.

The good part of web log data is that it is a direct behavioral measure - users can't idealize their own behavior. The bad part is that it doesn't tell you about intention or satisfaction.

Cookies are a good source of data, but users often delete them. Flash cookies are more durable.

RSS feeds aren't counted in web logs unless the user clicks through; campus portals and mobile device versions may or may not be counted. And different analysis packages give different results, because they handle corrupted data in the log files differently.

Public library computers with the library page as the home site inflate the page views count.

What are the key performance indicators for library websites? Most current benchmarks are focused on commercial sites, which isn't accurate. For example, is a short time on the site good, because that means that people found what they needed, or is a long time good because they're more engaged. More unique visitors might be better for September, and more return visitors in April. Different library staff have different interests: designers might want to know which browser is being used, but librarians are more likely to want to know paths.

OHSU logs show that the number of new users per month increased steadily from 2005-2008, peaking each year in February and May.

This project's data only includes basic web traffic - not the OPAC, digital collections, databases, or institutional repositories. This highlights that library websites are portals to other types of sites. For example, the UO Library homepage has 42-47 links, and 6 link off the site (catalog, main UO page, etc.)

Comparisons Across Institutions (2008-2009 academic year)
The 3 have very different calendars of traffic. UO and PSU have almost the same enrollment, but UO has much more traffic (nearly 2 million a month versus around 325,000 a month). UO's traditional students show more extreme peaks and valleys for spring and winter break. OHSU is more steady around the year.
Comparing this data to research on journal use shows similar patterns in time and across institution type.
Comparing on visits per days of the week shows more similarity - peaks on Monday and Tuesday, declining steadily to a nadir on Saturday with an uptick on Sunday.
At PSU, 79% of traffic was to top-level pages (home, "About", etc); at UO, it was 57%; at OHSU it was only 14%. OHSU has a very long tail of content - the pages deep inside their website get much more of the traffic. PSU's pattern is due to an active attempt to take no more than 3 clicks to get to needed information. This difference could be due the number of public terminals with the library's page as home: there are many more of these at PSU and UO.
Over the quarter, PSU has high use early in the quarter, drops to nearly nothing at mid quarter, and peaks at the end of the quarter. OHSU is very consistent across the quarter. UO shows consistent use across the quarter, except for graduation week and the week after.
Some of the difference may be due to different software analyzing the raw log data.

Conclusions
This can point to needed services. For example, PSU had a lot of hits on its map finding aid, but has no map librarian.
UO showed a lot of use of its electronic Asian books. Are there similar hidden gems?
This could also be a guide to service provision - for example, mid-quarter might be a fine time for PSU librarians to take a vacation.

No comments:

 
Creative Commons License
Libri & Libertas: Books & Freedom in a Web 2.0 World by Laura H. Wimberley is licensed under a Creative Commons Attribution - Noncommercial - Share Alike 3.0 United States License.