Assessment Council Information
Agendas (staff only)
Minutes (staff only)
By Bonnie Brubaker Imler, Library Director, Robert E. Eiche Library, Penn State Altoona
I was asked by the Libraries Assessment and Metrics Council to give a summary of
a presentation I made with my research partner Michelle Eichelberger at the 2011
PaLA Conference entitled “Undergraduate Research Behavior: Do We Really
Want to Know What They’re Doing Out There?”
Michelle and I became interested in studying undergraduate research behavior because we were curious to see if the
kinds of research confusion that we were seeing at the reference desk were indicative of confusion across the whole
student body of undergraduates at Penn State Altoona, or if they were isolated problems. Our challenge was to find
a way to capture student research behavior in a way that was easy for us to go back and review, but that didn’t affect the student’s behavior in any way.
After attending a University Park seminar on the video coding software Studiocode, I realized that we could use a
combination of screen capture technology and coding software to study student research skills in a whole new way.
Video screen capture allows the student to search in a private setting without the researcher present, which makes
it less likely that the researcher will influence the actions of the student. It records all mouse clicks, mouse tracking,
typing and highlighting. It is a true recording of the event that can be played repeatedly, allowing the researchers to
mine the data for comprehensive and varied findings. Screen capture videos can also be coded or marked for common choices and behaviors. When every video has been coded, data concerning the shared traits can be pulled out of each one and placed together in a spreadsheet or chart.
Over the past three years, Michelle and I have used this technology to study student research behaviors in two
separate studies. Our first research venture, with 40 undergraduate participants, studied their use (or non-use) of the Get It! button in ProQuest. For the first project, students were instructed to select and print five entire articles from a canned ProQuest search. Ultimately, 62 percent completed the task by printing five entire articles. However, 38 percent printed at least one abstract in place of the full article, and 10 percent of the students only printed abstracts.
We were concerned about the almost 40 percent of students who either couldn’t find the fulltext of an article or
who didn’t know the difference between an abstract and the fulltext. We also noted that students had a great deal of trouble using the Get It process to find the fulltext of the articles that they wanted. In many cases, this was because of the poor design of the databases to which they were linking. In order to more thoroughly study this phenomenon, we selected five of the more complicated databases and solicited 40 more students to help us find out if the databases were as difficult to use as we suspected. During this second study, we also quizzed the students on their understanding of the terms “abstract,” “fulltext,” and “pdf.” In this study, only 25 percent completed the assignment successfully by printing five entire articles, proving our suspicion that the databases can be confusing and difficult to use.
In short, we’ve discovered during our research process that screen capture technology works great for studying undergraduate research behavior, students need help deciphering the terminology used by the databases, and we need to be better advocates for improving database design and consistency.