Tuesday, May 13, 2014

Notes and my poster content from the 2014 Distance Library Services Conference

If interested, here are links to my conference notes, and the poster and associated brief handout from my poster at the DLS Conference:
Here's the summary of my poster:

With a staff of four librarians serving a population of 20,000 adult distance and blended learners, providing scalable
and accessible information literacy learning opportunities is a challenge for this medium-sized state university. In addition to live online workshops and a self-paced, text-based course, we strive to empower our instructional designers and faculty with easy-to-embed learning objects that can be used to teach or reinforce, through self-paced active learning where appropriate, necessary information and research skills at the point of need. These learning objects are designed to be modular and customizable to fit within the context of any curriculum, course or assignment.
Online scavenger hunts, information skills self-assessments, ask-a-librarian chat boxes, and video tutorials, some with interactive elements, all play a part in this low-cost endeavor. Objects are primarily created with free online tools (the exception is Adobe Captivate for videos). Each is heavily promoted to faculty, student support staff, and instructional designers as a plug-and-play service that can impact student assignment quality and preemptively relieve them from having to repeatedly answer basic, non-course content questions.

Tuesday, May 6, 2014

Can Students Tell the Difference


(update: the site this link goes to from 2010 is going away, so I'm reproducing it here)
I posted a brief article on my workplace blog that I figured I'd link to here:
Can Students Tell the Difference?

It discusses the problems students face in identifying the types and usefulness of information they find in online library databases.


Screenshot from Academic Search Complete
What you’re looking at above is a screenshot of three search results from one of the library databases, Academic Search Complete. To be upfront, I doctored the search to get this screenshot in order to illustrate a point: many of our students cannot discern, especially in the online environment, the source of the information they are using, nor understand the potential differences in perspective and bias that are the result of this. As the image illustrates, the problem we all know exists when students ‘Google’ something (i.e., the very wide variety and quality of results; evaluating quality), also presents complications for students when using a library research database, even one called “Academic Search Complete!”

The problem isn’t necessarily one of students’ lack of attention to the details in front of them (journal title, year, etc.) or not knowing what a scholarly resource is supposed to look like. Those can certainly be contributing factors, but a larger problem is caused by the unifying and democratizing nature of the online medium itself. In most of the larger research databases these days, peer-reviewed journal articles sit alongside trade and popular magazine articles, as well as newspaper articles, conference reports, dissertations and theses, encyclopedia entries, and even e-books. On the surface, as the image above shows, there is little that distinguishes these different kinds of information sources to the untrained eye.

As experienced scholars and expert researchers we know all the little things that might (or might not) help us determine the source of a piece of information, even through the homogenizing ‘skin’ of an online database. We know how to read critically. We know the basic elements of presenting research results and where to look for author affiliations. The problem is that students are neither comfortable yet with the issues, knowledge or even language of their field of study, nor experienced in using scholarly materials and search tools in general.

When you don’t know what questions to ask, you tend not to ask any questions at all.
As a result, students often take (and then quote in their papers) whatever results show up in a search, regardless of who wrote them, when or why they were published, or even if they don’t directly support their argument. Again, this is not the fault, in most cases, of the students themselves. The sad truth is that most students entering higher education (everywhere and of all ages, not just our own learners) have not yet encountered the need or been motivated to acquire the skills needed to critically read, evaluate, synthesize, and most importantly create new information. This is becoming especially true when a majority of that information is in a wide variety of electronic and multimedia formats.

I don’t have any easy solutions to this issue. The library has taken some initial steps in this battle with our Information Skills Tutorial and our @Home Library Workshops. But these limited and voluntary learning opportunities can only go so far. Successfully and sustainably tackling the urgent need for better information, research, and critical thinking skills in our new and graduating students will probably need a college- and even SUNY-wide collaborative effort; an effort that will require input and effort from faculty, administrators, instructional designers, student support staff, librarians and more.

My hope is that a growing discussion on these issues here at Empire State College is a good place to start. There are certainly short-term, as well as long-term things we can do to begin.