You are here

Yesterday marked the opening of a new session of Infopeople's CORE Reference Fundamentals online course, which I've been teaching twice a year since Carole Leita retired in 2006. One of the six modules in which participants learn, practice and refine their basic reference skills focuses, of course, on resource evaluation. While many of us long time practitioners don't question the ethical importance of bypassing assumption and ease to establish authority, purpose, scope, audience, and timeliness, how well do we incorporate all of those evaluation facets when we consider statistics and reports that are based on statistical analysis? Technology and the marketplace offer us increasing ways to apply statistics gathering and dissemination in library work. In the US, government (both federal and of the various states) statistics are plentiful as reflections of health assets, civic, educational,  and commercial attainments and  aspirations. We have a host of vendor possibilities through which we can develop, share and manipulate library-specific statistics toward improving upon our professional and institutional efforts to provide good service.

And statistics don't lie, from a mathematical perspective. They can be derived to represent aspects of the reality in which we live and work. But do we forget to question the relevance of statisitical information when considering matters for which seemingly related data sets are so close at hand? Probably the most important question to ask as we refer to statistical reports and analyses of them is: "Was my real question asked in the collection of this data?" (Ah, recognizing the "real" question, another essential of basic reference work!)

Back when I was in library school 83 years ago, I was lucky enough to have a couple of excellent reference profs who approached this need to examine from a couple of viewpoints. There was Michael Ochs, who reminded us, as often as necessary, to focus on the information wanted, not the availability of resources that were in the same ball park or used the same vocabulary but didn't recognize the "real" question.  A. J. Anderson, who taught, in addition to that crackerjack intellectual freedom course where we wound up in a real live federal courtroom as a real live censorship case unwound,  also dealt with government documents.  He offered us an example that rings true for me still as I examine any statistical set or reported analysis of it:

We looked at how the Statistical Abstract of the United States reflected change across time of mothers in the workforce. Up to 1961, female labor force members were not reported but rather working mothers; after that the question about mothers in the workforce disappeared for some years into the female labor force.

Authority? Yes. Purpose? Yes. Timeliness? Well, yes, as good as we could get back then. Audience? Certainly. But scope? No. Back to Michael Ochs: because this question about mothers in the workforce had not been asked, and so statistics not collected, we needed to seek an alternative resource--perhaps not one as legend at the Statistical Abstract--if the question really was to provide nformation about the relative prevalence of working mothers in the US in 1965. Providing an answer for 1961 was to leave the "real" question behind.

What does this have to do with how we analyze library life using the panoply of statistical help we have now? It's a reminder to form the question you want to research to fit your information need, rather than to adjust what you need to know, in order to be accurate in your investigation, to fit the resources you can reach most readily.

How does that matter in the "practical" world? A couple of days ago I read an article, which is replicated in this week's American Libraries Direct online newsletter, that dramatically announced that over 1,000 library jobs are to be lost in the UK (a national with whom we share similar library values, where there are considerably fewer libraries, and where the library economy is in worse shape than it is here, even now). Dire indeed. BUT, as it turns out, the job title and description list's portrays "library work" as something that has very in common with contemporary library work, while the entry under "information specialist" suits the dynamics of contemporary "library work"--and the same resource spelling doom for "library workers" by the numbers provides glowing growth, again statistically, for "information  specialists."  What was the real question? Was it "Are library jobs disappearing?" or "what's the job outlook for  jobs utilizing library skills, including  jobs in libraries?"

Think about what the statistician brought to the table in terms of questions asked before applying numbers that maybe don't lie, but don't reflect the real answer either.

Comments

Comment: 
83 years ago, huh? Feels like that for me sometimes, as well. Nice overview of the process. And I remember the Stat Ab very well.