I applied for a HuMetricsHSS Community Fellowship because I had been dissatisfied with the way my university recognized and rewarded specific sorts of academic work that are essential to our institutional identity but either blur the boundaries between traditional categories of academic personnel assessment or defy ready documentation in our academic personnel system. The two areas about which my concern was most acute were academic work that engages broad public audiences outside the academy in humanistic research and the widely variable faculty contributions to a campus and college goal of fostering a diverse, equitable, and inclusive academic community. Greater visibility in both these areas is, I think, essential to meeting my university’s public service and civic engagement missions, to demonstrating the general value of the humanities in and to a society that is more explicitly enamored with science, engineering, and technological development, and to walking the walk of a research university that proudly touts its undergraduate student population as one of the country’s most diverse and most successful (#1 in upward mobility again this year!). The challenge of recognizing and rewarding this work in both areas still faces my campus, and my experience with HuMetricsHSS enabled me to develop perspectives and tools that I think are helping me contribute to the collective work of addressing that challenge in both those areas. But for the sake of brevity and clarity, I decided to focus here just on the first – the challenge of public humanities.
As a faculty member and then Chair of the History Department at UC Riverside, I had long been concerned both about the individual career advancement of my public history colleagues who are especially outstanding at doing this work and about the negative effects on our collective morale of an institutional failure to recognize and reward public history as a direct contribution to the university’s research enterprise. When I became Associate Dean for Arts and Humanities and began reviewing the advancement and promotion files of all humanities and arts faculty, I quickly realized that this challenge was not unique to History but common across the humanities. As I saw it, the main obstacle to be surmounted was translating the nature of public humanities research into terms that are legible to the traditional metrics of research assessment at a research university with a personnel system designed to recognize and reward patterns of research productivity that are typical for traditional research in the lab sciences. Our system is flexible enough that it works pretty well for social sciences and humanities research that fits traditional modes of scholarly dissemination, too, but it starts to struggle the further we get from a lab- or field research and scholarly publication modality. Since over the years I had often seen public history work discussed in glowing terms as remarkable service contributions – but (implicitly) not the research activity that drives faculty advancement at our research-intensive university – I hypothesized that a key to translating it for our system was to make the mechanisms of peer review that operate outside traditional academic publishing more visible. The goal of my project would ultimately be to develop metrics to help us critically assess public humanities research.
I am not a public historian myself and though I have enjoyed the rare opportunities in my career during which I have brought my esoteric research findings to more general audiences, doing so does not animate my own research program. For evaluating my own research, traditional metrics should work just fine. I didn’t have the personal investment and experience that might have provided a solid starting point for the project. But my position as an Associate Dean with significant faculty affairs responsibilities did give me access to a valuable source base. I thought that as I reviewed academic personnel files, I could extract from their self-evaluation statements the ways that my colleagues across the college who do public humanities work best describe its impact. Then, I planned to triangulate those descriptions with their departments’ research norms, which were just then being codified (in most cases, for the first time) in new departmental research statements. These research statements were a response to a mandate that was issued by the Vice Provost for Academic Personnel before I submitted my application to the HuMetricsHSS Fellowship program, but they had not yet been written or deployed. So my project was necessarily speculative from the outset. I also certainly did not have any experience devising new metrics that might emerge from my planned triangulation, so I proposed the project to HuMetricsHSS, hoping to join a community that was already engaged in thinking about alternative measures of academic personnel assessment.
As soon the project term began, something quite remarkable happened. A few departments across the college seized on the new department research statements that I had been counting on as part of my proposed triangulation exercise as an opportunity to articulate much more expansive, flexible, and, I would say, equitable conceptions of research in their respective disciplines. Leaning on statements by scholarly associations like the Modern Languages Association and American Historical Association, they described multiple pathways to merit advancement and promotion that included updated research norms. They also re-asserted departmental expertise as the most reliable tool to measure the impact of research that is not reducible to journal articles and scholarly monographs, and a couple departments even put forward a few files that demonstrated the applicability of the newly articulated norms. The need to translate public engagement activities – as well as a host of other work that can slip through the cracks of more traditional evaluation practices – into traditional peer-reviewed research impact metrics suddenly evaporated. That part of my proposed HuMetricsHSS project no longer seemed so pressing, especially when those promotions that depended on the newly articulated norms sailed through successfully. It was especially exciting to read the congratulatory extramural assessments of those files as faculty at other universities praised us for looking beyond traditional publication records to see the true and meritorious impact these faculty were having on their fields. Traditionalist backlash in the external letters and in the department votes was not non-existent, but it was rare and ultimately inconsequential.
Meanwhile, as the need for my proposed project on campus was waning, I was attending the meetings of the HuMetricsHSS fellows. I was eagerly learning about other colleagues’ projects, enjoying the break-out discussion groups, and reflecting on what “values-enacted frameworks for understanding and evaluating” scholarly activity might mean in a lot of different contexts, many of which were directly applicable to other problems on my own campus. I had applied to the group for help solving a concrete problem, but the value of our regular meetings was much more subtle, wide-ranging and, I think, powerful. Our group activities and discussions had the cumulative effect of provoking in me a much more intentional, purposeful, thoughtful reflection on my campus service and leadership activity that reached well beyond my originally proposed project. During our meetings, the fellows were encouraged to explicitly articulate core values — our own and those claimed by our universities — that were more basic than our individual projects. That was interesting. We were encouraged to think about and then actually articulate the opportunities open to each of us in our respective positions, at our respective universities, with respect to our particular projects. And we were encouraged to be open and realistic about the obstacles, probably surmountable but also those insurmountable, that may stand in our way. Sometimes that was rejuvenating, since it was energizing to realize that so many of us were working toward such similar goals at all different sorts of institutions, and it is exciting to learn from others’ experiences and strategies. Sometimes it was depressing, when the realities of budget cuts, structural reorganizations, and political interventions derailed or delayed fascinating and innovative initiatives. This participation in a community of fellows was the most valuable component of the HuMetricsHSS experience. It is sometimes easy for me to focus in frustration on what I cannot do in a resource-starved context, but HuMetricsHSS insisted that I focus instead on what I can do, and the experience also gave me the opportunity to appreciate that others are doing exceptional work in even more difficult contexts than mine. It made me appreciate the fact that, whatever the challenges I face, my institution has mostly been a very supportive place to do this work.
One of the great attractions of administrative work is that it provides opportunities to help others accomplish their academic goals. It can be extremely satisfying to facilitate some bit of positive change that works, that lowers a barrier or eases a burden, that opens a door or reveals a path forward. The flip side of that coin, though, is that it can be frustrating to inhabit the institutional practices and administer policies that reproduce the same systemic inequities we’re committed to alleviating. It is, in other words, sometimes hard to reconcile the institutional imperatives of administrative duties with one’s own core values. As a colleague recently observed, it is exceptionally hard to change institutions from the inside. But short of revolution and all of its unintended and unforeseen consequences (I’m a historian of the Soviet Union, by the way, who is not prone to romanticizing revolution), it just might also be the only way to change them for the better. This is where I see the power of the HuMetricsHSS approach and how the HuMetricsHSS Community Fellowship experience changed the way I think about my own administrative work. The combination of continuous intentionality, referenced to some core institutional values that I believe in, and an effort to be actively aware of what is actually possible right now, at this moment, with this very next step is what I now understand to be the meaning of a “values-enacted” framework.
That brings me back to the project I originally proposed – to start building a framework that would help my institution more equitably and effectively recognize and reward my colleagues’ public humanities scholarship. I have so far emphasized the big improvements at my university on this front – the new, up-to-date research statements and their successful application. But the full story is, of course, much more complicated. Not every department is as fully invested in these changes as the enthusiastic early adopters. Even the best of the department research statements still do not capture the full array of valuable academic work that our system should reward, and perhaps they never can. Though the backlog of colleagues who should have advanced years ago but have not because their robust record was missing some discipline-specific accomplishment expected by traditional metrics is beginning to clear, there are still many more to go. Another component of my original project that I initially thought had been subsumed by the mandated research statements and the institution’s implementation of their parameters seems to be (re-)emerging. That component is disagreement over metrics. Activities that did not use to “count” now do “count,” but there is still some fundamental uncertainty about how much they should count, how to measure quality, and how to balance distinct areas of scholarly output.
What to do? For me, the way forward is to practice the intentional incrementalism of a values-enacted framework for change that I learned as a HuMetricsHSS Community Fellow. Much of this work has always started with conversations, with Chairs, with faculty due for promotion, with other faculty considering how to adopt updated metrics alongside the traditional ones that (often) have worked so well for them in the past. I like to start those conversations with what I hope is a casual invocation of shared values that motivate common goals. That technique has been working, a bit unevenly perhaps, to facilitate communication and conversation about the future. More departments are revisiting their still quite new research statements, thinking about how to use them to better reflect the scholarly work the faculty values or recognizes (perhaps with a bit of encouragement) that they should value. And I both hope and genuinely think that more faculty whose work may not have been fully appreciated in the past and the colleagues who have been frustrated by that lack of appreciation, alike, see hopeful reasons to put advancement files forward. The technique doesn’t always work as I try to apply it to areas of my administrative work increasingly far from my originally proposed project. There are some things that optimistic reference to core values simply does not fix, at least not immediately. For my institution, the biggest of those is resource scarcity. We are near the end of a multiphase, multiyear strategic planning initiative, and I have tried to bring some of the language and techniques I learned through HuMetricsHSS to the conversations that I have led during this last phase. Some of those conversations have been successful, and the groups I have convened have been, in principle, universally enthusiastic about the idea of aligning values, strategic planning, and concrete goals and measures. But the conversations ultimately grind into issues of resource scarcity. And there, I think, we see the power of the incrementalism, the call to focus on the next available step. Let’s articulate our values and our end goals, I can say, and let’s imagine a future that we may now only be able to hope for. But then let’s focus on what we actually can do next.
I may have come to HuMetricsHSS in search of new, ready-made, innovative, and equitable metrics to measure scholarly production, but as my fellowship comes to an end, I think the experience has helped me develop something more important: the tools and perspective to help build those metrics from within my institution and the knowledge that a lot of other people are doing analogous work at other universities. It is a hopeful place.