/ Research Management: Combining Platforms, Practices, and Policies

This is the article version of the author’s presentation given at IFLA 2016. View the video recording of the author’s presentation. The presentation begins at 57:34.

Abstract

Research management is about more than open access and it is about more than creating online publishing platforms. It is about creating online publishing platforms that meet the needs of all of the stakeholders in the higher education enterprise, most notably faculty. Technological infrastructure needs to be combined with policies that reflect the career needs of faculty members. So far, the goal of combining open scholarship policies with online infrastructure has been elusive. The answer may be to rethink how the career structure of faculty members is structured and how research managers and librarians can be a part of the solution.

Introduction

Imagine a world in which all scholarship is digital, and consists not only of articles and books, but also databases, linked websites, blogs, and other new forms of online discourse. Imagine a world in which anyone with access to a computer can download, read, and remix academic work for their own purposes. Imagine a world in which institutions such as libraries and publishers manage, disseminate, and organize this information. In some ways this world is already here. Increasingly, the outputs of scholarship are not just articles, but data, software, and other items. Additionally, the continuing pressure toward open access means that modern electronic scholarship is more often openly available to anyone on the internet. Moreover, librarians, publishers, and library publishers are having to manage these diversified outputs, many of which are available not on library shelves or within library servers, but on the websites of other publishers, personal websites of scholars, institutional repositories, and repositories of other entities like government agencies or funding sponsors. Thus, one has to ask the question, how are libraries going to thrive in this “imaginary” world? By answering that question, it may be possible to chart a course for not only the creation of technological infrastructure, but also social incentives to deal with the future of communicating scholarly research.

In the age of linked data and the semantic web, open access repositories might seem to be the first step toward solving the much larger problem of creating a technological solution for research management that would help to assess the impact, productivity, and use of resources online. Yet, it is extremely important that such a technical infrastructure must also be linked with a social infrastructure through policies and practices (such as open access mandates) that encourage scholars to act in particular ways. Despite multiple efforts, linking both computing solutions and social ones remains elusive.

In some ways answers for the first problem, technical infrastructure, seems a bit easier. Already there have been various initiatives that have started in order to assist with managing research activities; examples include ResearchGate, Symplectic Elements, VIVO, and many others, all of which share a fundamental characteristic: to enable sharing and linking of research between disciplines, equipment, and institutions. In some ways European counties and the UK are actually a bit closer to solving the problem of research management through such systems because their governments require universities to go through procedures like the Research Excellence Framework (REF) which, among other things, measures the publications, research productivity, and impact of various departments and schools throughout an entire country. In the United States, many institutions are already exploring ways to emulate these examples.

This example shows that to some extent it is impossible for technological solutions to succeed without appropriate social incentives to back it up. Faculty members at universities already have ingrained habits and practices that in many cases mean they will purposefully not share their research outputs in open repositories. Thus, it is important that those incentives be changed in some way through policy actions. In all, open access is merely part of a much larger challenge. How do information experts such as librarians need to manage and make articles, books, and databases more useful? By understanding the links between research management tools and the policies that manage them, it may be possible to move a step closer to that answer.

Background

Libraries have increasingly turned toward institutional repositories as a method for publishing scholarship within their institution including grey literature as well as original research published in academic journals.[1] Despite these moves toward new uses for institutional repositories, there are still difficulties in using such platforms as a replacement for earlier modes of scholarly communication.[2] To meet such challenges, libraries have begun new operations for creating publishing services.[3] Yet, even with these new services in libraries, scholars are faced with a proliferation of other digital publishing services that in some cases may meet their individual needs more efficiently and more quickly. Therefore, university faculty members desire increasing flexibility in their publishing options.[4]

Helen Connell, writing for the Organisation for Economic Co-Operation and Development (OECD) wrote a report that discusses the problem of “research management” in higher education in which she attempts to “analyse institutional responses to challenges arising from the implications of the changing education environment on research management, and draw together findings and ideas from current experience.”[5] Using a series of case studies, Connell argued that there are three areas that universities need to develop in order to create efficient research management that include:

  1. creating professionals within universities to manage research activities within the institution,
  2. providing institution-wide strategic plans that further the cause of research management across multiple responsible departments within the university,
  3. recognizing the career goals of individuals (primarily faculty and students) working within the higher education sector.

Librarians clearly have the potential to be the professionals that Connell envisions (though in cooperation with other offices within the university). Additionally librarians can implement technological solutions that help to meet the goal of effectively administering scholarly outputs. The key to successful research management, however, might seem to lie in the other two aspects of Connell’s recommendations. Both are social solutions and they too need to be implemented in order to create effective research management practices.

Platforms

Librarians have been especially active in creating platforms to manage research outputs. Some of these are commercial in nature, others non-profit. All have the same general goal of creating mechanisms to make research more widely accessible, but are also increasingly being called upon to provide metrics for research productivity and impact. Assessment is becoming an important aspect of scholarly practice, particularly in tenure and promotion guidelines, and, therefore, one might argue, that the more successful platforms for scholarly outputs will also conform in some ways to these needs for research evaluation.[6]

Institutional repositories are perhaps best known to librarians as a primary method for disseminating research articles along with grey literature, original research, data, and other research outputs. Though they are an important way to make academic literature available, at the same time, institutional repositories, so far, have been inconsistent in their growth and have not posed much of a challenge to the traditional scholarly communication system.[7] One of the potential reasons posited for this inability for this inability of institutional repositories to confront the entrenched journal publishing system has been their lack of integration with the other platforms that are more well known to academic users.[8]

In Europe, one of the ways to address this problem has been to create Current Research Information Systems (CRIS). Usually, these systems do not house actual content such as articles or data. On the contrary, they monitor metadata about individual research. Such metadata might include measurements like the impact factor, h-index, altmetrics, or links to individual articles (including institutional repositories.[9] So far, CRIS-like systems have as prominent in the United States’ institutional landscape, perhaps because of the more decentralized nature of higher education. Nonetheless, there are several commercial platforms that do mimic some aspects of the CRIS which have been utilized by professors at American universities.

The most well-known of these commercial platforms is likely sites like ResearchGate or Academia.edu. These sites bill themselves as a kind of Facebook for academics. At the same time, these sites imitate many of the functions of an institutional repository. Importantly, however, such sites provide metrics to academics like the h-index and altmetrics for all of the items deposited on their site.[10] Similarly, GoogleScholar also provides similar services. In fact, GoogleScholar has become a fairly popular service for academic users. A recent survey reported that 92% of researchers used Google Scholar for finding literature, 68% of respondents to get alerts and recommendations abour research, and 70% as their researcher profile.[11] Similarly to both Academia.edu and ResearchGate, Google Scholar provides similar metrics and has the added advantage of providing a more comprehensive set of literature (though much of that literature would not be available to people without university subscriptions). One of the great advantages of services like Academia.edu, ResearchGate or GoogleScholar are not only their ability to provide metrics, however, but also their ability to integrate research across multiple universities. There have of course been multiple initiatives in the United States that are also trying to accomplish the goal of integrating metadata across institutional repositories. One of the more well-known of these efforts is Shared Access Research Ecosystem (or SHARE). In Europe projects like OpenAIRE are attempting to accomplish similar goals. OpenAIRE seeks to create common workflows and metadata standards across repositories of data, articles, and other research outputs.

Projects like SHARE and OpenAIRE both focus their efforts on integrating content within institutional repositories (and similar storehouses of literature and data). There are, however, more ambitious that hope to establish greater integration not only across universities, but between commercial publishers as well. VIVO is one example of such a collaborative effort. VIVO, a semantic web framework linking together scholarship, universities, and faculty members, has become a larger open source community working with groups like Consortium Advancing Standards in Research Administration Information (CASRAI), the European Current Research Information System (EuroCRIS) and Open Researcher and Contributor ID (ORCID). ORCID provides name disambiguation (something extremely important when there are many common names like “John Smith” among academic researchers) and ORCID has been working with several research institutions, professional societies, and other communities to further integration into already existing workflows for scholarship (such as journal and manuscript submission). A major commercial counterpart to these more university led initiatives is a UK based company called Digital Science which has created products like Symplectic Elements, a tool that helps to provide CRIS-like functions, Figshare, for sharing figures, Overleaf, for creating documents, Dimensions, for managing information about research funding, and Altmetric, for readership statistics .

Finally, one project that often goes unmentioned in overviews of publishing platforms, but is extremely important in terms of scholarly practices is STAR METRICS, an initiative started by Julia Lane (a professor at New York University), funded by the NIH and the National Science Foundation (NSF), and developed under the sponsorship of the White House Office of Science and Technology Policy (OSTP). STAR METRICS is creating data and tools that assess the impact of federal government research investment. Though this may seem unrelated to publication, at the same time, there are increasing calls for accountability from universities receiving federal funding and, in the same ways that scholars are called to quantify the impact of their scholarship (in terms of readership, citation and other metrics) it is likely that they will also have to show the impact of their research on local and national communities. Increasingly faculty members are being asked to provide return on investment in terms of jobs or other economic benefits.[12] Projects like STAR METRICS may help to provide a way of assessing such impacts, and hopefully can be combined with more traditional metrics that already assess the impact of individual research outputs like articles and books.

Practices

Projects like STAR METRICS are especially important because they show the ways in which scholarly practices (the ways in which faculty will be increasingly required to act) influence the way that they will publish and the ways in which they will position their research.[13] An important aspect of Connell’s report was career outcomes of faculty.[14] Thus, one must ask how faculty members within universities view the progress of their careers and how the individual career goals of scholars will affect their publishing practices. Paula Stephan suggested that there are three aspects to understanding academic career structures (primarily within the sciences) within U.S. universities. First, there is an entrenched reward system which she suggests is built around benchmarks for particular stages in career development. Second, academic institutions tend to value individual entrepreneurship and accountability which creates disincentives for faculty to go against the existing system of career development. Finally, the use of doctoral and postdoctoral researchers as staff within a laboratory creates a network early in the career lifecycle that ingrains these sets of practices in young professionals, again reinforcing the current system.[15] Underlying all of this there is an entrenched funding structure for U.S. laboratories, generally built upon grant funding either by the government, industry, or other foundations that tends to be awarded to already prestigious and well-funded labs.[16]

Stephan highlights several marks of distinction that are used benchmarks that are used for moving forward in a career, say from an assistant to an associate professor. The first of these is number of grants obtained to secure a laboratorys’ operations. A second is the ranking of collaborating institutions, and a third, perhaps the most important aspect from a library’s perspective, is the publication activity from a lab.[17] Importantly Levin & Stephan note that research productivity is placed early in the career lifecycle meaning that younger scientists have to publish more often in order to achieve tenure and then, at the stage where they have become tenured professors are then obligated to mentor younger colleagues into their professional publication patterns.[18] Therefore, younger faculty generally must publish early in their careers and do so in the ways that garner them the most prestige so that they can move forward in their careers.

The importance of prestige in journal publication cannot be overstated. As early as the 1960s Robert Merton suggested that scientists aimed to achieve prestige and that this goal created a system of academic publishing that put certain prestigious journals above others.[19] One of the ways that later scholars and practitioners, most notably Eugene Garfield developed to measure the “prestige” of an academic was the impact factor that attempts to determine journal ranking through number of citations.[20] Though meant as a measure for librarians to use in making purchasing decisions, the impact factor has taken a much more significant role, and, according to some scholars has created a rather perverse incentive system that rewards getting published only in particular journals rather than thinking of new and novel concepts to advance knowledge.[21]

Thus, one might say that the publication practices of scholars are twofold. First, they must get published early in their career (and frequently). Second, they must publish only in the highest ranking and most prestigious journals and scholarly imprints. A report by the University of California’s Center for Studies in Higher Education further reflects these points “There are a variety of criteria used to judge a successful scholar in a tenure and promotion case: publication, service, and teaching. Excellence in the latter two holds little weight without a stellar publication record and evidence that a scholar’s work: is widely read, is judged to be of high quality.”[22] Additionally, scientists rely not only on the prestige of publishing in high impact journals, but also are required to obtain high impact grants in order to fund their labs and perpetuate their discipline by attracting good doctoral students and postdoctoral fellows.

Policies

The final point that Connell mentions in her report on research management is the creation of policies and strategic plans to harness the professionals who manage research (presumably librarians in this case) and career faculty (who have their own priorities for publication).[23] Therefore given the potential capabilities of available platforms to publish research outputs and to manage their workflows and the realities of academic practice, is it possible to create a policy that might further the needs of both library and scholarly communities? Though open access policies hope to achieve the goal of meeting the needs of the institution, libraries, and scholars, at least so far they have not yet done so.[24]

In some ways, it seems that the UK and Europe are further ahead than the US in implementing open access policies that also fit the needs of research management.[25] There is, however, one key difference. Unlike the United States, the UK and Europe have more centralized higher education systems controlled by the government. The United States has a much more decentralized public/private system run by a variety of government entities. Though there have been some efforts at funder mandating (like the National Institutes of Health and the efforts of the White House Office of Science and Technology Policy), these efforts do not have quite the same pull as government sponsored open access. Efforts like the congressional bill FASTR or the California statewide open access mandate provide some movement in this direction but still are somewhat limited in scope. Policies in the United States will have to be implemented institution by institution with a patchwork of different policies that meet the needs of individual universities and colleges.

As Connell suggests, any research management or open access policy will need to recognize both the needs of the university and the needs of individual faculty.[26] The needs of the university are to maximize research productivity and to increase funding for their universities.[27] The goals of scholars are to further their careers both through publication and through increasing grant funding. Thus, it would seem that external funding is a key component to both of these groups. Perhaps, the open access policy mandates that NIH and other agencies have created need to increase and standardize. That is only part of the problem, however. Universities also need to create their own policies which supplement those of funders, and universities need to think about the ways in which they reward publishing. Publishing in high impact journals has created an economy of science that rewards certain actors and may not be in the best interests of furthering knowledge, nor, necessarily in the interests of universities (particularly if they wish to bring down the costs of purchasing research). In all, it is essential that policies of universities be thought about more holistically and considered not just a means of furthering the goal of open access, but rather a method for reevaluating the ways in which research is produced, evaluated, and disseminated.

Conclusion

Research management is not just about open access. It is a combination of three things. First, research managers (such as librarians) need to be able to fully and effectively harness the platforms (ORCID, Symplectic Elements, VIVO, STAR METRICS) that are currently available to manage and disseminate scholarship around the world. Second, research management platforms need to be in line with scholars’ needs to further their careers that, at least currently, are based on their success at publishing in prestigious high impact journals and getting grants to fund their labs. Finally, and perhaps most importantly, research management requires better policies that help to maximize research productivity and to facilitate an easier economy of science. According to Stephan, the current system of scientific research emerged in the 1950s from the successes of science in winning the Second World War and to combat the pressure of Communist societies gaining the upper hand. Moreover, this model of scientific progress rested on the belief that individual entrepreneurs working in labs were a new kind of human capital that would propel science forward. Yet, this model of scientific economy has broken down, and as Stephan suggests, “the failure of the model is undoubtedly related to the fact that the production of scientific knowledge is far more complex than the human model assumes. . . . This leads us to conclude that we need to rethink the way we study the careers of scientists.”[28]

In the same way that Stephan calls for a new way of studying the careers of scientists, perhaps librarians and research managers need to rethink the ways that they incentivize the research production process. Rather than tying career advancement to publication in high prestige journals early in their careers, and reinforcing the “human capital” model of production through requiring grant funding for laboratories (especially when grant funding seems to be increasingly difficult to obtain), it is essential to find other methods of maximizing research productivity. Given the increasingly global and interdisciplinary nature of the scientific enterprise in the twenty-first century, universities, through their policies, can change this model, and research managers, such as academic librarians, must help in this effort. It is time for a new kind of open scholarship policy, one which, in the ways that Connell (2005) suggests, understands not just the need for further dissemination of scholarship, but also recognizes the needs of research managers, career scientists, and the university as a whole to further the goal advancing knowledge.


Shawn Martin is currently an IDEASc Fellow at Indiana University Bloomington; his research focuses on scholarly communication and the history of academic publishing. Previously, Shawn was the scholarly communication librarian at the University of Pennsylvania and an adjunct professor at the College of Computing and Informatics at Drexel University. He has also worked as the project librarian for the Text Creation Partnership at the University of Michigan and has been involved with multiple international digital humanities and digital library projects.

References

  • Awre, Chris, Beeken, Andrew, Jones, Bev, Stainthorp, Paul, and Stone, Graham. “Communicating the open access policy landscape”. Insights. 2, no. 2 (2016): 126–132.
  • Bosman, Jeroen, and Kramer, Bianca. “Innovations in Scholarly Communication: Changing Research Workflows.” Innovations in Scholarly Communication. July, 2016, https://101innovations.wordpress.com/tag/updates-insights/.
  • Brownman, Howard I. and Stergiou, Konstantinos I. “Factors and indices are one thing, deciding who is scholarly, why they are scholarly, and the relative value of their scholarship is something else entirely.” Ethics in Science and Environmental Politics, 8 (2008): 1-3.
  • Clements, Anna and McCutecheon, Valerie. “Research data meets research information management: Two case studies using (a) Pure CERIF-CRIS) and (b) EPrints repository platform with CERIF extensions,” Procedia Computer Science. 33 (2014): 199-206.
  • Connell, Helen, ed. University Research Management: Meeting the Institutional Challenge. Paris: Organization for Economic Co-Operation and Development, 2005.
  • Cronin, Blaise, and Sugimoto, Cassidy. Scholarly Metrics Under the Microscope: From Citation Analysis to Academic Auditing. Medford, NJ: Information Today, Inc., 2014.
  • Dubinsky, Ellen. “A current snapshot of institutional repositories: Growth rate, disciplinary content and faculty contributions.” Journal of Librarianship and Scholarly Communication. 2 no. 3 (2014): 1-23.
  • Dundar, Halil, and Lewis Darrell R. “Determinants of research productivity in higher education.” Research in Higher Education, 39, no. 6 (1998): 607-631.
  • Bevan, Simon J. and Harrington, John. “Managing research publications: lessons learned from the implementation of a Current Research Information System.” Serials, 24, no. 1 (2011): 26–30.
  • Garfield, Eugene. “The intended consequences of Robert K. Merton.” Scientometrics. 60, no. 1 (2004): 51-61.
  • Harley, Diane., Acord, Sophia Krzys, King, C. Judson. Assessing the Future Landscape of Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines. Berkeley, CA: University of California Press, 2010.
  • Kapeller, Jakob. “Metrics: Serious drawbacks, perverse incentives, and strategic options for heterodox economics.” The American Journal of Economics and Sociology, 69, no. 5 (Metrics: Serious drawbacks, perverse incentives, and strategic options for heterodox economics (2010): 1376-1408.
  • Kutay, Stephen. “Advancing Digital Repository Services for Faculty Primary Research Assets: An Exploratory Study.” Journal of Academic Librarianship, 40, (2014): 642-649.
  • Lippincott, Sarah. “The Library Publishing Coalition: organizing libraries to enhance scholarly publishing.” Insights, 29, no. 2 (2016): 186-191.
  • Levin, Sharon G., & Stephan, Paula E. “Research productivity over the life cycle: Evidence for academic scientists.” American Economic Review, 81,no. 1 (1991): 114-132.
  • Ma Athen, Mondragón Raul J., Latora Vito. “Anatomy of funded research in science.” Proceedings of the National Academy of Science, 112 (2015): 14760–14765.
  • Menzies, Kathleen, and Johson, Frances. “Academic attitudes toward new media: An exploratory multidisciplinary study.” Information Society, 32, no. 1 (2016): 1-13.
  • Merton, Robert K.. “The Matthew effect in science.” Science 159, no. 3810 (1968): 56-63.
  • Merton, Robert K. (1969). “Behavior patterns of scientists.” American Scientist, 57, no. 1 (1969): 1-23.
  • Rogers, John. “Open scholarship and research management.” Insights. 27, no. 3 (2014): 239–243. DOI: http://doi.org/10.1629/2048-7754.170
  • Ruscio, John, Seaman, Frlorence, D’Oriano, Carianne, Stremlo, Elena, and Mahalchik, Krista. Measuring scholarly impact using Modern citation-based indices.” Measurement, 3 (2012):123-146.
  • Shearer, Kathleen. “The CARL institutional repositories project - A collaborative approach to addressing the challenges of IRs in Canada.” Library Hi-Tech, 24, no. 2 (2006): 165-172.
  • Stephan, Paula E. “Career stage, benchmarking and collective research.” International Journal of Technology Management, 22, no. 8 (2001): 676-687.
  • Stephan, Paula E. “The Economics of Science.” Journal of Economic Literature, 34 (1996): 1199-1235.
  • Swan, Alma, and Carr, Leslie. “Institutions, Their Repositories and the Web.” Serials Review, 34, no. 1 (2008): 31-35.
  • Thelwall, Mike, and Kousha, Kayvan. “ResearchGate: Disseminating, Communicating, and Measuring Scholarship?” Journal of the Association for Information Science and Technology, 66, no. 5 (2015): 876-889.
  • Weinberg, Bruce A., Owen-Smith, Jason, Rosen, Rebecca F., Schwarz, Lou, McFadden Allen, Barbara Weiss, Roy E., and Lane, Julia. “Science Funding and Short-Term Economic Activity.” Science, 344, no. 6179 (2014): 41-43.
  • Xia, Jingfeng, Gilchrist, Sarah B., Smith, Nathaniel X. P., Kingery, Justin A., Radecki, Jenniver R., Wilhelm, Marcia L., Harrison, Keith C., Ashby, Michael L., and Mahn, Alyson J. “A review of open access self-archiving mandate policies.” portal: Libraries and the Academy, 12, no. 1 (2012): 85-102.

Notes

    1. Alma Swan and Leslie Carr. “Institutions, Their Repositories and the Web.” Serials Review, 34, no. 1 (2008): 31-35.Kathleen Shearer. “The CARL institutional repositories project - A collaborative approach to addressing the challenges of IRs in Canada.” Library Hi-Tech, 24, no. 2 (2006): 165-172.return to text

    2. Stephen Kutay. “Advancing Digital Repository Services for Faculty Primary Research Assets: An Exploratory Study.” Journal of Academic Librarianship, 40, (2014): 642-649.return to text

    3. Sarah, Lippincott. “The Library Publishing Coalition: organizing libraries to enhance scholarly publishing.” Insights, 29, no. 2 (2016): 186-191.return to text

    4. Kathleen Menzies and Frances Johnson. “Academic attitudes toward new media: An exploratory multidisciplinary study.” Information Society, 32, no. 1 (2016): 1-13.return to text

    5. Helen Connell, ed. University Research Management: Meeting the Institutional Challenge. Paris: Organization for Economic Co-Operation and Development, 2005: 3.return to text

    6. Howard Brownman and Konstantinos I. Stergiou. “Factors and indices are one thing, deciding who is scholarly, why they are scholarly, and the relative value of their scholarship is something else entirely.” Ethics in Science and Environmental Politics, 8 (2008): 1-3.

      John Ruscio et al. Measuring scholarly impact using Modern citation-based indices.” Measurement, 3 (2012):123-146.return to text

    7. Ellen Dubinsky. “A current snapshot of institutional repositories: Growth rate, disciplinary content and faculty contributions.” Journal of Librarianship and Scholarly Communication. 2 no. 3 (2014): 1-23.return to text

    8. Anna Clements and Valerie McCutecheon. “Research data meets research information management: Two case studies using (a) Pure CERIF-CRIS) and (b) EPrints repository platform with CERIF extensions,” Procedia Computer Science. 33 (2014): 199-206.return to text

    9. Simon J. Bevan and John Harrington. “Managing research publications: lessons learned from the implementation of a Current Research Information System.” Serials, 24, no. 1 (2011): 26–30.return to text

    10. Mike Thelwall and Kayvan Kousha. “ResearchGate: Disseminating, Communicating, and Measuring Scholarship?” Journal of the Association for Information Science and Technology, 66, no. 5 (2015): 876-889.return to text

    11. Jeroen Bosman and Bianca Kramer. “Innovations in Scholarly Communication: Changing Research Workflows.” Innovations in Scholarly Communication. July, 2016, https://101innovations.wordpress.com/tag/updates-insights/.return to text

    12. Bruce A. Weinberg et al. “Science Funding and Short-Term Economic Activity.” Science, 344, no. 6179 (2014): 41-43.return to text

    13. Blaise Cronin and Cassidy Sugimoto. Scholarly Metrics Under the Microscope: From Citation Analysis to Academic Auditing. Medford, NJ: Information Today, Inc., 2014.return to text

    14. Helen Connell, ed. University Research Management: Meeting the Institutional Challenge. Paris: Organization for Economic Co-Operation and Development, 2005: 3.return to text

    15. Paula E. Stephan. “Career stage, benchmarking and collective research.” International Journal of Technology Management, 22, no. 8 (2001): 676-687.return to text

    16. Athen Ma, Raul J. Mondragón, and Vito Latora.. “Anatomy of funded research in science.” Proceedings of the National Academy of Science, 112 (2015): 14760–14765.return to text

    17. Paula E. Stephan. “Career stage, benchmarking and collective research.” International Journal of Technology Management, 22, no. 8 (2001): 676-687.return to text

    18. Sharon G. Levin and Paula E. Stephan. “Research productivity over the life cycle: Evidence for academic scientists.” American Economic Review, 81,no. 1 (1991): 114-132.return to text

    19. Robert K. Merton. “The Matthew effect in science.” Science 159, no. 3810 (1968): 56-63.

      Robert K. Merton (1969). “Behavior patterns of scientists.” American Scientist, 57, no. 1 (1969): 1-23.return to text

    20. Eugene Garfield. “The intended consequences of Robert K. Merton.” Scientometrics. 60, no. 1 (2004): 51-61.return to text

    21. Jakob Kapeller. “Metrics: Serious drawbacks, perverse incentives, and strategic options for heterodox economics.” The American Journal of Economics and Sociology, 69, no. 5 (Metrics: Serious drawbacks, perverse incentives, and strategic options for heterodox economics (2010): 1376-1408.return to text

    22. Diane Harley, Sophia Krzys Acord, and Judson C. King. Assessing the Future Landscape of Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines. Berkeley, CA: University of California Press, 2010: 7.return to text

    23. Helen Connell, ed. University Research Management: Meeting the Institutional Challenge. Paris: Organization for Economic Co-Operation and Development, 2005: 3.return to text

    24. Jingfeng Xia et al. “A review of open access self-archiving mandate policies.” portal: Libraries and the Academy, 12, no. 1 (2012): 85-102.return to text

    25. Chris Awre et al. “Communicating the open access policy landscape”. Insights. 2, no. 2 (2016): 126–132.

      John Rogers. “Open scholarship and research management.” Insights. 27, no. 3 (2014): 239–243. DOI: http://doi.org/10.1629/2048-7754.170 return to text

    26. Helen Connell, ed. University Research Management: Meeting the Institutional Challenge. Paris: Organization for Economic Co-Operation and Development, 2005: 3.return to text

    27. Halil Dundar and Darrell R. Lewis. “Determinants of research productivity in higher education.” Research in Higher Education, 39, no. 6 (1998): 607-631.return to text

    28. Paula E. Stephan. “The Economics of Science.” Journal of Economic Literature, 34 (1996): 1230.return to text