By Stanley D. Smith, Professor of Finance, University of Central Florida
I was recently reading the article, "Creating Community: The Measurement of Research," by Eileen Peacock, in the December 2012 eNEWSLINE, and the research report by AACSB International, "Impact of Research: A Guide for Business Schools." The article and research report outline many important aspects of evaluating research and integrating that evaluation into a school's mission and faculty evaluation system. Pages 38 and 39 of the research report identify many possible indicators for research, e.g., publications in designated journals and SSCI/ISI citation counts. The following analysis examines one of the most common aspects of evaluating research, evaluating publications across disciplines.
One problem that often arises with college of business promotions, tenures, or research award committees is that individuals tend to have discipline-based biases. We all examine things in our most familiar context, our discipline. The method outlined here attempts to provide an objective and rational approach to comparisons of portfolios of journal articles. This approach develops exchange rates between the disciplines. Individuals on these committees would not be required to only use these exchange rates, but utilizing these rates would be an excellent and objective way to start the discussion.
A starting point is the list of journals used by University of Texas—Dallas (UTD) to rank business schools. This list of "top journals" and the Journal of Citation Reports (JCR) Impact Factors (IF) for the top three in the four disciplines of accounting, finance, management, and marketing are examined in this paper. The following results demonstrate that management has the highest average IF at 5.33, followed by finance (4.23), marketing (3.71), and accounting (2.69).
The top three journals for accounting and their impact factors are the Journal of Accounting and Economics (3.281), Accounting Review (2.418), and Journal of Accounting Research (2.378). The top three finance journals and their impact factors are the Review of Financial Studies (4.748), Journal of Finance (4.218), and Journal of Financial Economics (3.725). The top three marketing journals and their impact factors are the Journal of Marketing (5.472), Journal of Consumer Research (3.101), and Journal of Marketing Research (2.571). UTD also lists a fourth marketing journal, Marketing Science (2.36), but only the top three are used for comparison purposes. The top three are used to estimate "exchange rates" between the discipline journals because there is often more agreement on the top three than the top four or five. However, there is nothing that prevents a college from using a larger discipline subset than the top three in this approach.
The top three management journals and their impact factors are the Academy of Management Review (6.169), Academy of Management Journal (5.608), and Administrative Science Quarterly (4.212). UTD also lists other journals that might be located within a management department such as the Organization Science (4.338), Strategic Management Journal (3.783), and Journal of International Business Studies (3.406). They also include other top journals in the production management and information systems areas. Economics journals are not included in the UTD information but could be chosen by any college that wanted to use this system.
One way to adjust IFs for different disciplines and journals within those disciplines is to convert the IFs to the top level, in this case management. In other words, a top three accounting journal should be viewed as having an IF that is (5.33/2.69 = 1.98) times the IF when compared to management journals. The adjustment factors for finance and marketing are (5.33/4.23 = 1.26) and (5.33/3.71 = 1.44), respectively.
The following example might explain how a record could be viewed by each discipline. Assume a finance candidate had a JFQA (IF = 1.775), a Journal of Banking and Finance (IF = 2.60), and a Real Estate Economics (IF = 1.020). These IFs would sum to 5.395. If we use the earlier adjustment ratios, those 5.395 finance IFs would be equivalent to [(5.33/4.23) X 5.395] 6.80 management IFs. Using a similar approach, those 5.395 finance IFs would be equivalent to [(3.71/4.23) X 5.395] 4.73 marketing IFs and [(2.69/4.23) X 5.395] 3.43 accounting IFs. These IFs could be compared to the other disciplines to see how they might equate. In this case, the finance-adjusted 6.80 management IFs would equate to (6.80/4.212) = 1.61 Administrative Science Quarterly. The finance-adjusted 4.73 marketing IFs would equate to (4.73/2.571) = 1.84 Journal of Marketing Research. The finance-adjusted accounting IFs would equate to (3.43/2.378) = 1.44 Journal of Accounting Research.
This paper focuses on evaluating portfolios of journals across disciplines, but it could also be used within a discipline to compare different portfolios of research for an evaluation period. Using the previous portfolio of finance journals, the 5.395 finance IFs would be comparable to (5.395/ 3.725) 1.45 Journal of Financial Economics.
Within a discipline, faculty could add journals that are not on the current JCR list using an "estimated" measure. For example, journals associated with regional finance associations (Journal of Financial Research, Financial Review, and Quarterly Review of Economics and Finance) might be considered. The burden of proposing a believable comparable estimate for the non-JCR journals would fall on the faculty, particularly if the finance faculty want faculty in other disciplines to assign a comparable value to them. The Social Sciences Citation Index (SSCI), the basis for JCR impact factors, may be used to find the cites in the SSCI to the papers published in a non-SSCI paper for a time period such as two or five years. These cites can be tied back to the number of articles for a preceding time period. These estimated impact factors would be similar to the JCR impact factors without self-cites because the cites in the journal would not be included.
Possible adjustments to this approach include, using five-year IFs or article influence scores instead of IFs for two years or using an average of the two or three of these types of IFs. These three measures are readily available at many research university libraries and can be compiled relatively quickly.
The IFs for two years give us the most recent ratings but they are also probably the most volatile of the three measures. The five-year IFs give us the most recent ratings over a longer period, and this measure is probably less volatile than the IFs for two years.
The article influence approach adjusts for two other factors. It is similar to a five-year IF but it eliminates journal self-cites, as well as gives a higher weighting to cites in higher ranked journals. For those who are worried about "coercion citations" by journal editors, this measure would adjust for those self-cites. However, the article influence may also be biased in favor of the top journals in a discipline because, in addition to more cites already reflected in the two impact factor measures, they are more likely to be cited in the highest ranked journals. For example, if someone publishes in real estate journals, or any other subset of a discipline, those journals probably have a smaller set of journals where they are likely to be cited and their journal is likely to contain a larger percentage of the most likely cites in that area. In addition, the cites in the subset are less likely to be weighted as high as cites in the top journals. If one wants to use the two-year or five-year IFs and remove the journal self-cites, the information is available to make those adjustments.
Any of these three measures or a weighted combination of the three measures could be used to develop the exchange rates. I prefer an average of the three measures with equal weights because all three measures provide interesting viewpoints of a journal's impact. An analysis of the top 48 business finance journals with all three pieces of information for JCR 2010 showed a correlation coefficient of 0.96–0.98 between the average and the three pieces of information.
Although this approach examines a portfolio of journal articles, some schools may specify a minimum number of publications in designated journals such as the top three. This approach can be adjusted to accommodate that qualification by including only the designated journals. I recognize that many schools, particularly those with doctoral programs, may have a cultural bias for the top three journals approach. However, one should recognize that the top three may be a poor indicator of an article's potential impact. I, (Smith, 2004) examined the top 15 finance journals and found JCR impact factors were strongly correlated with mean cites per journal article; however, when I looked at the top three as an indicator of articles with actual cites above the mean number of cites in the top 15 journals that indicator was wrong a combined 77% of the time, i.e., 44% of the articles above the mean were non-top three articles, and 33% of the top three articles were not above the mean. In other words, the top three decision rule had a combined type 1 and type 2 error rate of 77%. This article, which has been cited in scientometric studies related to accounting, engineering, entrepreneurship, finance, information systems, international business, management, management science, medicine, research grants, and patents, demonstrates that a portfolio of journal articles is a better approach than the top three approach when examining an individual's record. The study also indicates that if one is examining a longer period after the publications then the actual citations for articles is a better measure of impact than the categorization of the articles.
In summary, this approach provides an excellent way to provide current and objective data to start an analysis for comparing portfolios of journal articles across disciplines or within a discipline.
Stanley D. Smith, Professor of Finance, University of Central Florida, Former Dean, Walton College of Business, University of Arkansas
For more information on this study, Dr. Smith can be contacted at: firstname.lastname@example.org.