Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-23T00:05:41.851Z Has data issue: false hasContentIssue false

Report of the Editors of the American Political Science Review, 2013–2014

Published online by Cambridge University Press:  02 April 2015

John Ishiyama*
Affiliation:
Lead Editor, University of North TexasAPSREditorial Team
Rights & Permissions [Opens in a new window]

Abstract

Type
Association News
Copyright
Copyright © American Political Science Association 2015 

We report here on the journal’s operations from July 1, 2013 to June 30, 2014, the second full year that the University of North Texas (UNT) team has been at the helm of the Review. We wish to express our great thanks to the APSA, President John Aldrich, President-Elect Rodney Hero, the staff, the council, and the publications committee, as well as to Cambridge University Press for their support and guidance over the past three years.

Members of our editorial board have helped us with their advice on more than a few submissions and have served as “guest editors” on UNT-connected submissions that might otherwise raise issues of conflict of interest. We also want to thank all of the authors who submitted their papers for consideration in the past year and the referees who reviewed them. In particular, without the talented work of authors and the referees’ commitment of time and effort in service of the profession, there simply would be no Review. Our entire discipline, as always, owes you a debt of gratitude, so thank you.

This report highlights our accomplishments over the past year. When we took on this job in 2012 we identified three primary goals in our manifesto: 1) to improve the efficiency of the Review’s editorial process; 2) to increase the number of submissions, and the diversity of submissions, which would lead to a greater diversity of articles appearing in the Review; 3) to maintain the APSR’s position as the leading political science journal in the world. The following report highlights the progress we have made toward those goals. We are pleased to report that we have thus far accomplished the goals that we laid out in 2012.

SUBMISSIONS AND PROCESSING

Number of Submissions

In terms of number of submissions, for 2013–2014, the UNT team reports the highest number of submissions to the APSR on record (breaking the previous record established during our first year at the helm). From July 1, 2013 to June 30, 2014 we received 961 new submissions (up from 895 from the previous year). After revisions are also factored in, 2013–2014 represents the highest total number of papers handled in any 12-month period on record for the APSR (1,056) up from the previous year’s reported 1,007 total submissions. Despite that record number of submissions, we still maintained a turnaround time of 49.2 days from receipt to first decision, which is slightly higher than the previous year of 41.2 days, but significantly lower than previous years.

As of June 30, 2014 we invited 4,662 reviewers, 2,349 of whom accepted, 1,267 declined. The remaining reviewers were either withdrawn as reviewers, or we are awaiting a response to our request to review for papers currently under review. Thus 65.0% of those who responded to our review request agreed to review, which is almost identical to the 64.9% rate that we reported for 2012–13.

At the recommendation of our editorial board at the annual meeting in Chicago in 2013, we conducted a study of “reviewer fatigue” to ascertain, based upon our current data, what explains why reviewers decline to review a paper for the APSR. Under the leadership of one of the Review’s coeditors, Professor Marijke Breuning, a preliminary study has been completed that analyzes the variety of reasons provided by potential reviewers for their decision to decline to review.

It is important to note two things from the report. An underlying concern expressed at previous APSA Council and APSR editorial board meetings, was that reviewer declines were caused by “reviewer fatigue” (i.e., too many reviews were being requested of reviewers), and that this jeopardized the efficiency of the editorial process. First, the report indicates that the “fatigue” issue is much more complex than reviewers being asked to do too many reviews. Second, there appears to be no relationship between reviewer fatigue and efficiency of the editorial process, as demonstrated by our own success in maintaining fairly quick turnaround times, irrespective of “decline to review” rates.

Table 1 Submissions per Year

Turnaround Times

We have made great efforts to reduce the number of days it takes to process manuscripts from first receipt of a submission to first decision (table 2). As indicated in the table, despite the substantial uptick in submissions processed by the UNT team from 2013–14, we have maintained a very low turnaround time of 49.2 days. Although some what higher than our previous year, this is substantially lower than previous years. One of our primary goals was to shorten the editorial assistant vetting and co-editor reviewer assignment time. Our editorial assistants have been very diligent in processing manuscripts quickly, and we have endeavored to be as quick as possible in reviewer assignment times. We have also engaged in the practice of directly contacting late reviewers to expedite the review process, although our reviewers have been generally very prompt in completing their reviews, 34 days on average. Indeed, the lion’s share of the credit in reducing turnaround times lies with the efficiency of our editorial assistants and of our reviewers.

Table 2 Elapsed Time (Avg. No. of Days) in Review Process, 2010–2014

Mix of Submissions

In terms of mix of submissions (see table 3a) during the period 2013–2014 the distribution of submissions changed somewhat when compared to previous years. Categorized by disciplinary subfield, the papers we received from July 2013 to June 2014 are reported in table 3a. The largest proportion of manuscripts continues to be from the comparative politics field (36%) with a slight decline in the proportion of manuscripts from international relations (16% compared to 20% in the previous year). Normative theory and American politics remain unchanged over time, although there has been a longer-term trend toward a decline in the proportion of manuscripts submitted to the APSR that are American politics. The increase in comparative politics submissions as a proportion of the total number of submissions may be a function of the increasing number of submissions from international scholars to the APSR.

Table 3a Distribution of New Papers Submitted, 2013–2014 Compared with Previous Years (%)

During 2013–2014, in terms of the mix of submissions by approach (see table 3b), the patterns of submissions are also consistent with past patterns. The largest proportion continues to be quantitative (58.0%) and the percentages for the other fields remained consistent when comparing the last year under the UCLA team with previous years. There has been an increase in papers using formal models and those using qualitative methods. Overall, in the past year, formal, quantitative, and formal and quantitative submissions constitute 76% of all submissions in comparison to the 71% of all submissions from these approaches in 2012–13. Thus there has been a slight increase in submissions that have employed quantitative methods.

Table 3b Distribution of New Papers Submitted, 2013–2014 Compared with Previous Years (%)

In addition to traditional indicators of the diversity of submissions that have appeared in past reports, we have also collected data on two other indicators of diversity during the period June 2013–July 2014: gender of first author of the submission, and national location of first author of the submission (data that we first reported in last year’s annual report). These data were not collected by previous editorial teams.

Thus far, 72.5% of manuscript first authors during this period were men, and 27.5% were women. Although we believe that this is progress (with the proportion of women first authors higher than our first year as editors), this is still lower than the estimated 32% of the APSA membership that is comprised of women. Further, approximately 33% of first authors of submitted manuscripts were based in non-US institutions, an increase over the previous year (31%). This is an encouraging sign as the APSR continues to strive to be the leading political science journal in the world. We are hopeful to improve the diversity of submissions on all dimensions, and will continue to monitor trends in terms of gender and international authorship (see table 4).

Table 4 Distribution of First Authors of Submitted Papers by Gender and International Authorship (%)

OUTCOMES

Table 5 reports the outcome of the first round of the review process for 2013–2014 (as well as previous years for comparative perspective). For the past year under the leadership of the UNT team, the proportion of summary rejects and inappropriate submissions (both without review), the proportion of rejects after reviews, conditional accepts, and accepts after first round were very consistent with percentages reported in the previous years.

Table 5 Outcome of First Round of the Review Process (%)

Continuing the practice of our predecessors we have made use of summary rejection in order to relieve “reviewer fatigue” and to remove from consideration submissions that would most surely not survive the usual review process. In comparison with period 2013–2014, in 2012–2013 summary rejects increased to nearly 25% of the total. Further, rejection after reviews remains about the same percentage in comparison to previous years (68.4%). The percentage invited to revise and resubmit is slightly lower than the previous year (7%). These differences are largely due, in our view, to our decision as an editorial team to avoid inviting “de novo” resubmissions (or “reject and resubmit”) which was a practice of previous editorial teams. Rather, we either reject or invite to revise and resubmit (and not “reject and resubmit”). This is a practice consistent with the practice of other major journals and we believe that avoiding granting de novo resubmissions is generally a wise practice that we will continue.

Tables 6a and 6b report outcomes by accepted manuscripts by field and approach. Papers accepted by field showed that the largest proportion of manuscripts accepted by field were from comparative politics (42%) and normative theory (25%). Acceptances in international relations and formal theory remained steady at 11% and 5% respectively. There has been, however, a decline in the proportion of accepted papers that were from American politics (from 21% to 13%). Overall, this may reflect the longer term trend of decline in the proportion of papers appearing in the Review that are from American politics, although during the past year this decline was fairly steep. We are currently working to address this issue.

Table 6a Distribution of Papers Accepted by Field (%)

Table 6b Distribution of Papers Accepted by Approach (%)

As indicated in table 5c, the percentage of formal, quantitative, and formal and quantitative acceptances continued to decline slightly but, taken together, continued to account for 62% of all papers accepted from 2012–2013, a somewhat lower percentage than the 66.5% reported last year, but substantially lower than the 74% reported by the UCLA team in 2011–12 (and much lower than the 2009–10 proportion of 84%). On the other hand, there has been a significant increase in the proportion of papers using qualitative, conceptual, and interpretive methods accepted by the Review. In 2013–2014, 38% of the manuscripts accepted were in these categories, up from 33.5% in 2012–13, and up from 26% in 2011–12. We take this as evidence that the Review is making significant progress in diversifying its content, particularly in terms of approach. Thus, the Review is becoming more diverse in the types of articles that are appearing in the journal.

VISIBILITY

As indicated in table 7, the American Political Science Review remains the top ranked journal in political science with a Thompson-Reuters Impact Factor (IF) score in 2013 of 3.844 (slightly down from 3.933 in 2012 but still #1 by far). However, and perhaps more importantly, the Review’s five-year impact factor score in 2013 has risen to 5.298, up significantly from 2012 (4.516), and the highest level recorded to date for the APSR. This places the Review in first place, far ahead of most all comparable journals.

Table 7 Annual and 5-year Thomson-Reuters JCR impact factors for APSR

In addition to maintaining the top ranking for the Review in terms of IF scores, we have also worked closely with Cambridge University Press (particularly with Janise Lazarte at Cambridge) to more broadly publicize pieces that appear in the Review. This has included the use of press releases, e-mail notifications, and other electronic media (such as Twitter) to “get the word out” about work that appears in the APSR. Further, we continue our relationship with the editors of the popular political blog, The Monkey Cage, to highlight important pieces that are scheduled to appear in the Review (using it as a way to publicize and preview pieces in much the same way as movie “trailers”). We believe that these efforts have greatly increased the public visibility of the Review.

Finally, it should be noted that the UNT editorial office delivers issues on time, and the physical production of the APSR is generally on schedule.

CONCLUSION AND FUTURE PLANS

Based upon the previous discussion, the Review has made great strides over the past two years in terms of significantly reducing the processing times of manuscripts to first decision, maintaining the diversity of types of submissions to the Review, and increasing the diversity of types of articles accepted by the APSR for publication, while maintaining the APSR as the world’s leading journal in political science. It appears that not only have there been increases in submissions to the Review and increases in the diversity of what appears in the APSR, but also increases in citations to articles that have appeared in the Review. In short, we have made good progress in realizing the goals that we laid out in our initial editors’ manifesto.

Our future plans include continuing our outreach effort to connect with various constituencies in our discipline (to further increase number and the diversity of submissions). Over the past three years, we have attended dozens of conferences and professional meetings that host a great many political scientists. We have addressed either the leadership of these organizations, the organized sections, or the caucus group of political scientists. We discussed our strategic goals and welcomed their suggestions and input. These meetings included many that one or more of the editors normally attend, so that we will likely continue our outreach efforts over the course of tenure as APSR editors.

In addition, in the coming year, we intend to make important changes in the Review’s submission guidelines to promote greater data access and research transparency for pieces published in the Review. In conjunction with the APSA DA-RT (data access and research transparency) initiative and the APSA Publications Committee, we have developed a concrete and workable policy on DA-RT for the APSR that we plan to implement in 2015. Furthermore, we intend to have a greater web presence for the APSR, and are developing plans to employ web-based forums to enhance discussion of articles that appear in the Review.

Thank you very much for the opportunity to serve the Association and our discipline, and we remain grateful for the trust and support of our colleagues. We welcome your comments and any suggestions as we proceed.

Figure 0

Table 1 Submissions per Year

Figure 1

Table 2 Elapsed Time (Avg. No. of Days) in Review Process, 2010–2014

Figure 2

Table 3a Distribution of New Papers Submitted, 2013–2014 Compared with Previous Years (%)

Figure 3

Table 3b Distribution of New Papers Submitted, 2013–2014 Compared with Previous Years (%)

Figure 4

Table 4 Distribution of First Authors of Submitted Papers by Gender and International Authorship (%)

Figure 5

Table 5 Outcome of First Round of the Review Process (%)

Figure 6

Table 6a Distribution of Papers Accepted by Field (%)

Figure 7

Table 6b Distribution of Papers Accepted by Approach (%)

Figure 8

Table 7 Annual and 5-year Thomson-Reuters JCR impact factors for APSR