Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-25T17:10:06.108Z Has data issue: false hasContentIssue false

Annual Report of the Editors of the American Political Science Review, 2012–2013

Published online by Cambridge University Press:  14 April 2014

John Ishiyama*
Affiliation:
Lead Editor, for the University of North Texas APSR Editorial Team
Rights & Permissions [Opens in a new window]

Abstract

Type
Association News
Copyright
Copyright © American Political Science Association 2014 

We report here the journal’s operations during the year from July 1, 2012 to June 30, 2013, the first full year that the University of North Texas (UNT) team has been at the helm of the Review. In this report, we also summarize the transition process from UCLA to UNT.

Continuing a tradition established by our predecessors, we wish to express our great thanks to the APSA, President Mansbridge, President-Elect Aldrich, the Staff, the Council, and the Publications Committee, as well as to Cambridge University Press for their support and guidance during this transition process. We are particularly grateful to President Mansbridge and APSA Executive Director Michael Brintnall, APSA Director of Communications and Publishing Polly Karpowicz, and Mark Zadrozny and the Cambridge University Press team for their vital help in making the transition process from UCLA to UNT a smooth one. Finally, our very special thanks again to Ron Rogowski and his team at UCLA (and in particular, the Managing/Senior Editor at UCLA, Joseph Riser) for making this transition an especially easy one. It has been a real pleasure to work with their team during this process.

Members of our editorial board have helped us with their advice on more than a few submissions and have served as “guest editors” on UNT-connected submissions that might otherwise raise issues of conflict of interest. We would especially like to thank Larry Dodd of the University of Florida and Barbara Walter at the University of California, San Diego, for outstanding work as guest editors this past year. We also want to thank all of the authors who submitted their papers for consideration in the past year to both the UCLA and UNT teams and the referees who reviewed them. In particular, without the talented work of authors and the referees’ commitment of time and effort in service of the profession, there simply would be no Review. Our entire discipline owes you a debt of gratitude, so thank you.

TRANSITION PROCESS

The transition process from UCLA to UNT was remarkably smooth, thanks in no small part to Ron Rogowski and the UCLA editorial team, and their Managing/Senior Editor Joseph Riser. Per agreement with the UCLA team, the UCLA team processed all incoming manuscripts prior to July 1, 2012. The UNT team took over processing all new incoming manuscripts on July1, 2012. The UCLA team continued to process all submissions they began (including revise and resubmits) until October 15, 2012, after which the responsibility for finishing up processing passed on to the UNT team. The UCLA team was responsible for the November 2012 issue —the first issue for which the UNT team was responsible was the February 2013(Volume 107 No. 1) issue. However, 107.1 included a number of “legacy” manuscripts, so much of the credit for this issue lies with the UCLA team.

During the transition period from 2011–2012 and throughout 2012–2013, the UNT editorial team made a concerted outreach effort to increase the number and diversity of submissions, particularly from fields that have been less well represented in the Review. This involved individual editor visits and the participation on panels or in workshops at numerous professional conferences in the United States and abroad in an effort to appeal to political scientists who may have felt left out by the Review. We were hopeful that by engaging in such outreach efforts, submissions from fields such as International Relations would increase, and there would be a greater diversity of submissions in terms of field and approach, as well as in terms of other indicators.

SUBMISSIONS AND PROCESSING

Number of Submissions

In terms of number of submissions, for 2012–2013, the UNT team reports the highest number of submission to the APSR on record. As with the first year of the UCLA team, numbers of submissions have spiked. In fact, from July 1, 2012 to June 30, 2013, we received 895 submissions (up from 761 in the last year of the UCLA team’s tenure). Once revisions are also factored in, 2012–2013 represents the highest total number of papers handled in any 12-month period on record for the APSR (1007) up from the previous year’s reported 846 total submissions. Despite that record number of submissions, however, time from receipt to first decision still declined significantly 2012–13 in comparison to the previous year (from 68.9 days to 41.3 days on average, a 40% decrease in time to decision).

As of June 30, 2013 we invited 4,516 reviewers, 2,051 of whom accepted, 1,108 declined. The remaining reviewers were either withdrawn as reviewers, or we are awaiting a response to our request to review for papers currently under review. Thus 64.9% of those who decided on whether or not to review agreed to review when requested, which is down from the 70% reported by the UCLA team in the previous year

Table 1. Submissions per Year

Table 2. Elapsed Time (Average Number of Days) in ReviewProcess, 2010–13

At the recommendation of our editorial board at our annual meeting in Chicago at the MPSA (which took place there because of the cancelled New Orleans meeting) we are currently conducting an analysis of reviewer decline reasons to ascertain whether reviewer “fatigue” is a central element for lower agreement to review rates We hope to have initial results by the time of the convocation of our board meeting in Chicago in August, 2013.

Table 3a. Distribution of New Papers Submitted, 2012–13, Compared with Previous Years (%)

Table 3b. Distribution of Papers Submitted, 2012–13, Compared with Previous Years (%)

Turnaround Times

We have made great efforts to reduce the number of days it takes to process manuscripts from first receipt of a submission to first decision (Table 2). As indicated in the table, despite the substantial uptick in submissions processed by the UNT team from 2012-13, turnaround times decreased from 68.9 days on average in the last year of the UCLA team’s tenure, to 41.3 days currently, an approximately 40% decrease in turnaround times. One of our primary goals was to shorten the editorial assistant vetting and co-editor reviewer assignment time. Our editorial assistants have been very diligent in processing manuscripts quickly and we have endeavored to be as quick as possible in reviewer assignment times. We have also engaged in the practice of directly contacting late reviewers to expedite the review process, although our reviewers have been generally very prompt in completing their reviews, 34 days on average. Indeed, the lion’s share of the credit in reducing turnaround times lies with the efficiency of our editorial assistants and of our reviewers.

Mix of Submissions

In terms of mix of submissions (see Table 3a) during the period 2012–2013 the distribution of submissions changed only very slightly when compared to previous years. Categorized by disciplinary subfield, the papers we received from July 2012 to June 2013 are reported in Table 3a. The distributions are consistent with past distributions in previous APSR reports. The largest proportion of manuscripts continues to be from the Comparative Politics field (32%) with a decline in the proportion of manuscripts from American Politics (21% compared to 23% in the previous year). The biggest increase (and in our view a most encouraging development given our stated goal of number of increasing the diversity of submissions) is the jump in submission from the International Relations field, from 17% in 2011–2012 to 20% currently. This is a most encouraging development. Submissions from Normative Theory, Formal Theory, Methods and Race, Ethnicity, & Politics remain consistent with past submission patterns.

During the period 2012–2013, in terms of the mix of submissions by approach, the patterns of submissions are also consistent with past patterns. The largest proportion continues to be quantitative (54.0%) and the percentages for the other fields remained consistent when comparing the last year under the UCLA team with previous years. There has been a slight decline in Formal and Formal and Quantitative approaches, but increases in Small N, Interpretive/Conceptual, and Qualitative and/or Empirical. Overall, in the past year, Formal, Quantitative, and Formal and Quantitative submissions constitute 71% of all submissions in comparison to the 74% of all submissions from these approaches in 2011–12 and the average of 72% of all submissions from 2008–12. Thus, there has been very little change in the distribution of submissions by approach.

In addition to traditional indicators of the diversity of submissions that have appeared in past reports, we have also collected data on two other indicators of diversity during the period July 2012–January 2013: gender of first author of the submission, and national location of first author of the submission. These data were not collected by previous editorial teams.

Thus far, 76% of manuscript first authors during this period were men, and 24% were women (two manuscripts could not be determined in terms of gender). Although we believe that this is progress, this is still lower than the estimated 32% of the APSA membership that is comprised of women. Further, approximately 31% of first authors of submitted manuscripts were based in non- US institutions, an encouraging sign. We are hopeful to improve the diversity of submissions on all dimensions, and will continue to monitor trends in terms of gender and international authorship.

OUTCOMES

Table 4 reports the outcome of the first round of the review process both for the year 2012–2013 (as well as previous years for comparative perspective). For the past year under the leadership of the UNT team, the proportion of summary rejects and inappropriate submissions (both without review), the proportion of rejects after reviews, conditional accepts and accepts after first round, were very consistent with percentages reported in the previous years.

Table 4. Outcome of First Round of the Review Process (%)

Continuing the practice of our predecessors we have made use of summary rejection in order to relieve “reviewer fatigue” and to remove from consideration submissions that would most surely not survive the usual review process. In comparison with period 2011–2012, in 2012–2013 summary rejects remained about the same at 20% of the total. Further, rejection after reviews remains about the same percentage in comparison to previous years (71%). The percentage invited to revise and resubmit is slightly higher (8.9%) than in previous years (although comparable to 2009–10). These differences are largely due, in our view, to our decision as an editorial team to avoid inviting “de novo” resubmissions (or “reject and resubmit”) which was a practice of previous editorial teams. Rather, we either reject or invite to revise and resubmit (and not “reject and resubmit”). This is a practice consistent with the practice of other major journals, and we believe that avoiding granting de novo resubmissions is generally a wise practice that we will continue.

Tables 5a and 5b report outcomes by accepted manuscripts by field and approach. Papers accepted by FIELD showed a fairly stable proportion of manuscripts accepted in American politics (at 21%) and Normative Theory (at 16%) and Comparative Politics (33%). Acceptances rose in International Relations (from seven to eleven per cent) and Methods (from five to seven per cent). However, there was a decline in acceptances in papers that were classified as purely Formal Theory (from 10% to 4%), although when compared to 2010–11 this decline is not nearly as great as it appears. The reported result is in line with results reported previously prior to 2011–12. The 2011–12 increase to 10% is in many ways an outlier compared with previously reported data. Further, many of the papers that included formal theoretical approaches were subsumed in other field papers, particularly in Comparative Politics manuscripts.

As indicated in Table 5b, the percentage of Formal, Quantitative, and Formal and Quantitative acceptances continued to decline slightly but, taken together, continued to account for 66.5% of all papers accepted from 2012–2013, a somewhat lower percentage than the 74% reported by the UCLA team in 2011–12 and a much lower percentage than reported by the UCLA team in 2009–10 (84%). On the other hand, there has been a significant increase in the proportion of qualitative, conceptual, and interpretive pieces accepted by the Review. In 2012–2013, 33.5% of the manuscripts accepted were in these categories, up from 26% in 2011–12 and 16% in 2009-2010. We take this as evidence that the Review is making significant progress in diversifying the content of the APSR, particularly in terms of approach. Thus, continuing the trend established by the UCLA team, the Review is becoming more diverse in the types of articles that are appearing in the journal.

Table 5a. Distribution of New Papers Accepted by Field (%)

Table 5b. Distribution of Papers Accepted by Approach(%)

VISIBILITY

Thanks largely to the efforts of the UCLA team, the American Political Science Review remains the top ranked journal in political science with a Thompson-Reuters Impact Factor (IF) score in 2012 of 3.933. Further, the Review’s five-year impact factor score in 2012, 4.516 (up from 3.759 in 2011) places the Review back in first place ahead of Political Analysis (3.856) and American Journal of Political Science (3.960). This is a tremendous accomplishment, and is largely due to the efforts of our predecessors at UCLA.

In addition to regaining the top ranking for the Review in terms of IF scores, we have also worked closely with Cambridge University Press (particularly with Michael Marvin at Cambridge University Press) to more broadly publicize pieces that appear in the Review. This has included the use of press releases, email notifications, and other electronic media (such as twitter) to “get the word out” about work that appears in the APSR. Further, we have forged a relationship with the editors of the popular political blog, The Monkey Cage, to highlight important pieces that are scheduled to appear in the Review (using it as a way to publicize and preview pieces in much the same way as movie “trailers”). We believe that these efforts will greatly increase the public visibility of the Review in the future.

Finally, it should be noted that the physical production of APSR is currently on schedule, and issues are now produced on time.

CONCLUSION AND FUTURE PLANS

Based upon the above, the Review has made great strides in the past year in terms of significantly reducing the processing times of manuscripts to first decision, maintaining the diversity of types of submissions to the Review (and increasing the proportion of submissions from historically less represented fields, such as International Relations), and increasing the diversity of types of articles accepted by the APSR for publication. In short, we have made good progress in realizing the goals that we laid out in our initial editors’ manifesto.

Our future plans include continuing our outreach effort to connect with various constituencies in our discipline (in order to further increase number and the diversity of submissions). Over the past two years, we have attended dozens of conferences and professional meetings that host a great many political scientists. We have addressed either the leadership of these organizations, the organized sections, or the caucus group of political scientists. We discussed our strategic goals and welcomed their suggestions and input. These meetings included many that one or more of the editors normally attend, so that we will likely continue our outreach efforts over the course of tenure as APSR editors. In 2014, we are scheduled to make additional outreach visits to a number of national and international conferences, workshops, and meeting to further broaden our outreach efforts.

In addition, in the next year, we intend to take a leadership role in promoting greater data access and research transparency for pieces published in the Review. In conjunction with the APSA DA-RT (data access and research transparency) initiative, and the APSA publications committee we hope to realize more concrete and workable policy on DA-RT for the APSR.

Thank you very much for the opportunity to serve the association and our discipline and we remain grateful for the trust and support of our colleagues. We welcome your comments and any suggestions as we proceed.

Figure 0

Table 1. Submissions per Year

Figure 1

Table 2. Elapsed Time (Average Number of Days) in ReviewProcess, 2010–13

Figure 2

Table 3a. Distribution of New Papers Submitted, 2012–13, Compared with Previous Years (%)

Figure 3

Table 3b. Distribution of Papers Submitted, 2012–13, Compared with Previous Years (%)

Figure 4

Table 4. Outcome of First Round of the Review Process (%)

Figure 5

Table 5a. Distribution of New Papers Accepted by Field (%)

Figure 6

Table 5b. Distribution of Papers Accepted by Approach(%)