Hostname: page-component-586b7cd67f-tf8b9 Total loading time: 0 Render date: 2024-11-25T21:59:42.053Z Has data issue: false hasContentIssue false

American Political Science Review

Editors’ Report July 1, 2018–February 29, 2020

Published online by Cambridge University Press:  10 July 2020

Rights & Permissions [Opens in a new window]

Abstract

Type
News
Copyright
© American Political Science Association 2020

In this report, we combine the editorial term from July 1, 2018 to June 30, 2019, with the current period from July 1, 2019 to February 29, 2020, to discuss the journal’s operations. However, to maintain comparability with previous reports, we treat this period separately as the latter does not constitute a full editorial term. Therefore, we present these numbers at the end of this report in a separate section. As editors of the American Political Science Review, we continue to provide insight into our editorial process by reporting on numbers of submissions, workflow and turnaround times, and invited reviewers. We also present numbers on issues which have raised concerns during our editorship, such as gender mix, (desk rejection) outcomes, subfield distribution, and scholarly visibility.

Before we begin, we would like to start with expressing our great thanks to Presidents Rogers Smith and Paula McClain, the APSA staff, the APSA Council, and the APSA Publications Committee, as well as to Cambridge University Press for their continued support and guidance over the past years. We would also like to thank the members of our editorial board, who have provided countless reviews and served as guest editors during our tenure. Finally, we thank all of the authors who submitted their manuscripts and the reviewers who evaluated them. Without their support it would be impossible to sustain an effective review process. We wish the new editorial team all the best and will help to ensure a smooth transition for their June 1st start.

EDITORIAL PROCESS AND SUBMISSION OVERVIEW

In the following section, we present an overview of the editorial process and submissions during 2018–19. Similar to previous reports, we discuss the number of submissions, workflow and turnaround time, and invited reviewers. We retrieved the data from our editorial management system. Briefly summarized, we experienced a decreasing number of submissions this year. However, the total number of 1,370 submissions still remains at the second highest level with an increasing share of manuscripts in the letter format, while our workflow increased compared to previous years. We are particularly grateful for the support of our reviewers, who helped us manage this very high number of submissions—admittedly, at the expense of a higher number of desk rejections.

Number of Submissions

Between July 1, 2018 and June 30, 2019, we received 1,174 new submissions, translating to an average of about 3.2 new submissions per day. We interpret the high number of new submissions, especially from authors outside the US, as both a sign of the journal’s popularity and international reach as well as a sign that authors appreciate our efforts in spite of a high rejection rate. We received about 7% more submissions in the same time period of the previous year, equating to 1,267 new submissions, but the number of revisions increased to 196 revisions, which is 4% higher than the previous year’s 189 revisions. Figure 1 shows both the number of new submissions and the total number of received submissions when revisions are included per year.

Figure 1 Submissions per Year (by First Receipt Date)

We are proud that our shorter publication format, letter, continues to rise in popularity. Letters address an important research problem or question, showing a novel perspective on existing research and encouraging scholarly debate in the discipline. In total, we received 171 letter submissions in this period, constituting about 15% of the overall new submissions. Figure 2 shows the steady increase in submission share of letters since their introduction. In terms of a subfield breakdown, while our letter submissions do not perfectly mirror manuscript submissions of a longer format, they reflect a similar trend. Namely, Comparative Politics makes up 29% of letter submissions (32% of manuscripts), International Relations 10% (14%), Formal 9% (5%), and Other 9% (10%). The main differences are seen in American Politics, which makes up a noticeably larger proportion of letter submissions at 27% than manuscript submissions at 16% and Methods at 10% compared to 4% of manuscripts. The proportion for Normative Political Theory letters is 12% lower than manuscript submissions.

Figure 2 Letter Submissions as Share of Manuscripts Received per Semester (by First Receipt Date)

Workflow and Turnaround Times

One of our editorial goals is to sustain an efficient workflow that reduces the time taken to render the first decision. Given the very high number of submissions and a relatively limited reviewer pool, this is not a trivial task. In 2018–19, it took an average of three days until a manuscript was first tech-checked after first receipt. We tech-check all submissions regarding their technical standard in terms of length, figures, tables, etc., and eventually inform authors about their need to correct their submission. The overall duration from first receipt until a manuscript was forwarded to our lead editor was four days including the approximate 27% of manuscripts returned to authors for “technical” issues.

Usually within a very short time, the manuscript was then either desk (summary) rejected by our lead editor or passed on to an associate editor. This first round of desk rejections by the lead editor almost exclusively concerned submissions which either speak to another, more specialized audience or do not fit within our two publication formats. From the assignment of an associate editor until the invitation of the first reviewer—which reflects the time required for researching reviewers—it took on average another 13 days. Alternatively, the associate editors can also desk reject manuscripts, which took on average eight days after they were assigned. These desk rejections mostly concern studies that will likely not be successful in peer review, in particular when the editor is confident that the appropriately selected reviewers would reject it.

Although desk rejections have raised criticism, there may be several reasons for this decision. Most often authors either failed to reflect on existing knowledge, selected weak methods, the research did not speak to broader questions or debates in their respective fields, or move the needle in the debate. This process is subjective because it pertains to experience, therefore we respond to authors who object to a desk rejection by the lead or the associate editor by reviewing the author’s comments and often requesting a consultation from another (second) editor.

Table 1 provides details on the development of the turnaround times. It shows the duration between the main stages of the editorial process, from submission to editor assignment, first reviewer invitation, from editor to first decision and submission to first decision (distinguished between whether it was desk rejected or not), starting with the initial submission date. In contrast to the “First Receipt Date,” which is the first time we receive a manuscript, initial submission refers to the date our journal first received a manuscript without it having been sent back to the authors due to formatting issues.

Table 1 Journal Turnaround Times (in days)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the terms 16–17 which runs from September 1, 2016 to August 31, 2017.

Partly due to the increasing workload that our editorial team has to manage, the time until a first decision increased from 66 days in 2017–18 to 71 days in 2018–19. If we exclude desk rejections which are processed rather quickly, the time to first decision increases to about 110 days.Footnote 1 A major reason for this is that we often have to “chase” reviewers to submit their feedback in a timely manner. For example, the average time from first reviewer invitation until at least three reviewer reports are submitted has increased from 61 days in 2016–17 to 70 days in 2018–19, an increase of about 15%. The time increases further when we receive diverging recommendations and we need to collect additional reviews, thereby increasing the wait until the last review arrives before making a final decision.

Invited Reviewers

In total, we invited 3,717 reviewers in the term 2018–19. While 768 of the invited reviewers declined, 2,519 reviewers accepted their invitation to review. The remaining reviewers were either terminated after agreeing or a response to our invitation is pending. Based on the reviews completed during the period from July 1, 2018 to June 30, 2019, it took the reviewers on average 37 days after invitation to complete their reviews. We are happy that the share of reviewers who completed their review remained stable (see table 2). We also consulted our editorial board members with respect to 90 distinct manuscripts, sending out a total of 94 invitations, of which 61 were completed, 20 were declined, and the remaining were either terminated or our invitation is pending. Additionally, we are grateful to the five board members who stood in as guest editors for manuscripts that could not be handled by our team over this term.

Table 2 Number of Invited Reviewers and Completed Reviews (By Invitation and Completion Date, respectively)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the term 16–17 which runs from September 1, 2016 to June 30, 2017.

SCHOLARLY DISCUSSIONS AND EDITORIAL OVERVIEW

In the following section, we aim to contribute to hotly discussed issues in and outside of our discipline. Similar to our previous efforts, we discuss gender, subfields and methods, outcomes and desk rejections, and visibility of scholarly publications by our numbers. We retrieved the data from our editorial management system.

Gender in the APSR

Gender has become a hotly debated issue in and outside political science, including the gender gap in scholarly journals. In our reports, we differentiate first by aggregating solo and coauthorship and then between submissions from women only, men only, and mixed-gender teams.Footnote 2 We were able to classify gender of 1,267 submissions which the APSR received during the editorial term. During 2018–19, 64% of submissions were authored by men (solo or team), while 22% were submitted by mixed-gender teams, and 14% by women (solo or team) authors. Put differently, 86% of submissions had at least one male author and 36% at least one female author. Figure 3 shows the general trend over time. Accordingly, the share of male contributions slightly decreased, while the share of women solo or team submissions remains low, yet, with an increasing trend for mixed-gender teams.

Figure 3 Submissions by Gender for Manuscripts Submitted Between July 2008 and February 2020

Because one point of gender gap criticism concerns editor bias, we consider our overall decision making with respect to gender by presenting a breakdown of submissions that received their final decision in the previous term (figure 4). Although the distribution is highly skewed, the main predictor of whether a manuscript gets desk rejected is whether the manuscript is male solo-authored. 47% of male solo-authored submissions are desk rejected, followed by 40% of female solo-authored submissions. In general, team submissions experience a lower desk rejection rate (38% female, 33% mixed, and 31% male teams). Regarding final acceptance rates, team-authored manuscripts seem also more successful, with much higher success rates for male single-sex teams (11%) than mixed-gender teams (5%) or all-female teams (3%). Yet, solo-authored submissions by women had a slightly higher acceptance rate than solo-authored work by men (6% solo female vs. 5% solo male). Despite the different proportions of accepted papers among single male and female authors, the number of decisions is not large enough to conclude that this category may fail to predict differences in acceptance rates (p=0.77).

Figure 4 Percentage Share of Final Decision Outcome by Type of Authorship Between July 2018 and June 2019

On this regard, out of the 75 submissions that were accepted in the last term, 19 publications were solo-authored by men and 36 publications were co-authored by full male teams. Twelve publications were work by mixed gender teams. Seven submissions were solo-authored by women and one publication was co-authored work by a full female team. For comparison over time, table 3 shows the mix of gender among accepted manuscripts for the past 10 years. The shares of accepted manuscripts correspond to the share of submissions. The number confirms the currently low share of publications authored by women only (both solo and team authored). We will continue to follow this development closely to detect whether this trajectory is systematic.

Table 3 Gender Mix of Accepted Papers (in percent of total)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the term 16–17 which runs from September 1, 2016 to June 30, 2017.

Mix of Submissions

As already mentioned, the distribution of submissions and publications by subfield is difficult to compare as our authors determine the classification of their manuscripts. However, it remains a central task of the lead editor to assign an associate editor with the highest expertise for selecting reviewers and evaluating their reviews. In particular for the subfields of Comparative Politics and International Relations, this makes identifying trends difficult.

Like in previous terms, the share of submissions is highest from Comparative Politics, followed by American Politics, Normative Political Theory, and International Relations. The first section of table 4 shows the pattern of submissions by subfield over time. Overall, the distribution of submissions across subfields remained stable.

Table 4 Submitted Papers by Subfield, Approach, and Location of First Author (in % of total)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the term 16–17 which runs from September 1, 2016 to June 30, 2017.

Most notably, Comparative Politics submissions increased to 32%; while submissions in Normative Political Theory decreased to 13%. The share of methodological submissions continues to rise, increasing to 5%. The high increase in “Other” suggests that the journal is expanding reputation in other fields.

In terms of the approach of the submitted manuscripts—coded by classification by the editorial teamsFootnote 3—the second section of table 4 shows that the share of Quantitative approaches continues to constitute the largest proportion of submissions, nearly 65%, while the share of submissions classified as Interpretative/Conceptual is the second largest with almost 18%. The share of Formal papers increased to 8%. The share Qualitative/Empirical submissions increased (5%) as did the share of Small-N studies (3%). Note that the coding of submissions in previous terms were non-exclusive (multiple mentions are possible) which makes a thorough comparison over time difficult.

Similar to previous reports, we have also gathered data on the nationality of authors. To indicate the global reach of the journal, we use the share of submissions from institutions of the corresponding author outside the US (see last row of table 4). During this term, the share of non-US submissions remained almost constant at 39%.Footnote 4 After the US, the countries with the most submissionsFootnote 5 are the United Kingdom (9.2%) and Germany (3.6%). Although the criticism of editor bias also refers to descriptive characteristics, it would be too far to conclude that the composition of our editorial team is responsible for this British/German dominance.

Outcomes and Desk Rejections

Regarding outcomes, a current point of criticism concerns desk (summary) rejections by either the lead or associate editor. In general, these rejections pursue two purposes. First, they prevent the limited reviewer pool from overuse on manuscripts which fail to fulfill scholarly standards or speak to a more specialized (different) readership; second, if these criteria are not met, to respond to the authors in a timely manner so that they can polish their manuscripts and/or submit it to another scholarly outlet.Footnote 6

Table 5 displays the outcomes after the first round. The number of desk rejections has risen during our editorship in comparison to the rejections after review as it is a specific goal of ours to reduce the overall turnaround times for authors and avoid “reviewer fatigue.” Accordingly, the share of desk rejections remained high at about 39% during 2018–19. The share of rejects after review increased to 55%. In total, however, we end up with comparable numbers of rejections over time—around 90% since 2007. At the same time, the share of R&Rs reduced compared to the last term from 9% to about 6%.

Table 5 Outcome of First Round (in percent of total)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the term 16–17 which runs from September 1, 2016 to June 30, 2017.

Table 6 Accepted Papers by Subfield, Approach, and Location of First Author (in percent of total)

* Terms run from July 1 to June 30 of the following year except for the term 15–16 which runs from July 1, 2015 to August 31, 2016 and the term 16–17 which runs from September 1, 2016 to June 30, 2017.

Between July 1, 2018 and June 30, 2019 our editorial team accepted 75 manuscripts. Of these 75 articles, the highest share with 30 publications was from Comparative Politics, followed by 16 manuscripts from American Politics, and 7 manuscripts from Normative Political Theory. We published 10 Formal Theory articles, three methodological contributions, three manuscripts on race and ethnicity, four papers from International Relations, and two Others. Re- garding the publication rate of letters, with respect to articles, we accepted 15 letters, making a share 20% of acceptances which is comparable to the submission share.

VISIBILITY & TRANSPARENCY

In addition to the number of submissions, incoming reviews, and acceptances, the academic impact as well as the public outreach of research published in the APSR is a major concern. In general, academic impact and public outreach remains difficult to measure and compare objectively for many reasons. For example, a journal’s impact factor may be disproportionately affected by co-citation patterns of symposia and special issues—publication formats not offered by the APSR in order to comply with the typical standard of a premier disciplinary outlet. Nevertheless, available scores and indices may still flag potential shortcomings in a journal’s editorial process or document how a journal’s impact is affected by editorial changes like our decision to introduce FirstView. In the following section, we therefore discuss the development of APSR’s annually published impact factor as well as the Altmetric attention score. Although impact factors suffer from time lags, both measures together may provide us additional insight into how the journal’s quality is perceived from the outside.

Impact Factor

The impact of a journal (and scholarly work in general) is typically evaluated based on the number of its academic citations of articles published in one year for the period thereafter. This idea lies at the core of the impact factor, which is both available for the two-year and the five-year period. For a long time, the APSR has been ranked among the top three generalist outlets in political science according to both measures. For example, the APSR had the highest two-year impact factor of all three major journals between 2007 and 2014 (with the only exception of 2008).

However, compared to other scholarly outlets, the APSR impact factor declined over the past years. This has become visible first by the relative decline of the two-year impact factor. The 2018 two-year impact factor measures, for example, the number of citations in 2018 of manuscripts published in 2016 and 2017 (divided by the number of publications), but, it is only published in 2019. According to figure 5, this recent decrease of the two-year impact factor has now stopped and even slightly increased in 2018. Nevertheless, the APSR still ranks second among the three major journals with a two-year impact factor of 3.9 compared to the American Journal of Political Science with an impact factor of 4.4 and the Journal of Politics with an impact factor of 2.5. Moreover, while we observed a drop in the five-year impact factor in 2017, it increased again in 2018 and is about 6.6. It remains to be seen in the coming years whether our editorial team was able to influence the impact factor as publications in 2018 were the first ones which were fully handled by us.

Figure 5 Impact Factor since 2007

Altmetric Attention Score

In addition to the long-term scholarly impact, editors and publishers care about the short-term outreach of their publications more generally—their public relevance, media coverage, and whether a publication is being discussed on social media. This is where the Altmetric attention score shines. It is sourced from the internet and based on an automated algorithm and is provided for each publication. In essence, the score is a weighted count of the number of attention a publication is receiving, among other things, in the news, on research blog entries, in policy documents, and on Twitter.Footnote 7 Moreover and in contrast to the impact factor measure, the attention score provides immediate feedback on the level of outreach upon online publication (even though we must stress that the measure does not allow to draw conclusions about a publication’s scholarly quality).

For a first tentative impression as to how much attention APSR publications have been receiving over time based on this measure, figure 6 shows the sum of Altmetric attention scores for all articles based on their year of publication.Footnote 8 It is important to note that these numbers should be taken with a grain of salt since articles published before 2011 are naturally less likely to receive attention in retrospect, although they sometimes do.Footnote 9 Nevertheless, the numbers hint at an increase in public outreach of articles published in the APSR, in particular in 2018, for which the sum of attention scores is more than twice as large as in the year before. However, also for 2019, we see a higher sum in the attention score compared to 2017.

Figure 6 Sum of Altmetric Attention Scores by Year of Publication

Looking only at articles published in 2019, we are able to highlight a few additional insights. The median attention score of manuscripts published (in print) in 2019 is 18 (as of March 6, 2020). In comparison, the article with the highest score, “Local News and National Politics,” written by Gregory J. Martin and Joshua McCrain has an Altmetric score of 587. Table 7 shows the top 10 articles with the highest attention scores. It is interesting to note that all of the top 10 articles are based on quantitative approaches and emphasize in the majority of cases (but not exclusively) causal identification. With respect to news coverage, according to Altmetric seven APSR articles published in 2019 are mentioned in 50 news reports. This is fewer than the articles published in 2018, of which 11 publications have been mentioned in news outlets. Together, these numbers suggest a constantly high level of attention of the APSR in recent years.

Table 7 Top 10 Manuscripts Published in Print in 2019 According to Altmetric Score

SUBMISSIONS BETWEEN JULY 1, 2019 AND FEBRUARY 29, 2020

In this final section, we discuss the journal’s operations from July 1, 2019 to February 29, 2020. We have data available on this period that we want to share with our readers. However, the period does not constitute a full editorial term which makes comparability with previous terms difficult and, thus, requires separate treatment.

In terms of submissions, the APSR received 863 manuscripts between July 1, 2019 and February 29, 2020, 726 of which were articles (84%) and 137 were letters (16%). This corresponds to, on average, about 3.6 submissions per day during this period which is more than during the editorial term 2018–19. In addition, we also received 173 revised manuscripts after review.

Regarding the gender distribution of authorship (among new submissions), solo male authors constituted the largest share (32%), followed by all male teams (29%). We received 198 mixed-gender team submissions (23%), 111 submissions from solo female authors (13%), and 25 submissions from all-female teams (3%). The share of submissions which have at least one woman author is accordingly 39%, three percentage points higher than during the previous editorial term, 2018–19.

Turning to the distribution of subfield classifications, 29% of these submissions were Comparative Politics, 20% American Politics, 14% Normative Political Theory, 15% International Relations, 6% Formal Theory, 4% Race/Ethnicity and 7% Other. Moreover, 42% of submissions received were from corresponding authors whose institutions lie outside the United States.

In the eight months since July 2019, our editors invited 2,589 reviewers, 67% of whom accepted the invitation, which is a comparable rate to previous years. In addition, we received 1,547 completed reviews. In the first round of decisions, the APSR editors desk rejected 46% of submissions, a higher rate than in the previous terms. 46% were rejected after review and 8% were invited to “Revise and Resubmit.”

With respect to final decisions, 54 manuscripts were accepted for publication. In contrast to the gender distribution among new submissions, all-male teams (37%) and mixed-gender teams (26%) constituted the largest share among accepted manuscripts in this period, followed by solo-male authors (22%). Female authored manuscripts constituted the smallest share with 11% of publications authored by solo-female authors and 4% by all-women teams. Thirty-one percent of these accepted manuscripts came from the subfield of Comparative Politics, 28% from American Politics, and 11% from Normative Political Theory. In addition, we accepted five International Relations papers (9%) as well as four papers from Formal Theory and Methods each (7%). Two published papers are concerned with Race and Ethnicity (4%) and one papers is classified as “Other.” With 40 manuscripts, more than two thirds of the acceptances took a Quantitative methodological approach, six manuscripts were Interpretative/Conceptual, five manuscripts had a Formal approach, and three manuscripts used a Qualitative approach.

CONCLUSION AND OUTLOOK

Overall, our editorial numbers indicate an effective editorial process in spite of the increasing number of submissions. The letter format shows increasing popularity and consists of about 20% of our submissions. In addition to our workflow and turnaround times we are very happy about the support of our reviewers, who continue to support the editorial process with a high acceptance rate of invitations and a high share of completed reviews. As we know from correspondence with other editors, this is not the conventional development in many other scholarly outlets.

Like other scholarly outlets, we are also confronted with hotly-debated issues. We still find a low submission rate of manuscripts (co-)authored by women. Existing explorations of our data suggest that this gender gap is not associated with the editorial process. Another issue of discussion concerns the mix of subfields and approaches. Although we find that four subfields dominate the mix of submissions and publications, we still cover a large proportion of other subfields. This is different for approaches, where quantitative studies dominate our submissions and publications—however, interpretative approaches come in second, and formal shows an increasing trend. Regarding desk rejections, we recognize criticism, in particular when we invite recently rejected authors to review. We take a second look when authors challenge our decisions, and we acknowledge when we are incorrect. However, we also need to make quick decisions on manuscripts, which are very unlikely to survive the review process. Oftentimes, authors fail to pay close attention to our publication formats or the scholarly discussions which take place in- and outside the APSR and therefore do not fit their manuscript into the broader political inquiries presented in their respective subfields.

Finally, public visibility is becoming a more important concern. When we started our editorship we were already confronted with this trend and responded to it by introducing FirstView and the letter format. The development of the impact factors revealed that the APSR had lost its prominent position when compared to other scholarly outlets in political science. In addition to letters and FirstView, the availability of the data and materials used in these articles may further increase the visibility and attractiveness of APSR publications. In 2015, APSR submission guidelines were updated to incorporate DA-RT principles.Footnote 10 Today, we host 181 published datasets and materials, of which approximately 158 have been updated during our tenure, with additional datasets in the pipeline to be released with the publication of the corresponding article. That being said, several contributors maintain their own Dataverses or data-hosting site, and where we have not been able to link the dataset to our Dataverse, we keep a list of APSR articles with their Digital Object Identifier (DOI).

Although at times we may seem aloof, we continue to listen and respond to criticism and the meaningful discussions taking place in the profession. While peer-review changes slowly—remember it still takes over a year from submission to acceptance for articles to be published, so change cannot occur overnight—we are hearing concerns and adjusting where possible to make APSR a more inclusive environment for all researchers. We welcome the new team to the APSR and will help them launch a successful start when they officially take over in June 2020. We thank you again for your continued support as readers, authors, reviewers, board members, and future editors.

References

NOTES

1 Please note that the turnaround times for the current term may get longer as they are determined by comparing date received and decision rendered, and not all submissions have had a decision rendered.

2 We used the genderizeR in R to identify gender and, then, handcoded all non-identified cases.

3 Starting in July 2010, the UNT editorial team began gathering information on the methodological approaches of the submissions they received.

4 Please note that these statistics are dependent on user information saved in Editorial Manager. While our team may from time to time update our contributors’ user data, we do not have the capacity to keep all records up-to-date. We therefore recognize that information on contributors’ whereabouts will not, and cannot, be completely accurate.

5 Whose manuscripts passed the technical check.

6 In our experience, the most difficult cases of desk rejections concern manuscripts of authors who we almost simultaneously invite to review other manuscripts.

8 The Altmetric attention scores for this report were downloaded on August 7, 2019.

9 A great resource to view a journal’s altmetric attention scores is located at https://app.dimensions.ai/discover/publication.

10 The following data presented excludes any archived material that is hosted on private researchers’ websites.

Figure 0

Figure 1 Submissions per Year (by First Receipt Date)

Figure 1

Figure 2 Letter Submissions as Share of Manuscripts Received per Semester (by First Receipt Date)

Figure 2

Table 1 Journal Turnaround Times (in days)

Figure 3

Table 2 Number of Invited Reviewers and Completed Reviews (By Invitation and Completion Date, respectively)

Figure 4

Figure 3 Submissions by Gender for Manuscripts Submitted Between July 2008 and February 2020

Figure 5

Figure 4 Percentage Share of Final Decision Outcome by Type of Authorship Between July 2018 and June 2019

Figure 6

Table 3 Gender Mix of Accepted Papers (in percent of total)

Figure 7

Table 4 Submitted Papers by Subfield, Approach, and Location of First Author (in % of total)

Figure 8

Table 5 Outcome of First Round (in percent of total)

Figure 9

Table 6 Accepted Papers by Subfield, Approach, and Location of First Author (in percent of total)

Figure 10

Figure 5 Impact Factor since 2007

Figure 11

Figure 6 Sum of Altmetric Attention Scores by Year of Publication

Figure 12

Table 7 Top 10 Manuscripts Published in Print in 2019 According to Altmetric Score