AGU’s Commentary Collections

A Commentary is an AGU paper type that offers a perspective on a recent result, controversy, or special event in particular field. JGR Space Physics published 15 Commentaries in 2016, most of them as part of the special section on Unsolved Problems in Magnetospheric Physics. These short articles are meant to spur discussion, action, and hopefully eventual resolution regarding the chosen topic. In JGR Space Physics, they are too new to understand and quantify their influence. Other journals have published Commentaries for many years, and the anecdotal evidence is good enough that AGU is encouraging all journals to publish more of these.

To better highlight and promote the existence of these papers, AGU has assembled several new special collections that gather these Commentaries for easy discovery. The link is on all journal websites, under the Special Collections pull-down menu:

JGR_Commentary_specialcollection

On this page are links to the Commentaries in each AGU discipline, including Space Weather and Space Physics. There are Commentaries here from a few different journals. Because papers cannot be in two special sections in the Wiley paper management system, instead of listing all of the UPMP Commentaries, there is simply a link to that special section’s webpage.

Happy reading!

 

Want Some Salt With That Metric?

I’ve become a fan of the Scholarly Kitchen. It’s a multi-author blog produced by the Society for Scholarly Publishing. They have daily posts about academic publishing across a wide range of topics, including some useful categories for JGR Space Physics readers, like peer review, discovery and access, and a category simply called academia.

scholarlykitchenbanner

While at the AGU EiC meeting this week, a link to a just-posted Scholarly Kitchen article was circulated on the trustworthiness of journal metrics. The author rates the various journal metrics according to their completeness, transparency, and veracity. She uses a clever scale…the “grains of salt” with which you should take each of the metrics. It goes well with my recent posts on metrics.

And the winner is…CrossRef, which only requires a pinch of salt. ISI and Scopus should be taken with a cup of salt, Download Statistics with a bathtub of salt, and Google Scholar and Research Gate with a classroom full of salt. Yeah, she really doesn’t like Google Scholar for scholarly metrics.

The author is Angela Cochran, who is the Journals Director for the American Society of Civil Engineers and a Past-President of the Council of Science Editors. She knows what she’s talking about on this subject.

I like one of the comments on the article about defining a new SI unit for skepticism, the pinch. A cup of salt is then a kilopinch, a bathtub a megapinch, and a classroom is a gigapinch. Clever.

CrossRef is what is used by Wiley for the “Cited By” link on each paper for all AGU journals, including JGR Space Physics. Here’s a recent example article with a healthy number in the “cited by” tab. When a publisher prepares a paper for production, they check the references for compliance with the database of known scholarly literature. Once published and online, that paper’s link is sent to CrossRef, which resolves the reference tags against its vast database, ensuring that the citation from the new paper is counted in the “cited by” list for each cited reference in it. The system is fast and the linkages are automatically made. CrossRef is a non-profit organization to which nearly all publishers contribute and subscribe, meaning that the database is as robust as possible and yet focused only on scholarly content.

CrossRef does not take the next step of generating an Impact Factor or CiteScore, which are proprietary creations of Thomson Reuters and Elsevier, respectively. What you get with CrossRef is a near-instantaneous update of the “cited by” number and paper listing at the Wiley site for your papers in AGU journals, and you can trust that it is the most accurate count of citations to your paper from other scholarly publications. That’s okay with me. We need to be dishing out kilopinches (or more) of salt with those other metrics, anyway.

Impact Factor Just For JGR-Space Physics

This coming year, I am told that Thomson Reuters will release section-specific Journal Impact Factors for JGR. I want to give the community a heads-up on where we stand.

If your institution has a subscription to Thomson Reuter’s Web of Science, then it is not a difficult to do a search for “Journal of Geophysical Research Space Physics” in the “Publication Name” field for specific years (one at a time) and get the citation numbers for a Journal Impact Factor specific for JGR-Space Physics.

I just did this and here are the results:

 

Unofficial JIF for JGR-Space Physics:

Number of items published in 2014:              760

Number of items published in 2013:              711

Cites in 2015 to items published in 2014:      1808

Cites in 2015 to items published in 2013:      2077

My estimate of the 2015 JGR-Space Impact Factor =      3885 / 1471 = 2.64

 

Hmm. This is well below the JGR-all sections JIF of 3.3 reported by Thomson Reuters. When I do this same search but with “Journal of Geophysical Research *” for the publication name, then I get a value of 3.20. This is close but a bit lower than the Thomson Reuters value. This is expected because I did not remove Editorials, Prefaces, and other items from the list of “papers”, which presumably have few if any citations and which presumably Thomson Reuters removes from the JIF calculation. So, perhaps the JGR-Space Physics value that I just calculated should be up by a tenth or two, but most likely it is still below 3.0.

I gathered the numbers for the 5-year Impact Factor, and for 2015, JGR-Space Physics has 2.76. Better than the 2-year JIF above, as expected because the journal has such a long cited half-life. With the correction to the denominator, this value is approaching 3.0, but again, probably not all the way up to it.

I was a nerd about it and pulled values back to 2002, when JGR went digital. This gives me 2-year JIF scores back to 2004 and 5-year JIF scores back to 2007. With my simple method, there has only been one 2-year JIF for JGR-Space Physics above 3.0, in 2012. The lowest is in 2004, which I calculate to be 1.89, but I am not sure that I trust the citation numbers for 2002; they are noticeably lower than other years and it is the year of the “switch.” So, removing this, then the next lowest year is 2007, with 2.25. Most years are between 2.5 and 2.7.

What does this mean? It means that this is the “level” for the JIF of JGR-Space Physics. Adding in the “Immediacy Index” values (cites in the same year as publication), which are usually between 0.6 and 0.8, and the total number of citations in the first two years of a paper’s life is, on average, between 6 and 6.5.

Note that the number of papers without a citation at all is very low. After 2 years, it is already down to just a few percent. Nearly all papers are cited at least once. In fact, total citations for the average paper increases almost linearly for the first ~8 years or so, as seen here:

avg_cites_per_paper_by_year

I’d say that’s pretty good.

In fact, it is a big reason why I am not afraid of the coming official release of a JIF specific to JGR-Space Physics and its expected value of below 3.0. This journal’s papers have longevity, accumulating citations well past anyone’s journal metric calculations. People have published good papers in this journal, and present-day researchers continue to cite those studies. We take a while to absorb a result and build on it. The JIF is useful as a journal metric, but it is not the whole story.

JIF and CiteScore

This week, Physics Today published an article on the Journal Impact Factor and the new CiteScore index. Both are average citation values within a certain year to papers published in a few preceding years. The main difference between the two are that the JIF uses citations to papers in the prior two years while CiteScore includes citations to papers in the previous three years. The other main difference is that Elsevier, the creator of the new CiteScore index, is making everything about the creation of the values open, while Thomson Reuters only makes the formula and numbers used available to subscribers, and the actual list of citations is kept proprietary.

physicstodayonline-site

            As the Physics Today article notes, the values are similar for most journals between the two indices, but some shifting is evident, especially among the top titles. For JGR (all sections combined), the values are almost identical, with the 2015 Impact Factor being 3.32 and the CiteScore being 3.39 (to two significant digits, which I don’t like to do).

Also as noted in the Physics Today article, the similarity in how they are calculated suggests that the complaints about JIF are largely applicable to CiteScore. Okay, it includes another year, but Thomson Reuters already produces a 5-year Impact Factor, so CiteScore splits the difference. Both are susceptible to the size of the “highly cited tail” of the paper distribution in a journal, especially if the number of citable items is relatively small. Also, both are susceptible to manipulation, if publishers were to unethically push authors of new manuscripts into citing papers in their journals.

I find it bewildering that there are ~5% of journals in existence with a CiteScore of zero (as reported in the Physics Today article). This means that there was a year in which there were no citations to any of the articles published in that journal for the prior three years. I have not looked up the names of these journals to look for a trend or commonality but, regardless…wow. Thanks again for reading and citing the papers in JGR Space Physics!

Details of JGR’s 2015 JCR

Thomson Reuters has completely reformatted the Journal Citation Reports (JCRs) at their website, but eventually I was able to sift through the new layout and find most of what I wanted. One of the documents, the Journal Profile Grid, is an Excel sheet in 5-point font. While this is easily correctable, it is annoying on first reading.

As I mentioned in a post last month, 2015 had flat-to-slightly-down 2-year and 5-year Impact Factors. However, in the long term, JGR‘s Impact Factor has been trending slightly upward. It was never above 3.0 in 2007 and earlier, yet has never been below 3.0 from 2008 onward. Here is a nice little graphic from the JCR showing that trajectory:

JGR-ImpactFactor

So, it’s done this (a brief, small dip) before. The little hiccup as of late might be just that, a blip in the long-term upward trend. Or it could be the start of something bad. Let’s hope for the former, not the latter.

Following what I did last year with JCR details, here are some other tidbits of information about JGR.

  • Total cites: rose by ~5% to an amazing 198,000. This is a huge number for a discipline-specific journal. Here’s a chart of JGR total cites by year:

JGR-TotalCites

  • Self cites: down a little at 18%. I don’t actually know what this means. A high number (above 10%, say) could be a sign of dominance in the field or it could be a sign of isolation and disconnection from the field. We’ll go with the former.
  • Immediacy index: dropped just a bit to 0.61. Remember, this is cites in 2015 to JGR papers published in 2015; most of the papers in the second half of the year have zero citations.
  • Cited half-life: still greater than 10 years. So, on average, more than half of the citations to a JGR paper occur 10+ years after its publication.
  • Citing half-life: ever-so-slightly up to 9.4 years. This is the “age” of references in JGR
  • References per paper: up by one to 56. They count papers with more than 100 references as “review articles” rather than “regular” research papers.
  • Eigenfactor score: dropped ever-so-slightly to 0.31. This value is based on the 5-year Impact Factor but removes self-cites and then weights citations based on the strength of the citing journal. This is a decent value.
  • Article Influence Score: down a bit to 1.39. This is a discipline-normalized version of the Eigenfactor. Values above unity are good.

All of these metrics are explained in more detail in an earlier post. And again, remember, this is for all of JGR, not just JGR Space Physics.

 

JGR’s 2015 Impact Factor

Thomson-Reuters released the 2015 Impact Factors and the value for Journal of Geophysical Research is 3.3. This is a 0.1 drop from the journal’s Impact Factor of 3.4 in 2014 and 2013 . Basically within the noise of year-to-year variation, but it went down a tenth instead of up, which is disappointing.

Here’s the Thomson-Reuter’s logo, just so we have a graphic to go with this post, with a link to their page on this topic:

thomson_logo_zoom

            The Impact Factor is calculated as the average citations in year 2015 of papers published in 2013 and 2014. Thomson-Reuters also calculates a 5-year Impact Factor and the 2015 value for JGR is 3.7, again identical to the 2013 and 2014 values to two significant digits.

Thomson-Reuters still has all sections of JGR combined in this value. So, I don’t know what it is specifically for JGR Space Physics.

In good news for the space physics community, AGU’s Space Weather Journal rose from 2.1 to 2.4 in its 2-year Impact Factor, and from 1.9 to 2.3 in its 5-year Impact Factor. Woohoo for Space Weather!

Like last year, I’ll download the Journal Citation Report, take a closer look at the numbers behind this index, and write a follow-up post.

Passing 50000 Hits

An arbitrary but round-number milestone was reached this week: this JGR Space Physics blog passed the 50 thousand hits mark. Here’s the image from the bottom of the screen, as of a few minutes ago:

Passing_50k_hits

Thank you very much to all of the readers of this blog out there, whether you are regular or one-time-only visitors. Watching this number steadily rise lets me know that this effort is worth it.

Here are a few other stats about the blog readership. First, here’s a plot of the daily visitors and page views (i.e., hits) for the last month:

DailyStatsEarlyJune2016

The numbers at the bottom are the values for today, as of a few minutes ago. As you can see, the blog normally gets between 50-100 page views a day, with a bit less on the weekends. The huge spike on May 9 is when my monthly highlights announcement came out in the SPA Newsletter. Those days typically get several hundred hits.

Here is a chart of the visitors and views per month for the last year:

MonthlyStatsEarlyJune2016

The values at the bottom are for June, which is only three and a half days old. You can see that the blog hovers in the ~2000 views/month range, except for January, which I took off from blogging. That month still had ~1400 visits.

Here are the top 10 blog posts viewed so far in 2016:

TopPostsForHalfOf2016.jpg

            The “home page” is the most commonly visited, and on this page people can read the latest five posts. Interestingly, the next four are all from 2014. I struck on a good topic with those, and for the most part they are instructive posts on how to understand the AGU manuscript process. Only three on this top-10 list are from this year, but the “home page” views are also view of this year’s posts, so that should count, too.

Most of the readers are from the United States. Here’s the map of 2016 views by country:

CountryMapForHalfOf2016

It’s a little easier in table format; here are the top ten countries visiting the blog in 2016:

TopCountriesForHalfOf2016

Countries from almost every continent are on the list. Thanks for being such a diverse audience.

One final factoid: I’ve written 162 blog posts (this is number 163). I’m two and half years into my four-year term as Editor in Chief, so I am pretty safe in saying that the majority of blog posts have already been written, unless I really pick up the pace in the final year. Please keep the blog post suggestions coming; I eventually get around to writing about most of them. If it seems that I’ve lost one of your suggestions, please feel free to send it in again.

Reviewer Statistics for 2015

We, the five Editors of JGR Space Physics just published an Editorial thanking the 1,506 scientists that served as peer reviewers in 2015. We greatly appreciate all that is done by the research community members to make this journal what it is. Thank you!

IMG_0173_thankyou

This is a photo taken at our editorial board meeting at AGU HQ in July 2014.

This is an increase of about 100 people from last year. The total number of reviews conducted was 3,592, also about 100 more than last year. We sent out 7,660 requests to review last year, which is up about 300 from last year, so our response rate dropped just slightly, down to 47%. Still, a great number, since this includes the “not needed” designation when others fill the two reviewer slots before the rest of the requests are answered. If you remove this category (1755 requests that were “not needed”), then the acceptance rate jumps up to 83%. Awesome!

A value I reported last year is the accept-to-decline ratio. There were 1884 declines, therefore this ratio is 1.9, just slightly lower than last year but still very high. Thanks for saying yes so often to our requests.

Yet another number for comparison with last year’s reviewer statistics: there were 341 times that a request was designated as “no response.” This is when the potential reviewer didn’t answer repeated requests and so we moved on to others. This is different than “not needed,” which is when other potential reviewers filled all of the desired slots before that person found time to answer. With a better than 10-to-1 ratio of acceptances to no responses, I think the community is doing very well. So, thank you, again!

Other 2015 stats from the Editorial: (a) the average number of manuscripts reviewed for each reviewer that served last year was 2.4; (b) 277 people did 4 or more reviews in 2015; (c) the total number of manuscript final decisions in 2015 was 1,147; (d) the acceptance rate was 67%; (e) there were 1,334 revision decisions; (f) on average an Editor makes 2.2 “decisions” for every assigned paper; (g) there are 3.1 reviews needed per manuscript, including reviews of revisions.

Again, thanks for all of your hard work for JGR Space Physics. We really appreciate the community support for this journal.

 

Rejection Without Review

The Editor’s of Global Biogeochemical Cycles published an Editorial last summer, “Criteria for rejection of papers without review.” In this article, they state that the Editors reject 25-30% of submissions because they are out of scope for the journal. That journal is rather specific and the boundaries of its scope can be unclear to authors. Thus, the GBC Editors felt the need to clarify their scope to the community.

Rejected_Without_Review_1-300x254.jpg

            Similarly, Geophysical Research Letters rejects a rather high percentage, over two-thirds, of papers submitted to it. I don’t know the exact breakdown, but a significant fraction of those rejected by GRL are not sent out for review because they do not fit the scope of the journal. In GRL‘s case, the scientific scope is very broad, and probably very few are rejected because the topic is not appropriate. Rather, that journal’s Editors reject without review because one or more of them found the paper to not meet the criteria of significant new results worthy of consideration for rapid publication. That is, “significance” is one of GRL‘s major limiting “scope” criteria. For some GRL papers, the Editors send it out for review and then, based on the comments of the reviewers, the Editor will decide that the manuscript doesn’t meet the significance criterion for GRL and reject it at that point.

I see examples of “significance” rejections from GRL in the submissions we get that were initially submitted to that journal before being submitted to JGR Space Physics. Some were rejected without review by that journal while others were sent out for review and then rejected on significance based on the reports.

At JGR Space Physics, we reject less than one-third of submissions, and around 11% of new submissions are rejected without review. Less than one percent of submitted manuscripts are rejected for being “out of scope.” We just revised the journal’s full aims and scope last fall, which can be found at the JGR Space Physics website. Therefore, JGR Space Physics does not suffer scope uncertainty like GBC. In addition, unlike GRL, we do not, at the moment, apply much of a filter on the significance of the results. As long as the findings are within the subject of space science, we will most likely send it out for review. We rely on the community to provide the initial assessment of the significance of the findings, and if it is rejected for this reason, the assigned Editor will read through the paper to be certain about this decision.

That is, the Editors of JGR Space Physics apply the “significance” criterion after review rather than before sending it out. This brings up the concern of burdening the community with refereeing assignments of papers with marginal new results. That is a consequence of our editorial practice, but I think that it would take a very large editorial board to make such decisions before sending them out. It would require more Editors for a few reasons, not only because the average editorial time commitment per paper would increase, but also in order to adequately cover the range of topics within the scope of the journal. Even with more Editors, it would, in my opinion, result in many good-quality papers being turned away from JGR Space Physics, perhaps to be published elsewhere.

If you would like to read other opinions about rejection without review in scientific publishing, then here’s perspective on it from the editors of a nutrition journal, another from a technical editing service, and yet another from a science writer. That last one is where I grabbed the graphic above.

Levels of Rejection

I was at the “Unsolved Problems in Magnetospheric Physics Workshop” in Scarborough, England this last week. It was an excellent meeting and I highly commend Mick Denton and crew for putting on a brilliant conference with lots of time for discussion. In case you were wondering, yes, I think that even my talk went well, as it prompted a lively debate.

Here is Editor Larry Kepko’s hand holding aloft a bottle of “magnetosbeer,” specially labeled by a local brewery just for the conference:

Kepko_magnetosbeer

He carried that bottle a couple of miles and up a hill in order to get that photo!

Something came up in conversations during the week: it was lamented to me that JGR Space Physics has gone the way of GRL in sending authors “reject with encourage to resubmit” decision letters. Yes, we do send out such letters. It was brought to my attention that these letters are worded rather nicely, indicating that the paper is being “declined publication” (i.e., rejected) but that the Editor would like to see it resubmitted to JGR Space Physics when it is suitably revised. We even ask for responses to the reviews. This sounds a lot like a major revision decision, except that it is assigned a new paper number upon resubmission. The complaint is that this decision is done simply to increase our rejection rate and submission-to-decision time. Perhaps others of you feel the same way as those that expressed this to me directly. Therefore, I would like to directly and openly address it.

First of all, I’d like to acknowledge that this opinion of the “reject and encourage resubmission” decision is a valid complaint. At first glance, it certainly does look like a major revision decision, just with a new paper number next time. That is not our intent.

I use this option when two conditions are met. The first condition is that the reviewers noted numerous and substantial concerns about the study. Therefore, I am judging the paper to be demonstrably not ready for acceptance and publication. I have written a couple of times about why I reject papers. The paper needs an overhaul and I am making the judgment call that it will be a significantly different paper upon resubmission, therefore making it a new submission and invalidating the original initial submission date. There second condition that must be met is that I think the core elements of the study are worthy of eventual publication in JGR Space Physics. The decision letter, therefore, not only informs about rejection but also indicates my eagerness to see it again. Yes, it was rejected, and I think you have a lot of work to do, but yes, I also want you to give it another chance in JGR Space Physics.

Asking for responses to the reviews actually helps you speed your previously rejected paper towards acceptance. That’s why we ask you to do it. If you supply them, then we will definitely seek out the original reviewers. You can also ask for us to not send it to those reviewers. In the end, we may or may not use the original reviewers.

We have another decision-to-reject letter does not include these words of encouragement. In that letter, we state that we are providing the reviews in case you want to consider them as you mull your options of sending the paper to another journal. You can still resubmit a rejected paper for which I sent such a letter, but my experience tells me it will be a challenge to get through the reviewers.

I would also like to acknowledge that I believe that no one likes to receive a rejection letter. Here’s a website I found with a graphic that pretty much sums it up:

rejection-letter1

You spent time on the study and writing the paper. Rejection stinks. It’s okay to wallow in misery for a day.

Finally, I would like to dispel the erroneous notion that I am under pressure to achieve a particular rejection rate or time-to-publication interval. AGU places no pressure on us to hit any targets with either of these statistics. We do our best to keep JGR Space Physics a high-quality journal that publishes significant new contributions to the field, while also moving your papers along through the editorial process as smoothly and quickly as possible.