Hindawi shuttering four journals overrun by paper mills

Hindawi will cease publishing four journals that it identified as “heavily compromised by paper mills.” 

The open access publisher announced today in a blog post that it will continue to retract articles from the closed titles, which are Computational and Mathematical Methods in Medicine, Computational Intelligence and Neuroscience, the Journal of Healthcare Engineering, and the Journal of Environmental and Public Health

The closures follow reporting by Retraction Watch in February that a professor used the identity and email account of a former student to edit special issues of two of the journals, Computational Intelligence and Neuroscience and the Journal of Environmental and Public Health.

The four closed journals were also among ten Hindawi titles that sleuth Dorothy Bishop found had signs of paper mill activity in their special issues. In March, the four titles were among 19 Hindawi journals that Clarivate removed from its Web of Science index for failing to meet its quality criteria.

Wiley, Hindawi’s parent company, announced that month that it had paused publication of special issues in Hindawi journals for three months, and subsequently had lost $9 million in revenue. 

Last month, Wiley and Hindawi said they planned to retract 1,200 articles with compromised peer review, on top of 500 they said they would retract last September. 

Slightly more than half of the completed or planned retractions come from these four journals, a Wiley spokesperson told us. 

Retraction Watch has tracked 81 retractions from Computational and Mathematical Methods in Medicine, 35 from Computational Intelligence and Neuroscience, 133 from the Journal of Healthcare Engineering, and six from the Journal of Environmental and Public Health. Most of the retractions have come since September 2022.

The closed journals “have published Special Issues that have been impacted to such an extent that we feel it is in the best interest of the scholarly community to discontinue them,” Hindawi’s blog post stated. It continued: 

We know that considerable effort has been put into these journals and appreciate all the editors and peer reviewers who have contributed time and expertise to evaluating legitimate research over the years. We also recognize the impact on authors who have published legitimate research in these journals. All articles hosted in Web of Science will remain fully indexed and we commit to continuing to host all published content on the archived journal websites.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

18 thoughts on “Hindawi shuttering four journals overrun by paper mills”

    1. Wiley is the problem. Or are we supposed to believe that this has nothing to do with Wiley’s push to multiply the volume of published papers, which has recently led to resignations in other journals?

      1. Closing journals is a strange way to push for more papers to be pubblished. Or is it possible that the jounals were irredeemabl tarnished and the issue of pushing for more papers to be published has nowt to do with it?

      2. Even more general: for profit academic publishers are the problem. In this case it’s Wiley, but all similar publishers have similar problems.

    2. The core body of Hindawi is not bad at all. They are honest and ethical. The problem is with their special issues.

      Hindawi journals have a decent rejection rate (which goes up to as high as 80%). Without special issues, the rejection rates might be even higher. They are very clear about their acceptance rates or fees. They don’t hide anything. There is nothing fishy about them.

      They are not that hungry for money the way people call them. Hindawi journals are 100% free for authors from low-income countries like mine. Yes, I publish in them for free. This is not the case for “Elsevier’s Open Access” or “MDPI” or “Frontiers”.

      Hindawi teams are extremely careful and thorough while checking the quality of reviews, coercive citations, plagiarism, similarity checks, conflicts of interest, the authors’ emails and identities, etc. etc. etc. I haven’t seen some of their quality control checks in any renowned journals.

      They are extremely fast in copy editing and publishing. By far, they are the fastest among all publishers. Interestingly enough, they are also very good at copy editing and typesetting. They even edit the images (something NO other publisher does), so that the font of any writing on the images matches the font of the text, or to make sure that the elements on the images are clear and of high quality. They don’t ask you to format the references in any particular way. If accepted, they will themselves reformat all references. So the work is easy on your part, and fast on their part.

      Anybody who has the experience of working with them has been very impressed by their quality work. You can try it yourself, and you too will be impressed. I have published some papers in Hindawi journals (all for free), and each time my coauthors were very surprised by the high level of professionalism and care the Hindawi staff exerted.

      I have also peer-reviewed for them. Once, I suggested the authors cite some of my relevant articles. But Hindawi emailed me back, asking me to remove those suggestions! I haven’t seen this level of professionalism in any other journal or publisher.

      If the reviewers’ review notes are not of high quality but the editor uses them to finalize the paper, or if later it turns out that the reviewer and/or the editor had any smallest conflict of interest, the whole previous review procedure will be discarded and the case will be handed to a new editor to begin it afresh. Then the new editor invites new reviewers and the whole review process restarts from scratch.

      It’s not really fair at all to judge Hindawi without even knowing the smallest thing about it.

      1. Using the same argument made in the comment: one person’s experience that they consider professional and positive does not account for the 1000s of other problematic ones (the numbers in the original post are astounding!).

        My experiences reviewing for Hindawi and reporting problematic papers, granted this was > 10 years ago, were negative. They didn’t take reviews or reports of problems seriously. They just wanted to publish every paper as fast as possible (the model MDPI adopted and further perfected). So, I put Hindawi in the spam filter and stopped interacting with them. From RW’s reporting above, it seems this move was wise.

        1. If you were honest or fair (or not ignorant), you wouldn’t say “They just wanted to publish every paper as fast as possible”.

          Your accusation is very UNTRUE. Their now-delisted journal’s stats show a 23.6% acceptance rate, a 33.6% immediate rejection rate, and a 36.3% ‘rejection after peer review’ rate.

          A 23% acceptance rate is much smaller than what you are accusing them of (“that they publish every paper” = 100% acceptance).

          Do criticize anything including Hindawi’s several drawbacks. But don’t be ignorant, dishonest, biased, or unfair in doing so.

          The very fact that you haven’t even read the original post or my comment carefully (before jumping to accusations) clearly shows that your opinion cannot be based on accurate, unbiased truth.

    3. Just remembered to add 3 other interesting and impressive points about Hindawi:
      1. Unlike all other publishers, Hindawi journals restrict the number of papers one can simultaneously have under investigation by Hindawi journals in order to make sure the journal resources are better distributed to more authors. If they were that money hungry (the way people call them), would they restrict the number of submissions by an author?
      2. They are, by far, so much more transparent than any other publisher. The number of received manuscripts in each month, the number of accepted manuscripts in each month, the rejection rates before peer review, the rejection rates after peer review, median reviews per accepted article, and any other detailed stats about almost all aspects of their activities are 100% transparent and visible. For example, see this link and change the parameters of interest from the drop-down menus to see how the journal operates in different time spans:
      https://www.hindawi.com/journals/bmri/journal-report/
      No other publisher reveals this much information and detail about each of its journals.
      3. Not only Hindawi journals are totally FREE for authors from about 85 low-income countries, but also they give a 50% discount to authors from more than 40 middle-income countries. https://www.hindawi.com/publish-research/authors/waiver-policy/
      So again, before judging them with uneducated dismissing statements, first try to know them.

      1. It is very likely that Let’s Be Fair is part of Hindawi or its part of its public relation team. These posts are very consistent in its tone and content, but is very different from my own negative experience with Hindawi as a former editor for a couple years of Hindawi.

        1. The user “Let’s be honest” is very likely a part of another competitor publisher that wants to defame Hindawi.
          See what I did there?! We can play this “accusation game”, OR we can discuss like smart, mature people.
          You, instead of giving clear explanations and providing evidence, simply accused the user “Let’s be fair” and Hindawi of secretive advertisement campaigns.
          Learn from the user “Let’s be fair” and talk with REASON. If you want to bash Hindawi, that’s fine. But at least give some good explanation and evidence. Hence, “BE FAIR”!

  1. At this point, hasn’t every academic and their dog been contacted by Hindawi to edit a special issue on whatever you want? I was contacted over a decade ago and, being more junior then, emailed back for more info. My memory is a bit hazy for it now, probably because I subsequently did some associate editing gigs and published my own special issue via one of those journals, but I think they actually turned down my proposal! I was relieved. The more back and forth there was the more I was concerned about the outfit being run. But, hey-ho. That’s what happens when a psychiatry journal contacts a psychologist for a special issue in psychiatry…

    1. As a clinician who publishes, not an academic, I have had my share. I didn’t ask my dog if he’d like the opportunity yet…

    2. You mean you were invited to publish a Hindawi special issue, then you happily applied, and then they *rejected* your application because it was not good enough or you were not qualified (or both)?

      Do you know you actually just complimented them (against your intention which was to tarnish them)?

      Because: If they did rejected you, it means they do have standards.

  2. I warned them repeatedly since some years ago that their special issues were being abused frequently and will cause them a lot of trouble and that they should take this matter seriously. They almost didn’t listen.
    But why stop the journals?! Now that the journals are delisted, Hindawi didn’t really need to shut them down, did they? They could simply close and discontinue any and all special issues permanently, catch up, and reapply to be listed in JCR and Web of Science again.
    After all, Web of Science and JCR have themselves become seriously compromised and ruined because Clarivate will be handing Impact Factors like candies to all ESCI journals this June. Many ESCI journals are so terrible they don’t even qualify to apply for Pubmed Central or Scopus lol. And now, all those awful journals (that are not even indexed by Pubmed or Scopus) will have Impact Factors, something that once used to mean “authenticity and quality”.
    So those 4 journals haven’t lost much; they could get back on track soon.
    Stopping the journals was the most idiotic decision ever, badly affecting many honest authors, editors, and reviewers who had collaborated with them.

    1. Shuttering the journals is the right move. The journals are forever tainted after engaging in unethical business practices. Who in their right might would ever submit there again? How would a reader ever know in the future if a paper they’re reading is fake or real? The scale of the fraud is astounding (look at the numbers in the original reporting).

      Further, how does shuttering the journal affect “honest authors, editors, and reviewers who had collaborated with them”? Work published in these journals before is available online. Wiley won’t be purging the history of the journal. The historical impact factors are available from the JCR, one can find what they were for the years a past publication was sent there. So, what exactly has changed for those papers published there before?

      Shuttering of journals after they’ve been compromised is normal. An example from Elsevier comes to mind:
      https://www.journals.elsevier.com/mathematical-and-computer-modelling/announcements
      “the journal’s peak reputation is not recoverable” — from the final editorial here: https://www.sciencedirect.com/science/article/pii/S0895717713002690
      But one can still read everything in it. Its historical impact factors and other metrics can be researched as needed.

      Colleagues, let us be logical and level-headed.

      1. These journals did not welcome or instigate unethical activities. They were being abused by outsiders (some guest editors). I reported some problematic special issues and Hindawi did close some of them prematurely (and perhaps did something about retracting some of their papers, that I didn’t check later).

        But I repeatedly warned them that they should seriously limit and restrict their special issues (current and future). This is where they didn’t listen. They didn’t listen to my word, saying that special issues should be closed forever or be allowed with much higher standards than the current ones.

        Regarding the other part of your comment: If a journal that has been compromised can regain the trust of the scientific community again (e.g., by getting back into JCR), why won’t people submit their papers to them? I for one would gladly do so if I know that they are clean again.

        Yes, the archived papers are available together with archived IFs. But for many employers or academics, the current status of a journal matters too. Killing a journal can disrupt the reputation of papers published in it years ago. Many employers never even bother to go search the archives of JCR to check a journal’s IF at the time of publication of a particular paper. Many care mostly about the current reputation and status of the journal. If you kill a journal, you would badly affect the reputation of its past collaborators (authors, editors, reviewers) too.

  3. Where are US Natl Library of Medicine (NLM) and NIH in this discussion?
    The mentioned journals together with a long list of additional journals should have been deindexed from PubMed/Medline long time ago.
    We need quality control in scientific publishing and the PubMed journal indexing step would be an efficient part of that.
    Instead we have a situation where public funds are spent on distributing misinformation for free. NLM do not charge a cent for helping high profit publishers to distributing their junk science.

    1. Pubmed, MedLine, and NLM do their fair share of quality control.

      The problem lies in ‘Pubmed Central’ which acts as a form of a backdoor, allowing low-quality papers to become visible in Pubmed and get mixed with high-quality papers indexed by MedLine.

      A similar weak link is now ESCI (Emerging Science Sources Index), which is a backdoor to the Web of Science for low-quality journals. And now, thanks to this recent strategy of Clarivate, ESCI will act as a backdoor also to the high-profile JCR for all those mediocre or even terrible journals.

      Note: I know that many ESCI journals are good and qualify to have a JCR Impact Factor. The problem is many others aren’t. For getting an Impact Factor, a journal must work very hard; at least it used to be like this. But that is no more.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.