Fraudulent Business Model for Scientific Publications
Predatory publishing, a particularly insidious form of academic deceit, operates on a foundation of exploitation and outright fraud. It's a model where journals and publishers, driven by avarice, systematically disregard the core tenets of scholarly advancement, prioritizing their own financial gain and reputation above the integrity of research. This isn't a subtle nuance; it's a deliberate subversion of the scientific process, characterized by a litany of misleading claims about their editorial practices, a blatant disregard for established peer-review standards, a conspicuous lack of transparency, and a penchant for aggressive, often coercive, solicitation tactics to lure unsuspecting authors. These publishers prey on the inherent pressures researchers face to publish, thereby corroding the very credibility and trustworthiness of scholarly communication.
The term "open-access predatory publishers" began to gain traction around 2012, notably through the observations of Jeffrey Beall, who described them as publishers "ready to publish any article for payment." While the label "predatory" has itself been a subject of considerable debate and criticism, the underlying phenomenon remains a stark reality. A comprehensive examination of this controversy, initiated by Beall, can be found in The Journal of Academic Librarianship.
These publishers are deemed "predatory" because they not only trick scholars into publishing with them, but some authors, perhaps aware of the journals' dubious quality, still submit their work, often driven by desperation or a misguided attempt to navigate the labyrinthine pressures of academic careers. New researchers, particularly those hailing from developing countries, are disproportionately vulnerable to falling victim to these deceptive schemes. A sobering 2022 report indicated that nearly a quarter of respondents across 112 countries, spanning all disciplines and career stages, admitted to having either published in a predatory journal, participated in a predatory conference, or being unsure if they had. The majority of those who unknowingly engaged with such entities cited a lack of awareness, while those who knowingly did so pointed to career advancement as their primary motivation. The impact is not negligible; one study revealed that a staggering 60% of articles published in predatory journals received no citations within five years of their publication.
In an effort to safeguard the scholarly ecosystem, various initiatives have emerged to counter the influence of predatory publishing. These include the implementation of blacklists, such as the now-defunct Beall's List, and whitelists, exemplified by the Directory of Open Access Journals. However, precisely defining and quantifying "predatory" journals remains a persistent challenge, as it often exists on a spectrum rather than as a clear binary. It's not uncommon to find a single journal containing articles that meet the highest standards of scientific integrity alongside others that raise significant ethical concerns.
History
The seeds of predatory publishing were sown even before the term became widely recognized. In March 2008, Gunther Eysenbach, a proponent of early open-access publishing, began highlighting what he termed "black sheep" among open-access publishers and journals. He pointed fingers at those who employed excessive spamming tactics to solicit authors and editors, specifically calling out publishers like Bentham Science Publishers, Dove Medical Press, and Libertas Academica. Later, in July 2008, Richard Poynder's interview series brought further attention to the practices of new publishers adept at exploiting the burgeoning open-access landscape. Throughout 2009, doubts about the honesty and prevalence of scams within a segment of open-access journals continued to surface.
These concerns about aggressive spamming practices prompted leading open-access publishers to establish the Open Access Scholarly Publishers Association in 2008. An earlier precedent was set in 2009 when the Improbable Research blog exposed a publisher, Scientific Research Publishing, for duplicating previously published papers. This exposé was subsequently covered by the prestigious journal Nature. In 2010, a graduate student from Cornell University, Phil Davis, famously submitted a manuscript generated by SCIgen, a program that creates nonsensical computer science papers. The paper was accepted for a fee, though Davis later withdrew it. Predatory publishers have also been known to hold submitted manuscripts hostage, refusing to allow authors to withdraw them, thereby blocking their submission to legitimate journals.
The very concept of "predatory publishing" is not monolithic; it encompasses a range of questionable practices. The term itself was popularized by American librarian Jeffrey Beall, who maintained a list of "deceptive and fraudulent" Open Access (OA) publishers until its withdrawal in 2017. This list was later adapted into a for-profit database by Cabell's International. Beall's list, and subsequently Cabell's database, included publishers engaging in outright fraud, such as fabricating editorial boards, fabricating ISSNs, employing dubious marketing tactics, and even hijacking established journal titles. However, these lists also encompassed journals with merely subpar standards of peer review and linguistic quality.
Studies that utilized Beall's list or his definitional framework documented an alarming exponential growth in predatory journals since 2010. A 2020 study revealed that thousands of scientists had reviewed papers for journals identified as 'predatory,' often without their full knowledge. An analysis of the Publons platform found records of at least 6,000 reviews submitted for over 1,000 predatory journals, with a notable concentration of reviewers being young, inexperienced, and affiliated with institutions in low-income nations in Africa and the Middle East. The unethical practices within the OA publishing industry have also garnered significant attention from the mainstream media.
Bohannon's Experiment
In 2013, John Bohannon, a staff writer for the journal Science, conducted a revealing experiment. He submitted a deliberately flawed paper, detailing fabricated research on the purported effects of a lichen constituent, to numerous open-access journals. His findings, published in a paper titled "Who's Afraid of Peer Review?", demonstrated that approximately 60% of these journals, including those associated with established publishers like Elsevier, SAGE, and Wolters Kluwer (via its subsidiary Medknow), accepted the fraudulent manuscript. In contrast, reputable journals such as PLOS ONE and Hindawi rejected it.
"Dr Fraud" Experiment
Further exposing the lax standards, a 2015 experiment involved four researchers creating a fictitious scientist, Anna O. Szust (an anagram of the Polish word for "fraudster"), with no publications or editorial experience. They applied for editor positions on behalf of Szust to 360 scholarly journals. A third of these journals were selected from Beall's list of predatory publishers. Astonishingly, 40 of these predatory journals appointed Szust as an editor, often within hours or days, without any substantive vetting. In stark contrast, journals deemed to meet acceptable quality standards, including ethical publishing practices, provided minimal to no positive response. Among journals listed in the Directory of Open Access Journals (DOAJ), 8 out of 120 accepted Szust, though the DOAJ later purged some of these affected journals in 2016. None of the 120 journals sampled from the Journal Citation Reports (JCR) offered her a position. The results of this experiment were published in Nature in March 2017 and garnered widespread media attention.
SCIgen Experiments
The SCIgen program, designed to generate random academic computer science papers using context-free grammar, has also been instrumental in uncovering predatory practices. Papers generated by SCIgen have been accepted by numerous predatory journals and presented at predatory conferences.
Federal Trade Commission vs. OMICS Group, Inc.
A significant legal action was taken on August 25, 2016, when the Federal Trade Commission (FTC) filed a lawsuit against the OMICS Group, iMedPub, Conference Series, and its president, Srinubabu Gedela. The lawsuit accused the defendants of "deceiving academics and researchers about the nature of its publications and hiding publication fees ranging from hundreds to thousands of dollars." The FTC's action was also a response to mounting pressure to address predatory publishing. The OMICS Group's legal team issued a rebuttal, calling the FTC's allegations "baseless" and suggesting the lawsuit was driven by "subscription based journals publishers who are earning Billions of dollars from scientists literature," implying a conspiracy within the established publishing industry. In March 2019, the FTC emerged victorious, securing a summary judgement and being awarded $50,130,811 in damages, along with a broad injunction against OMICS's practices. However, the collection of this award remains unlikely, as US court rulings are not enforceable in India, where OMICS is primarily based and does not possess significant property in the US.
Characteristics
Identifying common traits of predatory publishers is crucial for researchers seeking to avoid them. The complaints frequently associated with predatory open-access publishing include:
- Rapid article acceptance with minimal or no peer review or quality control: This often extends to accepting hoax and nonsensical papers.
- Delayed notification of article fees: Authors are often informed of publication fees only after their papers have been accepted, creating a commitment they cannot easily retract.
- Acceptance of papers outside the journal's stated scope: Journals may accept submissions on topics entirely unrelated to their advertised focus.
- Aggressive solicitation tactics: Researchers are frequently bombarded with unsolicited invitations to submit articles or join editorial boards.
- Listing academics on editorial boards without consent: Many academics find themselves listed as editorial board members without their knowledge or permission, and are subsequently denied requests to resign.
- Appointment of fake academics to editorial boards: Some publishers invent fictitious academics to populate their editorial boards.
- Mimicry of established journals: Predatory publishers often adopt names or website designs that closely resemble those of well-respected journals, aiming to confuse unsuspecting authors.
- Misleading claims about publishing operations: False information about locations, editorial processes, and indexing is common.
- Improper use of ISSNs: International Standard Serial Numbers may be misused or fabricated.
- Citing fake or non-existent impact factors: Predatory journals often boast inflated or entirely fabricated impact factors to appear more credible.
- Misrepresenting indexing services: They may claim to be indexed by prestigious databases when they are merely listed on academic social networking sites like ResearchGate or by standard identifiers like ISSNs and DOIs, presenting these as indicators of scholarly rigor.
- Favoritism and self-promotion in peer review: The peer-review process, if it exists at all, is often compromised by bias and self-serving practices.
Predatory publishers share a striking resemblance to vanity presses, where the primary objective is financial gain from the author, rather than the dissemination of quality scholarship.
Beall's Criteria
In 2015, Jeffrey Beall outlined a comprehensive set of criteria to identify predatory publishers. These criteria were categorized into 26 points related to poor journal standards and practices, 9 concerning journal editors and staff, 7 focusing on ethics and integrity, 6 on the publisher's business practices, and 6 general "other" criteria. He also listed an additional 26 practices that, while not definitively indicative of predatory behavior, suggested subpar journal standards.
Eriksson and Helgesson's 25 Criteria
Stefan Eriksson and Gert Helgesson, in 2016, identified 25 potential indicators of predatory publishing. They cautioned that meeting a single criterion might not be conclusive, but a cumulative presence significantly increases the likelihood of a journal being predatory. Their extensive list includes:
- The publisher is not a member of any recognized professional organization committed to best publishing practices (e.g., COPE or EASE).
- The journal is not indexed in well-established electronic databases such as MEDLINE or Web of Science.
- The publisher boldly claims to be a "leading publisher" despite being a recent entrant.
- The journal and publisher are unfamiliar to the researcher and their colleagues.
- Published papers exhibit poor research quality, potentially bordering on non-academic content, such as overt pseudo-science.
- Fundamental errors in titles and abstracts, or frequent typographical and factual errors throughout published papers.
- The journal website lacks professionalism.
- The journal website fails to present an editorial board or provides insufficient details on its members and their affiliations.
- The journal website does not disclose the editorial office location or provides an incorrect address.
- The publishing schedule is not clearly stated.
- The journal title suggests a national affiliation that contradicts its actual location (e.g., "American Journal of..." based on another continent) or includes "International" while having a single-country editorial board.
- The journal mimics the title or website design of another established journal.
- The journal provides an impact factor despite being new, making calculation impossible.
- The journal claims an unrealistically high impact using spurious alternative impact factors (e.g., a score of 7 for a bioethics journal, far exceeding top-tier notations).
- The journal website displays non-academic or unrelated advertisements.
- The publisher has launched an overwhelming number of new journals simultaneously or within a very short timeframe.
- The editor in chief also serves as editor for multiple journals with vastly different focuses.
- The journal publishes articles that fall far outside its stated scope.
- Unsolicited invitations to submit articles are sent, often making it clear the editor has no understanding of the recipient's field.
- Emails from the journal editor are poorly written, employ exaggerated flattery, and contain contradictory claims (e.g., demanding a response within 48 hours while also stating submission is flexible).
- The journal charges a submission or handling fee, rather than a publication fee, meaning payment is required even if the paper is not accepted.
- The submission/publication fees and their exact amounts are not clearly stated on the journal's website.
- Unrealistic promises are made regarding the speed of the peer review process, hinting at a minimal or non-existent review.
- Copyright agreements are not clearly described, or the journal demands copyright while claiming to be open-access.
- The journal lacks strategies for handling misconduct, conflicts of interest, or ensuring the archiving of articles if the journal ceases operation.
Memon's Criteria
Scholar Aamir Raoof Memon proposed a set of criteria for identifying predatory publishing:
- Scope too broad or inconsistent: Journals may cover disparate topics (e.g., biomedical and non-biomedical) or publish special issues on topics clearly outside their stated scope.
- Acceptance of all submitted papers: Predatory journals often claim a peer-review process but accept all submissions, regardless of quality.
- Lack of affiliation with reputable organizations: They are typically not affiliated with recognized universities or research institutions.
- Poor quality of published papers: Papers are often of low quality due to the absence of genuine peer review or editing, with a high volume of articles published per issue.
- Inappropriate invitations: Researchers receive invitations to submit manuscripts in fields outside the journal's scope, indicating a lack of editorial understanding.
- False or misleading indexing information: Journals may falsely claim indexing by reputable agencies or be indexed by irrelevant ones, while neglecting to be indexed in relevant databases.
- Falsification of impact factors: They often fabricate impact factors or similar metrics, especially for new journals unable to have their impact calculated.
- Misleading editorial board information: False or misleading details about the editorial board members are presented.
- Hidden or unclear costs: Authors may be surprised by unexpected fees, or the costs associated with publishing are not clearly stated.
- Lack of membership in professional organizations: They are not monitored by or members of regional or international organizations focused on ethical publishing.
- Absence of misconduct policies: There are no clear strategies for handling issues like plagiarism, salami slicing, or article retraction.
- Outdated or incomplete website: The website may not be updated regularly or may lack essential information regarding submission requirements, manuscript processing, and reviewing procedures.
- Submission via email or basic website forms: Manuscripts are often submitted directly via email or through rudimentary website interfaces.
- Insufficient contact information: Contact details are often missing or misleading, with false information about the journal's location.
Policies of Leading Scholar Databases
Many prominent scientific abstract and citation databases have implemented policies to identify and combat predatory journals. For instance, Scopus automatically flags journals exhibiting outlier behavior in two consecutive years based on three comparative criteria:
- Substantially higher self-citation rate: Indicating an unhealthy reliance on its own published work.
- Substantially lower number of citations: Suggesting minimal impact and readership.
- Substantially lower CiteScore: A measure of citation impact.
Web of Science employs similar criteria, though without specifying exact quantitative metrics. Notably, Web of Science also scrutinizes excessive citations of works authored by journal board members, a practice that Scopus does not explicitly mention. As of summer 2024, SciFinder (and its parent, Chemical Abstract Service) have not publicly disclosed a definitive policy regarding predatory journals.
Growth and Structure
A 2015 study indicated a dramatic surge in predatory journal publications, escalating from approximately 53,000 articles in 2010 to an estimated 420,000 in 2014, distributed across roughly 8,000 active journals. Initially, publishers managing over 100 journals dominated the market. However, since 2012, publishers with a portfolio of 10–99 journals have captured the largest market share. By 2022, nearly a third of the top 100 publishers, ranked by journal count, were considered predatory. The geographical distribution of both publishers and authors is heavily skewed, with a significant majority of authors originating from Asia and Africa. Authors typically paid an average fee of US $178 for rapid publication, often within two to three months of submission, bypassing rigorous review processes. By 2019, it was reported that about 5% of Italian researchers had published in predatory journals, with a third of these journals exhibiting fraudulent editorial practices.
Causes and Impact
The fundamental driver behind exploitative publishing practices is the shift in the business model towards article-processing charges (APCs), where authors pay to publish rather than readers paying to access. This model inherently incentivizes publishers to prioritize the sheer volume of published articles over their quality. APCs have gained considerable traction over the past two decades as a revenue stream for OA publishing, primarily due to the guaranteed income they provide and a perceived lack of competitive pricing that grants vendors significant control over their charges.
Ultimately, maintaining scholarly quality hinges on robust editorial policies and their diligent enforcement. The inherent conflict between rigorous scholarship and profit can only be effectively managed by selecting articles for publication based solely on their peer-reviewed methodological quality. Most legitimate OA publishers ensure their quality by registering their titles with the Directory of Open Access Journals and adhering to a standardized set of ethical and operational criteria.
While the majority of predatory OA publishers appear to be based in Asia and Africa, one study found that over half of the authors publishing in these journals hailed from "higher-income or upper-middle-income countries." It has been argued that authors may unknowingly publish in predatory journals due to concerns about potential prejudice against scholars from non-Western countries in North American and European journals, the intense pressure to publish, or a lack of research proficiency. Consequently, predatory publishing also raises critical questions about the geopolitical and commercial dynamics influencing scholarly knowledge production. Early-career researchers are particularly susceptible, facing immense pressure to publish rapidly to advance their careers. This, combined with a lack of awareness regarding predatory practices, makes them prime targets for exploitative publishers. For instance, Nigerian researchers might turn to predatory journals due to the imperative to publish internationally while facing limited access to Western journals, or because the APCs charged by mainstream OA journals are prohibitively high. More broadly, the criteria employed by high-impact-factor journals, including the quality of English language, editorial board composition, and the rigor of peer review, often favor familiar content from established academic centers over that from the "periphery." It is therefore crucial to distinguish between genuinely exploitative publishers and legitimate OA initiatives, which, despite potentially varying standards, aim to improve and disseminate epistemic content.
Response
Blacklists
The proliferation of predatory journals has led to the creation of lists designed to identify and flag unacceptable publishing venues. Beall's List, a freely accessible blacklist, and Cabells' Predatory Reports, a subscription-based database, are prominent examples. However, the Committee on Publication Ethics (COPE) advises against blindly relying on any such list, particularly those that do not clearly articulate their evaluation criteria. Some lists have faced criticism for being based on subjective judgment rather than objective evidence. Conversely, lists of acceptable sources have been deemed irrelevant to how academics actually evaluate journals. The Directory of Open Access Journals serves as a whitelist, and various research funders also provide lists of pre-approved journals.
Beall's List
Jeffrey Beall, a librarian and researcher at the University of Colorado Denver, is credited with coining the term "predatory publishing." He first published his list of predatory publishers in 2010, aiming to identify scholarly open-access publishers exhibiting questionable practices. By 2013, Nature reported that Beall's list was "widely read by librarians, researchers, and open-access advocates, many of whom applaud his efforts to reveal shady publishing practices." However, criticisms emerged, questioning the fairness of labeling all listed journals as "predatory" and suggesting that "several shades of gray may be distinguishable." Beall's analyses have been characterized as sweeping generalizations lacking supporting evidence, and he faced accusations of bias against open-access journals from less economically developed countries. A 2018 study applying Beall's criteria to both OA and non-OA journals in library and information science found that even top-tier non-OA journals could be deemed predatory, highlighting the difficulty in drawing clear distinctions. Similarly, research in biomedicine identified challenges in demarcating predatory and non-predatory journals. One librarian noted that Beall's list attempted a binary division, but many of the criteria were either unquantifiable or applied equally to established OA journals and newer entrants, reflecting potentially "First World assumptions that aren't valid worldwide." Beall, however, maintained his position, publishing a rebuttal in mid-2015.
Following the "Who's Afraid of Peer Review?" investigation, the DOAJ tightened its inclusion criteria, evolving into a whitelist akin to Beall's role as a blacklist. The investigation confirmed Beall's efficacy in identifying publishers with poor quality control. However, Lars Bjørnshauge, managing director of the DOAJ, estimated that questionable publishing constituted less than 1% of all author-pays OA papers, a significantly lower figure than Beall's 5–10% estimate. Bjørnshauge advocated for open-access associations like the DOAJ and OASPA to take greater responsibility in policing publishers by establishing clear criteria for trustworthy journals.
Beall faced threats of a lawsuit from a Canadian publisher on his list and reported experiencing online harassment. His list drew criticism for its reliance on website analysis, lack of direct engagement with publishers, and inclusion of newly established but legitimate journals. Beall addressed these concerns by publishing his criteria and implementing a review process for publishers seeking removal. For instance, a re-evaluation in 2010 led to the removal of some journals from his list.
In 2013, the OMICS Publishing Group threatened Beall with a $1 billion lawsuit, calling his inclusion of them "ridiculous, baseless, [and] impertinent" and "smack[ing] of literal unprofessionalism and arrogance." The publisher's legal correspondence included veiled threats of criminal charges in India and the USA under Section 66A of India's Information Technology Act, 2000, a provision criticized for its potential misuse in stifling dissent. Beall characterized the letter as "poorly written and personally threatening," an attempt to divert attention from OMICS's editorial practices. In 2015, India's Supreme Court of India struck down Section 66A as unconstitutional, deeming it an arbitrary and excessive infringement on free speech. This rendered OMICS's threat under that section moot, though a defamation case remained a possibility. In August 2016, the FTC filed its lawsuit against OMICS for "deceptive business practices," securing an initial court ruling in their favor in November 2017.
Beall's list was adopted by South Africa's Department of Higher Education and Training for its accredited journal list, impacting research funding. ProQuest also reviewed journals on Beall's list, removing some from the International Bibliography of the Social Sciences. In January 2017, Beall shut down his blog and removed its content, citing pressure from his employer. His supervisor, however, denied pressuring him, affirming support for Beall's academic freedom. In 2017, Ramzi Hakami reported on his successful attempt to get a flawed paper accepted by a publisher on the list and referenced a resurrected version of Beall's list, maintained by an anonymous researcher.
Cabell's Predatory Reports
At the May 2017 meeting of the Society for Scholarly Publishing, Cabell's International announced its intention to launch a subscription-based blacklist of predatory journals (not publishers) in June. The company had begun developing its criteria in early 2016, and by July 2017, both a blacklist and a whitelist were available via subscription.
Other Blacklists
Since the discontinuation of Beall's list, other groups have emerged to maintain similar directories. Kscien's list, for example, used Beall's original list as a foundation, updating it with new entries and removals. In 2020, China's Ministry of Science and Technology commissioned the Chinese Center of Scientometrics to create the Chinese Early Warning Journal List (EWJL). This list classifies journals into three risk levels (low, medium, or high), offering a more nuanced approach than simple binary categorization. Despite these efforts, criticism persists regarding the methodology and scope of such lists. A 2020 systematic review of 93 lists concluded that only three were demonstrably evidence-based.
Science Funders
Numerous science funding agencies have implemented measures to combat predatory publishing, particularly concerning national journal rankings.
Poland: In September 2018, Zbigniew Błocki, director of the National Science Centre, announced that articles funded by the NCN and published in journals failing to meet peer-review standards would have their grant numbers removed, and funds would be reclaimed.
Russia: Both the Russian Science Foundation and the Russian Foundation for Basic Research mandate that grant recipients publish exclusively in journals indexed by Web of Science or Scopus. This policy aims to prevent researchers from falling prey to predatory publishers without the foundations needing to maintain their own lists and ensures wider discoverability of funded research. However, with Clarivate's withdrawal from Russia in 2022 and the pause in Elsevier services, these indexing requirements have become less central for Russian agencies.
Other Efforts
The campaign "Think. Check. Submit." promotes transparency and best practices in scholarly publishing. More transparent peer-review models, such as open peer review and post-publication peer review, have been advocated as countermeasures. However, some argue that the focus should remain on the fraudulent nature of predatory publishing rather than conflating it with the broader issues of peer review effectiveness.
Principles of transparency and best practice have been collectively issued by the Committee on Publication Ethics, DOAJ, OASPA, and WAME to help distinguish legitimate journals from illegitimate ones. Various journal review websites, both crowd-sourced and expert-run, have emerged, some extending their scope beyond OA publications to assess the quality of the peer-review process. A collaborative awareness campaign has also been launched by libraries and publishers.
Several strategies have been proposed to further combat predatory journals, including enhancing publication literacy, particularly among junior researchers in developing countries. Organizations have also developed practical tips for identifying predatory publishers. Some researchers, like Beall, have linked predatory publishing to the rise of gold open access, particularly its author-pays variant. This has led to advocacy for "platinum open access," where the absence of APCs theoretically removes the publisher's conflict of interest in article acceptance. More objective metrics, such as a "predatory score" and journal quality indicators, have also been proposed. The International Academy of Nursing Editors (INANE) encourages authors to consult expert-reviewed journal listings and Beall's list of predatory journals.
Bioethicist Arthur Caplan has warned that predatory publishing, alongside fabricated data and academic plagiarism, erodes public trust in the medical profession, devalues legitimate science, and undermines support for evidence-based policy.
In 2015, Rick Anderson, associate dean at the J. Willard Marriott Library, University of Utah, questioned the utility of the term "predatory," suggesting it might be too narrowly focused on author-pays OA and could obscure other forms of scholarly misconduct. He proposed retiring the term, arguing it generated more heat than light. A 2017 The New York Times article suggested that many academics are "eager" to publish in these journals, framing the relationship as a "new and ugly symbiosis" rather than outright exploitation.
Similarly, a January 2018 study found that scholars in developing nations sometimes felt more comfortable publishing in journals from their region due to perceived prejudice from Western journals, or simply due to a lack of awareness of journal reputations. The "publish or perish" pressure also drove many to opt for fast-turnaround journals. In some instances, researchers lacked adequate guidance and felt unqualified to submit to more reputable journals.
In May 2018, India's University Grants Commission removed 4,305 dubious journals from its list used for evaluating academic performance.
To further refine the definition and distinction of predatory journals, Leonhard Dobusch and Maximilian Heimstädt proposed a tripartite classification in 2019: "aspirant" (science-oriented despite below-average peer review), "junk" (predominantly profit-oriented with superficial or no peer review), and "fake" (also profit-oriented with superficial or no peer review).
In April 2019, a consensus definition was formulated by 43 participants from 10 countries: "Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices." Adequacy of peer review was omitted from this definition due to perceived subjectivity, a decision criticized by some who argued it could inadvertently strengthen predatory journals.
A March 2022 report by the InterAcademy Partnership, titled "Combatting Predatory Academic Journals and Conferences," emphasized that predatory publishing exists on a spectrum, not as a binary. They proposed the following classifications:
- Hijacked journals: Mimicking reputable existing journals.
- Journals re-publishing papers: Re-publishing content from legitimate sources.
- Deceptive journals: Providing false or misleading information about charges, services, publisher location, or editorial board members.
- Low-quality journals: Characterized by poor cumulative criteria (e.g., disregarding negative reviews, publishing outside scope) without apparent deceitful intent.
It is important to note that some journals may fit into multiple categories.
See also
- List of scholarly publishing stings
- Author mill
- Conflicts of interest in academic publishing
- Content farm
- Diploma mill
- Elsevier § Fake journals
- Essay mill
- Hijacked journal
- Journalology
- Mega journal
- Open-access journal
- Peer review failures
- Predatory conference
- Pseudo-scholarship
- Research Integrity Risk Index
- Center for Promoting Ideas