Publishing Original Research in AMJ: Advice to Prospective Authors
2021; Academy of Management; Volume: 64; Issue: 3 Linguagem: Inglês
10.5465/amj.2021.4003
ISSN1948-0989
AutoresLászló Tihanyi, Katherine A. DeCelles,
Tópico(s)Innovation and Knowledge Management
ResumoAcademy of Management JournalVol. 64, No. 3 From the EditorsFree AccessPublishing Original Research in AMJ: Advice to Prospective AuthorsLaszlo Tihanyi and Katherine A. DeCellesLaszlo TihanyiRice University and Katherine A. DeCellesUniversity of TorontoPublished Online:15 Jun 2021https://doi.org/10.5465/amj.2021.4003AboutSectionsPDF/EPUB ToolsDownload CitationsAdd to favoritesTrack Citations ShareShare onFacebookTwitterLinkedInRedditEmail It is fair to assume that Academy of Management Journal (AMJ) readers expect to find original research in every journal issue. All of us who serve AMJ as editors and reviewers would like to publish papers that meet our readers’ expectations for originality, as highlighted in the journal’s mission statement: “Authors should strive to produce original, insightful, interesting, important, and theoretically bold research that demonstrates a significant ‘value-added’ contribution to the field’s understanding of an issue or topic.” Being able to identify original research is an essential task for a journal that publishes 72 articles from the more than 1,600 submissions it receives in a year.The purpose of this editorial is to explain when a submission is considered “original” by AMJ, and to provide an update to the Ireland (2009) “From the Editors” (FTE) essay on this topic. Although previous FTE essays have covered several topics about the types of research AMJ seeks to publish, describing how new submissions are evaluated—and how AMJ responds to changes in the expectations and practices of the field—will still be useful for prospective authors. Discussing what constitutes original empirical research by AMJ might, for instance, help those who plan to contribute to the scholarly conversations about their research topics by sharing their work in our journal for the first time. Furthermore, while the editorial policies and practices we cover in this essay are, in general, likely familiar to those who have recently published their research in AMJ, a periodic review of these policies and practices can help experienced authors as well. By learning how new concerns about reported scientific results are addressed, how the latest research approaches are implemented, and how recent societal challenges are met, authors can improve their work and thus increase the odds of the acceptance of their papers in our journal. We hope the advice in this editorial might be even more valuable during times when opportunities for in-person meetings and thus learning from one another are limited. We answer the question “When is a paper considered to be original by AMJ?” by summarizing the perspectives of editors and reviewers when a manuscript is evaluated in terms of the data used in the research, the originality of the research, and the text in the paper.ARE ORIGINAL DATA REQUIRED FOR ORIGINAL RESEARCH?Understandably, authors spend a considerable amount of time and resources collecting unique data, and, in turn, wish to publish multiple papers from a data set. For example, the growing adoption of the three-essay dissertation format in many doctoral programs seems to have led to the increase of submissions that utilize data from the same empirical research effort. Furthermore, many authors maintain active research programs by submitting multiple papers to different journals that might be related in some ways. Since AMJ is one of the few “big tent” management journals, we need to acknowledge the different practices regarding the use and reuse of data between macro and micro and quantitative and qualitative research. At the same time, we ought to maintain fairness across these areas. A previous editorial provided a comprehensive overview of data overlap policies that have been followed by AMJ editorial teams and review boards (Colquitt, 2013). When authors submit their manuscripts, they are asked to disclose any data overlap with their other papers by answering the following question:Have you published (in AMJ, in another journal—including “in press” and “conditional accept” stages—or in a book) another publication or have another paper under review at AMJ or at a different journal that uses either some of the same variables from this data collection effort, or that uses some of the same cases/observations from this data collection effort? If so, please describe in detail the precise nature of the data overlap in your cover letter and please attach that previous publication in the Cover Letter field (note that multiple files can be uploaded to that field). Please see this editorial (Colquitt, 2013) for more information.Therefore, even if papers are not using the exact same data set (such as a data set supplemented with some new additional data, a subset of the data, or using somewhat different operationalizations or analyses), we ask authors to disclose their other related work. We emphasize that transparency is key to ensure that editors make informed and fair decisions about the scope of the potential overlap of data, ideas, and text across papers.A recent update to this question asks authors to disclose whether they have another paper under review at AMJ or a different journal with related data. Two recent trends have motivated this update. First, author team sizes have increased in recent years and submissions are often based on joint data collection efforts. Coauthors on an AMJ submission might work on multiple papers with other authors at the same time and contribute their own data to other research projects as well as the focal submission. Second, the number of management journals and other outlets has increased in recent years. The submission policies of those journals vary, and the length of time that manuscripts are under review can be long, which could mean that other papers are written and submitted during that time.When a manuscript is submitted to AMJ, related papers and descriptions of data overlap are reviewed by the editor, the area’s deputy editor, and the associate editor who handles the manuscript. Authors may be asked to provide further information during a minor or a major desk edit. We generally send these requests when there are unexplained differences between data sets and response rates, usually occurring in the case of data segmentation or trimming, or sometimes in different operationalizations of the same construct. It is reasonable to expect that all papers written from the same data collection efforts should have consistent sample sizes, descriptive statistics, and response rates. However, samples may differ in analyses due to missing data in equations. This is particularly important because, if presenting trimmed data or a subset of a data set, the descriptive statistics may appear different from paper to paper (should, e.g., outliers or respondents with different missing data be eliminated without explaining this in the paper). This practice of failing to acknowledge data overlap could lead readers to believe that these were independent investigations, and possibly skew future meta-analyses.To protect the double-blind review process, documents and communications included to help editors assess originality are not forwarded to the reviewers, but they do represent an important initial step in the review process to ensure research is reported accurately and consistently, which will help to avoid corrections in the future. Such transparency allows editors to make informed and fair decisions, and ensures that readers are not unintentionally misled.While having original data is an important consideration in evaluations of what constitutes original research, it is not the only—nor is it a necessary—criterion. In fact, distinct contributions have been made by studies utilizing the same data sets. Furthermore, papers can make significant theoretical contributions at AMJ through meta-analyses or by testing new theories that debunk previous assumptions with others’ data or by using alternative approaches (Eden, 2002; Shaw & Ertug, 2017). Instead, as we discuss next, to be considered an original submission at AMJ, papers should have a new “conceptual core.”ORIGINAL PAPERS: NEW CONCEPTUAL COREAs with most of our work as scholars, what constitutes something “original” often involves a subjective evaluation. For example, incorporating a theory from a different discipline into the management field can often help authors develop a novel theoretical contribution in AMJ that moves the management literature forward (DeCelles, Leslie, & Shaw, 2019). In addition, we suggest that papers considered to be original in AMJ should also go beyond replication of previously tested theory, unless the authors can make a case that their examination develops substantive theory (e.g., authors might successfully argue a new context challenges the underlying assumptions in the theory). Therefore, any submission to AMJ is expected to be substantively new at its conceptual core in order to be sent for review and considered for publication.What do we mean by “conceptual core”? We mean that, to be considered original, a manuscript should have novel ideas of scope—a substantively unique research question, theoretical model, and literature review of sufficiently different focus from previously reviewed or published work. Adding a new mediator variable, an additional study, a different theoretical framework, a re-analysis of the same data, or a different outcome variable to a previously rejected submission will unlikely mean that a submission is deemed new at its conceptual core. Several reasons support this position. In deductive, quantitative work—which is most research AMJ publishes, given its mission—newly submitted papers should not be based on post hoc theorizing (Kerr, 1998).1 Authors could address these concerns by validating their a priori plans for separate papers and models that, on the surface, may seem similar or draw on the same data. Time-stamped data and pre-registration have been tools used for such validations. While qualitative work is different—in that inductive theory building is often the intent, rather than a positivist, deductive approach—here, too, scholars should be mindful that their AMJ submissions are new at their conceptual core. In qualitative work, this means a paper must have a completely different theoretical focus and analysis, and unique supporting evidence and research question(s), relative to a rejected submission. In summary, papers that are new at their conceptual core are not an evolution or unsolicited revision of a manuscript previously rejected from AMJ.AMJ strives to quickly provide authors with thorough, constructive feedback to move their research forward, whether at AMJ as a revision, or to improve their work for submission at another journal. We do not issue reject-and-resubmit decisions at AMJ. The approach of resubmitting scholarship to AMJ that was previously rejected at our journal (Ireland, 2009), even if seemingly harmless, could reinforce inequalities in our field, due to the demographic and social characteristics or professional status of those whose papers might (and might not) ask for or receive another chance. Reducing these inequalities is critical for the progress of our field and is consistent with our vision for AMJ’s inclusivity, fairness, and transparency (Tihanyi, 2020).When authors enter their submission on ScholarOne Manuscript Central (http://mc.manuscriptcentral.com/amj), the second question on the submission page asks authors about the previous history of the manuscript at AMJ:Have you previously submitted another manuscript to AMJ that uses either some of the same variables from this data collection effort, or that uses some of the same cases/observations from this data collection effort? If so, please describe in detail the precise nature of the data overlap in your cover letter and please attach that previous manuscript in the Cover Letter field (note that multiple files can be uploaded to that field). Please see this editorial (Colquitt, 2013) for more information.This box, when checked, helps editors to understand the nature of the data overlap more easily, which also makes turnaround time on manuscripts quicker. Current and previous submissions from the same or similar studies and/or by the same authors are often compared, to assess the extent of possible overlap. If authors are unsure of whether or not to check this box during the submission process, they can disclose their other work in the cover letter and upload it for editors to examine.ORIGINAL TEXTIn addition to deciding whether the data and paper are original for consideration for publication in AMJ, submissions are screened for “original text.” By this, we mean original writing that has limited or no overlap between the submitted manuscript and published work without quotations/citations. Submissions that plagiarize or self-plagiarize published works are returned to authors, regardless of where it occurs in the paper. This is because the copyrights for accepted articles are transferred from authors to publishers, which therefore require permissions for AMJ to republish text. Authors should submit their papers to one of many free plagiarism detectors on the Internet to ensure that text accidentally copied has been removed, paraphrased, or properly attributed to the source with quotations. We also discourage the reuse of text across author-distributed work, such as from book chapters or popular press articles without attribution to the other work.Submissions are also considered to have original text when authors present their review of the relevant literature fairly and independently. At AMJ, the literature review typically sets the stage for how a paper makes a novel theoretical contribution to the field. By the authors having an original fair take on the current literature, the review should be new, rather than based on others. This original review should help interested readers to learn something new. For example, an original literature review crafts and substantiates an author’s unique claims that there is an underlying theoretical assumption that should be challenged, or a tension in the current literature that the current paper resolves. Doing an independent literature review also helps ensure that papers are fairly credited and accurately interpreted, based on their original source rather than on other authors’ standpoints.PUTTING YOUR BEST FOOT FORWARDAs our field continues to grow, and submissions to journals rise, competition necessarily increases. We thus encourage authors to put their best foot forward when they submit their work to AMJ, especially knowing that they only have one chance. First, although we acknowledge pressure for tenure and the fact that there are schools that count rather than read and evaluate scholarship for rewards and promotion, we believe that slicing data sets into the smallest publishable pieces (aka “salami slicing”) is ill advised. AMJ seeks papers that have a robust scope—that is, papers published in our journal typically have several interrelated hypotheses and studies. Although authors may want to publish different papers based on the same data, addressing the reviewer comments could narrow the differences between those papers.While choosing to write multiple papers using the same data set will always be the authors’ decision, we encourage them to make the strongest possible case for their paper when they submit it to or revise it for AMJ. Revise and resubmit invitations are reserved for the most promising papers, but much work is still required to get them to the final publication stage. It is easy to underestimate the amount of work needed to be done for the next favorable decision. Revise and resubmit decisions also represent a significant—often yearlong or more—commitment by the editor and the reviewer team who regularly spend many hours helping authors improve their work. Holding back good theoretical ideas and results in the hope of publishing them in a future article might lead to incomplete revisions and a perception of unresponsiveness by the reviewers that could lead to rejection.While doing so might not include text or data overlap per se, reducing projects or studies in order to several smaller papers can weaken the insights and impact that a paper makes. An example detailed by Kassirer and Angell (1995), former editors of The New England Journal of Medicine, recounted that they received a submission of an intervention study of new mothers, whereas the authors submitted the paper based on the data for the newborns to a different journal. But both papers were limited because of the insight that could have been brought by simultaneous consideration of these outcomes. Similar phenomena may occur at submissions in our field, such as research using the same data set, but having one paper that demonstrates a main effect and another that examines a moderation, or by separating key dependent variables—the entire product is likely to be of stronger scope and impact when considered together as one paper. And, as noted above, the second submitted paper may not be sufficiently original in its conceptual core to warrant review, relative to the first paper.Our intention with this FTE was to assist prospective AMJ authors by providing insights on the journal’s review process and to explain the reasoning behind our practices involving original empirical research. While recent adjustments aim to address the latest developments and emerging practices, improvements are expected to continue as a result of new ideas by the members of the Academy of Management community. We welcome their input as we continue to serve the readers, authors, and reviewers of AMJ.1 Deductive theory building and testing may, at times, be initially motivated by surprising findings (Shaw, 2017). For example, scholars might use a sequential study approach that theorizes and further investigates some unexpected initial study findings (see, e.g., Bermiss & McDonald, 2018). However, in the absence of theory building and deductive testing in quantitative research, we encourage authors exploring/primarily theorizing from their quantitative data to submit this work to Academy of Management Discoveries (AMD) (please see Rockmann et al., 2021, for the differences between types of papers published in AMJ and AMD).AcknowledgmentsWe thank Andrew Carton, Luis Diestre, Gerry George, Lindred Greer, Denis Grégoire, Jennifer Howard-Grenville, Ivona Hideg, Bart de Jong, Cindy Muir (Zapata), Floor Rink, Matthew Semadeni, Jason Shaw, Elizabeth Umphress, Gurneeta Vasudeva, Heather Vough, Ingo Weller, and Daphne Yiu for their thoughtful comments and suggestions.REFERENCESBermiss, Y. S., & McDonald, R. 2018. Ideological misfit? Political affiliation and employee departure in the private-equity industry. Academy of Management Journal, 61: 2182–2209.Link , Google ScholarColquitt, J. A. 2013. From the editors: Data overlap policies at AMJ. Academy of Management Journal, 56: 331–333.Link , Google ScholarDeCelles, K. A., Leslie, L. M., & Shaw, J. D. 2019. From the editors: Disciplinary code switching at AMJ: “The Tale of Goldilocks and the Three Journals.” Academy of Management Journal, 62: 635–640.Link , Google ScholarEden, D. 2002. From the editors: Replication, meta-analysis, scientific progress, and AMJ’s publication policy. Academy of Management Journal, 45: 841–846.Link , Google ScholarIreland, R. D. 2009. From the editors: When is a “new” paper really new? Academy of Management Journal, 52: 9–10.Link , Google ScholarKassirer, J. P., & Angell, M. 1995. Redundant publication: A reminder. New England Journal of Medicine, 333: 449–450. Google ScholarKerr, N. L. 1998. HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2: 196–217. Google ScholarRockmann, K. J., Bunderson, J. S., Leana, C. R., Hibbert, P., Tihanyi, L., Phan, P. H., & Thatcher, S. M. B. 2021. Publishing in the Academy of Management journals. Academy of Management Discoveries, 7: 1–9.Link , Google ScholarShaw, J. D. 2017. From the editors: Advantages of starting with theory. Academy of Management Journal, 60: 819–822. Google ScholarShaw, J. D., & Ertug, G. 2017. From the editors: The suitability of simulations and meta-analyses for submissions to Academy of Management Journal. Academy of Management Journal, 60: 2045–2049.Link , Google ScholarTihanyi, L. 2020. From the editors: Academy of Management Journal in 2020 and beyond. Academy of Management Journal, 63: 1–6.Link , Google ScholarFiguresReferencesRelatedDetailsCited ByFrom the Editors—Improving the Transparency of Empirical Research Published in AMJKatherine A. DeCelles, Jennifer Howard-Grenville and Laszlo Tihanyi13 September 2021 | Academy of Management Journal, Vol. 64, No. 4 Vol. 64, No. 3 Permissions Metrics Downloaded 3,253 times in the past 12 months History Published online 15 June 2021 Published in print 1 June 2021 Information© Academy of Management JournalAcknowledgmentsWe thank Andrew Carton, Luis Diestre, Gerry George, Lindred Greer, Denis Grégoire, Jennifer Howard-Grenville, Ivona Hideg, Bart de Jong, Cindy Muir (Zapata), Floor Rink, Matthew Semadeni, Jason Shaw, Elizabeth Umphress, Gurneeta Vasudeva, Heather Vough, Ingo Weller, and Daphne Yiu for their thoughtful comments and suggestions.PDF download
Referência(s)