This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.

P-Hacking Via Academic Finance Research Conferences

The post “P-Hacking Via Academic Finance Research Conferences” first appeared on Alpha Architect Blog.

Documentation of the File Drawer Problem at Finance Conferences: A Follow-Up Study

  • Manoela N. Morais and Matthew R. Morey
  • Journal of Investing
  • A version of this paper can be found here
  • Want to read our summaries of academic finance papers? Check out our Academic Research Insight category

What are the research questions?

This research is an update to “Documentation of the File Drawer Problem in Academic Finance Journals” published by the same authors in the Journal of Investment Management in 2018. A summary of that article can be found here. The “file drawer problem” refers to the idea that journal editors are predisposed to accepting articles for publication, only if they contain statistically significant results.  Since editors are motivated by improving journal impact numbers and citation counts, this bias is not surprising.  Articles with significant results are more likely to be cited and thus improve journal impact.  Articles with nonsignificant results end up hidden away in the researchers’ file drawer and not submitted anywhere at all.  Putting numbers to the problem in academic journals, the authors reported only 2.1% of 29 finance journals published nonsignificant results.  Five of those 29 journals published no studies with insignificant results. This update examines the degree to which finance conferences exhibit a similar pattern.

  1. Is there a significant file drawer problem with respect to academic financial conferences?

What are the Academic Insights?

  1. YES. The file drawer problem was observed to be at least as serious at finance conferences as it is in finance journals. The authors constructed a database of 3,425 empirical articles presented at the annual Financial Management Association for 5 years.  The FMA is the largest academic conference by number of papers.  Each paper examined was a stand-along research article.  Roundtables, panel sessions, pedagogy series and debates were not included. Of the 3,425 articles, only 14 (or 0.41%) had nonsignificant results over the five year period.  This is in comparison to the 2.1% of articles published in academic journals.  It also appears that the problem within the FMA intensified between 2014 and 2018. Stunning.

Why does it matter?

As with journal publications, this article provides evidence that the file drawer problem is alive and well with respect to academic financial research conferences.  It appears that potential presenters should avoid submitting analyses that have nonsignificant results otherwise risk rejection by the conference. As a result, conference attendants see a biased set of research presentations comprised of only those papers that exhibit statistical significance. The important question here how much this bias contributes to the use of p-hacking or datamining practices in order to achieve significant results. We have seen increasing attention paid to the practice of p-hacking, datamining, and other “bad habits” and the negative impact they have on the credibility of the discipline.  

In 2017, Campbell Harvey (his Presidential Address for the Am Finance Assoc) took the issue one step further into the intentional misuse of statistics in empirical research.  He defines intentional p-hacking as the practice of reporting only significant results when the researcher has conducted a myriad of statistical methods, empirical approaches or data manipulation.  The underlying motivation for the use of such practices is the desire to be published in a world where finance journals are biased towards publishing significant results almost exclusively. The underlying risk to p-hacking and datamining, especially in the investments area, is the identification of significant results when they are likely just random events.   Since random events by definition, do not repeat themselves in an anticipatory manner, the investment results are likely to fail on a going-forward basis. Datamining and p-hacking go a long way in explaining why investment strategies fail out-of-sample, or even worse when they are implemented in the real world.

This criticism can now be extended to finance conferences.

The most important chart from the paper

FMA Academic Finance Journals


The file drawer problem is a publication bias where journal editors are much more likely to accept empirical papers with statistically significant results than those with statistically nonsignificant results. As a result, papers that have nonsignificant results are not published and relegated to the file drawer, never to be seen by others. In a previous paper, Morey and Yadav (2018) examined the file drawer problem in finance journals and found evidence that strongly suggests that such a publication bias exists in finance journals. In this follow-up study, we examine the prevalence of the file drawer problem at finance conferences. As such we are the first article in finance that we know of to attempt such an analysis. To do this, we examine every single empirical paper presented at the annual Financial Management Association (FMA) conference from 2014–2018. In an examination of 3,425 empirical papers, we found less than 0.5% of these papers had statistically nonsignificant results. These results suggest that there is also a significant file drawer problem at finance conferences.

Disclosure: Alpha Architect

The views and opinions expressed herein are those of the author and do not necessarily reflect the views of Alpha Architect, its affiliates or its employees. Our full disclosures are available here. Definitions of common statistics used in our analysis are available here (towards the bottom).

This site provides NO information on our value ETFs or our momentum ETFs. Please refer to this site.

Disclosure: Interactive Brokers

Information posted on IBKR Traders’ Insight that is provided by third-parties and not by Interactive Brokers does NOT constitute a recommendation by Interactive Brokers that you should contract for the services of that third party. Third-party participants who contribute to IBKR Traders’ Insight are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from Alpha Architect and is being posted with permission from Alpha Architect. The views expressed in this material are solely those of the author and/or Alpha Architect and IBKR is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to sell or the solicitation of an offer to buy any security. To the extent that this material discusses general market activity, industry or sector trends or other broad based economic or political conditions, it should not be construed as research or investment advice. To the extent that it includes references to specific securities, commodities, currencies, or other instruments, those references do not constitute a recommendation to buy, sell or hold such security. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

trading top