Cross-Fitting-Free Debiased Machine Learning with Multiway Dependence
arXiv stat.ML / 4/7/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper presents an asymptotic theory for two-step debiased machine learning (DML) estimators in GMM settings with general multiway clustered (multiway) dependence, specifically avoiding the need for cross-fitting.
- It argues that cross-fitting can be statistically inefficient and computationally costly when first-stage learners are complex and when the effective sample size is limited by the number of independent clusters.
- The authors achieve valid inference without sample splitting by using Neyman-orthogonal moment conditions together with a localisation-based empirical process method that supports an arbitrary number of clustering dimensions.
- The resulting debiased GMM estimators are proven to be asymptotically linear and asymptotically normal under multiway clustered dependence.
- A key contribution is the development of new global and local maximal inequalities for function classes defined on sums of separately exchangeable arrays, which may be useful beyond the immediate DML application.
Related Articles

Black Hat Asia
AI Business

Amazon CEO takes aim at Nvidia, Intel, Starlink, more in annual shareholder letter
TechCrunch

Why Anthropic’s new model has cybersecurity experts rattled
Reddit r/artificial
Does the AI 2027 paper still hold any legitimacy?
Reddit r/artificial

Why Most Productivity Systems Fail (And What to Do Instead)
Dev.to