The central challenge in 21~cm cosmology is isolating the cosmological signal from bright foregrounds. Many separation techniques rely on the accurate knowledge of the sky and the instrumental response, including the antenna primary beam. For drift-scan telescopes such as the Hydrogen Epoch of Reionization Array \citep[HERA, ][]{DeBoer2017} that do not move, primary beam characterization is particularly challenging because standard beam-calibration routines do not apply \citep{Cornwell2005} and current techniques require accurate source catalogs at the telescope resolution. We present an extension of the method from \citet{Pober2012} where they use beam symmetries to create a network of overlapping source tracks that break the degeneracy between source flux density and beam response and allow their simultaneous estimation. We fit the beam response of our instrument using early HERA observations and find that our results agree well with electromagnetic simulations down to a -20~dB level in power relative to peak gain for sources with high signal-to-noise ratio. In addition, we construct a source catalog with 90 sources down to a flux density of 1.4~Jy at 151~MHz.
The detection of the Epoch of Reionization (EoR) delay power spectrum using a avoidance highly depends on the instrument chromaticity. The systematic effects induced by the radio-telescope spread the foreground signal in the delay domain, which contaminates the EoR window theoretically observable. Therefore, it is essential to understand and limit these chromatic effects. This paper describes a method to simulate the frequency and time responses of an antenna, by simultaneously taking into account the analogue RF receiver, the transmission cable, and the mutual coupling caused by adjacent antennas. Applied to the Hydrogen Epoch of Reionization Array (HERA), this study reveals the presence of significant reflections at high delays caused by the 150-m cable which links the antenna to the back-end. Besides, it shows that waves can propagate from one dish to another one through large sections of the array because of mutual coupling. In this more realistic approach, the simulated system time response is attenuated by a factor $10^{4}$ after a characteristic delay which depends on the size of the array and on the antenna position. Ultimately, the system response is attenuated by a factor $10^{5}$ after 1400 ns because of the reflections in the cable, which corresponds to characterizable ${k_\parallel}$-modes above 0.7 $h \rm{Mpc}^{-1}$ at 150 MHz. Thus, this new study shows that the detection of the EoR signal with HERA Phase I will be more challenging than expected. On the other hand, it improves our understanding of the telescope, which is essential to mitigate the instrument chromaticity.
We describe the validation of the HERA Phase I software pipeline by a series of modular tests, building up to an end-to-end simulation. The philosophy of this approach is to validate the software and algorithms used in the Phase I upper limit analysis on wholly synthetic data satisfying the assumptions of that analysis, not addressing whether the actual data meet these assumptions. We discuss the organization of this validation approach, the specific modular tests performed, and the construction of the end-to-end simulations. We explicitly discuss the limitations in scope of the current simulation effort. With mock visibility data generated from a known analytic power spectrum and a wide range of realistic instrumental effects and foregrounds, we demonstrate that the current pipeline produces power spectrum estimates that are consistent with known analytic inputs to within thermal noise levels (at the 2 sigma level) for k > 0.2 h/Mpc for both bands and fields considered. Our input spectrum is intentionally amplified to enable a strong `detection' at k ~0.2 h/Mpc -- at the level of ~25 sigma -- with foregrounds dominating on larger scales, and thermal noise dominating at smaller scales. Our pipeline is able to detect this amplified input signal after suppressing foregrounds with a dynamic range (foreground to noise ratio) of > 10^7. Our validation test suite uncovered several sources of scale-independent signal loss throughout the pipeline, whose amplitude is well-characterized and accounted for in the final estimates. We conclude with a discussion of the steps required for the next round of data analysis.
We analyze data from the Hydrogen Epoch of Reionization Array. This is the third in a series of papers on the closure phase delay-spectrum technique designed to detect the HI 21cm emission from cosmic reionization. We present the details of the data and models employed in the power spectral analysis, and discuss limitations to the process. We compare images and visibility spectra made with HERA data, to parallel quantities generated from sky models based on the GLEAM survey, incorporating the HERA telescope model. We find reasonable agreement between images made from HERA data, with those generated from the models, down to the confusion level. For the visibility spectra, there is broad agreement between model and data across the full band of $\sim 80$MHz. However, models with only GLEAM sources do not reproduce a roughly sinusoidal spectral structure at the tens of percent level seen in the observed visibility spectra on scales $\sim 10$ MHz on 29 m baselines. We find that this structure is likely due to diffuse Galactic emission, predominantly the Galactic plane, filling the far sidelobes of the antenna primary beam. We show that our current knowledge of the frequency dependence of the diffuse sky radio emission, and the primary beam at large zenith angles, is inadequate to provide an accurate reproduction of the diffuse structure in the models. We discuss implications due to this missing structure in the models, including calibration, and in the search for the HI 21cm signal, as well as possible mitigation techniques.
Combining the visibilities measured by an interferometer to form a cosmological power spectrum is a complicated process. In a delay-based analysis, the mapping between instrumental and cosmological space is not a one-to-one relation. Instead, neighbouring modes contribute to the power measured at one point, with their respective contributions encoded in the window functions. To better understand the power measured by an interferometer, we assess the impact of instrument characteristics and analysis choices on these window functions. Focusing on the Hydrogen Epoch of Reionization Array (HERA) as a case study, we find that long-baseline observations correspond to enhanced low-k tails of the window functions, which facilitate foreground leakage, whilst an informed choice of bandwidth and frequency taper can reduce said tails. With simple test cases and realistic simulations, we show that, apart from tracing mode mixing, the window functions help accurately reconstruct the power spectrum estimator of simulated visibilities. The window functions depend strongly on the beam chromaticity, and less on its spatial structure - a Gaussian approximation, ignoring side lobes, is sufficient. Finally, we investigate the potential of asymmetric window functions, down-weighting the contribution of low-k power to avoid foreground leakage. The window functions presented here correspond to the latest HERA upper limits for the full Phase I data. They allow an accurate reconstruction of the power spectrum measured by the instrument and will be used in future analyses to confront theoretical models and data directly in cylindrical space.
Radio interferometers targeting the 21cm brightness temperature fluctuations at high redshift are subject to systematic effects that operate over a range of different timescales. These can be isolated by designing appropriate Fourier filters that operate in fringe-rate (FR) space, the Fourier pair of local sidereal time (LST). Applications of FR filtering include separating effects that are correlated with the rotating sky vs. those relative to the ground, down-weighting emission in the primary beam sidelobes, and suppressing noise. FR filtering causes the noise contributions to the visibility data to become correlated in time however, making interpretation of subsequent averaging and error estimation steps more subtle. In this paper, we describe fringe rate filters that are implemented using discrete prolate spheroidal sequences, and designed for two different purposes -- beam sidelobe/horizon suppression (the `mainlobe' filter), and ground-locked systematics removal (the `notch' filter). We apply these to simulated data, and study how their properties affect visibilities and power spectra generated from the simulations. Included is an introduction to fringe-rate filtering and a demonstration of fringe-rate filters applied to simple situations to aid understanding.
Radio interferometers aiming to measure the power spectrum of the redshifted 21 cm line during the Epoch of Reionisation (EoR) need to achieve an unprecedented dynamic range to separate the weak signal from overwhelming foreground emissions. Calibration inaccuracies can compromise the sensitivity of these measurements to the effect that a detection of the EoR is precluded. An alternative to standard analysis techniques makes use of the closure phase, which allows one to bypass antenna-based direction-independent calibration. Similarly to standard approaches, we use a delay spectrum technique to search for the EoR signal. Using 94 nights of data observed with Phase I of the Hydrogen Epoch of Reionization Array (HERA), we place approximate constraints on the 21 cm power spectrum at $z=7.7$. We find at 95% confidence that the 21 cm EoR brightness temperature is $\le$(372)$^2$ "pseudo" mK$^2$ at 1.14 "pseudo" $h$ Mpc$^{-1}$, where the "pseudo" emphasises that these limits are to be interpreted as approximations to the actual distance scales and brightness temperatures. Using a fiducial EoR model, we demonstrate the feasibility of detecting the EoR with the full array. Compared to standard methods, the closure phase processing is relatively simple, thereby providing an important independent check on results derived using visibility intensities, or related.
ABSTRACT We present a Bayesian jackknife test for assessing the probability that a data set contains biased subsets, and, if so, which of the subsets are likely to be biased. The test can be used to assess the presence and likely source of statistical tension between different measurements of the same quantities in an automated manner. Under certain broadly applicable assumptions, the test is analytically tractable. We also provide an open-source code, chiborg, that performs both analytic and numerical computations of the test on general Gaussian-distributed data. After exploring the information theoretical aspects of the test and its performance with an array of simulations, we apply it to data from the Hydrogen Epoch of Reionization Array (HERA) to assess whether different sub-seasons of observing can justifiably be combined to produce a deeper 21 cm power spectrum upper limit. We find that, with a handful of exceptions, the HERA data in question are statistically consistent and this decision is justified. We conclude by pointing out the wide applicability of this test, including to CMB experiments and the H0 tension.
We report the most sensitive upper limits to date on the 21 cm epoch of reionization power spectrum using 94 nights of observing with Phase I of the Hydrogen Epoch of Reionization Array (HERA). Using similar analysis techniques as in previously reported limits (HERA Collaboration 2022a), we find at 95% confidence that $\Delta^2(k = 0.34$ $h$ Mpc$^{-1}$) $\leq 457$ mK$^2$ at $z = 7.9$ and that $\Delta^2 (k = 0.36$ $h$ Mpc$^{-1}) \leq 3,496$ mK$^2$ at $z = 10.4$, an improvement by a factor of 2.1 and 2.6 respectively. These limits are mostly consistent with thermal noise over a wide range of $k$ after our data quality cuts, despite performing a relatively conservative analysis designed to minimize signal loss. Our results are validated with both statistical tests on the data and end-to-end pipeline simulations. We also report updated constraints on the astrophysics of reionization and the cosmic dawn. Using multiple independent modeling and inference techniques previously employed by HERA Collaboration (2022b), we find that the intergalactic medium must have been heated above the adiabatic cooling limit at least as early as $z = 10.4$, ruling out a broad set of so-called "cold reionization" scenarios. If this heating is due to high-mass X-ray binaries during the cosmic dawn, as is generally believed, our result's 99% credible interval excludes the local relationship between soft X-ray luminosity and star formation and thus requires heating driven by evolved low-metallicity stars.
ABSTRACT Detection of the faint 21 cm line emission from the Cosmic Dawn and Epoch of Reionization will require not only exquisite control over instrumental calibration and systematics to achieve the necessary dynamic range of observations but also validation of analysis techniques to demonstrate their statistical properties and signal loss characteristics. A key ingredient in achieving this is the ability to perform high-fidelity simulations of the kinds of data that are produced by the large, many-element, radio interferometric arrays that have been purpose-built for these studies. The large scale of these arrays presents a computational challenge, as one must simulate a detailed sky and instrumental model across many hundreds of frequency channels, thousands of time samples, and tens of thousands of baselines for arrays with hundreds of antennas. In this paper, we present a fast matrix-based method for simulating radio interferometric measurements (visibilities) at the necessary scale. We achieve this through judicious use of primary beam interpolation, fast approximations for coordinate transforms, and a vectorized outer product to expand per-antenna quantities to per-baseline visibilities, coupled with standard parallelization techniques. We validate the results of this method, implemented in the publicly available matvis code, against a high-precision reference simulator, and explore its computational scaling on a variety of problems.