The backbone of standard cosmology is the Friedmann-Robertson-Walker solution to Einstein's equations of general relativity (GR). In recent years, observations have largely confirmed many of the properties of this model, which is based on a partitioning of the universe's energy density into three primary constituents: matter, radiation, and a hypothesized dark energy which, in LambdaCDM, is assumed to be a cosmological constant Lambda. Yet with this progress, several unpalatable coincidences (perhaps even inconsistencies) have emerged along with the successful confirmation of expected features. One of these is the observed equality of our gravitational horizon R_h(t_0) with the distance ct_0 light has traveled since the big bang, in terms of the current age t_0 of the universe. This equality is very peculiar because it need not have occurred at all and, if it did, should only have happened once (right now) in the context of LambdaCDM. In this paper, we propose an explanation for why this equality may actually be required by GR, through the application of Birkhoff's theorem and the Weyl postulate, at least in the case of a flat spacetime. If this proposal is correct, R_h(t) should be equal to ct for all cosmic time t, not just its present value t_0. Therefore models such as LambdaCDM would be incomplete because they ascribe the cosmic expansion to variable conditions not consistent with this relativistic constraint. We show that this may be the reason why the observed galaxy correlation function is not consistent with the predictions of the standard model. We suggest that an R_h=ct universe is easily distinguishable from all other models at large redshift (i.e., in the early universe), where the latter all predict a rapid deceleration.