The genomics community has frequently compared advances in sequencing to advances in microelectronics. Lately there have been many claims, including by the National Human Genome Research Institute (NHGRI), that genomics is outpacing developments in computing as measured by Moore's law – the notion that computers double in processing capability per dollar spent every 18-24 months. Celebrations of the “$1000 genome” and other speed-related sequencing milestones might be dismissed as a distraction from genomics' slowness in delivering clinical breakthroughs, but the fact that such celebrations have been persistently encouraged by the NHGRI reveals a great deal about the priorities and expectations of the American general public, the intended audience of the genomics–computing comparison. By delving into the history of speculative thinking about sequencing and computing, this article demonstrates just how much more receptive to high-risk/high-payoff ventures the NIH and the general public have become. The article also provides access to some of the roots and consequences of the association of “innovation talk” with genomics, and the means to look past that association to the less glamorous (but arguably much more important) contributions of the NHGRI to building the field of genomics.