MR Fingerprinting (MRF) is a relatively new imaging framework capable of providing accurate and simultaneous quantification of multiple tissue properties for improved tissue characterization and disease diagnosis. While 2D MRF has been widely available, extending the method to 3D MRF has been an actively pursued area of research as a 3D approach can provide a higher spatial resolution and better tissue characterization with an inherently higher signal-to-noise ratio. However, 3D MRF with a high spatial resolution requires lengthy acquisition times, especially for a large volume, making it impractical for most clinical applications. In this study, a high-resolution 3D MR Fingerprinting technique, combining parallel imaging and deep learning, was developed for rapid and simultaneous quantification of T1 and T2 relaxation times. Parallel imaging was first applied along the partition-encoding direction to reduce the amount of acquired data. An advanced convolutional neural network was then integrated with the MRF framework to extract features from the MRF signal evolution for improved tissue characterization and accelerated mapping. A modified 3D-MRF sequence was also developed in the study to acquire data to train the deep learning model that can be directly applied to prospectively accelerate 3D MRF scans. Our results of quantitative T1 and T2 maps demonstrate that improved tissue characterization can be achieved using the proposed method as compared to prior methods. With the integration of parallel imaging and deep learning techniques, whole-brain (26 × 26 × 18 cm3) quantitative T1 and T2 mapping with 1-mm isotropic resolution were achieved in ~7 min. In addition, a ~7-fold improvement in processing time to extract tissue properties was also accomplished with the deep learning approach as compared to the standard template matching method. All of these improvements make high-resolution whole-brain quantitative MR imaging feasible for clinical applications. Copyright © 2019 The Authors. Published by Elsevier Inc. All rights reserved.