Background The quantitative evaluation of liver iron concentration (LIC) is important in guiding the treatment of blood transfusion-dependent patients. Conventionally, LIC is assessed through R2*or R2 values using magnetic resonance imaging (MRI). However, most of the studies using MRI to determine iron overload were restricted by the minimum echo time, so that severe iron overload could hardly be quantified. In our study, we demonstrate a new approach to overcome the limitation of the shortest echo time using ultra-short echo time (UTE) MRI to quantify liver iron overload of varying degrees in a rat model. Methods Sixty female Sprague-Dawley rats were included and randomly assigned into 10 equal groups. Group 1 was not injected with iron dextran. Groups 2 to 10 were intraperitoneally injected with iron dextran at a dose of 15 mg/kg every 3 days. On every 6th day, one group was randomly selected from groups 2 to 10 for MRI scanning and liver iron concentration (LIC) detection. For groups 1 to 10, images were acquired by UTE sequence using a 3.0T MR scanner, and the T2* value and R2* value were obtained (R2* =1/T2*). In addition, LIC was measured using an atomic absorption photometer. The correlation analysis between R2* value and LIC was performed and the regression equation of R2* and LIC was established and its reliability verified. Results For groups 1 to 10, R2* values and LIC ranged from 60.16±4.76 to 1,306.90±42.26 Hz and from 0.84±0.11 to 5.89±2.64 mg/g dry, respectively. The R2* value was linearly correlated to the LIC (r=0.897, P<0.001), and the linear regression equation was LIC = 0.005 × R2* + 1.783. The validation analysis results showed that the intragroup correlation coefficient (ICC) between the predicted and measured LIC was 89.5%. Conclusions The UTE sequence could be used for quantification of varying degrees of hepatic iron overload in the rat model, and the LIC could be predicted by using the R2* value on an MR 3.0T scanner.