The authors investigate the upper and lower bounds on the quantization distortions for independent and identically distributed sources in the finite block-length regime. Based on the convex optimization framework of the rate-distortion theory, they derive a lower bound on the quantization distortion under finite block-length, which is shown to be greater than the asymptotic distortion given by the rate-distortion theory. They also derive two upper bounds on the quantization distortion based on random quantization codebooks, which can achieve any distortion above the asymptotic one. Moreover, they apply the new upper and lower bounds to two types of sources, the discrete binary symmetric source and the continuous Gaussian source. For the binary symmetric source, they obtain the closed-form expressions of the upper and lower bounds.