Assuming that black holes are in thermal equilibrium with the Universe, which is considered a heat bath, we investigated the finite-temperature effect of the Universe on black-hole evaporation. We found that if a black hole receives cosmic background radiation from the Universe, it steadily acquires mass if its initial mass is greater than the critical mass $M_c = (8\pi T_U/T_P)^{-1} M_P,$ where $T_U$ is the temperature of the Universe, $T_P$ and $M_P$ are the Planck temperature and the mass respectively. If the initial mass of the black hole is less than $M_c,$ it evaporates at a finite time $t_e.$
For $T_U \approx 2.725 \ \textrm{K},$ the average temperature of microwave background radiation, the critical mass is approximately equal to $ 4.5 \times 10^{19} \,\textrm{kg},$ or $ 2.26\times 10^{-13} M_\odot.$