The growth of Listeria monocytogenes inoculated on frankfurters at four inoculum levels (0.1,0.04,0.01, and 0.007 CFU/g) was examined at 4, 8, and 12°C until the time L. monocytogenes populations reached a detectable limit of at least 2 CFU/g. A scaled-down assumption was made to simulate a 25-g sample from a 100-lb batch size in a factory setting by using a 0.55-g sample from a 1,000-g batch size in a laboratory. Samples of 0.55 g were enriched in PDX-LIB selective medium, and presumptive results were confirmed on modified Oxford agar. Based on the time to detect (TTD) from each inoculum level and at each temperature, a shelf life model was constructed to predict the detection or risk levels reached by L. monocytogenes on frankfurters. The TTD increased with reductions in inoculum size and storage temperature. At 4°C the TTDs (±standard error) observed were 42.0 ± 1.0, 43.5 ± 0.5, 50.7 ± 1.5, and 55.0 ± 3.0 days when the inoculum sizes were 0.1, 0.04, 0.01, and 0.007 CFU/g, respectively. From the same corresponding inoculum sizes, the TTDs at 8°C were 4.5 ± 0.5, 6.5 ± 0.5, 7.0 ± 1.0, and 8.5 ± 0.5 days. Significant differences (P < 0.05) between TTDs were observed only when the inoculum sizes differed by at least 2 log. On a shelf life plot of In(TTD) versus temperature, the Q10 (increase in TTD for a 10°C increase in temperature) values ranged from 24.5 to 44.7 and with no significant influence from the inoculum densities. When the observed TTDs were compared with the expected detection times based on the data obtained from a study with an inoculum size of 10 to 20 CFU/g, significant deviations were noted at lower inoculum levels. These results can be valuable in designing a safety-based shelf life model for frankfurters and in performing quantitative risk assessment of listeriosis at low and practical contamination levels.