Recently Ant Colony Optimization (ACO) algorithms have been proven to be efficient in uncertain environments, such as noisy or dynamically changing fitness functions. Most of these analyses focus on combinatorial problems, such as path finding. We analyze an ACO algorithm in a setting where we try to optimize the simple OneMax test function, but with additive posterior noise sampled from a Gaussian distribution. Without noise the classical (μ + 1)-EA outperforms any ACO algorithm, with smaller μ being better; however, with large noise, the (μ + 1)-EA fails, even for high values of μ (which are known to help against small noise). In this paper we show that ACO is able to deal with arbitrarily large noise in a graceful manner, that is, as long as the evaporation factor ρ is small enough dependent on the parameter σ2 of the noise and the dimension n of the search space (ρ = o(1/(n(n+ σ log n)2 logn))), optimization will be successful.