Surrogate data testing is a popular method to detect nonlinearity and chaos in time series and has been vastly used in many applications with erratic time series. The explicit null hypothesis often used is that the time series is generated from a linear, stochastic, Gaussian stationary process, including a possible invertible nonlinear static observation function. It is pointed out that the rejection of such a hypothesis may not only result from an underlying nonlinear or even chaotic system, but also from, e.g., a linear, stochastic, non-Gaussian and non-minimum phase sequence. We investigate the power of the test against non-minimum phase sequence.