The human visual system is able to quickly and robustly infer a wealth of scene information - the scene "gist" - already after 100 milliseconds of image presentation. Here, we investigated the ability to estimate the position of the horizon in briefly shown images. Being able to judge the horizon position quickly and accurately will help in inferring viewer orientation and scene structure in general and thus might be an important factor of scene gist. In the first, perceptual study, we investigated participants' horizon estimates after a 150 millisecond, masked presentation of typical outdoor scenes from different scene categories. All images were shown in upright, blurred, inverted, and cropped conditions to investigate the influence of different information types on the perceptual decision. We found that despite individual variations, horizon estimates were fairly consistent across participants and conformed well to annotated data. In addition, inversion resulted in significant differences in performance, whereas blurring did not yield any different results, highlighting the importance of global, low-frequency information for making judgments about horizon position. In the second, computational experiment, we then correlated the performance of several algorithms for horizon estimation with the human data - algorithms ranged from simple estimations of bright-dark-transitions to more sophisticated frequency spectrum analyses motivated by previous computational modeling of scene classification results. Surprisingly, the best fits to human data were obtained with one very simple gradient method and the most complex, trained method. Overall, global frequency spectrum analysis provided the best fit to human estimates, which together with the perceptual data suggests that the human visual system might use similar mechanisms to quickly judge horizon position as part of the scene gist.