Robustness is one of the desired properties of many computer vision algorithms. Most existing efforts toward robustness have focused on the natural changes in nature, e.g. day/night and various weather conditions. However, real-world challenges do occur from accidental situations that are not expected at the training phase. In this paper, we address a practical multispectral fusion issue as unexpected image contamination in day and night conditions. Based on our observation that the change of a few parameters in the fusion part is enough to achieve good performance in such conditions, we propose a fault-tolerant training strategy for normal and abnormal conditions of multispectral pedestrian detection. Through the extensive experiments on KAIST multispectral benchmarks, the proposed method significantly reduces the performance degradation in the unseen contamination by a large margin. Furthermore, our model shows comparable performance with state-of-the-art methods in normal conditions.