This study presents a novel data-driven approach to detect desynchronization among biosignals from two modalities. We propose to train a deep neural network to learn synchronized patterns between biosignals from two modalities by transcribing signals from one modality into their expected, simultaneous or synchronized signal in another modality. Thus, instead of measuring the degree of synchrony between signals from different modalities using traditional linear and non-linear measures, we simplify this problem into the problem of measuring the degree of synchrony between the real and the synthesized signals from the same modality using the traditional measures. Desynchronization detection is then achieved by applying a threshold function to the estimated degree of synchrony. We demonstrate the approach with the detection of eye-movement artifacts in a public sleep dataset and compare the detection performance with traditional approaches.