Low social rhythm regularity predicts first onset of bipolar spectrum disorders among at-risk individuals with reward hypersensitivity.
Academic Article
Overview
abstract
The social zeitgeber model (Ehlers, Frank, & Kupfer, 1988) suggests that irregular daily schedules or social rhythms provide vulnerability to bipolar spectrum disorders. This study tested whether social rhythm regularity prospectively predicted first lifetime onset of bipolar spectrum disorders in adolescents already at risk for bipolar disorder based on exhibiting reward hypersensitivity. Adolescents (ages 14-19 years) previously screened to have high (n = 138) or moderate (n = 95) reward sensitivity, but no lifetime history of bipolar spectrum disorder, completed measures of depressive and manic symptoms, family history of bipolar disorder, and the Social Rhythm Metric. They were followed prospectively with semistructured diagnostic interviews every 6 months for an average of 31.7 (SD = 20.1) months. Hierarchical logistic regression indicated that low social rhythm regularity at baseline predicted greater likelihood of first onset of bipolar spectrum disorder over follow-up among high-reward-sensitivity adolescents but not moderate-reward-sensitivity adolescents, controlling for follow-up time, gender, age, family history of bipolar disorder, and initial manic and depressive symptoms (β = -.150, Wald = 4.365, p = .037, odds ratio = .861, 95% confidence interval [.748, .991]). Consistent with the social zeitgeber theory, low social rhythm regularity provides vulnerability to first onset of bipolar spectrum disorder among at-risk adolescents. It may be possible to identify adolescents at risk for developing a bipolar spectrum disorder based on exhibiting both reward hypersensitivity and social rhythm irregularity before onset occurs.