Results
Permanent snow cover date, i.e., 100% snow cover without melting until the spring, was variable across our autumn seasons, occurring almost 3 weeks later in 2015 (November 3rd) than in 2016 (October 16th) and 2017 (October 17th). Completion of snowmelt date, i.e., no more snow on ground, was similar across study years (May 6th, 2015, May 1st 2016, May 2nd 2017 and May 1st 2018). When considering both seasons and all years together, the prevalence of coat colour mismatched hares that contrasted with their snowless environment was low (14% of trapping records) in our population. Mismatch occurred more frequently in the autumn (19% of trapping records) than the spring (8% of trapping records). The autumn with the latest permanent snow cover arrival date, i.e., 2015, had the highest prevalence of mismatch (33% of records). Prevalence of mismatch in the autumns of 2016 and 2017 were 10% and 13% of trapping records, respectively. Spring mismatch was consistent across years around 10% (2015-9% of trapping records, 2016-10%, 2018-12%), with the exception of 2017 when only 1% of trapped hares were mismatched.
Effect of coat colour mismatch on mortality
The CPH model with the strongest support in both seasons included snow depth, snow cover and mismatch (Table S11, S12 & S13). However, the second highest ranking CPH model for spring, i.e., the model including only snow variables, was within 2 ΔAICc (AICc = 0.09) from our top spring CPH model (Table S11). Mortality risk for mismatched hares in autumn was significantly reduced (z = -2.43; P=0.02) relative to matched hares (Hazard Ratio (HR)= 0.135; 95% Confidence Intervals (CI): 0.027, 0.679; Fig. 1a). In contrast, coat colour mismatch was positively correlated with mortality risk for hares in the spring (Fig. 1b), but this effect was non-significant (z = 1.60; P= 0.11). Models were qualitatively similar regardless of our classification of mismatch, except when considering mismatch as a minimum 40% contrast between coat colour and snow cover; in this case mismatch significantly increased mortality risk in the spring (HR= 6.780; 95% CI: 2.390, 19.240;z = 3.60; P<0.001). Snow depth (z = -2.29; P= 0.02) and snow cover (z = 2.98; P=0.003) significantly affected mortality risk in the top spring model, but not in the top autumn model. In spring, the risk of dying decreased as snow depth increased (HR=0.95; 95% CI: 0.92, 0.993; Fig S1a) and mortality risk increased as snow cover increased (HR=1.046; 95% CI: 1.01, 1.08; Fig S1b).
Effect of coat colour mismatch on foraging time
Across our study years, hares foraged on average 706 ± 2.29 minutes per day in the spring and 751 ± 1.65 minutes per day in the autumn. Coat colour mismatch was an important predictor of daily foraging time in the autumn, but not the spring (Table S14 and S15). The top model for autumn foraging time included coat colour mismatch, temperature, year, and the interaction between temperature and mismatch (Table 1). As autumn temperature decreased, mismatched hares decreased daily foraging time, whereas matched hares increased foraging time (Fig. 2a; Table 1). For instance, when the temperature was - 8 °C, brown-matched hares foraged 65 minutes more per day than white-mismatched hares (Fig. 2a). The top model for spring included temperature, year, and sex (Table 1), When coat colour mismatch was included in our spring foraging models, its effect on daily foraging time was non-significant (t =-0.759,P >0.05).