
Image from Garance Doré
So fall is here, I think we all noticed that this morning when we walked out the door.
But for some reason I’m still scared about taking my boots out of my winter closet because then it becomes a fact: it’s cold again.
But leaving my personal fears aside, when do you guys think is the right time to start wearing boots again? Is it the beginning of fall? Do you just wear them all-year-long? When do you put them back in the closet for good?
Answers in the comment section!