Today is Labor Day, an occassion first observed 125 years ago to honor America's workers. In the intervening years, Labor Day came to symbolize the end of summer. Somewhere along the way, Southern women adopted a tradition that white shoes should be worn only only after Easter and before Labor Day.
So where does this leave women in Florida? Is this a Southern state, or not? A common generalization is that the further south you go in Florida, the further north you are. One professor I know sometimes muses that the line between the two is marked by whether you are served grits or a croissant for breakfast. I'm wondering -- could the line be drawn by whether or not women wear white shoes after Labor Day?
I don't want to get into North v. South debates -- we're all part of Florida's culture and heritage. But I am curious -- will women in Tallahassee be wearing white shoes this week? How about women in Miami?