To my understanding, through most of history, both genders did work that, to the extent any work paid, was paying work. In hunter-gatherer times, both men and women obtained food. In most agricultural societies, both men and women were involved in growing crops and/or tending livestock, and if they were not, generally either women did all the work, or women did some other work, such as spinning and weaving, that was at least sometimes sold for money. Even for most of the industrial era, if you were wealthy enough to not work, neither gender worked; if you were poor enough to need to work, generally both genders had paying jobs. There was, perhaps, a brief snapshot of time--possibly as wide as the '20s to the '70s, possibly much narrower--where any substantial fraction of households in the Western world had a model where men did paying work outside the home, while women did unpaid work inside the home.
Yet, if you hear social conservatives speak of it, you'd think that brief snapshot somehow represents all of history, or possibly some sort of natural law. That the natural way of things is for men to be "providers", while women are housewives.
What gives?