My mother’s way of judging “eat-by” dates of most food was to ask two questions: how much mold is on it? (A sub-question: can you scrape it off?) And, how does it smell? She’d majored in biology at UNC and worked professionally in a number of bio-chemical labs, so she didn’t lack scientific knowledge of what was going on in the food decomposition process. What she had, and what I’m eternally grateful for, is a reasonable approach to risk when it comes to food, and an understanding that our senses are a pretty reliable guide to what’s safe, provided they are trained and that you understand the situations in which they are not reliable.
Training is crucial, and by training I don’t mean just eating everything and seeing what makes you sick. You could do this – we basically have done this collectively as a species – but I don’t recommend it. I mean, rather, that you look at and smell something, make your best guess as to whether it’s ok, then before you eat it you ask someone who knows better than you – because they have the relevant experience – whether you’ve guessed right. That way you get confirmation or not of your initial judgment. Do this often enough, and your ability to judge for yourself emerges in due course. (Lots of education, both formal and informal works this way, of course – it’s more-or-less how we perpetuate all sorts of culturally entrenched knowledge – which is one of the reasons why replacing teachers with computers and remote graders is, for many kinds of learning, a bad idea.)
Thanks to my mom, then, I’m comfortable sniffing and eyeballing my way through the leftovers in the fridge. It helps that I have a good sense of smell, even after years of allergies. I’m frequently told by my wife: “here, smell this,” to see if something is good. (She appreciates this when it comes to food; not so much when it runs afoul of her relentless refusal to stick to approved shampoos, lotions, etc. And as an aside: why oh why must people smear or mist themselves with lavender-scented concoctions? It’s cruel and inhumane.)
As an opposite extreme from mine, consider the case of one of my wife’s students from a few years ago. Talking about food with her just as she was heading off to medical school, she said that in her house growing up, leftover food that had been in the fridge more than three days was automatically thrown away. If that’s the kind of house you grow up in, there’s no way your judgment can be formed in such a way that you can trust yourself to make good assessments about what’s safe. You won’t have any way of knowing when something is too moldy or smells off (except in the really obvious cases).
I mention this student here because this issue of trusting your senses opens up into a whole host of issues about food safety and risk and the many factors in our society that shape how we approach them. The understanding of what’s “safe” that this young woman got growing up may well come to inform her professional judgments as a doctor, where they will have the stamp of scientific/medical authority because of her position and training, even though what they reflect is not science but instead the particular risk culture she grew up in, one in which she didn’t learn to think and judge for herself about what poses a risk and in which everything was instead subsumed under a single, overly cautious rule.
That’s more-or-less the standard risk culture Americans grow up in when it comes to food (at least if they have enough money to be able to afford to throw food away). And it’s part of a general risk culture in which the microbial world is seen as an ever-present threat that needs to be annihilated: hence the prevalence of anti-bacterial soaps, Lysol, and a host of other products designed to cleanse and purify our homes. But there’s a kind of positive feedback loop here: people grow up in a culture in which risk is understood a certain way; they then take that perception of risk into their medical, scientific and bureaucratic work; they conduct research framed by their background assumptions and issue “professional” rules and recommendations that then shape how others make judgments about risk. One effect of this is that a certain non-scientific set of practices gets the imprint of Real Science, and anyone who wants to question what’s really a risk or how we should respond to it then finds her or himself automatically perceived as a reactionary, unscientific outsider.
The wrong lesson to draw from this would be that all the professional rules and recommendations are wrong, that medical science can’t be trusted, or that everyone is equally capable of judging what’s risky for themselves. As I’ve said, you’ve got to be educated in the right way to be capable of judging for yourself. (So as not to be misunderstood: I think everyone should be free to take whatever risks they want, as long as they don’t impose undue risks on others in the process. But that doesn’t mean everyone is equally capable of assessing risks.)
No, the lesson instead is this: we (all of us) need to ask whether pronouncements coming from scientists, doctors, and bureaucrats are merely reflective of a risk culture that they themselves are blind to (we all tend to be blind to the feedback loops that shape what we think is reasonable), or whether they are instead ones that reflect expert understanding unavailable to us laypeople — for there really is such a thing as expert understanding, even if it is often corrupted by cultural bias, monetary influence, or other factors, and even if it is always less than fully certain.
That gets to the second thing I mentioned at the outset, the need to recognize when our senses can’t possibly be trained to detect a danger. Botulism is one clear example. If my home-canned tomatoes smell bad when I open them, that’s because the seal was imperfect and air could get in to allow nasty stuff to grow. But I can’t see or smell botulism. To understand when there is a risk of it and what to be alert for, I need to read what the microbiologists say, or the canning experts who have done that work for me. Likewise for many issues with meat. A huge percentage of grocery-store chicken is contaminated with salmonella (new rules aim to reduce this from 50% or so down to about 15%). I can’t see or smell that, so I need to rely on experts to know when it’s likely to be present and how to handle it. (That there’s so much of it is, of course, no small indictment of our industrial meat system. I’m curious about, but I don’t know, what bacterial loads are on the meat processed by our smaller local facilities, those through which the meat at most of the farmers’ markets pass.) With the ham I cured, and the salamis I’ve since done, I did some homework to see what the risks are and how to protect against them, and for that I had to rely on people who have expert knowledge I lack.
And, despite what may sound like a pretty cavalier approach to food safety, my mom was good about this, making sure we knew when not to trust our senses and deal with risks they couldn’t inform us about. The first canning book she gave me as an adult is the very cautious Putting Food By. But by allowing me to learn to trust my senses in many cases, I’m in a better position to judge when an author is likely being overly cautious or not – and I can do the research to figure out whether there is real science behind a claim about risk or just the prejudices of a risk culture.
Bringing up the idea of a risk culture, and of how our professional scientists and bureaucrats play into that, opens a huge can of worms. (Never having had canned worms, I don’t know how to judge if they’re safe to eat, and I’m not sure what expert opinion is on this.) I’ll examine some of these worms in the future.
Two last thoughts here though, which will steer much of what I explore on this topic: scientists are by nature and training suspicious of culturally transmitted understanding of the sort my mom gave me of how to judge food safety. And not without some reason, for culture is capable of transmitting a whole lot of stupidity just as easily as it can transmit wisdom. Scientists, however, are generally pretty blind to the fact that they science itself relies on many of the same mechanisms of belief transmission as those areas of life they are suspicious of (religion, folklore, etc.), and to the fact that their own views are often in unconscious ways shaped by non-scientific forces. (I’m in the middle of reading an excellent book by Drs. Jerome Groopman and Hartzband on how this plays out in medicine, Your Medical Mind, which also looks at how our individual dispositions towards risk as patients figure in appropriate decision making.) Science, when done well, subjects everything to scrutiny, tests beliefs as rigorously as it can, and gives up on those for which there is no evidence. But there are always cultural background forces at work in determining what needs to be scrutinized or is legitimate to believe. Scientists are also jealous of their cultural authority – and in these days when good science has an increasingly small impact on public policy, as most notably, perhaps, in climate change and vaccination debates, this is understandable – and so they frequently fail to be open to the possibility that with many things, culture can be a reliable guide to belief and action. This was essentially the point of Michael Pollan’s book In Defense of Food, which argued that we’re better off picking any long-standing food tradition that had been developed over a long period of time by a particular culture than we are relying on nutrition science to tell us what to eat. For nutrition science is framed by a crucial assumption about what food is: a collection of individually isolable nutrients. While useful up to a point, that’s an assumption that frames and guides empirical research but represents a cultural choice about how to think about food. A similar assumption governs a great deal of agricultural science, as Pollan and others have also explored. There’s nothing unscientific about thinking about the farm as a holistic ecosystem embedded in a larger ecosystem, but most ag science treats individual elements rather than wholes, and so thinks in analogous terms to the nutritionist: what elements can be added or subtracted in what quantities in order to generate optimal output?
Lastly: in the end, the most science can do is provide us with good information about levels of risk: what are the chances that in doing X bad outcome Y will result? But that’s a much different question than should I be allowed to do X? So, for instance, science can, in theory at least, tell us what the chances are of getting sick from drinking raw milk or eating young raw milk cheeses by on the one hand assembling population level data that correlates illness and consumption patterns, and on the other studies the microbiology of milk as its produced and stored in different environments. Anecdotal evidence we acquire from those we happen to know isn’t a good substitute for this. But even assuming that the chance of getting food poisoning from raw milk than from other foods (an assumption most raw milk advocates would, of course, deny), that by itself doesn’t mean that we shouldn’t be allowed to drink it. Europe allows raw milk products because it’s pretty well known what the risks are, and Europeans are comfortable with those risks. Americans see things differently. We have different risk cultures when it comes to raw dairy.
There’s also of course the felt need by food safety bureaucracies to establish their power (as well as the influence over them by industrial ag interests) which leads them to target raw milk, but the perceived legitimacy of that targeting piggybacks on the “scientific” assumption that it’s dangerous.
My nose has, it seems, led me into the thorny questions about what the proper scope of government is, what role regulation should play in our risk culture, and how moneyed interests shape the regulatory culture. But I’ll let those (enormous) questions ripen a little before I take another sniff at them.