If you want food system 'insight' but don't want to use a professional research agency, you probably need to read this...
- Grounded Research

- Dec 19, 2025
- 5 min read
Updated: Jan 4
We are going into 2026 with some fragile foundations on what is true in a very changeable place for the UK food system. It is tempting to reach for the Typeform or the Mailchimp...even the MS Forms...and see what's cooking. You might need a good headline. You might need a temp check before jumping into a big decision.
Give me a call, I can help.
No budget for insight? No problem, do it solo, but read this before you start writing...
Surveys have a strange place in the marketing mix. They look simple, almost democratic, anyone can ask a question, anyone can answer one. And that’s true, technically. But the gap between asking questions and producing insight is where many well-meaning surveys quietly fall apart.
And to be fair, sometimes a survey is there to do a job. You might need a headline for a press release, a stat to anchor a comms campaign, or a line that helps an issue cut through. That’s a legitimate use of surveys, I was a marketer before a researcher - no judgement -this is how a lot of people encounter research in the wild. But it’s important to be honest about what you’re doing in this industry that is already a mass of uncertainty - adding to it helps no one.
If your survey is primarily a marketing or communications tool, the rules are different. If, however, you’re calling it research, something that will inform policy, strategy or big decisions - then the bar is much higher. And that’s where it’s worth slowing down and reading on.

Here are some common mistakes with examples that we have seen pop up in the environment/agriculture/food space recently, particularly when things are getting a bit hotter!
Emotion creeps into your questions
This matters most when the topic is emotive. Policy change. Personal finances. Identity. Livelihoods. When people care deeply, it’s easy for a survey to drift from research into something closer to campaigning, often without the author even realising it.
One of the most common issues I see is language that feels emotionally accurate but analytically risky. Words that reflect how a situation feels can also nudge respondents toward a particular answer. The moment a question starts to sound like a statement, you stop measuring opinion and start shaping it. The data may feel affirming, but it becomes harder to defend, especially to people who don’t already share your perspective.
“How unfair do you think the recent changes are for people working in this sector?”
“To what extent do you feel this policy is damaging livelihoods?”
“Do you agree that this decision shows a lack of understanding of our industry?”
To assume is to make an ass of yourself and your analytics
There’s a similar problem with cause and effect. Surveys often assume that because something happened, it must be the reason people are thinking or behaving differently. But human decision-making is rarely that ready packaged. A good survey doesn’t jump straight to attribution; it carefully separates awareness, interpretation and action. Without that separation, it’s very easy to overstate impact and draw conclusions the data can’t really support.
“Since the recent policy announcement, have you changed your business plans?”
“As a result of these changes, are you now reconsidering future investment?”
“How has this decision affected your confidence in the future of the sector?”
Quick and dirty questions, can give quick and dirty data
Another quiet trap is the overuse of yes/no questions. They’re efficient, but they collapse nuance. When someone says they’ve “considered” doing something, that could mean anything from a fleeting thought to a serious plan. Those distinctions matter, especially when results might inform policy, investment or public debate. Binary answers often look clean in a spreadsheet, but they can hide more than they reveal.
“Have you considered reducing investment in the next 12 months? Yes / No”
“Have you thought about making changes to your business as a result? Yes / No”
“Are you now less likely to expand? Yes / No”
Trying too hard, doing too much
Clarity matters too. Questions that try to do too much at once - asking about two things, or three, or bundling action with judgement - create data that’s impossible to interpret cleanly. Respondents do their best, but the ambiguity remains baked into the results. Interpretation of a question should be carefully tested before it goes anywhere near a mass audience.
“Have you reviewed your succession plans and made changes as a result?”
“Do you feel the policy is unfair and poorly communicated?”
“Have you delayed or cancelled investments because of uncertainty and lack of trust?”
The road to hell (or commercial confusion!) isn’t paved with good intentions – just misinterpreted buying intentions
Questions about what farmers are “likely to buy” feel commercially powerful. But on farms, intention is not behaviour – it is conditional, provisional and easily derailed by weather, cashflow, policy or supply.
Take a typical example:
“Which of the following are you likely to buy in the next 12 months?”(Clothing, feed, machinery, software, chemicals…)
Ticking all of these doesn’t mean four real purchasing decisions. It signals identity – this is the kind of farmer I am – not budget, timing or authority.
“How much influence do you have over buying decisions on farm? (0–10)”
On most farms, influence varies wildly by category. Machinery is not seed; feed is not software. A single score flattens governance into ego.
“Thinking about the last two years, which product or service would you recommend?”
Recency bias takes over. Big, emotional purchases crowd out the quiet decisions that actually shape farm performance.
These questions aren’t malicious – they’re written with good intent. But when aspiration is mistaken for readiness, organisations start planning for demand that never materialises. The data looks confident. The market never quite behaves the way the spreadsheet promised.
Dodge political questions like a politician
Political questions add another layer of complexity. Asking people to declare voting intentions, protest behaviour or allegiance can trigger defensiveness or socially desirable answers. Sometimes those questions are necessary, but often there are better ways to understand alignment, trust or openness without pushing respondents into corners they’d rather avoid.
“Would this policy change affect how you vote at the next election?”
“Have you taken part in protests or demonstrations about this issue?”
“Do you support the government’s approach to this policy? Yes / No”
Seeking validation has no place in research
Perhaps the biggest risk, though, is when a survey slowly drifts away from curiosity and towards validation. When the questions themselves signal what the “right” answer looks like, respondents pick up on it. At that point, the survey may still collect responses, but it stops collecting insight. And while that might be useful for mobilisation, it rarely stands up to scrutiny or leads to better decisions.
“Do you agree that this policy will have serious long-term consequences?”
“How strongly do you oppose the proposed changes?”
“Should more be done to stop this policy from going ahead?”
A final sense check...
One simple safeguard: before launching a survey, ask whether someone who disagrees with you would still feel able to answer it truthfully. If the answer is no, the data will almost certainly be weaker than it looks.
The irony is that most of these mistakes come from good intentions. People care. They want to be heard. They want to capture urgency. But good research isn't about diluting strong feeling, it creates space for it to be expressed honestly, without steering or assumption.
If your survey needs to influence policy, inform strategy or hold up under public examination, neutrality isn’t a nice-to-have. It’s the foundation. How you ask the question matters just as much as the answer you’re hoping to hear. It takes years of practice to get right and even those who have been at it and hold a clutch of certificates to prove it don't always get it spot on.
You made it this far, you are serious about research. So even without budget, you can still give me a call - I don't mind helping out with some pointers in the name of good research for this sector! clare@groundedresearch.co.uk








Comments