Publicly funded advice and support to business can have a positive impact on business but performance is mixed across schemes, with some schemes delivering little, if any, benefits. Better design and evaluation will improve cost-effectiveness of business advice schemes.
- Programme design and evaluation needs to improve if we want to identify what works in the area of business advice. Evaluation should be embedded in programme design.
- When designing a programme, local policymakers should identify one or two clear programme objectives, and then identify outcome measures that are clearly related to the programme objectives and feasible to measure.
- Programmes which used a hands-on, ‘managed brokerage’ approach may perform better than those using a light-touch approach. It is not clear, however, which of these two approaches is more cost-effective.
- There is a need for improved evaluation and greater experimentation – identifying how different elements of the programme design contribute to outcomes, and the value for money of different approaches.
- More analysis of cost-effectiveness is needed when evaluating the impact of business advice – as well as consistency across analyses. Only five of the 23 shortlisted studies included cost-benefit analysis, and not all of these used measures that are comparable across studies.
About the research
The provision of publicly funded advice, mentoring and support to businesses – particularly to entrepreneurs and small businesses – is widespread in OECD countries. These interventions typically aim to increase rates of firm creation, improve business survival and promote business productivity and employment growth, by providing impartial, free or subsidised advice and mentoring.
The What Works Centre for Local Economic Growth has carried out a systematic review of evaluations of business advice, to explore the impact of policy interventions and establish how cost-effective they are in supporting business.
The review includes evaluations that meet defined quality standards, comparing ‘before and after’ outcomes for firms receiving treatment measured against a ‘counterfactual’ comparison group of firms without business advice – to explore what would have happened to these outcomes in the absence of treatment. The evidence reviewed includes four randomised control trials – set up to ensure that control and treatment groups are truly comparable.
- Business advice had a positive impact on at least one business outcome in 14 out of 23 evaluations.
- In most cases, programmes had vague or multiple objectives, which makes measuring success difficult.
- No strong differences in results were found between programmes with multiple objectives and programmes with more focused objectives.
- Business advice programmes show somewhat better results for sales than they do for employment and productivity, but results are generally mixed.
- Programmes that used a hands-on ‘managed brokerage’ approach may perform better than those using a light-touch approach (such as providing advice through a website) – although this conclusion is based on only one comparison study, and hands-on schemes are also more expensive.
- There was no evidence that would suggest one level of delivery – national or local – is more effective than another.
- It is difficult to reach any conclusions about the effectiveness of public-led versus private-led delivery.
- Detailed local knowledge and context remain crucial. The review findings do not address the specifics of ‘what works where’ or ‘what will work for a particular firm’.
The report Evidence review: business advice, produced by the What Works Centre for Local Economic Growth, presents a systematic review of business information, advice and mentoring programmes (‘business advice’), assessing their effectiveness in improving firm performance in terms of productivity, employment and other performance measures.
- See also the WWG toolkit for business advice