This detailed costing of the development of an implementation intervention to increase attendance at diabetic retinopathy screening has yielded a total cost estimate of €40,485, 78% of which is attributable to the time contributed by researchers, patients, PPI participants and healthcare practitioners to the various development activities. These results highlight that intervention development requires a significant amount of human capital input, combining research experience, patient experience and expert knowledge in relevant fields.
Comparison with existing studies
As evident from the few studies which have costed both the direct and indirect costs of intervention development, the cost varies depending on the nature of the intervention and the approach to development. Our findings align with existing studies, which also reported most costs were attributable to personnel [
40], albeit the total cost varied substantially in line with approach to development (i.e. size of the team and type of personnel involved) and the nature of the intervention (i.e. extent of media production and external expertise required). For example, using a similar development process to IDEAs, Mortimer et al. [
13] developed an intervention (GP workshops in person or via DVD recording, supplemented with information packs) to support the implementation of clinical guidelines for lower back pain in general practice, reporting total costs of $83,456. Their approach involved substantial work with stakeholders, including the formation of GP advisory groups, focus groups, and subsequent analysis and interpretation. Most costs (66%) were attributable to person-hours contributed by the administrative/research personnel responsible for coordinating and conducting the development work. In comparison, Lairson and colleagues [
40] estimated the cost of developing a tailored interactive computer programme to support uptake of colorectal cancer, to be $328, 866 [
8], with personnel costs contributing about 69% of total costs (researchers (46%), video and software developers (24%)). Schuster et al. [
41] who developed a tailored health literacy intervention (including educational DVD, and guidebook) delivered by community health workers to improve uptake of cervical and breast cancer screening, estimated the cost of this process to be $121,817, with 84% attributable to staff salaries, including researchers, programmers, actors, graphic designers and a photographer/videographer. Similar to our approach, these existing studies costed development using financial records, receipts and invoices [
13,
40,
41], administrative records (e.g. details of the venues used and number of participants) [
13,
41] and staff wages [
13,
40,
41]. Mortimer et al. [
13] retrospectively consulted research project staff to obtain information on the process, using this to estimate the proportion of whole time posts spent on development. In contrast, Lairson et al. used weekly staff time logs as the work progressed, incorporating reminders to prompt accurate logs, or, when not maintained, to estimate time retrospectively (approx. 20%) [
40].
Implications
Given the substantial time committed to intervention development, this step should not be overlooked and instead should be considered and costed as part of the economic analysis of new interventions. Ideally, this would inform appropriate funding for such activities and provide estimates for opportunity costs and ensure sustainable development of high-quality interventions. Emphasis on conducting economic evaluations earlier in the intervention lifecycle is increasing [
4,
42]. In particular, rather than only considering costs at the full trial stage, it is considered good practice to collect relevant data and understand costs at the development and feasibility stages [
42] facilitating informed decisions at this point. We found costing from the funder perspective to be most practical and appropriate at the development stage. However, a broader perspective would be more relevant for in full economic evaluation to determine cost-effectiveness as part of a definitive trial. That is, taking account of all costs and potential outcomes with a value to society (e.g. avoided productivity losses, blindness, premature death). In theory, a societal perspective should be adopted, though we acknowledge in practice national guidelines need to be followed, which often only require a health care provider perspective.
In terms of approaches to incorporate development costs, studies which have costed the development stage, typically conducted analyses both with and without these costs included, in the former scenario amortising (i.e. gradually writing off or spreading) these costs [
13,
40] over the number of individuals expected to utilise the intervention [
13,
40], and over the expected life of the intervention [
40], or evaluating costs and outcomes over different time horizons which allowed for development time and costs [
41]. Incorporating development costs has the potential to impact on the findings on cost effectiveness, as illustrated by Mortimer and colleagues, who found their intervention to support guideline uptake was more effective
and more costly than standard approach to guideline dissemination, but only when development costs were included. Unlike research and development within the pharmaceutical industry [
43], the cost of developing interventions to support implementation cannot be recouped through profits, protected by patents. However, as outlined by Mortimer et al., these costs might be set aside as ‘sunk costs’ when there is potential for repeated use in other settings and among other populations [
13]. The aim of health services interventions, typically developed through public funding (i.e. research grants and supported by public sector academics), is not to profit but to improve patient outcomes. However, interventions should be still cost-effective to ensure such funding is allocated efficiently and effectively, and to aid decisions about how best to utilise it. The costs of development should still be considered by those developing and trialling such interventions so as to inform research priority decisions, particularly early in the intervention’s lifecycle, even if these costs are not included in the final CEA for the health care funder who considers whether to adopt the intervention or not. For example, information about the substantial investment required to develop interventions should be factored into decisions about whether to develop de novo interventions or use existing interventions, potentially saving resources. However, interventions introduced into different contexts may have different effects [
44]. Even if an existing intervention can be utilised, time and effort will still need to be invested to adapt it for a different context. In their systematic review to inform guidance on adaptation, Movsisyan et al. identified 11 key adaptation steps, which included conducting a population needs assessment and obtaining stakeholder input [
44]. Although we did not adapt an existing intervention, our findings illustrate that even within an area with numerous existing interventions (i.e. enhancing DRS attendance [
45]), and with existing evidence to draw on, following a recommended systematic process to enhance the ‘fit’ with the health care context still requires substantial resources.
The variation in development costs across existing studies reflects the fact there are several different ways to develop interventions with no single best approach. O’Cathain and colleagues, through systematically reviewing the development literature, identified 8 different types of approaches with 18 actions to be considered within those, including deciding who to involve in the development process, and whether to prioritise working with stakeholders or focusing on theory [
15]. One of the high cost activities was the patient and professional interviews; time spent on qualitative professional interviews with HCPs contributed 18% (€4251) to the total cost of intervention development. Though conducted prior to the development work, we included these interviews in our core development costs as a key part of development is understanding the barriers and facilitators to the behaviour of interest. Often this takes the form of interviews [
46,
47]. It is important to note these interviews were conducted as part of a broader evaluation to understand the implementation of the national clinical programme for diabetes in Ireland, including the establishment of a national DRS programme [
48]. Arguably, the time and cost required to conduct these interviews may have been shorter if the topic was restricted solely to the DRS programme. Though a previous systematic review of barriers and facilitators to DRS attendance was available [
49], the lack of interviews in the Irish setting meant it was possible that this review might risk missing context-specific barriers in our setting. Furthermore, the results of this previous work enabled the research team to consider known barriers and facilitators and make informed decisions during intervention development
. This raises an important point, namely that costing development, and raising awareness of such costs, should not prompt developers to only engage in processes which can be streamlined, or which are deemed most efficient. It is important to consider which development approach is most appropriate and feasible given the healthcare context, stakeholder expectations and resource constraints. We used what O’Cathain and colleagues have classified as a stepped theory and evidence based approach to development. Others such as partnership approaches which comprise greater involvement from end-users, often with an emphasis on co-creating knowledge, can be less linear in nature [
15] and may be less time efficient. However, there could be downstream gains from using such approaches, including the development of an intervention which is a better ‘fit’ for the context and involves less adaptations and changes as the process moves from feasibility to efficacy studies. If more and different development approaches were costed and considered in light of the broader project (presented alongside a CEA for example), this would yield further insights about the best development approach for the given context.
Our scenario analyses highlight alternative approaches in the development of future interventions. We considered non-essential and ‘value-added’ costs in different scenarios, highlighting these activities in the context of their contribution to the total cost of the development of this intervention, demonstrating how costs can vary depending on status of prior work and approach adopted. Firstly, had the existing audit utilised for step 1 been necessary for the development process, for example, at the conception and planning stage [
15] to identify the problem and whose behaviour needs to change, it would have contributed a considerable proportion (36%) to the overall cost of intervention development. The process of manually sorting through patient health records to identify non-attenders at diabetic retinopathy screening is a labour-intensive activity. In other instances, a rapid or systematic review has been conducted to identify who needs to do what differently [
22]. A smaller scale audit or an expert consultation may also be an alternative to a full audit. Secondly, the scoping review conducted to identify effective recruitment strategies for PPI in clinical trials (part of step 3a), while considered a non-essential activity in the development of the intervention, was a valuable contribution to the conception and planning stage (18) given one key action is to establish groups to guide the development process and consider how to engage relevant stakeholders. The review contributed to the development of an informed recruitment strategy for PPI. At a cost of €1169, this activity would have contributed just 3% to the total cost of intervention development. Thirdly, had the consensus process been limited to just one mixed meeting, the total costs of development (excluding the audit and scoping review) would have been reduced by 9%. However, in a separate study we found that group composition influenced group dynamics and consequently contributions on the intervention ‘fit’ with the local context [
50].
Strengths and limitations
Our study adds to the very limited body of literature in a research area that has received little attention to date but that should be regarded as a fundamental aspect of the development of any intervention. However, the cost estimate has a number of limitations. First, the results may not be generalisable to other jurisdictions given that Irish wage rates, overhead rates, pension contribution and social insurance rates may not be applicable elsewhere. However, the exercise clearly demonstrates that a significant portion of intervention development costs can be accounted for with personnel time, 77% in our study. The development of an intervention requires a large amount of human capital, with expert knowledge in fields such as behavioural science and public health, that could otherwise be invested in a multitude of projects and endeavours. In that regard, our study can be compared to a number of other studies that have been conducted to cost the development of interventions to increase attendance at cancer screening, where the cost attributable to personnel has varied from 67% [
44] to 74% [
51]. Second, much of the data on research time spent developing the IDEAs intervention was retrospectively gathered from researchers during the intervention development process. Given that the costing exercise included activities that commenced with the audit of DRS uptake in June 2015 and concluded with development of the logic model almost 4 years later (May 2019), recollection of time spent contributing to the various steps may be subject to recall bias, an issue other studies have also cited [
40,
41]. We tried to account for this by conducting sensitivity analyses accounting for varying the time spent on different activities. While we accept that accurately estimating time is difficult, we are confident we have identified and used accurate records to compile other non-personnel costs. Co-authors involved in all stages of development (2015–2019) (FR and SMH) were able to identify and draw on email records, calendars and meeting minutes to inform estimates, and provide oversight over the costing process. For the later development steps, the lead author SA consulted with more than one researcher for certain steps (e.g. preparation of a questionnaire for the consensus process involved FR and two RAs) to verify the time and resources involved, including cross-checking emails and notes. Most of those involved in the core development work (FR, SMH, JB, SS, PK, AM) are co-authors and had the opportunity to review costs. Regular meetings were held between the lead author SA, and AM, FR and SMH to review the cost breakdown and discuss time estimates and any gaps. Having undertook this process, we would advise different approaches to documentation which would make costing easier and more accurate than retrospective accounts, for example using weekly staff logs as the work progressed in line with the approach used by Lairson et al. [
40]. Third, our decisions about what was essential and non-essential parts of the development process, was dictated by our approach to development, a theory and evidence-based process which has been widely used. However, elements of our decisions were considered subjective. In general, if we felt the intervention could have been developed without using a specific method or step, then we considered that part non-essential. So, while we felt the target behaviours could be identified without an audit (step 1), we knew step 2 would always involve a process to develop an understanding of barriers and facilitators, typically using interviews, and so this part was considered essential. While step 3 should identify the feasibility, acceptability and local relevance of the intervention, we chose to conduct consensus process but could have modified elements of this (i.e. fewer meetings, presenting an intervention plan to critique rather than options for intervention content). As we were aware of the subjective nature of our decisions and given there is no single best approach to develop interventions, we included the scenario analyses to make the costs as transparent as possible. Lastly, for reasons of confidentiality and data protection, the occupations of PPI participants were not sought and collected during intervention development. An assumption was made that average annual earnings for the relevant year, as published by the Central Statistics Office in Ireland [
38], should be used as a salary proxy for patients who participated in interviews, PPI participants and 2 contributors whose stated profession did not enable us to establish a salary. While accepting this as a limitation it should be noted that the number of hours contributed to intervention development from these participants represented 9% of the total time spent on intervention development. We also conducted sensitivity analysis to assess the impact of potentially higher salaries. Average annual earnings were varied by + 20% and this increased the total intervention development cost by just 2%.