What’s complexity got to do with health? It’s complex…

Still defrosting from my visit to Washington DC, I’ve reflected on the conference that I’ve just attended in complexity, inequalities and health. Sound complex? Well, here’s a simple summary that’s not as snow-covered as I have been over the past few days. But why waste your coffee time reading this article? Well, this might give you some insights about the perspectives and methods emerging from leading researchers working in complex systems, health and inequalities, as well as the investments in the area from the main health policy agency in the US.

  • “Complex Systems, Health Disparities & Population Health: Building bridges”

http://conferences.thehillgroup.com/UMich/complexity-disparities-populationhealth/agenda.html This conference was organised by the USA’s Network on Inequalities, Complexity and Health (NICH) and hosted by the National Institutes of Health (NIH) on campus in Bethesda, Maryland, USA. Not much tweeting throughout the two days, but I did start a hashtag that was picked up: #NICHconference

Image

  • The socio-ecological model of health lives on

As with most quality public health conferences, we saw the socio-ecological model in the opening comments. And one of the authors of papers about the socio-ecological model was present! It is a crucial framework by which we think, talk, measure, and report – important to communicate the individual, interpersonal, organisational, community and social policy impacts upon health of populations globally. It shows the complexity of health determinants, simply.

  • Complex systems theory challenges our thinking about how health is constructed

To begin we heard the nuts and bolts of complex systems science as it applies to health, and a message that the “find it, fix it” approach to public health isn’t working. If traditional approaches were effective, we wouldn’t have epidemics of non-communicable disease and unfair health inequalities.  Unbalanced investment exists in most contexts – for example in the USA they know that 40% of health problems are socially determined, 50% behavioural and only 10% due to health care. However, only 3% is spent on societal and individual-level prevention strategies (complex solutions), whilst 97% is spent on health care (simple solutions).

  • Complex systems science reorients our thinking about how to act to improve health

We can always interrogate the ‘why’ of health issues and inequalities. A person smokes because it’s socially acceptable, affordable, possible to do where they live, work/learn and play, and because cigarettes are available– actively marketed by for-profit companies. Food supply was given as another example. The production, marketing, acquisition, distribution, retail, purchasing and consumption of food is dynamic and depends on many factors such as market forces, housing, economics and built environment. Consider that the majority of countries in the world have McDonalds in urban areas; and, that the majority of countries have 50% of their population housed in urban areas.  What influences do these factors have on healthy food supply and access? Then how does that affect health and lifespan? As you can see, it’s complex. Check out this paper by Sandro Galea for more: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3134519/  

Image

  • Everything should be made as simple as possible, but not simpler

One speaker articulated that using simple interventions to address complex health issues is likely to fail. It’s a bit like King Canute ‘ordering back the tide’ – with health interventions and measurements, we can’t simply push against how things go naturally in a system, we need to identify multiple points and levers for interventions at different socio-ecological levels. Similarly, intervention research in this area can’t continue to be ‘linear’ and use averages for estimating effects –we need to capture heterogeneity. It’s tempting and logical to believe that if the parts get better (e.g. risk factors) then the whole will get better (e.g. populations), but change is contextually dependent. The response to multiple interventions will be very different than the totality of responses to each intervention separately. In other words, the whole is greater than the sum of its parts!

Image

Image from: http://canute2.sealevelrise.info/slr/Story%20of%20Canute

  • When it’s all ‘too complex’, or when there’s no ‘real’ data?

Try simulating or modelling data! There are often times when observed health issues are ‘too hard’ to disentangle from the modifiers and contextual factors. Modelling epidemiological associations between factor X (e.g. fast food) and factor Y (e.g. heart disease) may not reveal the nuances of what produced the issue in the first place – the causes of the causes.  The same goes for evaluating multifaceted interventions across the many socio-ecological levels – it’s hard to measure each and every factor that might have had an impact upon the observed outcomes, and then to attribute causation. Thus, we are often without empirical data that integrates the diversity of elements in a system, so it’s hard to prove what determinants to target. Also, limited quality evidence exists on processes and effectiveness of complex interventions, so we’re often ‘working in the gaps’.  Synthetic estimates can be produced by building simulation models, guided by existing data, evidence and theory. Models can control experimental conditions in a complex system, which is obviously impossible to do in ‘real world’ observational studies. Also, and rather compellingly, we heard that standard statistical approaches can’t examine feedback and adaptive mechanisms between environments and individuals/agents – whereas computational modelling can. This recent paper by Amy Auchincloss et al provides a recent example, with links between neighbourhood resources and obesity under study: http://onlinelibrary.wiley.com/doi/10.1002/oby.20255/full

  • Methods for research of complex systems, health determinants and impacts

The main methods presented in the presentations and posters included system dynamics, social network analysis (SNA), agent based modelling (ABM), and discrete event modelling. These methods, having emerged from complex systems science, are being applied to public health research. The methods were described as tools to help us make sense of the interactions within complex systems, and the impacts that interventions might have on health and inequalities.  For a primer, see the take-home messages from Nathan Osgood below, refer to a recent paper by Doug Luke and Katherine Stamatakis – these sources will be eminently better than my interpretation would be! http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3644212/pdf/nihms414057.pdf   

Image

  • What are some of the applications for simulation and modelling research?

For the most part, the presentations and posters highlighted a series of examples of modelling research studies that explored a range of factors related to health inequalities at the individual, institutional and neighbourhood level. Mostly, this provided case studies for how inequalities are produced, but some focused on estimating potential effects of interventions.

  • Examples of data simulation/modelling studies

At the individual and community level, an ABM explored differential effects of alcohol outlet density restrictions and policing upon alcohol-related violence and homicide among white Americans and African-Americans. A simulation study explored potential effects of upstream policy on Healthy eating and Physical activity, finding proof of concept that it may be more effective to target neighbourhood factors, not race, in reducing disparities in some contexts. At the population level, a case study from New Zealand was described, conducted when the earthquakes in 2011 interrupted the annual census, and modelled data was used to predict ongoing trends in primary health care access among Maori and Pacific Islander populations.

  • Progress and pitfalls for complex systems methods in public health

Collectively, from this conference it seems that certain systems science methods may tell us more about the nuanced factors causing health inequalities. It may also help reveal leverage points and suggest how to tailor interventions. But as with all research, challenges and limitations remain with these methods. These studies require interdisciplinary teams to ensure sufficient expertise in epidemiology, mathematics, computer programming, geography, public health and urban planning. Working together is essential –from observational research to computational modelling, the first step is a doozy!

Another challenge highlighted was that ultimately, we need to be able to link the models to ‘real’ data, to ensure their validity. Involvement of community stakeholders and decision-makers in the process was discussed only briefly, but this would appear to be a key step in verifying models. Community physician and systems scientist Kurt Stange described a great example of a participatory process of community stakeholder involvement in model planning and development. This may be a good point for us to start, to ensure that we ‘keep it real’.

Image

  • Closing thoughts from a complexity novice

From a KTE perspective, I would think that external validity would be a key challenge for the application of this research, which may be difficult to reconcile. The conference left me pondering how do we use the evidence generated for decision-making? How can we be sure that modelled data reflects what’s in the ‘real world’? A discussion on using these models to guide policy was led by Complex dynamics researcher Ross Hammond, and NIH program director Stephen Marcus, which began to raise these questions. I would imagine, as for research evidence generated through ‘traditional’ methods, that a similar approach to knowledge translation and exchange would be required for evidence generated through modelling.

So after that, a penny for your thoughts? Leave a comment if you’re using/exploring these methods!

 

Written by Dr Tahna Pettman

Research fellow: Public Health Evidence and Knolwedge translation
Evaluation fellow: CO-OPS collaboration

The Jack Brockhoff Child Health & Wellbeing Program.
The University of Melbourne
e: tpettman@unimelb.edu.au

 

Interests in conflict? Managing the head and heart of research

Research findings have the ability to influence decisions – with regard to practice, policy and funding directions. It’s what makes the work of researchers satisfying – the thought that it may actually make a difference! But with this warm fuzzy feeling comes responsibility and the need to check our good intentions at the door – not necessarily to leave them there, but to submit ourselves to an open and honest conflict scan.

My work involves managing the editorial steps leading to the publication of public health research, and includes assessing the appropriateness of the composition of research teams as well as allocation of editorial advisors and peer referees to provide feedback on the research.  In doing so I am very conscious of the conundrum that can arise, in identifying individuals with sound understanding of a topic to undertake the research (or review the research) yet free of any vested interest in the outcomes of that research.

There are rules and policies to identify, declare and manage potential conflicts of interest (COI), to “provide guidance to ensure that there is clarity and transparency in the declaration of any interests, a balance of perspectives, and guidance on disclosing and managing interests” (NHMRC 2012) around research committees and working groups developing guidelines, and for researchers and peer referees of researchers’ work. 

The tricky part is that declaration statements often rely on the objectivity of the individual closest to the work – the researcher, the research committee member, the guidelines developer, the content expert chosen to peer referee the research.  I hazard to guess that a failure to declare a potential conflict of interest associated with a particular task is usually not due to an underhanded intent of the researcher or research advisor, but due more to a lack of understanding of what might be perceived as a conflict.  Most are clear about declaring any financial interest in the subject at hand or funds received by parties with an interest in the findings of a research work. But what of other influences that might openly or inadvertently influence the judgements and decisions of the researcher or research advisor? And can these influences coexist in a team without compromising the integrity and outcomes of a research task?

In noting my area of work, as Managing editor of the Cochrane Public Health Group, I also declare an interest (conflicting?) in this topic for authors, editors and advisory group members and peer referees of systematic reviews – of public health topics specifically. A recent report, prepared for the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services, contended that whilst the importance of attention to financial conflicts of interest has been addressed, there has been little guidance on how to manage the risk of bias for systematic reviews systematic reviews from nonfinancial conflicts of interest. The paper outlines definitions and examples of non-financial COI, and how these can be managed and assessed for their potential to bias their involvement in the review. It also confirms that authors may not be identifying themselves as having potential conflicts.

Non-financial COI in the AHRQ report was defined as “a set of circumstances that creates a risk that the primary interest—the quality and integrity of the systematic review—will be unduly influenced by a secondary or competing interest that is not mainly financial.” They include interests relating to the individual (intellectual, professional, career advancement), persons with whom the individual has a close personal relationship (e.g., family members, friends, colleagues), and interests held by the employer or organization with which the person is affiliated (e.g., employer, academic institution, specialty organizations, other professional organizations, and community interests).

Getting the authorship team right on a systematic review is important – with a need to include content expertise, methods knowledge and experience, as well as statistical and searching expertise. Bringing together a systematic review team that adequately balances essential content expertise with independence of judgment can be tough and requires open and deliberate choices for the lead author.

What is important to understand is that the identification and declaration of a potential COI and the management of that COI are two very different things.  It is the latter that can ease the struggle between the need to be close to, knowledgeable, and dare we admit, passionate about a subject or content area, and the need to make objective decisions based purely on the information presented or available to the team. Once the risk of potential conflicts of interest is identified, based on the context of the topic, there are a range of options to managing the conflicts of interests within a research team.  These range from disclosure followed by no change in the research team or activities, inclusion on the team along with other members with differing viewpoints to ensure a range of perspectives, exclusion from certain research activities (such as assessment of risk of bias in individual studies in the case of a systematic review), to exclusion from the authorship team entirely.

Not all conflicts of interest, once identified and acknowledged, lead to a compromised research project.  Being upfront and declaring all potential conflicts, to the editorial team and in any associated publications, allows the reader to make an informed judgement about the trustworthiness of the research process and findings. 

Written by Jodie Doyle
Managing Editor, Cochrane Public Health Group
e: jodied@unimelb.edu.au