Complexity, Quantification and the Management of Policy

Philip Haynes

 

This paper explores the implications of complexity science for quantitative research in social and public policy. It is argued that there is some need for a revision of the use of quantitative methods, rather than an abandonment. Data and models can be used to explore social issues, rather than to explain them conclusively. Social statistics must still play a part in forming an overview and synthesis of social life. The use of reductionist methods to achieve very precise measurement of need, resource allocation and performance is questioned.

 

In the applied social sciences, statistics and mathematics have sometimes been thought of as a kind of art form, a disciplined way of creating arguments and pictures about the world.  In his recent work on complexity theory and the social sciences, David Byrne (1997, 1998) has talked of the ‘quantitative being qualitative’. Ian Sanderson (2000, p. 450), writing on complexity and policy evaluation, talks of ‘crafting an approach to evaluation’. In this paper it is argued in a similar vane that the synthesis and development of policy strategies from quantitative methods are as important as the details of the methods chosen. Complexity offers a new sense of realism about the use of quantitative data in social policy.

If multivariate models cannot really explain the multiple causes of social problems and the outcomes of social policies, what future is there for quantitative methods? What is the contribution of quantitative work to social policy? Why do multivariate models always leave research with more questions than answers? There is a new approach to this problem that demands a return to some fairly fundamental questions about the meta theoretical picture that quantitative analysis fits into.

A growing number of social scientists interested in quantitative tools have looked to chaos and complexity theory to provide them with this new meta theoretical framework. This is because the scientific language that complexity offers is seen as being highly relevant to social and policy realities. It provides a new vocabulary and arguably some new priorities. The attractiveness of complexity’s meta theoretical approach can be summarised as:

  •  A commitment to be multi disciplinary, in the broadest sense;
  • Challenging traditional approaches to causality and association;
  • Encouraging some new holistic thinking that is more realistic than previous attempts at holism;
  • Permitting policy analysts to honestly face the limits of their discipline, without giving up the search for knowledge and progress.

This paper reassesses the relationship of quantitative methods and the management of complex policy environments, at a time when government still seems to be relying on some simplistic quantitative research management messages that do not hold true with reality for many practitioners and researchers.

 

Aims and objectives of paper:

  • To summarise the theoretical approach of complexity theory so as to inform the design of a quantitative methodology;
  • To explore what a complex approach implies for the design of quantitative research;
  • To revisit the relationship between social policy and quantitative methods;
  • To suggest quantitative methods that may be particularly useful in understanding social policy;
  • To describe how quantitative research is currently used in the management of policy;
  • To conclude as to how quantitative research might be used to aid an understanding of the complex realities of policy management.

To summarise the theoretical approach of complexity theory - so as to inform the design of a quantitative method

Complexity theory has its routes in the mathematics of chaos. A basic non linear equation demonstrates that over time small changes in the rate of change lead to exponential change (Cartwright, 1991; Kiel, 1994; Parker and Stacey, 1994; Elliot and Kiel, 1997).Weather systems are often cited as the most obvious example of this, based on Edward Lorenz’s work at the Massachusetts Institute of Technology immediately after the Second World War and finally published in the 1960s (cited in Sardar and Abrams, 1999). Kiel (1994) and others have offered a simple spreadsheet formula that allows non-mathematical social science teachers and students to experience the mathematical basis of chaos and complexity. The approach is useful in enabling everyone to get an accessible insight into the mathematical basis of the theory, but there are immediate issues about the substantive application of such mathematical ideas to the complexities of public policy where many variables are interconnected and adequate measurement of change over time is not available when contrasted with the pure sciences (Sokal and Bricmont, 1998).

There is a strong quantitative tradition in British Social Policy; the work of Peter Townsend, Bleddyn Davis, Sara Arber and John Hills, for example. But in recent years there is perhaps a lack of impact from social policy quantitative work that informs real political debate about quantitative findings and trends and their direct social impact and policy consequences. Much of the baseline work is still going on, but there are less high impact seminal quantitative studies at the centre of the discipline. If one, for example, thinks of the large impact of Peter Townsend’s work on poverty in the 1970s, there is nothing similar today in terms of a large scale quantitative piece of work that evolves into the policy and academic arena and retains itself at the centre of a debate for a considerable number of years. The importance of quantitative work and its potential impact on policy is less clear to current social policy students, although many students still struggle with quantitative research methods and their application. It can be argued that quantitative exploration of society should be more centre stage in the discipline of social policy.

There was a sea change in ideology and values in the 1980s and this must partly account for the undervaluing of any grand meta quantitative account. The weaknesses of social statistics were increasingly exposed, but in reality many already knew of the methodological dangers of over relying on quantitative generalisations. However the politics of individualism seemed to encourage throwing the ‘baby out with the bathwater’. Given Thatcher’s notion that there was ‘no such thing as society’ there were no grand social models either. A decade or so later there was an increasing realisation that there is no such thing as an individual consumer. In some areas of welfare, people do not behave like consumers at all, but like the passive recipients of welfare that social policy documented 25-30 years ago.

The hypocrisy of it all. The resurgence of classical economics brought a dominance of a certain type of statistical approach, where rather than social statistical models generalising about groups of individuals, individuals were judged to behave all exactly the same as stereotypical consumers. The methodological weaknesses were as great as earlier social science models and the economic grand theories far too simplistic (Ormerod, 1998).

Underneath this ideological hijacking of statistical and applied mathematical methods, some academics were struggling to redefine the relationship between theory and quantification. Sayer (1992) seemed to create some opportunities for social statistics to do better than just survive. The acknowledgement of levels of reality and understanding permitted social scientists to still make some major contributions to the understandings of social problems and issues. Realism acknowledged that causality was difficult to model, because it was often contingent on many factors. Yet understanding and progress were still possibilities -  given adequate reflexive and methodological accounts.

Complexity and new realism (Sayer, 1992) have much in common, but complexity goes further in its approach to causality and association. Complexity in social science implies the following points:

  • Reductionism and individualism are inadequate for a narrative of the social and the individual.
  • The sum is greater than an addition of all the parts. Holism must be described as an evolving dynamic that is affected by constant feedback. Holism that sees the social as deterministic and based on a highly predictable structure is inadequate.
  • There is a need to study the interaction of variables in quantitative social policy and not to make assumptions about what is a dependent and independent variable. The future relationships in a causal pattern will be largely unpredictable and characterised by feedback between the variables in its constituency. This focus on feedback, or interactions, is a key element in complexity.

In short, despite the desire that political ideology has for universal truths that can dictate the mechanical method of economics and statistics, there are no ‘simple fixes’.  Numerical models will be just that, models of reality – not the full and true version.

 

The design of complex quantitative social policy research

The design of appropriate methods becomes less difficult when the fundamental difficulties are accepted. If there are no mathematical laws that can determine how we should study complexity, we must use a variety of approaches and use our judgement to attempt to match model type with reality type (Harvey and Reed, 1997). It is this judgement that becomes so important.

This paper proceeds at this point by making the assumption that the task of finding mathematical laws that are relevant to a complex policy environment is misconceived. Some literature in the US has focused extensively on finding statistical measures of chaos in society (see for example Brown, 1997), and perhaps by implication complexity, but it is not surprising to find that the work is inconclusive. Such work can involve the application of the statistical method known as the Lyapunov exponent to measure deterministic chaos in a time series.  But rather than searching for a mathematical law of chaos that is near perfect in its robustness, the general question is to describe social complexity using a range of mathematical tools, so that social complexity is better understood and recognised.  Byrne (1997) has characterised these two different approaches as symptomatic of different approaches in the US and Britain. It should be noted that one difficulty with the US approach is that it stretches social scientists to comprehend the frontiers of maths and statistics, when arguably our energies might be better spent applying the scientific mathematical language that is already available.

Quantitative methods of analysis can be used to provide qualitative exploration of reality. It is necessary to start with recognition of the: ‘impossibility of a full quantitative understanding of complex phenomena and the consequent requirement to turn to qualitative approaches’ (Byrne, 1998, p7).  If the task is not to look for a single perfect method, or a new universal statistical law, what kind of statistical methods might we be interested in?

 

The relationship between quantitative methods and social policy

The issue may be less about what measures are used and more about how the measures already available are applied. It is good to have as a starting point some research dimensions that are substantially secure. Complexity theory implies that the two dimensions of quantitative security are time and space. Time is the deterministic component of chaos and complexity theory, in that time cannot be run backwards and in this sense the past does determine the future. If one can understand how order emerges over time, some partial answers are possible. Time then is one secure variable. Social and economic history is an important source of study. Kiel and Elliot (1997, p19) comment: ‘The initial starting point of a social system has much to do with its eventual structure and behaviour.’  Eve, et al, (1997, pxix.) note that:

Perhaps the most fundamental change that has been brought by the new science is the recognition that the time will not go away…the new science has shown conclusively that time is irreducible, irreversible, and asymmetrical.

 

The temporal dimension

Dale and Davies (1993) have made some important observations about the need for social science researchers to understand change over time. In the order of magnitude of things, change over time may be relatively easy to pick up. Certainly in terms of the change of one key variable, this is more simple a research challenge than trying to understand the interconnectedness of things. The description of variable change is a good starting point for understanding society.  Variables that are used to capture key social concepts such as demography, unemployment, divorce, and wealth can be important evidence for key transformations in society. Accounting for the reasons for this change is more of a challenge, but adequate description and monitoring of the underlying variables is a prerequisite. This type of description over time is in itself, at first, a simple quantitative exercise.

Our understanding of change over time is imperative. Cross-sectional approaches need to be critically revisited and re-evaluated. There is a need to repeat quantitative research over time and to try and understand trends. It is likely to be very difficult to understand the multivariate nature of trends. But models can be proposed to speculate on the nature of these relationships.

The measurement of change over time and its analysis has not been central to the study of social policy, although it has played some part. Certainly some longitudinal data is collected in major surveys like the General Household Survey, but its analysis is not a strong enough feature of social policy teaching and research in the UK.  The analysis of trends was central to the evolution of economics during the last century, most significantly built around quantitative attempts to predict temporal cycles, so that the occurrence of recession and growth could be predicted. Most textbooks on applied statistics for economics contain at least one chapter on time series analysis. But trend analysis has not been a central feature of social policy research methods.

Certainly some quantitative studies of longitudinal change have contributed to the social policy field: for example analysis of the British Social Attitudes Survey and General Household Survey, but often analysis of these important data sources seems to have become rather secondary to the core activities of social policy. An analysis of trends over time to evidence social and political change is fairly rare. One exception was the work of Hogwood (1992). Another is Hills’ (1998) work on trends in the distribution of income and wealth in the UK.

 

The spatial dimension

The second secure dimension is space. Geographical space is a determinate of human life, in that space represents a dimensional challenge to all societies and individuals. Physical space exists and in part determines social life, even though there is very large scale divergence in terms of the how, what, when and where of this relationship. Nevertheless, physical space itself is largely stable. Our social response to it is highly unstable, but as a physical resource it is relatively stable, as Sheppard (1996, p1319) comments: ‘Despite the socially constructed nature of space, it is valid to treat space as constitutive of social process.’ 

The industrial revolution and disciplines of modernism influenced an approach towards space where it was occupied by key social groups and dominated and made subservient to capital markets and the human society. Post modernism and the post industrial age call for a revision of this view. Instead societies are re-evaluating their interconnectedness with physical geography and finding a new respect for limited spatial resources that may be out of human control and subject to strong ecological forces.

Spatial analysis, such as GIS, can provide some new social insights that are not necessarily picked up with conventional data analysis. The visualisation of a geographical distribution can provide an important element to the synthesis of social problems. There have been a number of recent examples of this, for example Dorling’s (1995), GIS social atlas of Britain, with its patterns of social and economic data, and the author’s social care market research in London (Haynes, 1999).

 

What methods are available? From explanation to exploration

It is very difficult to understand the interconnectedness of things and this challenge has to be recognised by social policy analysts and governments and respected for its magnitude (Sanderson, 2000). However longitudinal analysis of multiple effects may provide some interesting insights that are not available to complex cross sectional studies. Paradoxically it may be easier, or appear easier, to make a judgement about the interconnectedness of things over time, when one sees and gets a feel for their combined effect, rather than struggling over the detail of a one off multi-variate cross sectional model. There is a danger that focusing on the detail of linear cross sectional models, such as regression analysis and factor analysis, can lead to a focusing on details that have no substantial significance or policy payback. An example is the ability of factor analysis to generate additional factors that are statistically significant, but that have no substantive meaning. Better to observe the major lessons (principal components) of such models and to try and re-run them in different circumstances, to test their robustness with different techniques, over time and with different data. Eve, et al, (1997, pxxv.) comment:

Such models have until now been fixed and inflexible, and based as they are on a linear conception of cause and consequence, they are confirmed or deconfirmed in an all-or-nothing way.

If it is accepted that the cross sectional multi variate analysis model is never going to be perfect, to run the same data set many times with different criteria begins to feel increasingly like looking for a statistical artefact that matches what the researcher wants to find and argue. Of course, to some extent this is inevitable with a deductive statistical approach, but a better way might be to use so called explanatory models as an extension of data exploration, to explore the possible interconnections between variables:

Instead of creating a hypothesis, testing it on the experimental and observational facts until a counterexample shows its flaw, and then trying another, we can now create an accurate facsimile of reality by successive tweaking of variables and the connections between them…This process reverses the top-down, theory to phenomena. (Eve, et al, 1997, pxxv-pxxvi)

All this implies using an inductive approach to develop and revise theory rather than a deductive one. The researcher’s judgement shifts more to an inductive open/action based approach, rather than a hunt for the holy grail of statistical argument. This is not to downplay the importance of statistical argument and correctness, but given the assumptions that complexity tells us about our inability to find the right type of model, the emphasis is less overall on the detail of how one particular statistical method is carried out. Substantive questions and qualitative theoretical thinking become of at least equal importance. The use of social statistics needs to be connected with questions of meta theory.

 

Systems of quantitative research: Exploration for Explanation

The reality of complexity may mean that policy analysts using mathematical tools are looking for a different kind of answer to those sought previously. There will never be a complete answer for a localised, micro issue, nor the suggestion of complete answers (laws) within methodology, but instead what can be called partial answers. This is because micro and localised study needs to be reconnected with the wider macro picture. Skills of synthesis are as important as analysis.

 Take the influence of chaos and complexity on economics. Here is a social science that has a tradition of sitting much more closely to the quantitative than does sociology. But the implications of chaos and complexity on economics are far reaching. Economists are having to face a hard truth – that for many years they have been relying on quantitative models that really are not up to the job of understanding highly complex and unpredictable systems (Parker and Stacey, 1994; Ormerod, 1998). Economists are certainly reluctant to give up all approaches to trend analysis, even if not necessarily linear. Economists are still trying to predict future behaviour on the assumption that it will be the same as behaviour in the past.   Trend analysis that features chaos breaks this assumption. It could be said that traditional economic planning is like looking in the rear view mirror of your car, as a method for understanding the road ahead (Sanderson, 2000).  Chaos and complexity implies a different kind of logic for social and economic planning. Judgement is as important as available knowledge and information.

Systems in states of unstable change will be confined to some temporary boundaries – the outer points of the chaos created. This is the ‘order within chaos’ – leading to the possibility of mapping attractors or central points in any given period of time. But the key point about attractors is not that they can be located at fixed points, but the patterns of order and similarity that can be shown over time – again suggesting a type of ‘dynamic’ order coming from chaos. What does this mean in practice? You are much more likely to be right if you predict that inflation will be between 2% and 4%, rather than staking your career on it being 2.6%. It also means that the prediction must be reviewed every day, as and when new information becomes available. An inflation forecast will need to change substantially if there is a major market shock. Any forecast that is not constantly updated is effectively useless. What this is really about is a re-emergence of holism in social science. This is not a holism of deterministic structures, but a holism based on the sum being greater than all of the parts (Byrne, 1998, p3). There is a need to keep a constant perspective on contingent events and not to focus too much on the details of one part of the picture. Holism and reductionism find a new inter-dependence on each other. This is what  Kontopoulos, (1993) calls ‘heterarchy’. Similarly, Cilliers, (1998, p5.) comments: ‘When we look at the behaviour of a complex system as a whole, our focus shifts from the individual element in the system to a complex structure of the system.’

 

Holism

It is as valuable for social scientists to look at the bigger picture as it is to look at individual behaviour – there is a need to look in both directions, both top down and bottom up. Complexity implies that the whole picture is more than the sum of its individual parts. The whole picture will also be feeding back to the individual and sub wholes, creating a dynamic feedback mechanism. This means that one of the most important areas of a complex social system to study is the points of interaction, the feedback that occurs between structures and individuals. Economists talk about economic confidence in this way, with the idea that the perceptions of many individual business people and consumers are as important as recent historical data about the performance of the economy. People’s reaction to the data (feedback) is as important as the data itself. Cillers (1998, p3) comments: ‘In order to constitute a complex system, the elements have to interact, and this interaction must be dynamic. A complex system changes with time. The interactions do not have to be physical; they can also be thought of as the transference of information.’

Perhaps more should be made in social policy of the large scale British Social Attitudes study in order to understand people’s feedback to social change. Do adequate numbers of social policy undergraduate students make use of such data? A further example is at a political level where there is a preoccupation with feedback politics and the extent to which political policy can ride ‘on a feedback wave’. Some commercial approaches similarly focus on a continuing relationship with customers in order to maximise feedback into new product specification. This raises interesting issues about the indeterminacy of feedback itself and the points at which feedback may suddenly and unexpectedly change its focus.

 

All this implies some different quantitative approaches.

The seminal quantitative work of the Plowden survey (Central Advisory Council for Education, 1967) on educational attainment in the 1960s argued that parental attitudes were one of the major determinants of educational attainment. But the question of what determines educational attainment is perhaps the wrong sort of question in a complex society. Rather the questions might be: what is educational attainment and how might educational attainment be achieved?  Such a study is more inductive and does not seek simple outright answers. Rather some critical relationships, or interactions, might become the focus of the research. Parental attitudes will be seen as highly unstable and changing over time. The key interest will be the inter-connectedness of parental attitudes with other variables such as child behaviour, teaching method, being able to choose schools, the local economy in which the family and school are situated, and confidence in government policy. Research will need to understand the interconnection of these aspects and the feedback between them.   If traditional linear approaches to causality were used, the researcher who is sympathetic to complexity would want to inter change the dependent and independent variables. Cluster analysis or factor analysis might be used for an initial exploration of the links between variables. In an exploratory phase it would be interesting to contrast a theoretical framework of interconnections based on qualitative interviews, with a statistical multivariate pattern of connections. Researchers would not wish to start with a clear hypothesis. A subsequent, final model could be tested for levels in the data sets using some form of multi level modelling.   It is also hard to imagine a purely quantitative approach to a complex issue that would not also be ideally triangulated with qualitative case studies.

 

Holism as Feedback

To summarise a quantitative approach to feedback, the key issues are:

  • Rates of change in variables and in their relationships to each other;
  • The interaction of variables with each other;
  • An inductive, rather than deductive approach;
  • The possibility of presenting contrasting statistical models in exploratory research write ups;
  • Trend analysis;
  • Spatial analysis (using GIS or cluster analysis) to explore spatial structures.

 

How quantitative research is currently used in the management of policy

Given the theme of this paper, that the use of the quantitative requires some fundamental rethinking about theoretical purpose and the key questions being asked, it is of concern to find that government use of policy indicators is still preoccupied with a linear and largely unreconstituted approach to applied statistics.

If we take two examples, we can see a number of problems.

 

The Standard Spending Assessment (SSA)

The SSA is a mathematical formula for generating the allocation of treasury monies to local government. Similar formulae are used for the allocation of NHS monies to local areas. These types of formula are made up of a wide range of indicators, including linear regression analysis of costs against social and demographic data.

There is a history of methodological problems with the SSAs (Owen, 1990; Boyne and Powell, 1993, Goldstein, 1994,  Senior, 1994), but with reference to complexity there are some particular problems. These are about the aggregation of incremental, linear cross sectional data and can be summarised thus (Haynes 1999 ch7):

Incremental assumptions about the past are projected into the future and the idea that the future might be radically different is rejected;

Population sub group size trends can change and vary quite significantly within localised areas and do not necessarily change in incremental, linear ways;

National aggregated cost data can fail to reflect local fluctuations (such as unstable data attributes like property and labour costs, labour availability and short term demand for crisis services);

Future needs can become confused with past costs, where past costs might be inefficient and inequitable, and not an adequate measurement of where money needs to be spent in the future;

Outliers, or areas going against the apparent trend, can get taken out of the aggregate model, and ‘special cases’ cannot be made, despite evidence of exceptional and unusual local circumstances.

 

Performance Management (PM)

On the face of it PM is a much simpler system than the SSA. It essentially is about the measurement of outputs and outcomes rather than activities (Rouse, 1999). It is evolving to become more action based and directly linked to budget expenditure on the basis that areas and services providing positive outputs and outcomes should be rewarded with a greater allocation of money. This happens in higher education through the Research Assessment Exercise (Talib, 2001).

Talib highlights the paradox of reductionist quantitative approaches to public expenditure. The paradox of these approaches rests on their detailed measurement (and in that process they raise all kinds of anxieties about whether they really are reliable and valid measures) and the final outcome in policy terms where government still has to give the recipient organisation considerable discretion in how it allocates the funding. The focus on specific measurement becomes circumvented in the policy reality. The analysis has to be generalised back to an overview for each institution, and the overview is often qualitative and political. This is also the case with the SSA where one civil servant is rumoured to have talked of, ‘presenting numerous mathematical outcomes to ministers, allowing them to pick the one that was most politically acceptable.’

 

The key methodological objections to performance objectives might be summarised as:

  • Invalid measures of outcome are used due to the difficulty of agreeing what outcome is;
  • Inadequate and invalid approach to causality and association;
  • Cost benefits (costing of alternatives) are not fully and adequately considered. Cost cutting is maximised rather than value raising;
  • Outputs are only weakly associated with outcomes (and other contingencies are ignored);
  • Unintended effects on the policy processes and stakeholders are not measured or considered.

A recent piece in the New Statesman (29.1.01) criticised the managerial discourse in government policy because of its focus on micro analysis and the ignoring of meta issues. This is in contradiction to the government’s current  joined-up philosophy.  Performance management reinforces a reductionist approach.

Sanderson (2000, p.450) summarises this error in a recent review of complexity and policy evaluation:

'This is not the application of techniques to well defined policy contexts which will provide a definitive answer to the question are our objectives being achieved.' (my italics)

What then is the value of doing the quantitative, reductionist work in the first place?

One key argument is that it does marginally increases transparency and accountability even if it is not valid and efficient in terms of changing actual public expenditure. This seems a bit like putting the cart before the horse, as there are surely other more efficient, less bureaucratic, methods of making the public services more accountable. These are likely to be quasi political - referendums, focus groups, lay representation on professional bodies; indeed many of them are present in the current fragmented and diverse policy process.

A more persuasive meta theoretical argument for keeping all this reductionist quantification is that quantification provides a common language that is relatively divorced from politics, values and ideology. Proponents of this idea say that there has to be some attempt to provide a transparent standard measure of things, albeit that the measurement is often partly unreliable and invalid.

The converse argument was made by neo Marxists in the 1970s about local corporate planning. The technical language of government was just another smokescreen to divert the public from their real needs and to focus on resource rationing. This implies that quantitative arguments are not rational-objective, but always ideological and subjective.

This leads on to a key point about allowing those at the bottom of the policy process, front line workers and service users, in particular, to be involved in the selection of measurements, as this is likely to increase the policy relevance of the measures used.  The recent work by Dewson et al (2001) at the IES on soft outcomes and distance travelled seems to be helpful in this respect. This proposes quantitative performance targets that are relevant to local agencies and that can contribute to a growing collection of policy relevant data at all levels of the policy process.  Recent work at the HSPRC at the University of Brighton (Cooper, Haynes and Williamson, 2001) with the parents of children with complex problems also found that service users had different ideas about the measurement of positive outcomes than the Department of Health’s national performance measurement programme.

Complexity theorists should be excited by the potential of living in a rich information age, where so much data is available. Research should not be about closing down alternatives, but opening up alternatives, given a realistic acknowledgement of the difficulty with providing lasting answers. Methodological pluralism is an important part of the complexity strategy.

 

To conclude - how quantitative research might be used to aid an understanding of the complex realities of policy management

Large scale quantitative, descriptive accounts, are extremely useful in the study of social and public policy. This is especially so where such descriptive data can be collected over time.

Complex data does not necessarily need complex methods. The presentation of complex interconnected data can sometimes best be done with simple methods. The initial exploration of large amounts of data, and the description of data, can be a vital part of an inductive approach (see Hogwood, 1992). Methodological pluralism is needed, but social policy would do well to evolve with more resources dedicated to the analysis of temporal and spatial data.

Needs analysis remains a key part of social policy. It is a principle of the discipline to understand what social needs are, and to present large-scale quantitative descriptions of what these needs mean. This seems particularly important in relation to social exclusion and poverty where social policy is rediscovering its historical link with the quantitative study of such issues. This is a continental and global issue also. The measurement of social needs, and arguments about these needs, should be central to social policy. This requires quantitative exploration and explanation.

The importance of feedback and holism to understanding society implies that the quantity of things does matter. It is difficult to quantify things and especially the interconnection of things, but this does not mean that social policy academics and students should give up.

In terms of social policy teaching, courses need to make sure that students know how to analyse and synthesise through quantitative exercises, where they can locate the growing stocks of data, and how to present arguments with this data.

 

A new complex understanding of quantitative policy management

The meta questions about the nature of social feedback offer an opportunity to rediscover the relationship between research and policy and research and action. Complexity offers an opportunity to move from reductionist questions to some holism questions. What kind of society do we have and do we want? - are relevant questions for social policy to ask.

Education is now assumed to be about reaching certain measured abilities in specific subjects, but is it also about creating happy and confident people who are able to realise their own unique contribution to society. The two aims may be related, but they may also be in conflict. The latter global outputs would require measures of children’s well-being, attitudes and hopes for the future; they are not part of the current performance management equation. The fact that increases in suicide and depression are a feature of young adult life, and of serious concern to health experts, needs feeding back into the DofEE performance assessment of education. There are other areas of social policy where the focus is currently too narrow.

The policy arena needs a renewing of interest in synthesis rather than just analysis. There have been numerous books and articles in the last decade about policy analysis, but few include the title ‘policy synthesis’. This does tend to reinforce a top down and reductionist approach. What we must avoid is a narrow approach to social outcomes that fails to make adequate assessment of the interconnectedness of things. Thus the complexity economist Paul Ormerod (1998, 186) talks of: ‘not a matter of detailed short-term interventions and targets but of creating the right overall environment’.

 

Acknowledgements

I would like to thank those who attended the SPA complexity study day at Salford University in March 2001 for their valuable feedback to my paper. I am also grateful for the comments received from the referee for this paper.

 

BIBLIOGRAPHY

Boyne, G.A and Powell, M. (1993) ‘Territorial Justice and Thatcherism’, Environmental and Planning C: Government and Policy, vol 11, pp.35-53.

Brown, T. A. (1997) ‘Measuring Chaos Using the Lyapunov Exponent.’ In Kiel, L.D. and Elliot, E. eds. Chaos Theory in the Social Sciences: Foundations and Applications. Ann Arbor: University of Michigan Press

Byrne, D. 1998 Complexity and Social Science. London: Routledge.

Byrne, D. 1997 ‘Complexity Theory and Social Research’, Social Research Update 18. Guildford: Department of Sociology, University of Surrey

Cartwright, T.J. (1991) ‘Planning and Chaos Theory’, Journal of the American Planning Association, Vol 57, No 1.pp44-56

Central Advisory Council for Education (1967) Children and their primary schools, vols 1 and 2. London: HMSO.

Cilliers, P.  (1998) Complexity and Postmodernism. London: Routledge

Cooper, M. Haynes, P. Williamson, V. and Ndebele, D.  (2000) An Evaluation of the Brighton and Hove Attachment Project. Brighton: HSPRC.

Dale, A and Davies R.B. (1994) Analysing Social and Political Change: A casebook of methods. London: Sage.

Dewson, S. Eccles, J. Djan Tackey, N. and Jackson, A. (2000) Guide to Measuring Soft Outcomes and Distance Travelled. Brighton: Institute for Employment Studies  http://www.employment-studies.co.uk/

Dorling, D (1995) A new social atlas of Britain. Chichester : Wiley.

Eve, R.A. , Horsfall, S. and Lee, M.E. eds. (1997) Chaos, Complexity and Sociology: Myths, Models, and Theories. London: Sage.

Goldstein, H. (1994) ‘The use of regression analysis for resource allocation by central government.’, Environment and Planning C, Feb, vol 12, no 1, pp 15-22.

Harvey, D.L. and Reed, M. (1997) ‘Social Science as the Study of Complex Systems’, in Kiel, L.D. and Elliot, E. eds.(1997) Chaos Theory in the Social Sciences: Foundations and Applications. Ann Arbor: University of Michigan Press

Haynes, P. (1999) Complex Policy Planning. Aldershot: Ashgate.

Hills, J. (2000) Income and Wealth – the latest Evidence. York: Joseph Rowntree Foundation http://www.jrf.org.uk/knowledge/findings/socialpolicy/spr368.htm

Hogwood, B. W. (1992) Trends in British Public Policy. Buckingham: Open University Press

Kiel, L.D. (1994) Managing Chaos and Complexity in Government: A New Paradigm for Managing Change, Innovation, and Organisational Renewal. San Francisco: Jossey-Bass.

Kiel, L.D. and Elliot, E. eds.(1997) Chaos Theory in the Social Sciences: Foundations and Applications. Ann Arbor: University of Michigan Press.

Kontopolous, K. (1993) The logics of social structure. Cambridge: Cambridge University Press.

Ormerod, P. (1998) Butterfly Economics. London: Faber.

Owen, C. (1990) ‘The Fairness of the Standard Spending Assessments Underlying the Community Charge’, Local Government Studies, Nov/Dec. pp.63-76.

Parker, D and Stacey, R. (1994) Chaos , Management and Economics: The Implications of Non Linear Thinking. London: IEA.

Rouse, J. (1999) Performance Management, Quality Management and Contracts, in Farnham, D. & Horton, S.  Public Management in Britain. London: Macmillan.

Sanderson, I (2000) ‘Evaluation in Complex Policy Systems.’, in Evaluation Vol. 6(4) pp. 433-454.

Sardar and Abrams (1999) Introducing Chaos. Cambridge: Icon Books

Sayer, A (1992) Method in Social Science: A Realist Approach. London: Routledge

Senior, M (1994) ‘The English Standard Spending Assessment; an assessment of methodology.’ Environment and Planning C. Feb Vol 12. No. 1. pp. 23-52.

Sheppard, E. (1996) ‘Site, situation and social theory’, Environment and Planning A, Vol 28, pp. 1339-1344.

Sokal, A. & Brickmont, J. (1999) Intellectual Impostures. London: Profile Books

Talib, A. A. (2001) Formula Based Allocation of Public Funds; The Case of Higher Education Research Funding. Public Management and Money.  Vol 21 No 1 pp57-64.

Townsend, P. (1979) Poverty in the United Kingdom : a survey of household resources and standards. London: Allen Lane

 

About the author

Philip Haynes is a Senior Lecturer in Social Policy at the University of Brighton, Health and Social Policy Research Centre. He is author of Complex Policy Planning: The Government strategic management of the social care market. His key research interest is applying complexity theory to public policy and public management issues. He is production editor of Social Issues and has a strong interest in information technology and its application to higher education teaching and research. Philip is a member of the University of Brighton’s Learning Technology Support Project.

[ Archive] [Home] [Editor] [Bookshop] [White Horse Bulletin]
© 2001, Inc. All rights reserved.