Skip to main content

Of RAGs and riches: indicators of public health in the Public Health Outcomes Framework

“Are we there yet?” You don’t have to travel very far with small children before you are asked this question. In fact, a survey by in March 2013 reported that children ask their mothers around 300 questions every day on why the sky is blue, why is water wet, and why there are no more dinosaurs.

Good public health intelligence works along much the same principle of asking lots of questions - not just about the state of a population’s health but also about the factors that help shape the health of both individuals and communities. However, once you have asked questions, you need to find some way of finding answers, which was the subject of an earlier post. One example of a system that helps answer such questions is the Public Health Outcomes Framework (PHOF).

PHOF was born out of the need to understand how well we are improving and protecting public health. Its vision is to increase life expectancy and to reduce differences in life expectancy and healthy life expectancy between communities. 68 indicators were developed with expert input from key users to provide appropriate measures of how long and how well people are living. These indicators are grouped into four domains:

There are also two ‘overarching indicators’: one on healthy life expectancy, and one on differences in life expectancy and healthy life expectancy between communities.

The PHOF data tool was launched in November 2012; updates are published as official statistics quarterly in February, May, August and November each year. The tool allows you to compare a local authority against the England average and against other local authorities in the region. To date, the tool has had 35,000 unique visitors. Data are being added that allow monitoring over time and new indicators are included as these are developed. For instance, the tool now gives you answers to questions such as how many excess winter deaths there are and the number of children living in poverty.

However, any parent or teacher will tell you that answering questions is sometimes a challenge. The PHOF data tool contains a wealth of data that can be used to help understand local health patterns. But it can only help if data are presented in a way that is meaningful, unambiguous and easily interpreted, not just by public health professionals with expertise in data analysis, but also by those who now have responsibilities for public health, such as local councillors. One particular challenge is the best way of displaying the value of a measure when it is compared with, for example, the overall value for England.

One way of achieving this is by colour coding the indicators. To date, PHOF has used a blue and orange coding to give an indication of a measure’s value in comparison with the England average. However, this system has proved difficult for people to interpret. For example, a value higher than England for infant mortality indicates that the area’s value is ‘worse’ than England, but a value higher than England for healthy life expectancy indicates that the area’s value is ‘better’ than England.

A number of resources published by Public Health England use the RAG rating system – RAG simply standing for Red, Amber, Green. Look at Longer Lives or the annual Health Profiles and you will see that the indicators are marked in red where they have a worse than average value, amber for much the same or green for a better than average value. RAG is in common usage, is widely understood, and can usually be interpreted at a glance without detailed explanations – all important requirements of a publicly available data tool. Since 73% of responses in a feedback survey were also strongly in favour of the PHOF data tool changing to RAG, the PHOF team has been actively investigating its implementation. This means resolving issues such as that presented by the diabetes indicator: does a higher than England average value for recording of diabetes mean that GPs in the area are very good at identifying people who have diabetes, or that the area has more people with diabetes than is average?

In the meantime, the feedback survey has led to other new features in the tool. For example, you can now compare local authorities in different geographic groupings, such as by PHE Centre or the Office for National Statistics classification for local authorities.

But the development of a data tool is a two-way process. Understanding how you use the tool and what you need from it will help us improve its value to you. As well as listening to your questions, the PHOF team therefore has some questions of its own:

How do you make use of the tool? Do you only view the data online or do you use the download options?

When do you use the tool? Constantly? Only when the tool is updated?

The team would love to hear from you at or via Twitter @phoutcomes


Are we there yet? No.

Does the PHOF data tool have all the answers? Of course not.

Is there more the data tool could do? Undoubtedly.

But, by bringing together such a wide range of factors affecting health, the tool helps draw a picture of the current state of health and whether or not it is improving. The more data that are included and the better the presentation, the richer this picture will become.

And how is health in your area? Check for yourself at:

By the way, why is water wet?

Sharing and comments

Share this page


  1. Comment by Meic Goodyear posted on

    About 10% of males and 1% of females are red-green colour blind. RAG is the worst possible colour combination to use, something that the leading data presentation gurus such as Stephen Few are quite vocal about.
    The orange and 2 shades of blue scheme is a bit confusing, as there's no intuitive meaning to the colours. The Health Survey for England makes do with just shades of blue, and that's very successful in its presentation.

  2. Comment by Bren posted on

    Hello Justine, Clare and Heather,

    A great insight into what is a good and developing tool. I guess another part of the measurement tool, an of any measurement tool, is what precisely are we measuring, and of course then what are we going to do with this data once collected. I think the RAG rating is helpful, and a guide, but my thoughts are is it a strong green, or just a green that has scrapped over the amber. The other aspect is how many indicators are needed and can some indicators be added or lost along the way, that may be more or less relevant to the direction of travel for PHE/delivering to the people and communities.

    If we are going to deliver evidence based practice and decision making, it is always helpful to benchmark to comparative areas, which is not an easy thing to do.

    A really thoughtful blog and some questions for us all to think about.

    Kind regards and thank you,