Artwork

Content provided by Office for National Statistics and Statistically Speaking. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Office for National Statistics and Statistically Speaking or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Communicating Uncertainty: How to better understand an estimate.

33:00
 
Share
 

Manage episode 408659091 series 3319221
Content provided by Office for National Statistics and Statistically Speaking. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Office for National Statistics and Statistically Speaking or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The ONS podcast returns, this time looking at the importance of communicating uncertainty in statistics. Joining host Miles Fletcher to discuss is Sir Robert Chote, Chair of the UKSA; Dr Craig McLaren, of the ONS; and Professor Mairi Spowage, director of the Fraser of Allander Institute.

Transcript

MILES FLETCHER

Welcome back to Statistically Speaking, the official podcast of the UK’s Office for National Statistics. I'm Miles Fletcher and to kick off this brand new season we're going to venture boldly into the world of uncertainty. Now, it is of course the case that nearly all important statistics are in fact estimates. They may be based on huge datasets calculated with the most robust methodologies, but at the end of the day they are statistical judgments subject to some degree of uncertainty. So, how should statisticians best communicate that uncertainty while still maintaining trust in the statistics themselves? It's a hot topic right now and to help us understand it, we have another cast of key players. I'm joined by the chair of the UK Statistics Authority Sir Robert Chote, Dr. Craig McLaren, head of national accounts and GDP here at the ONS, and from Scotland by Professor Mairi Spowage, director of the renowned Fraser of Allander Institute at the University of Strathclyde. Welcome to you all.

Well, Sir Robert, somebody once famously said that decimal points in GDP is an economist’s way of showing they've got a sense of humour. And well, that's quite amusing - particularly if you're not an economist - there's an important truth in there isn't there? When we say GDP has gone up by 0.6%. We really mean that's our best estimate.

SIR ROBERT CHOTE It is. I mean, I've come at this having been a consumer of economic statistics for 30 years in different ways. I started out as a journalist on the Independent and the Financial Times writing about the new numbers as they were published each day, and then I had 10 years using them as an economic and fiscal forecaster. So I come at this very much from the spirit of a consumer and am now obviously delighted to be working with producers as well. And you're always I think, conscious in those roles of the uncertainty that lies around particular economic estimates. Now, there are some numbers that are published, they are published once, and you are conscious that that's the number that stays there. But there is uncertainty about how accurately that is reflecting the real world position and that's naturally the case. You then have the world of in particular, the national accounts, which are numbers, where you have initial estimates that the producer returns to and updates as the information sets that you have available to draw your conclusions develops over time. And it's very important to remember on the national accounts that that's not a bug, that's a feature of the system. And what you're trying to do is to measure a very complicated set of transactions you're trying to do in three ways, measuring what the economy produces, measuring incomes, measuring expenditure. You do that in different ways with information that flows in at different times. So it's a complex task and necessarily the picture evolves. So I think from the perspective of a user, it's important to be aware of the uncertainty and it's important when you're presenting and publishing statistics to help people engage with that, because if you are making decisions based on statistics, if you're simply trying to gain an understanding of what's going on in the economy or society, generally speaking you shouldn't be betting the farm on the assumption that any particular number is, as you say, going to be right to decimal places. And the more that producers can do to help people engage with that in an informed and intelligent way, and therefore mean that decisions that people take on the basis of this more informed the better.

MF So it needs to be near enough to be reliable, but at the same time we need to know about the uncertainty. So how near is the system at the moment as far as these important indicators are concerned to getting that right? SRC Well, I think there's an awful lot of effort that goes into ensuring that you are presenting on the basis of the information set that you have the best available estimates that you can, and I think there's an awful lot of effort that goes into thinking about quality, that thinks about quality assurance when these are put together, that thinks about the communication how they mesh in with the rest of the, for example, the economic picture that you have, so you can reasonably assure yourself that you're providing people with the best possible estimate that you can at any given moment. But at the same time, you want to try to guide people by saying, well, this is an estimate, there's no guarantee that this is going to exactly reflect the real world, the more that you can do to put some sort of numerical context around that the more the reliable basis you have for people who are using those numbers, and thinking about as I say, particularly in the case of those statistics that may be revised in future as you get more information. You can learn things, obviously from the direction, the size of revisions to numbers that have happened in the past, in order to give people a sense of how much confidence they should place in any given number produced at any given point in that cycle of evolution as the numbers get firmer over time. MF If you're looking to use the statistics to make some decision with your business or personal life, where do you look for the small print? Where do you look for the guidance on how reliable this number is going to be? SRC Well, there's plenty of guidance published in different ways. It depends, obviously on the specific statistics in question, but I think it's very important for producers to ensure that when people come for example to websites or to releases that have the headline numbers that are going to be reported, that it's reasonably straightforward to get to a discussion of where do these numbers come from? How are they calculated? What's the degree of uncertainty that lies around that arising from these things? And so not everybody is obviously going to have an appetite for the technical discussion there. But providing that in a reasonably accessible, reasonably findable way, is important and I think a key principle is that if you're upfront about explaining how numbers are generated, explaining about the uncertainty that lies around them in as quantified way as you can, that actually increases and enhances trust in the underlying production and communication process and in the numbers rather than undermining it. I think you have to give the consumers of these numbers by and large the credit for understanding that these things are only estimates and that if you're upfront about that, and you talk as intelligently and clearly as you can about the uncertainties - potential for revision, for example - then that enhances people's confidence. It doesn't undermine it. MF You mentioned there about enhancing trust and that's the crux of all this. At a time we're told of growing public mistrust in national institutions and so forth, isn't there a risk that the downside of talking more about uncertainty in statistics is the more aware people will become of it and the less those statistics are going to be trusted? SRC I think in general, if you are clear with people about how a number is calculated, the uncertainty that lies around it, the potential for revision, how things have evolved in the past - that’s not for everybody, but for most people - is likely to enhance their trust and crucially, their understanding of the numbers that you're presenting and the context that you're putting around those. So making that available - as I say, you have to recognise that different people will have different appetites for the technical detail around this - then there are different ways of presenting the uncertainty not only about, you know, outturn statistics, but in my old gig around forecasts of where things are going in the future and doing that and testing it out with your users as to what they find helpful and what they don't is a valuable thing to be doing. MF You've been the stats regulator for a little while now. Do you think policymakers, perhaps under pressure to achieve certain outcomes, put too much reliance on statistics when it suits them, in order to show progress against some policy objective? I mean, do the limitations of statistics sometimes go out of the window when it's convenient. What's your view of how well certainty is being treated by those in government and elsewhere? SRC Well, I think certainly in my time as a forecaster, you were constantly reminding users of forecasters and consumers of that, that again, they're based on the best available information set that you have at the time. You explain where the judgements have come from but in particular, if you're trying to set policy in order to achieve a target for a particular statistic at some point in the future, for example, a measure of the budget deficit, then having an understanding of the uncertainty, the nature of it, the potential size of it in that context, helps you avoid making promises that it's not really in your power to keep with the best will in the world, given those uncertainties. And sometimes that message is taken closer to heart than at other times. MF Time I think to bring in Craig now at this point, as head of national accounts and the team that produces GDP at the ONS to talk about uncertainty in the real world of statistical production. With this specific example, Craig, you're trying to produce a single number, one single number that sums up progress or lack of it in the economy as a whole. What do you do to make the users of the statistics and the wider public aware of the fact that you're producing in GDP one very broad estimate with a lot of uncertainty built in? CRAIG MCLAREN Thanks, Miles. I mean, firstly, the UK economy - incredibly complex isn't it? The last set of numbers, we've got 2.7 trillion pounds worth of value. So if you think about how we bring all of those numbers together, then absolutely what we're doing is providing the best estimate at the time and then we start to think about this trade off between timeliness and accuracy. So even when we bring all of those data sources together, we often balance between what can we understand at the point of time, and then equally as we get more information from our businesses and our data suppliers, we evolve our estimates to understand more about the complex nature of the UK economy. So where we do that and how we do that it's looking quite closely at our data sources. So for example, we do a lot of surveys about businesses, and that uses data provided by businesses and that can come with a little bit of a what we call a time lag. So clearly when we run our monthly business surveys that's quite timely. We get that information quite quickly. But actually when we want to understand more detail about the UK economy, we have what we call structural surveys, and they're like our annual surveys. So over time, it can take us a couple of years actually to get a more complete picture of the UK economy. So in that time, absolutely. We may revise the estimate. Some businesses might say, well, we forgot about this. We're going to send you a revised number. We look at quite closely about the interplay between all the dynamics of the different parts of the economy, and then we confront the data set. So I think by bringing all this information together, both on the timeliness but also as we get a more complete picture, we start to refine our estimates. So in practice, what we do find is as we evolve our estimates, we can monitor that. We do look quite closely at the revisions of GDP, then we can produce analysis that helps our users understand those revisions and then we quite heavily focus on the need for rapid information that helps policymakers. So how can policymakers take this in a short period of time, but then we provide this information to understand the revision properties of what we would call that about how our estimates can change and evolve over time as we get additional information going forwards. MF So let's just look at the specifics, just to help people understand the process and how you put what you've just explained so well into action. Craig, the last quarterly estimate of GDP showed the economy contracted slightly. CM That's exactly right Miles and I think where we do produce our estimates in a timely basis, absolutely they will be subject to revision or more information as we get them. So this is why it's important, perhaps not to just focus on a single estimate. And I know in our most recent year in the economy, when that's all pretty flat, for example, or there's sort of a small fall, we do have a challenge in our communication. And that becomes a little bit back to the user understanding about how these numbers are compiled. And also perhaps how can you use additional information as part of that? So as I mentioned the UK economy is very complex, GDP is a part of that, but we also have other broader indicators as well. So when we do talk about small movements in the economy, we do need to think about the wider picture alongside that. MF Okay, so the last quarterly estimate, what was the potential for revision there? Just how big could that have been? CM We don't formally produce what we call range estimates at the moment. We are working quite closely with colleagues about how we might do that. So if you think about all the information that comes together to produce GDP, some of that is survey base which will have a degree of perhaps error around it, but we also use administrative data sources as well. So we have access to VAT records anonymized of course, which we bring in to our estimates. So the complex nature around the 300 different data sources that we bring in to make GDP means that having a range can be quite a statistical challenge. So what we do is we can actually look at our historical record of GDP revisions, and by doing that, in perhaps normal times, are quite unbiased. And by that, I mean we don't expect to see that to be significant either way. So we may revise up by perhaps 0.1 or down by 0.1, but overall, it's quite a sort of considered picture and we don't see radical revisions to our first estimates over time. MF You're saying that when revisions happen they are as likely to be up as they are to be down and there's no historical bias in there either way, because presumably, if there was that bias detectable, you would have acted some time ago, to make sure it was removed from the methodology.

CM

Exactly. Exactly.

MF

Just staying with this whole business of trying to make a very fast estimate because it is by international standards, a fast estimate of a very, very big subject. How much data in percentage terms would you say you’ve got at the point of that first estimate as a proportion of all the data you're eventually going to get when you produce your final number?

CM It does depend on the indicator Miles. So the UK is one of the few countries in the world that produces monthly GDP. So we are quite rapid in producing monthly GDP. Robert did mention in the introduction of this session that with monthly GDP we do an output measure. So this is information we have quite quickly from businesses. So our monthly GDP estimate is based on one of the measures of the economy. So that uses the output measure. We get that from very rapid surveys, and that has quite a good coverage around 60 or 70% that we can get quite quickly. But then as we confront with our different measures of GDP, that's when the other sources come in. So we have our expenditure measure which takes a bit longer and then we have our income measure as well. So we have this process in the UK working for a monthly GDP which is quite rapid. We then bring in additional data sources and each of these measures have their own strengths and weaknesses until we can finally confront them fully in what we call an annual framework. And then often that takes us a couple of years to fully bring together all those different data sources so we can see the evolution of our GDP estimates as additional data comes in. MF Now looking back to what happened during the pandemic, of course, we saw this incredible downturn in the economy as the effects of lockdown took effect on international travel that shuddered to a halt for a while and everyone was staying at home for long periods. The ONS said at that point, it was the most significant downturn it had ever recorded. But then that was closely followed of course when those restrictions were eased by the most dramatic recovery ever recorded. Just how difficult was it to precisely manage the sheer scale of that change, delivered over quite a short period, relatively speaking, just how good a job did the system do under those very testing circumstances? CM It was incredibly challenging and I think not just for official statistics of course but for a range of outputs as well. Viewing it in context now, I think when the economy is going relatively stable, perhaps a 0.1 or 0.2 change, we might start to be a bit nervous if we saw some revisions to that but if you think about I believe at the time was around 20% drop in activity and actually the challenge of ensuring that our surveys were capturing what was happening in the economy in the UK, and in the ONS we stood up some additional surveys to provide us with additional information so we could understand what was happening. We still have that survey that's a fortnightly survey. So the challenge that we had was to try and get the information in near real time to provide us with the confidence and also obtaining information from businesses that are not at their place of work, so they weren't responding to our surveys. So we had to pivot to using perhaps telephone, collecting information in a different way really to understand the impact the economy. So when we look back now, in retrospect, perhaps a 20% drop should that have been 21 or 22%. It's all relative to the size of the drop is my main point I would make. So in the context of providing the information at the time, we were quite fortunate in the survey on the data collection front to really have a world leading survey for businesses that provided that information in near real time, which we could then use to understand the impacts on different parts of the UK economy. And I think now when we get new information in an annual basis, we can go back and just confront that data set and understand how reliable those estimates were, of course. MF Of course the UK was not alone in making some quite significant revisions subsequently to its initial estimates, what was done, though, at the time to let the users of the statistics know that because of those circumstances, which were so unusual, because the pace of change you were seeing was so dramatic, that perhaps there was a need for special caution around what the data was seeming to say about the state of the economy? CM Exactly, and it was unprecedented of course as well. So in our communication and coming back to how we communicate statistics, and also the understanding as well. We added some additional phrasing, if you like Miles, to ensure people did sort of understand and perhaps acknowledge the fact that in times like this, there is an additional degree of uncertainty. So the phrasing becomes very important, of course to reflect that these are estimates they're our first estimate at the time, they perhaps will be maybe more revised than perhaps typically we would expect to happen. So the narrative and communication and phrasing, and the use of the term ‘estimate’, for example, became incredibly important in the time of the pandemic. And it's also incredibly important in the context of smaller movements as well. So while we had this large impact on COVID, it was our best estimate at the time, and I think it's important to reflect that, and as we get more and more understanding of our data sources, then those numbers will be revised. So what we did do was really make sure that was front and centre to our communications just to reflect the fact that there can be additional information after the fact but this is the best estimate at the time and there's a degree of uncertainty. And we've continued that work working closely with colleagues in the regulator to understand about how best we can continue to improve the way that we communicate around uncertainty in what is a complex compilation process as well. MF

Professor Mairi Spowage. You've heard Sir Robert talking earlier about the importance of understanding uncertainty in statistics and the need to make sure our statistical system can deal with that, and explain it to people properly. You've heard Craig also there explain from a production point of view the length to which the ONS goes to deal with the uncertainty in its initial estimates of GDP and the experience of dealing with those dramatic swings around the pandemic. What is your personal take on this from your understanding of what the wider public and the users of economic statistics have a right to expect? What do you make of all that?

MAIRI SPOWAGE

So I think I’d just like to start by agreeing with Robert, that explaining uncertainty to users is really important. And in my view, and certainly some research that some of my colleagues at the Economic Statistics Centre of Excellence have done, which show that actually it increases confidence in statistics, because we all know that GDP statistics will be updated as more information comes in when these are presented as revisions to the initial estimates. And I think the more you can do to set expectations of users that this is normal, and sort of core part of estimation of what's going on in the economy, the better when these revisions inevitably happen. We very much see ourselves as not just a user of statistics, but also I guess a filter through which others consume them. We discuss the statistics that ONS produce a lot, and I think we like to highlight for example, if it's the first estimate that more information will be coming in where revisions have happened. And particularly when you're quite close to zero, as we've been over the last year or so, you know, folks can get quite excited about it being slightly above or below zero, but generally the statistics are in the same area even though they may be slightly negative or slightly positive.

MF

Yes, and I'd urge people to have a listen to our other podcast on the whole subject of ‘what is a recession’ to perhaps get some more understanding of just how easily these so called technical recessions can in fact be revised away. So overall then Mairi, do you think the system is doing enough that people do appreciate, particularly on the subject of GDP, of course, because we've had this really powerful example recently, is doing enough to communicate the inherent uncertainty among those early estimates, or perhaps we couldn't be doing more?

MS

Yeah, absolutely. Obviously, there's different types of uncertainty and the way that you can communicate and talk about uncertainty when you're producing GDP statistics is slightly different to that, that you might talk about things like labour market statistics, you know. I know there are a lot of issues with labour market statistics at the moment, but obviously, the issues with labour market statistics in normal times is really about the fact it's based on a survey and that therefore has an inherent uncertainty due to the sampling that has to be done. And it might mean that a seemingly you know, an increase in say unemployment from one quarter to the next isn't actually a significant difference. Whereas with GDP, it's much more about the fact that this is only a small proportion of the data that will eventually be used to estimate what's happened in this period in the economy. And over time we’ll sort of be building it up. I think the ONS are doing a good job in trying to communicate uncertainty in statistics but I think we could always do more. I think having you know, statisticians come on and talk about the statistics and pointing these things out proactively is a good idea. So much more media engagement is definitely a good idea. As I said, we try and through you know informal means like blogs and podcasts like this, to talk about the data that have been produced. And you know, when there are interesting features of it, which are driving some of the changes and to what extent those might change. So, one of the features over the last year for 2023 has been the influence of, you know, things like public sector strikes on the data, because when there's less activity in the public sector that also changes the profile of growth over the year quite a lot. And that's been very influential over 2023. So I think it's important that there's more discussion about this and, to be honest, more knowledge in economic circles about how these statistics are put together. Or you know, I'm an economic statistician rather than an economist per se, and I think the more knowledge and awareness that can be amongst economic commentators on these issues, I think the better because if we’re upfront about the uncertainty, I think it increases the confidence when these revisions inevitably happen.

MF

Perhaps then it is the way the statistics are told in the media and elsewhere? Of course, they're invested by those observers with more authority perhaps than they deserve. Particularly, of course, it must be very tempting if you're a politician and the numbers are going your way, then obviously you want people to believe they are absolutely 100% accurate.

MS

Absolutely. We're in a funny situation at the moment. I mean, you know, our research institute focuses a lot on the Scottish economy. And the data for Scotland for 2023 shows... Yes, it shows two quarters of contraction and two quarters of growth, but they're not joined together. So there wasn't a technical recession in Scotland. But you know, over the year, basically, the Scottish and UK economies have had a really poor year with hardly any growth. But you know, I haven't seen it yet, but I'm expecting that there will be some people, you know, sort of crowing about that, like it's really showing that the Scottish economy is doing better or something when it's not really. So there will always be politicians who try to you know, over interpret changes in the data. Another example would be the first estimates of quarterly growth in the first part of 2023 showed 0.4 growth in Scotland compared to 0.1 in the UK, and there were politicians saying that Scotland was growing four times as fast as the UK. These things will happen, but you know, one of our roles to be honest is in our regular blogs and communications with the policy community, particularly in Scotland, but also beyond, is to point these things out and say that they're a bit silly. That no doubt these things will be revised and come closer together and nobody should get too excited about them.

MF

Thinking particularly about when you're looking at levels of geography different from the UK for yourselves in Scotland and from where I'm sitting here in Wales as well, for that matter. Do the data tend to become more or less accurate, should we have more or less confidence in the sort of datasets we're seeing for those different levels of geography?

MS

Well, generally it becomes more unreliable, and it's subject to more uncertainty. A lot of the data that's used is based on business surveys for estimating what's going on in the economy. And there are two areas of uncertainty there. The samples at smaller geographies are smaller so it's greater uncertainty because of sampling variability. But there's also a key problem on the data infrastructure in the UK that business data - this is across GB because Northern Ireland's is collected slightly differently - is collected on units which are GB wide. So it does make estimating what's going on in the parts of GB quite challenging. And there are some additional estimation procedures that need to be done to actually say what's going on in Scotland or in Wales. So it does add an additional layer of uncertainty to any sort of economic estimation at sub-UK geographies.

MF

I should add at that point that improving the quality of regional sub-national data has always been an important part of the ONS’s work and continues to be part of its strategic drive. But Sir Robert, from what you've seen recently, particularly over the last year, the way that GDP estimates have been used in the media and in politics, and particularly the whole business of comparing quite small differences in GDP change internationally and the significance that's invested in that, the relative growth rates between one country or another. Has there been too much discussion around that; has too much weight being put on that recently from where you've been sitting?

SRC

Well, I think just to pick up on the point that Mairi was making, you can end up investing, you know, much too much significance in comparisons of what's going on in one place and in another place over a relatively short period of time in which there's likely to be noise in those numbers. So, as she said, the idea that you know, taking one area where the growth is 0.4 in a quarter and another where it’s 0.1 and saying that one economy is growing four times more quickly, while strictly true on the basis of those that is really not an informative comparison, you have to look over a longer period for both. When you get to international comparisons, there's the additional issue of the extent to which although there are international standards and best practices as to how, for example, national accounts are put together. The way in which this is actually carried out from place to place can be done in different ways that make those sorts of comparisons again, particularly over short periods, but also when the economy is doing strange things as it was during the course of the pandemic, particularly tricky. So in the GDP context, obviously, there was the question mark about having big changes in the composition of what the education and health sectors were doing as we went into the period of lockdown and therefore judging how the output of those sectors had changed was a really very tricky conceptual judgement to make. And one of the issues that arose about trying to make international comparisons is that different people will be doing that in different ways, depending in part on how they measure the outputs of health and education under normal circumstances. So if you are going to do international comparisons, it’s certainly better to look over a longer period. So you're avoiding being misled by short term noise but also having a care to the way in which methodologies may differ, and that that may matter that sometimes more than it does allow others to see if this is actually a meaningful comparison of like for like.

MF

It's also worth pointing out, as I think we have in previous podcasts, that the UK is one of the few economies that does actually seek to measure the actual output of public services, whereas some countries just make broad assumptions about what those sectors have been doing. But it's also worth mentioning, I think that some countries simply don't revise as much as we do because their system makes an initial estimate, and then they don't return to it for some years in the case of a number of countries.

SRC

Yes, that's true. And so then the question is sometimes - and I think this arose relatively recently in the UK context - of the set of revisions that you look at and change the international comparison, but you know that some countries have not yet essentially done the same set of revisions. For example, the way in which you try to pull together the estimates of output income and expenditure at times afterwards as you have more information from annual surveys you have more information on incomes for example, from the tax system. So, again, at any given moment, even though you know, you're in both cases, trying to say well, what's our best sense of what was going on a year ago, different countries will be at different stages of the statistical production process and the proportion of the eventual total information set on which you base your estimates, you know, some countries will have incorporated more of that than less, and so a revision that you're doing this month, somebody else may not do for six months, and that again, complicates the picture, and really again, suggests that looking at international comparisons at too high frequency or too much in the recent past, there are bigger uncertainties and caveats that you ought to be placing around big calls and big interpretations based on that.

MF

Yes, and while it's hard enough to know where you are at any given point in the economy, it's even harder of course - infinitely harder you might say - to work out where on earth you're going to go next. You've spent a lot of time in the forecasting business, how are forecasters, and I know the Bank of England in particular is taking a good look at this at the moment - the data it relies upon in order to make its forecast - what can the statistical system be doing to support organisations with that unenviable task of having to look into the future and guide us on what is going to happen next?

SRC

Well, I think from the perspective of the forecasters themselves, many of the same principles that we've been talking to in terms of how the statistical system should communicate uncertainty applies in spades, in the case of forecasts where explaining how you've reached the judgements that you have, the uncertainty that you know, past forecast errors, that particular sensitivity of a forecast, a judgement that you'd be maybe making in some part of it, the more you can do to explain that increases people's trust rather than reduces it. From the perspective of the statistical producer helping the forecaster, I think, again, explanation, if you have got particular difficulties, particular reasons why you think there might be greater uncertainty than in the past around particular numbers, it's very important. The current evolution of the labour market statistics is a good example of that - you need to be talking to the big users and the big forecasters about the particular uncertainties, there may be at a given time so they can take account of that as best they can. On the other hand, having been a forecaster for 10 years, I certainly took the view that for forecasters to complain about revisions in economic data is like sailors complaining about waves in the sea. I'm afraid that is what you're dealing with. That's what you have to sail on and everybody makes their best effort to come up with the best possible numbers, but it's a fact of life. And your knowledge and understanding of what's going on in the past now, and how that informs your judgements in the future evolves over the time. It doesn't remain static and you're gazing through a murky cloud at some time, but that doesn't reduce the importance of doing the best job you can.

MF

Final word then for the forecasters and for everybody else. The statistics are reliable but understand their limitations.

SRC

Yeah.

MF Well, that's it for another episode of Statistically Speaking. Thanks to Sir Robert Chote, Professor Mairi Spowage, and Dr. Craig McLaren, and of course, thanks to you, as always for listening. You can subscribe to future episodes of this podcast on Spotify, Apple podcasts, and all the other major podcast platforms. And you can follow us on X, previously known as Twitter, via the @ONSfocus feed. I'm Miles Fletcher and from myself and producer Steve Milne. Goodbye

ENDS.

  continue reading

22 episodes

Artwork
iconShare
 
Manage episode 408659091 series 3319221
Content provided by Office for National Statistics and Statistically Speaking. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Office for National Statistics and Statistically Speaking or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The ONS podcast returns, this time looking at the importance of communicating uncertainty in statistics. Joining host Miles Fletcher to discuss is Sir Robert Chote, Chair of the UKSA; Dr Craig McLaren, of the ONS; and Professor Mairi Spowage, director of the Fraser of Allander Institute.

Transcript

MILES FLETCHER

Welcome back to Statistically Speaking, the official podcast of the UK’s Office for National Statistics. I'm Miles Fletcher and to kick off this brand new season we're going to venture boldly into the world of uncertainty. Now, it is of course the case that nearly all important statistics are in fact estimates. They may be based on huge datasets calculated with the most robust methodologies, but at the end of the day they are statistical judgments subject to some degree of uncertainty. So, how should statisticians best communicate that uncertainty while still maintaining trust in the statistics themselves? It's a hot topic right now and to help us understand it, we have another cast of key players. I'm joined by the chair of the UK Statistics Authority Sir Robert Chote, Dr. Craig McLaren, head of national accounts and GDP here at the ONS, and from Scotland by Professor Mairi Spowage, director of the renowned Fraser of Allander Institute at the University of Strathclyde. Welcome to you all.

Well, Sir Robert, somebody once famously said that decimal points in GDP is an economist’s way of showing they've got a sense of humour. And well, that's quite amusing - particularly if you're not an economist - there's an important truth in there isn't there? When we say GDP has gone up by 0.6%. We really mean that's our best estimate.

SIR ROBERT CHOTE It is. I mean, I've come at this having been a consumer of economic statistics for 30 years in different ways. I started out as a journalist on the Independent and the Financial Times writing about the new numbers as they were published each day, and then I had 10 years using them as an economic and fiscal forecaster. So I come at this very much from the spirit of a consumer and am now obviously delighted to be working with producers as well. And you're always I think, conscious in those roles of the uncertainty that lies around particular economic estimates. Now, there are some numbers that are published, they are published once, and you are conscious that that's the number that stays there. But there is uncertainty about how accurately that is reflecting the real world position and that's naturally the case. You then have the world of in particular, the national accounts, which are numbers, where you have initial estimates that the producer returns to and updates as the information sets that you have available to draw your conclusions develops over time. And it's very important to remember on the national accounts that that's not a bug, that's a feature of the system. And what you're trying to do is to measure a very complicated set of transactions you're trying to do in three ways, measuring what the economy produces, measuring incomes, measuring expenditure. You do that in different ways with information that flows in at different times. So it's a complex task and necessarily the picture evolves. So I think from the perspective of a user, it's important to be aware of the uncertainty and it's important when you're presenting and publishing statistics to help people engage with that, because if you are making decisions based on statistics, if you're simply trying to gain an understanding of what's going on in the economy or society, generally speaking you shouldn't be betting the farm on the assumption that any particular number is, as you say, going to be right to decimal places. And the more that producers can do to help people engage with that in an informed and intelligent way, and therefore mean that decisions that people take on the basis of this more informed the better.

MF So it needs to be near enough to be reliable, but at the same time we need to know about the uncertainty. So how near is the system at the moment as far as these important indicators are concerned to getting that right? SRC Well, I think there's an awful lot of effort that goes into ensuring that you are presenting on the basis of the information set that you have the best available estimates that you can, and I think there's an awful lot of effort that goes into thinking about quality, that thinks about quality assurance when these are put together, that thinks about the communication how they mesh in with the rest of the, for example, the economic picture that you have, so you can reasonably assure yourself that you're providing people with the best possible estimate that you can at any given moment. But at the same time, you want to try to guide people by saying, well, this is an estimate, there's no guarantee that this is going to exactly reflect the real world, the more that you can do to put some sort of numerical context around that the more the reliable basis you have for people who are using those numbers, and thinking about as I say, particularly in the case of those statistics that may be revised in future as you get more information. You can learn things, obviously from the direction, the size of revisions to numbers that have happened in the past, in order to give people a sense of how much confidence they should place in any given number produced at any given point in that cycle of evolution as the numbers get firmer over time. MF If you're looking to use the statistics to make some decision with your business or personal life, where do you look for the small print? Where do you look for the guidance on how reliable this number is going to be? SRC Well, there's plenty of guidance published in different ways. It depends, obviously on the specific statistics in question, but I think it's very important for producers to ensure that when people come for example to websites or to releases that have the headline numbers that are going to be reported, that it's reasonably straightforward to get to a discussion of where do these numbers come from? How are they calculated? What's the degree of uncertainty that lies around that arising from these things? And so not everybody is obviously going to have an appetite for the technical discussion there. But providing that in a reasonably accessible, reasonably findable way, is important and I think a key principle is that if you're upfront about explaining how numbers are generated, explaining about the uncertainty that lies around them in as quantified way as you can, that actually increases and enhances trust in the underlying production and communication process and in the numbers rather than undermining it. I think you have to give the consumers of these numbers by and large the credit for understanding that these things are only estimates and that if you're upfront about that, and you talk as intelligently and clearly as you can about the uncertainties - potential for revision, for example - then that enhances people's confidence. It doesn't undermine it. MF You mentioned there about enhancing trust and that's the crux of all this. At a time we're told of growing public mistrust in national institutions and so forth, isn't there a risk that the downside of talking more about uncertainty in statistics is the more aware people will become of it and the less those statistics are going to be trusted? SRC I think in general, if you are clear with people about how a number is calculated, the uncertainty that lies around it, the potential for revision, how things have evolved in the past - that’s not for everybody, but for most people - is likely to enhance their trust and crucially, their understanding of the numbers that you're presenting and the context that you're putting around those. So making that available - as I say, you have to recognise that different people will have different appetites for the technical detail around this - then there are different ways of presenting the uncertainty not only about, you know, outturn statistics, but in my old gig around forecasts of where things are going in the future and doing that and testing it out with your users as to what they find helpful and what they don't is a valuable thing to be doing. MF You've been the stats regulator for a little while now. Do you think policymakers, perhaps under pressure to achieve certain outcomes, put too much reliance on statistics when it suits them, in order to show progress against some policy objective? I mean, do the limitations of statistics sometimes go out of the window when it's convenient. What's your view of how well certainty is being treated by those in government and elsewhere? SRC Well, I think certainly in my time as a forecaster, you were constantly reminding users of forecasters and consumers of that, that again, they're based on the best available information set that you have at the time. You explain where the judgements have come from but in particular, if you're trying to set policy in order to achieve a target for a particular statistic at some point in the future, for example, a measure of the budget deficit, then having an understanding of the uncertainty, the nature of it, the potential size of it in that context, helps you avoid making promises that it's not really in your power to keep with the best will in the world, given those uncertainties. And sometimes that message is taken closer to heart than at other times. MF Time I think to bring in Craig now at this point, as head of national accounts and the team that produces GDP at the ONS to talk about uncertainty in the real world of statistical production. With this specific example, Craig, you're trying to produce a single number, one single number that sums up progress or lack of it in the economy as a whole. What do you do to make the users of the statistics and the wider public aware of the fact that you're producing in GDP one very broad estimate with a lot of uncertainty built in? CRAIG MCLAREN Thanks, Miles. I mean, firstly, the UK economy - incredibly complex isn't it? The last set of numbers, we've got 2.7 trillion pounds worth of value. So if you think about how we bring all of those numbers together, then absolutely what we're doing is providing the best estimate at the time and then we start to think about this trade off between timeliness and accuracy. So even when we bring all of those data sources together, we often balance between what can we understand at the point of time, and then equally as we get more information from our businesses and our data suppliers, we evolve our estimates to understand more about the complex nature of the UK economy. So where we do that and how we do that it's looking quite closely at our data sources. So for example, we do a lot of surveys about businesses, and that uses data provided by businesses and that can come with a little bit of a what we call a time lag. So clearly when we run our monthly business surveys that's quite timely. We get that information quite quickly. But actually when we want to understand more detail about the UK economy, we have what we call structural surveys, and they're like our annual surveys. So over time, it can take us a couple of years actually to get a more complete picture of the UK economy. So in that time, absolutely. We may revise the estimate. Some businesses might say, well, we forgot about this. We're going to send you a revised number. We look at quite closely about the interplay between all the dynamics of the different parts of the economy, and then we confront the data set. So I think by bringing all this information together, both on the timeliness but also as we get a more complete picture, we start to refine our estimates. So in practice, what we do find is as we evolve our estimates, we can monitor that. We do look quite closely at the revisions of GDP, then we can produce analysis that helps our users understand those revisions and then we quite heavily focus on the need for rapid information that helps policymakers. So how can policymakers take this in a short period of time, but then we provide this information to understand the revision properties of what we would call that about how our estimates can change and evolve over time as we get additional information going forwards. MF So let's just look at the specifics, just to help people understand the process and how you put what you've just explained so well into action. Craig, the last quarterly estimate of GDP showed the economy contracted slightly. CM That's exactly right Miles and I think where we do produce our estimates in a timely basis, absolutely they will be subject to revision or more information as we get them. So this is why it's important, perhaps not to just focus on a single estimate. And I know in our most recent year in the economy, when that's all pretty flat, for example, or there's sort of a small fall, we do have a challenge in our communication. And that becomes a little bit back to the user understanding about how these numbers are compiled. And also perhaps how can you use additional information as part of that? So as I mentioned the UK economy is very complex, GDP is a part of that, but we also have other broader indicators as well. So when we do talk about small movements in the economy, we do need to think about the wider picture alongside that. MF Okay, so the last quarterly estimate, what was the potential for revision there? Just how big could that have been? CM We don't formally produce what we call range estimates at the moment. We are working quite closely with colleagues about how we might do that. So if you think about all the information that comes together to produce GDP, some of that is survey base which will have a degree of perhaps error around it, but we also use administrative data sources as well. So we have access to VAT records anonymized of course, which we bring in to our estimates. So the complex nature around the 300 different data sources that we bring in to make GDP means that having a range can be quite a statistical challenge. So what we do is we can actually look at our historical record of GDP revisions, and by doing that, in perhaps normal times, are quite unbiased. And by that, I mean we don't expect to see that to be significant either way. So we may revise up by perhaps 0.1 or down by 0.1, but overall, it's quite a sort of considered picture and we don't see radical revisions to our first estimates over time. MF You're saying that when revisions happen they are as likely to be up as they are to be down and there's no historical bias in there either way, because presumably, if there was that bias detectable, you would have acted some time ago, to make sure it was removed from the methodology.

CM

Exactly. Exactly.

MF

Just staying with this whole business of trying to make a very fast estimate because it is by international standards, a fast estimate of a very, very big subject. How much data in percentage terms would you say you’ve got at the point of that first estimate as a proportion of all the data you're eventually going to get when you produce your final number?

CM It does depend on the indicator Miles. So the UK is one of the few countries in the world that produces monthly GDP. So we are quite rapid in producing monthly GDP. Robert did mention in the introduction of this session that with monthly GDP we do an output measure. So this is information we have quite quickly from businesses. So our monthly GDP estimate is based on one of the measures of the economy. So that uses the output measure. We get that from very rapid surveys, and that has quite a good coverage around 60 or 70% that we can get quite quickly. But then as we confront with our different measures of GDP, that's when the other sources come in. So we have our expenditure measure which takes a bit longer and then we have our income measure as well. So we have this process in the UK working for a monthly GDP which is quite rapid. We then bring in additional data sources and each of these measures have their own strengths and weaknesses until we can finally confront them fully in what we call an annual framework. And then often that takes us a couple of years to fully bring together all those different data sources so we can see the evolution of our GDP estimates as additional data comes in. MF Now looking back to what happened during the pandemic, of course, we saw this incredible downturn in the economy as the effects of lockdown took effect on international travel that shuddered to a halt for a while and everyone was staying at home for long periods. The ONS said at that point, it was the most significant downturn it had ever recorded. But then that was closely followed of course when those restrictions were eased by the most dramatic recovery ever recorded. Just how difficult was it to precisely manage the sheer scale of that change, delivered over quite a short period, relatively speaking, just how good a job did the system do under those very testing circumstances? CM It was incredibly challenging and I think not just for official statistics of course but for a range of outputs as well. Viewing it in context now, I think when the economy is going relatively stable, perhaps a 0.1 or 0.2 change, we might start to be a bit nervous if we saw some revisions to that but if you think about I believe at the time was around 20% drop in activity and actually the challenge of ensuring that our surveys were capturing what was happening in the economy in the UK, and in the ONS we stood up some additional surveys to provide us with additional information so we could understand what was happening. We still have that survey that's a fortnightly survey. So the challenge that we had was to try and get the information in near real time to provide us with the confidence and also obtaining information from businesses that are not at their place of work, so they weren't responding to our surveys. So we had to pivot to using perhaps telephone, collecting information in a different way really to understand the impact the economy. So when we look back now, in retrospect, perhaps a 20% drop should that have been 21 or 22%. It's all relative to the size of the drop is my main point I would make. So in the context of providing the information at the time, we were quite fortunate in the survey on the data collection front to really have a world leading survey for businesses that provided that information in near real time, which we could then use to understand the impacts on different parts of the UK economy. And I think now when we get new information in an annual basis, we can go back and just confront that data set and understand how reliable those estimates were, of course. MF Of course the UK was not alone in making some quite significant revisions subsequently to its initial estimates, what was done, though, at the time to let the users of the statistics know that because of those circumstances, which were so unusual, because the pace of change you were seeing was so dramatic, that perhaps there was a need for special caution around what the data was seeming to say about the state of the economy? CM Exactly, and it was unprecedented of course as well. So in our communication and coming back to how we communicate statistics, and also the understanding as well. We added some additional phrasing, if you like Miles, to ensure people did sort of understand and perhaps acknowledge the fact that in times like this, there is an additional degree of uncertainty. So the phrasing becomes very important, of course to reflect that these are estimates they're our first estimate at the time, they perhaps will be maybe more revised than perhaps typically we would expect to happen. So the narrative and communication and phrasing, and the use of the term ‘estimate’, for example, became incredibly important in the time of the pandemic. And it's also incredibly important in the context of smaller movements as well. So while we had this large impact on COVID, it was our best estimate at the time, and I think it's important to reflect that, and as we get more and more understanding of our data sources, then those numbers will be revised. So what we did do was really make sure that was front and centre to our communications just to reflect the fact that there can be additional information after the fact but this is the best estimate at the time and there's a degree of uncertainty. And we've continued that work working closely with colleagues in the regulator to understand about how best we can continue to improve the way that we communicate around uncertainty in what is a complex compilation process as well. MF

Professor Mairi Spowage. You've heard Sir Robert talking earlier about the importance of understanding uncertainty in statistics and the need to make sure our statistical system can deal with that, and explain it to people properly. You've heard Craig also there explain from a production point of view the length to which the ONS goes to deal with the uncertainty in its initial estimates of GDP and the experience of dealing with those dramatic swings around the pandemic. What is your personal take on this from your understanding of what the wider public and the users of economic statistics have a right to expect? What do you make of all that?

MAIRI SPOWAGE

So I think I’d just like to start by agreeing with Robert, that explaining uncertainty to users is really important. And in my view, and certainly some research that some of my colleagues at the Economic Statistics Centre of Excellence have done, which show that actually it increases confidence in statistics, because we all know that GDP statistics will be updated as more information comes in when these are presented as revisions to the initial estimates. And I think the more you can do to set expectations of users that this is normal, and sort of core part of estimation of what's going on in the economy, the better when these revisions inevitably happen. We very much see ourselves as not just a user of statistics, but also I guess a filter through which others consume them. We discuss the statistics that ONS produce a lot, and I think we like to highlight for example, if it's the first estimate that more information will be coming in where revisions have happened. And particularly when you're quite close to zero, as we've been over the last year or so, you know, folks can get quite excited about it being slightly above or below zero, but generally the statistics are in the same area even though they may be slightly negative or slightly positive.

MF

Yes, and I'd urge people to have a listen to our other podcast on the whole subject of ‘what is a recession’ to perhaps get some more understanding of just how easily these so called technical recessions can in fact be revised away. So overall then Mairi, do you think the system is doing enough that people do appreciate, particularly on the subject of GDP, of course, because we've had this really powerful example recently, is doing enough to communicate the inherent uncertainty among those early estimates, or perhaps we couldn't be doing more?

MS

Yeah, absolutely. Obviously, there's different types of uncertainty and the way that you can communicate and talk about uncertainty when you're producing GDP statistics is slightly different to that, that you might talk about things like labour market statistics, you know. I know there are a lot of issues with labour market statistics at the moment, but obviously, the issues with labour market statistics in normal times is really about the fact it's based on a survey and that therefore has an inherent uncertainty due to the sampling that has to be done. And it might mean that a seemingly you know, an increase in say unemployment from one quarter to the next isn't actually a significant difference. Whereas with GDP, it's much more about the fact that this is only a small proportion of the data that will eventually be used to estimate what's happened in this period in the economy. And over time we’ll sort of be building it up. I think the ONS are doing a good job in trying to communicate uncertainty in statistics but I think we could always do more. I think having you know, statisticians come on and talk about the statistics and pointing these things out proactively is a good idea. So much more media engagement is definitely a good idea. As I said, we try and through you know informal means like blogs and podcasts like this, to talk about the data that have been produced. And you know, when there are interesting features of it, which are driving some of the changes and to what extent those might change. So, one of the features over the last year for 2023 has been the influence of, you know, things like public sector strikes on the data, because when there's less activity in the public sector that also changes the profile of growth over the year quite a lot. And that's been very influential over 2023. So I think it's important that there's more discussion about this and, to be honest, more knowledge in economic circles about how these statistics are put together. Or you know, I'm an economic statistician rather than an economist per se, and I think the more knowledge and awareness that can be amongst economic commentators on these issues, I think the better because if we’re upfront about the uncertainty, I think it increases the confidence when these revisions inevitably happen.

MF

Perhaps then it is the way the statistics are told in the media and elsewhere? Of course, they're invested by those observers with more authority perhaps than they deserve. Particularly, of course, it must be very tempting if you're a politician and the numbers are going your way, then obviously you want people to believe they are absolutely 100% accurate.

MS

Absolutely. We're in a funny situation at the moment. I mean, you know, our research institute focuses a lot on the Scottish economy. And the data for Scotland for 2023 shows... Yes, it shows two quarters of contraction and two quarters of growth, but they're not joined together. So there wasn't a technical recession in Scotland. But you know, over the year, basically, the Scottish and UK economies have had a really poor year with hardly any growth. But you know, I haven't seen it yet, but I'm expecting that there will be some people, you know, sort of crowing about that, like it's really showing that the Scottish economy is doing better or something when it's not really. So there will always be politicians who try to you know, over interpret changes in the data. Another example would be the first estimates of quarterly growth in the first part of 2023 showed 0.4 growth in Scotland compared to 0.1 in the UK, and there were politicians saying that Scotland was growing four times as fast as the UK. These things will happen, but you know, one of our roles to be honest is in our regular blogs and communications with the policy community, particularly in Scotland, but also beyond, is to point these things out and say that they're a bit silly. That no doubt these things will be revised and come closer together and nobody should get too excited about them.

MF

Thinking particularly about when you're looking at levels of geography different from the UK for yourselves in Scotland and from where I'm sitting here in Wales as well, for that matter. Do the data tend to become more or less accurate, should we have more or less confidence in the sort of datasets we're seeing for those different levels of geography?

MS

Well, generally it becomes more unreliable, and it's subject to more uncertainty. A lot of the data that's used is based on business surveys for estimating what's going on in the economy. And there are two areas of uncertainty there. The samples at smaller geographies are smaller so it's greater uncertainty because of sampling variability. But there's also a key problem on the data infrastructure in the UK that business data - this is across GB because Northern Ireland's is collected slightly differently - is collected on units which are GB wide. So it does make estimating what's going on in the parts of GB quite challenging. And there are some additional estimation procedures that need to be done to actually say what's going on in Scotland or in Wales. So it does add an additional layer of uncertainty to any sort of economic estimation at sub-UK geographies.

MF

I should add at that point that improving the quality of regional sub-national data has always been an important part of the ONS’s work and continues to be part of its strategic drive. But Sir Robert, from what you've seen recently, particularly over the last year, the way that GDP estimates have been used in the media and in politics, and particularly the whole business of comparing quite small differences in GDP change internationally and the significance that's invested in that, the relative growth rates between one country or another. Has there been too much discussion around that; has too much weight being put on that recently from where you've been sitting?

SRC

Well, I think just to pick up on the point that Mairi was making, you can end up investing, you know, much too much significance in comparisons of what's going on in one place and in another place over a relatively short period of time in which there's likely to be noise in those numbers. So, as she said, the idea that you know, taking one area where the growth is 0.4 in a quarter and another where it’s 0.1 and saying that one economy is growing four times more quickly, while strictly true on the basis of those that is really not an informative comparison, you have to look over a longer period for both. When you get to international comparisons, there's the additional issue of the extent to which although there are international standards and best practices as to how, for example, national accounts are put together. The way in which this is actually carried out from place to place can be done in different ways that make those sorts of comparisons again, particularly over short periods, but also when the economy is doing strange things as it was during the course of the pandemic, particularly tricky. So in the GDP context, obviously, there was the question mark about having big changes in the composition of what the education and health sectors were doing as we went into the period of lockdown and therefore judging how the output of those sectors had changed was a really very tricky conceptual judgement to make. And one of the issues that arose about trying to make international comparisons is that different people will be doing that in different ways, depending in part on how they measure the outputs of health and education under normal circumstances. So if you are going to do international comparisons, it’s certainly better to look over a longer period. So you're avoiding being misled by short term noise but also having a care to the way in which methodologies may differ, and that that may matter that sometimes more than it does allow others to see if this is actually a meaningful comparison of like for like.

MF

It's also worth pointing out, as I think we have in previous podcasts, that the UK is one of the few economies that does actually seek to measure the actual output of public services, whereas some countries just make broad assumptions about what those sectors have been doing. But it's also worth mentioning, I think that some countries simply don't revise as much as we do because their system makes an initial estimate, and then they don't return to it for some years in the case of a number of countries.

SRC

Yes, that's true. And so then the question is sometimes - and I think this arose relatively recently in the UK context - of the set of revisions that you look at and change the international comparison, but you know that some countries have not yet essentially done the same set of revisions. For example, the way in which you try to pull together the estimates of output income and expenditure at times afterwards as you have more information from annual surveys you have more information on incomes for example, from the tax system. So, again, at any given moment, even though you know, you're in both cases, trying to say well, what's our best sense of what was going on a year ago, different countries will be at different stages of the statistical production process and the proportion of the eventual total information set on which you base your estimates, you know, some countries will have incorporated more of that than less, and so a revision that you're doing this month, somebody else may not do for six months, and that again, complicates the picture, and really again, suggests that looking at international comparisons at too high frequency or too much in the recent past, there are bigger uncertainties and caveats that you ought to be placing around big calls and big interpretations based on that.

MF

Yes, and while it's hard enough to know where you are at any given point in the economy, it's even harder of course - infinitely harder you might say - to work out where on earth you're going to go next. You've spent a lot of time in the forecasting business, how are forecasters, and I know the Bank of England in particular is taking a good look at this at the moment - the data it relies upon in order to make its forecast - what can the statistical system be doing to support organisations with that unenviable task of having to look into the future and guide us on what is going to happen next?

SRC

Well, I think from the perspective of the forecasters themselves, many of the same principles that we've been talking to in terms of how the statistical system should communicate uncertainty applies in spades, in the case of forecasts where explaining how you've reached the judgements that you have, the uncertainty that you know, past forecast errors, that particular sensitivity of a forecast, a judgement that you'd be maybe making in some part of it, the more you can do to explain that increases people's trust rather than reduces it. From the perspective of the statistical producer helping the forecaster, I think, again, explanation, if you have got particular difficulties, particular reasons why you think there might be greater uncertainty than in the past around particular numbers, it's very important. The current evolution of the labour market statistics is a good example of that - you need to be talking to the big users and the big forecasters about the particular uncertainties, there may be at a given time so they can take account of that as best they can. On the other hand, having been a forecaster for 10 years, I certainly took the view that for forecasters to complain about revisions in economic data is like sailors complaining about waves in the sea. I'm afraid that is what you're dealing with. That's what you have to sail on and everybody makes their best effort to come up with the best possible numbers, but it's a fact of life. And your knowledge and understanding of what's going on in the past now, and how that informs your judgements in the future evolves over the time. It doesn't remain static and you're gazing through a murky cloud at some time, but that doesn't reduce the importance of doing the best job you can.

MF

Final word then for the forecasters and for everybody else. The statistics are reliable but understand their limitations.

SRC

Yeah.

MF Well, that's it for another episode of Statistically Speaking. Thanks to Sir Robert Chote, Professor Mairi Spowage, and Dr. Craig McLaren, and of course, thanks to you, as always for listening. You can subscribe to future episodes of this podcast on Spotify, Apple podcasts, and all the other major podcast platforms. And you can follow us on X, previously known as Twitter, via the @ONSfocus feed. I'm Miles Fletcher and from myself and producer Steve Milne. Goodbye

ENDS.

  continue reading

22 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide