background_fid (1).jpg
Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Local

COVID-19 Data: Takeaways, Tips And A 'Doom Scrolling' Warning

IMG_5092.JPG
Erika Mahoney
/
The pandemic has immersed us in data and numbers. Processing all of the information can be overwhelming.

Over the past few months, we’ve been inundated with data about COVID-19, from maps to graphs to data dashboards. Add to that a glitch in the state’s reporting system, which led to a backlog in test results. It’s all a lot to process.

KAZU’s Erika Mahoney interviewed Dr. Judith Canner, a statistics professor at Cal State Monterey Bay, to help put everything in perspective. Canner was a guest speaker on the university’s “Virtual Wine With A Scientist” series last month.    

Erika Mahoney (EM): What are the impacts of the state’s reporting error? 

Judith Canner (JC): From a data perspective, it really reinforces how we have to be careful about looking at any single day and thinking that is an accurate representation of that day's cases. Our data really is from the past, showing us what's been happening. There are delays in reporting data. There are delays in collecting the data because there are delays in testing. And so there's just a lot going on. Most visualizations, most data and decisions should be made on seven day averages or even 14 day averages. So we just have to be careful. And I think as we're making decisions about schools or businesses reopening, waiting to just meet some specific threshold is probably not the best approach because we really need to think about.... we've met a threshold, but is that threshold, is that pattern going to continue into the next few weeks? And so rushing to reopen might be a detriment.

EM: What's your biggest concern with this reporting error?

JC:  What I hope doesn't happen is that we get a long-term sort of distrust of the data and of the state reporting mechanisms. The data is needed. The data is needed to make decisions, to think about allocation of resources. But we also have to keep in mind that even when all systems are go and working perfectly, the data that we're getting is never going to be complete because there’s always going to be delays in people to go get tests or getting the results from the test or people who just never get a test, for instance. And so we have to keep that in mind.

EM: How long do you think it will take to get a better picture of the full impact of the pandemic?

JC: The reality is, we're not going to have a full view of the pandemic, I think, for a long time. You know, there's been a lot of comparison to historical pandemics, but they're historical. We've seen them from, you know, the 1918 pandemic, the Spanish flu, being one of the most popular. We have one hundred years of research that we've done on that data, that information, those times. And so we don't have that hindsight yet.

EM: What is your general advice for making sense of the data that's coming our way?

JC: Don't inundate yourself more than you have to. I learned a term recently, “doom scrolling,” where you scroll through social media feeds or through headlines, just looking at everything coming at you. If you're just constantly engaging, you'll get too much information and it'll be too hard to process. Nobody can process that much information.

EM: So we're hearing about the test positivity rate and it's a fairly new phrase. What is that and why is it useful? 

JC: Test positivity rate, or case positivity rate, it's basically telling us when we do tests, how many of our tests come back positive for COVID-19. And the recommendation right now from the World Health Organization is five percent, or below five percent preferably. And the reason these numbers become so important to us is because we need to get a good idea of what's going on in our community so we can allocate resources, plan for PPE, figure out how we should open up schools. And if our case positivity rate is really high, 20 percent, 25 percent, as it is right now in some places throughout the country, that means we don't have a very good idea of what's going on, how many cases there actually are in that community. That’s because it's probable that only the sickest are getting tested or the people who actually have access to testing because we are missing the people who maybe have milder symptoms or are maybe asymptomatic or may not have access to testing.

EM: States in the U.S. and countries around the world report their numbers differently. What problems does that cause?

JC: Different countries might report their numbers differently or even how they're collecting their information might be different. It means we have to be careful about any kind of comparison we might make. We like to make comparisons. We like to say, ‘oh, look, we're doing better than this place or this place is doing worse than this place.’ Comparisons have their place because they can help us see… oh, here… this country managed to get their number of cases down to a very manageable number, but this country didn't. What were their differences? What worked for one, what worked for another? But we also have to be careful about comparisons because every country is extremely different. 

EM: What are some red flags that maybe this source isn't trustworthy? 

JC: If a news source is not willing to submit a retraction or make it known that they made a mistake, that is a red flag because everybody makes mistakes. The second piece to be careful of is one who's not honest about where the information is coming from. When people are vague and start to say, ‘researchers at Harvard.’ OK, well, researchers at Harvard... Harvard is a great research institution, but are the researchers in epidemiology, are the researchers in infectious disease? And also keep an eye out for that sort of claim. I have a PhD in statistics, but I’m going to keep telling you I'm not an epidemiologist, which means I can talk about data, but I cannot talk to you about all the aspects and intricacies of infectious disease, especially around COVID. I'm trying to make sure that my answers reflect that. So that's another red flag, the expert appeal. 

Then, anybody who just claims way too much certainty in their results. Uncertainty is just a part of the process. The field of statistics is built around uncertainty. And anyone who tells you differently is trying to sell something. 

The last piece I'll throw out is something that's been popping up a lot because we need information faster and science is bad at providing information fast because you rely on a peer review process. Typically, when we write an article, we send that article out, we wait for experts to review the articles, send it back to us, maybe reject it. So we have to redo things. A lot of people post PDFs of their articles in archives related to their field, both to get the information out there, especially at a time like now when you have a pandemic, but also to not be scooped because academia is driven by publication and novelty. It's possible that it's a preprint, which means it hasn't gone through peer review. Now, they may have also done that because a lot of journals are behind pay walls, which is a whole other issue when we think about federally funded research. But this is something to check on. Has this article been peer reviewed or not? 

EM: What's something you would like to see news organizations do better when sharing data? 

JC: I would like news organizations to be more careful about their headlines because a lot of times people only read the headlines or they only read the first paragraph. So to make sure that the headline is not overly dramatic and is representative of what's going on. In academia, we like to give an abstract, which is sort of the “too long, didn't read” summary of everything. And I do see that in some news organizations, they have a few bulleted points. 

EM: With so much information and so many numbers coming our way, there is that concern that we could become desensitized. I would just love your thoughts on that.

JC: It's something I have to remember everyday as a statistician when I'm talking about numbers and talking about percentages or total numbers or ratios. Is that ultimately there's a person there, right? There is a person behind that. And it's something I think we need to keep in mind when we start saying things like, well, maybe the death rate isn't as high as we think. That's something that people put out. Well, there are more cases or we’re not catching all the cases, then maybe the death rate isn't actually that bad. But the reality is, people are still dying, right. And we have to keep that in mind and be compassionate and caring to our community and to others, because the person dying is somebody's brother or sister or mom or aunt. We can’t just get so caught up on a certain percentage or certain rate that we forget that there are people who are suffering.

KAZU is licensed to Cal State Monterey Bay.