Eh Sayers Episode 15 - A Little Less Misinformation, A Little More True Facts, Please

Release date: December 13, 2023

Catalogue number: 45200003
ISSN: 2816-2250

Eh Sayers Season 4 Episode 3 - A little less misinformation, a little more true facts, please

Listen to "Eh Sayers" on:

Social media shareables

Tag us in your social media posts

  • Facebook StatisticsCanada
  • Instagram @statcan_eng
  • Twitter @StatCan_eng
  • Reddit StatCanada
  • YouTube StatisticsCanada

Visuals for social media

Misinformation graphic 1

A little less misinformation, a little more true facts, please - Eh Sayers with Timothy Caulfield

In the age of social media, AI, and deepfakes, discerning fact from fiction is a crucial skill. Nowadays, we’re not just getting our information from the six o’clock news. Friends, family, researchers, influencers, entertainers, news anchors, advertisers… Who can you trust?

Timothy Caulfield, misinformation/disinformation expert, author, University of Alberta professor, and member of the Order of Canada, and Eric Rancourt, Assistant Chief Statistician at StatCan, join us to explore the challenges posed by our information environment and what can be done to counter misinformation.


Tegan Bridge


Timothy Caulfield, Eric Rancourt

Listen to audio

Eh Sayers Episode 15 - A Little Less Misinformation, A Little More True Facts, Please - Transcript

Tegan: Welcome to Eh Sayers, a podcast from Statistics Canada, where we meet the people behind the data and explore the stories behind the numbers. I'm your host, Tegan Bridge.

Like everybody else, I get news from a bunch of different places. For me, if I heard a cool tidbit, I probably heard it on a podcast because, shocker, I listen to a lot of podcasts. I also read books, listen to the radio, and, of course, use social media, though I openly admit that I should probably cut back on my social media use.

Just yesterday, I saw two things that weren't quite right. One of these was a video that had been edited to distort the size of a snake to make it look larger and get more views, and the other was a historical documentary with a misleading factoid. The snake video was debunked by Snopes, the fact-checking website, but the misleading documentary... I only caught that because I've read a lot on that topic. That's just one day for one person. And those are just the two pieces I noticed. How many did I see that just passed me by, shaping how I see the world without my even noticing?

What are we supposed to do? For a lot of us, the internet is our main source of information, but our feeds are a bizarre hodgepodge: news organizations, meme creators, professional groups, influencers, and so on. On my own feed, I see a meme about the Lord of the Rings, data from the census, an ad for earplugs, a video from a seamstress who recreates historical clothing, a picture of a cute cat, a post about some international news, a friend's vacation pictures, a post about the Canadian economy and inflation, an ad for men's soap, that's a weird one for me, a comedian riffing on that same piece of international news, and a list of, I'm using airquotes here, "good" and "bad" foods...

It's a mishmash labyrinth of people trying to inform us, entertain us, advertise to us... and mislead us. I know that I can trust the data from the census, but what about the rest? The soap is probably not dangerous, expensive and weirdly gendered, but not harmful, but the list of "good" and "bad" foods? The post about inflation? Were those reliable? And what about the news clip?

Today, we're talking about misinformation. And, heads up, I'm going to be using that as a bit of a catch-all term, but it's actually not so simple, as we'll soon see.

Timothy: My name is Timothy Caulfield. I'm a professor in the faculty of law and the school of public health at the University of Alberta.

Tegan: You dedicated your recent book, quote, "To science. Hang in there." Why?

Timothy: Oh, it's been a tough decade or two, hasn't it? There has been this growing sort of universal almost disdain for science, for scientific institutions. Now, I want to be clear, if you went out and we surveyed 1000 people, most people in Canada, in particular in Canada, would say they trust science and scientists, right? But that trust is starting to erode, especially if you start talking about particular topics.

So, a person might say, I trust science, but then you ask them, what about vaccines? What about climate change? What about supplements? What about unproven therapies of other kinds? So it's a very interesting time. So that's one of the reasons I dedicated the book to science. But the other reason is I feel like scientific institutions are under assault. And that's really heartbreaking because getting good evidence, having knowledge that is trustworthy is fundamentally important to liberal democracies and I think there's currently a crisis in that context.

Tegan: As I said before, I'll be using misinformation as a catch-all term, but it technically isn't. There are a few different kinds of bad information. As Timothy explains.

Timothy: I use misinformation as the catch-all phrase too, and not everyone agrees that that's a good strategy. It is actually sort of a complex environment out there, information environment. I call it the misinformation continuum.

So on one end, you have information that the purveyors know is a lie, it's clearly a lie, there's no evidence to support it, and it's being put forward to satisfy a particular agenda, to sell products. Tthat's disinformation, right? The incentive, the intent is to spread misinformation for a whole bunch of reasons, it may be, as I said, maybe political reasons, maybe to build a brand, etc. But, but the intent is to spread misinformation.

You move along that continuum, and we'll just maybe go in the middle of it. And you have individuals that—I put a lot of the wellness gurus in this category—you know, do they believe it? I don't know. It seems scientifically… Does a wellness guru really believe, you know, a colon cleanse helps you? Do they really believe that, you know, these supplements work? I don't know. I'm skeptical. Maybe they fooled themselves that they think it works, but it's still wrong. It's still misinformation. It still does harm.

And then if you move along that continuum a little bit further. You have individuals that, you know, genuinely believe they're doing what's best, you know, they just want what's best for themselves, for the family and their community, and they're spreading that, uh, incorrect information with no intent to do harm, right? Uh, but it's still misinformation, and it can still do harm.

So yeah, there's all these other levels throughout that continuum, and I think it is a complex environment, and that does matter because the nature of the misinformation may inform how we should tackle it.

Tegan: Hoaxes predate the internet. You know, the Cardiff Giant, the Fiji Mermaid, the Cottingley Fairies, which are my personal favourite. What makes modern misinformation special?

Timothy: Yeah, I love looking at those old hoaxes. You know, I do a lot of health misinformation, as you know, and I love looking at those old posters for bogus treatments.

Look, misinformation has been around forever. As soon as human beings started communicating, I'm sure there was misinformation, right? But it is different.

I think we can go back to say the 2016 election in the United States, where a lot of people think misinformation really started to take off and take on a different character. Further than that, it really is social media. Yeah, it's an obvious response, but social media really has changed it because it allowed the people to spread… You know, people in the past didn't really have access to ways to broadcast ideas, and the Internet allowed that to take off, right? Allowed the creation of echo chambers that allowed the creation of these communities that believe this stuff. So I think it really started to accelerate that.

The other thing I think it's happening now is that because the information environment has gotten so confusing, and also I think our knowledge environment has gotten so confusing, it's easier to find tokens of legitimacy to legitimize a position, to make a position seem more credible. For example, there's bad science out there that people can point to make their position seem more legitimate.

In addition to that, and one of the things I find really scary, is the degree to which misinformation has become ideological. Yes, it's always been there. You know, there's always been an ideological component to a lot of the misinformation. But that aspect is really accelerated too. And if you just look at it in the context of the health space, which is where I do my research, it's incredible how virtually everything now has an ideological component to it: whether you're talking about vaccines, whether you're talking about supplements, whether you're talking about unproven cures for things like autism. There's this ideological lens through which everything is now projected, and that's really scary because once something becomes about ideology it becomes much more difficult to change people's minds. It becomes part of their worldview, how they identify. And I think that that is something that is also different. Yeah, it it's always been there. But now it's just at the fore.

Tegan: According to a recent StatCan general social survey, the most common method for following news and current affairs was the internet followed by television. Could you talk about how the difference in access point affects everything?

Timothy: There's been a lot of recent research that has found a strong correlation between where you get your information and whether you believe misinformation and whether you share misinformation and yes, it really does matter. So, no surprise here, if you get your information from social media, you're more likely to believe misinformation, more likely to spread misinformation. If you get your news from the legacy media, you know, the kinds of sources, well-known newspapers, broadcasters that have been around for a long time, you're less likely to believe misinformation, less likely to share misinformation. Yes, this is very correlational, right? This is very correlational, and it's correlated to education and all those other… and socioeconomics and all those things. But it's still relevant because that speaks to the echo chambers. So where people get their information matters.

And there have even been studies that have shown there's a strong correlation between what cable news you watch and your beliefs around things like COVID therapies. In fact, one study came out this year, just a couple months ago, that found the strongest predictor of your belief around the efficacy of COVID treatments is what cable news show you watch. And in fact, so that, that's the public, but in fact, that stat holds also for physicians. So they studied physicians too. How horrifying is that? So they basically found that in the United States, your physician's position on COVID therapies is more influenced by the cable news show they watched than the science that they read. Just absolutely horrifying, right? And that really shows the degree to which this has become polarized and it's become about ideology and also where you get your information.

And the last thing I'll say about this—and I could go on and on—is I also think it speaks to the chaotic nature of our information environment, right? So if you're getting your information from social media platforms, from the Internet, there's a lot more information coming at you, right? Some of the… a platform like TikTok actually pushes information at you, right? Instagram's the same. I guess all the platforms have that to a degree. So we know that when you are bombarded with information, you're less likely to sort of look at it critically. And so we really need to think about how we can invite people to pause in that chaotic, frantic information environment to apply their critical thinking skills. When information is coming at you, it's just washing over you, it's playing to your cognitive biases, it's playing to your fears, and preconceived notions misinformation can really take hold, and so we've got to figure out strategies to push back against that.

Tegan: Could you talk about the nuts and bolts of social media? I'm talking about algorithms, echo chambers, and the role they play in information access for the average person.

Timothy: I think it's really important to recognize that… the idea is that the algorithms that drive search engines, the algorithms that drive virtually every social media platform, they're designed to exploit our cognitive biases. So there's been a lot of interesting work that talks about how they play to fearmongering, right? They play to your ideological leanings. They play to your desire for in-group signaling, right? So those algorithms, because of that, facilitate not only the spreading of misinformation, but the creation of echo chambers that legitimize and emphasize misinformation. All of the platforms, and I won't use any names, have said they're trying to tweak their algorithms to not to avoid that. But I think we definitely need more transparency about what's really going on with those algorithms.

You know, as individuals, as the public, we should recognize that those algorithms are designed to do that. You know, they're designed to get, in our attention economy, they're designed to get clicks, they're designed to get eyeballs, they're designed to engage you. And they do that by playing to our cognitive biases.

Look, as I said before, it impacts all of us. I have made mistakes or fallen, gone down, you know, a hole that I wouldn't have if I just paused for a moment and remembered that these algorithms are designed to kind of trick us.

Tegan: So many people get information in snippets: soundbites on social media, a headline in a notification on their smartphone. Why is it a problem when people rely on these tiny infobytes for their news?

Timothy: It is, I think, become the way that we get our news now, right? And sometimes I'm guilty of this, by the way. And I mean, there's been a lot of really interesting research about how people only read the headlines and how rare it is for people to click through the actual content, right? And we also know, as I was talking about before, that not pausing is,  not sort of trying to apply your critical thinking skills, is correlated with believing misinformation and spreading misinformation.

Um, and the other thing we need to remember is that the situation that you described, right? This frantic information environment. It also plays to our emotions, right? So it's a sort of a worst case scenario, right? Because we have these algorithms that now kind of know us, right? They kind of know us, and they're presenting you with headlines and content and images and memes that play to your preconceived notions, probably play to some degree to your ideology, right? And that means you're more likely to internalize it.

And so we need to invite people to pause, and people like Gordon Pennycook and David Rand at MIT, Gordon's at Cornell now, they've done really interesting research that have highlighted that value of just pausing. Just pausing for a moment, you're less likely to believe misinformation, less likely to share misinformation, and the other thing I think is really important is the degree to which that frantic clickbait kind of world that we live in, it also plays to our emotions. Really interesting work by people like Kate Starbird at University of Washington have… She suggests that if content makes you emotional, in particular, she once said to me, if it feels like, for example, your team got a touchdown, right? So you see a headline, "I knew it!" or… If it feels like your team got a touchdown, that shouldn't be... Your impulse shouldn't be to share it or to internalize it. Your impulse should be, "I should be skeptical. I should double check this. You know, when it makes me feel emotional, it makes me angry, fearful or that like my team just got a touchdown, that should be a signal to pause and to double check it." Because it's the algorithm playing to you, right? And don't fall for it. You know, take a pause and apply those critical thinking skills.

Tegan: That actually leads right into my next question, which was that in researching for this episode, I found time and time again, that misinformation was more likely to be reshared on social media if it was terrible news, the more dramatic, the more negative, the more shocking the headline, the more likely it was to be re shared. Could you talk about negativity bias and the role it plays in our attention?

Timothy: Yeah, I mean, there is a lot of really interesting research on that. And, and some of it very, very recent. There was a study that came out, gosh, I'm gonna say just a couple weeks ago, that backs up exactly what you just said. They looked at the role of the negativity bias in the sharing of misinformation in the context of COVID, and they found it to be a dominant factor, right? And we've known this for a very long time. Negative headlines outperform positive ones. There was a study that came out earlier this year that found that because of that, over the last couple decades, negative, scary, ominous headlines are increasing in frequency. And headlines that are related to joy—and that's actually what the study looked at, which is so depressing—headlines related to joy are decreasing in frequency, right? And that's all because of the negativity bias.

And the negativity bias, as you know well, is the idea that if something's scary, we're more likely to remember it, more likely to act on it, which makes total sense as a cognitive bias. For most of human history you want to remember the scary stuff, right? You know, tigers hang out over there, don't eat that berry. But it's backfiring now, right? And we have to remember that. There have been other studies that have looked at what kind of content goes viral, right? What kind of misinformation goes viral, right? And it's something that is scary that plays to our morals, and I would say that includes ideology, and it's easily processed, and so much of the misinformation that you can think of ticks all those boxes. And negativity, you know, being negative is a big one.

Now, what's interesting for those of us who are trying to counter misinformation that poses a challenge, right? Our own initiative hashtag science up first (#ScienceUpFirst), we're trying to counter misinformation in a positive way. We want to be constructive. We don't want to fall into that fearmongering trap. But when you're fighting the negativity bias, that can be challenging. I mean, there have been studies that have found that public health messages that have a little bit of a scary element to it do outperform those that are totally positive. So, you know, do you fight fire with fire or do you try to think of other creative strategies that can still make your content get traction, but aren't necessarily just adding to a negative worldview? I, I, I like the latter more than the former. Let's, let's try to be positive. The world needs some more positivity.

Tegan: What are some of the consequences of the idea that we should give both sides of an issue equal time or consideration?

Timothy: I don't think this is said enough, that our current information environment is a false balance machine. So what do I mean by that? Fringe views, contrarian views, tiny minority views, often views that aren't backed by good science are elevated in our information ecosystem to make them appear equivalent to the body of evidence, the scientific consensus. We see that happen with vaccines, with climate. Virtually any contentious topic that you can think of, this is happening.

There's been really interesting research, a study that came out of Europe that highlights the degree to which this is a problem. So what they did in this study is they asked thousands of people how much consensus is there in the medical community about the safety and efficacy of the COVID vaccines? And 90 percent of people said that the medical community was divided. And in fact, they said that the most common response, it was like 50/50. Right? And that 50 percent thought maybe they weren't safe and they weren't effective and... When the reality is over 90 percent thought that [they were safe] and a very small percentage had concerns. Think about that. 90 percent of people thought that there was a lack of consensus in the medical community.

And the problem is research has consistently shown that false balance can impact public perception, it can impact health behaviours, it can lead to vaccination hesitancy. So we have to do a better job presenting the scientific consensus to the community.

In fact, I think it's also really important to highlight talking about… The scientific consensus is not about group-think. It's not about not respecting, you know, controversial scientific positions, and often that's how doubt mongers try to portray the scientific consensus: "Oh, that's just sheeple. You know, that's just people that have bought into, um, a group think mentality." On the contrary, the scientists, scientists are always challenging the scientific consensus, and controversial views are incredibly important, but, but, but those scientific views have to be presented in scientific forums. It's about using science to support your position. And if you can do that, eventually your position will rise. The scientific consensus matters.

Okay, the last thing I'll say on this, now that you've wound me up is to back up what I've just said. This whole idea of the scientific consensus versus these fringe views is really only contentious on topics that have become political. No one worries about scientific consensus being not true when they board an airplane, and, by the way, these fringe views are our own research has found, are vastly overrepresented. They're not being silenced. They're not being censored. They're vastly overrepresented in the public sphere.

Tegan: StatCan is a producer and communicator of data. As such, it engages regularly in this public sphere, which, as we've just heard, is significantly more complex than it used to be: misinformation, disinformation, bad science, cluttered newsfeeds...

Maybe the chief statistician, Anil Arora, said it best at a keynote address in 2019:

Anil: When we are talking about Big Data, we need to recognize that volume does not

equal quality.

Tegan: Volume without quality equals information overload. StatCan's job is to get Canadians the information they need, so how does the organization navigate this information environment? We asked an expert .

Eric: Eric Rancourt, Assistant Chief Statistician and Chief Data Officer at Statistics Canada.

Tegan: How is StatCan thinking about the information tsunami that Canadians are facing, especially knowing that there are bad actors intentionally putting disinformation out there to mislead us?

Eric: It's a big change in the context. Traditionally organizations like Statistics Canada have been operating in a mode where we control the information: we survey, we gather information. That is why we have a modernization plan, a modernization mindset, where we constantly try to be closer to what are the needs of people and how can we be at the leading-edge of using methods so that this new context we understand and we can scientifically perform there. So there might be some mis- or disinformation, but organizations like Statistics Canada, we, are designed to counter that by the way in which we scientifically produce information.

Tegan: Disinformation and misinformation are eroding Canadians' trust. What are some of the consequences if Canadians don't know who to trust?

Eric: We all want people to be autonomous and free. So, if the information that one has to make a decision is not depicting exactly what is the world, then are you really free and taking a clear decision if the information on which you base your decision is not reflecting society and the world? Probably not. So it is very important to make sure that we use very well-established methods, that this be transparent to the users so they can know and they can trust that the information that they use is depicting society.

Tegan: Why is trust crucial for StatCan specifically?

Eric: This is the basis of a statistical system. We cannot just produce information on our own. It starts with Canadians providing us the data pieces that make up the information that we then produce. So I always like to talk about the data life cycle as gather, guard, grow, give. So we gather information, Canadians provide us the data. We guard it well; we safeguard the information. Then we grow it by integrating the information, producing estimates, and then at the end, we give back information. So the trust starts with Canadians trusting us with the data they provide, and then we use leading-edge methods. We use a scientific approach. We haveapproaches based on the necessity of the information and how proportional what we do is to that necessity. And then we make access to the data. And that whole system, the whole national statistical system is founded on trust.

Tegan: If misinformation sows mistrust, it threatens much more than our national statistical system. So... why? Why would someone intentionally spread disinformation? I put that question to Timothy.

Tegan: Who benefits from misinformation?

Timothy: Yeah. I mean, that's a, that's a great question. And I get it a lot because, you know, people say, "well, you know what, why do anti-vaxxers push this stuff? Why are people trying to create doubt around the nature of climate change?" Well, you know, very often there's money involved. There really is money involved. So, there have been interesting studies that have looked at the degree to which those spreading misinformation profit. So, just to give an example, many anti-vaxxers are also selling supplements or they're selling some other health product. They're selling products often on the back of creating fear around things like vaccines.

Tegan: Who's most vulnerable in this conversation?

Timothy: You know, I think we're all vulnerable. I think that our institutions are vulnerable. I also think that the spread of misinformation and all the things that we've talked about can also polarize our communities in ways that make equity issues more profound. I think there are certain communities that are potentially more vulnerable. I think we're all vulnerable to the spread and harm of misinformation, and I think that that is a really important point that we shouldn't forget. I mean, this impacts all of us. It impacts our healthcare institutions, it impacts healthcare providers, it impacts patients, it impacts populations, it impacts communities. We need to remember that when we're developing strategies to counter misinformation.

Tegan: I put the same question to Eric to answer from a statistical and a data perspective, and he stressed one of Timothy's points.

Who's most vulnerable in this conversation?

Eric: It's, it's everyone, but in particular, it's the small groups. The general population is, is one thing. It's relatively easy to produce information on the total, the average, but there are subgroups of the population, those that are sometimes missed by the general information found on the internet, who are particularly at risk. So major efforts are made by organizations like Statistics Canada to produce disaggregated information in scientific ways that can ensure that there's no bias against or for any group, so that there's no harm done.

Tegan: Knowing that there is a lot of bad information out there. What is StatCan doing about it?

Eric: We are nurturing and developing the skill set of our experts so that it does meet the needs of the digital world and ever-changing society.

We make sure to occupy the space. So, wherever there's data discussions, data information, we participate at the discussion table, at the decision-making tables, to enable those decisions by providing very relevant information.

And lastly, we also team up. This is not just something that Statistics Canada does on its own. We team up with the other departments of the federal family. For example, there's a data strategy for the federal public service that has been launched this year, earlier this year, and Statistics Canada is a major player on that. And we also team up on the international level. We participate in the United Nations statistical activities, in the Conference of European Statisticians, the OECD. These are groups that look at what are the situations faced in different countries. So, it is really in the best interest of all to team up and tackle that as a group rather than just one institution.

Tegan: So, you could say that we're fighting bad information by making sure there is good information out there that Canadians can access.

Eric: Yes, we were trying to create a reliable data space that is greater in importance than the dark cloud of misinformation.

Tegan: What's the role of good, high-quality research and data in the fight against misinformation?

Timothy: I actually think there is a little bit of a knowledge creation crisis right now. I really feel that liberal democracies around the world should make the creation of trustworthy science a priority. What do I mean by that? There is there's a replication crisis that's going on. There's predatory journals that are polluting our knowledge ecosystem. There's zombie papers. (So, these are retracted papers that would just won't go away that are still getting cited.) You know we need to create trustworthy, independent science that is distributed in a way that the public can feel confident about it. And this absolutely is essential in the fight against misinformation, to inform our policies, and to create trust within the community more broadly.

Tegan: What tips do you have for someone who wants to stay informed but who doesn't know how to navigate this landscape?

Timothy: There are strategies that can be used. You know, we've talked about a couple of them.

First of all, pause. That simple strategy really can make a difference. I believe that trying to slow down your information environment a little bit, you know, slow down that bombardment.

The other thing we need to do is we need to get people, I think to understand the evidence, the nature of evidence better. An anecdote is not the same thing as well-done studies. And just teaching people that very straightforward thing, I think and research backs us up, can really make a difference. And that's something that you can deploy day to day, right? Just asking, okay, where's this evidence come from? Right? I also think that as things like AI become more common, we are going to have to really invite people to use fact-checking skills even more. And that's sort of where, you know, that being pessimistic, my lead in, um, comes in, you know, it's, it's hard enough right now to get people to fact check, to pause, but with AI, I think it, it heightens that need even further, right? You really do need to investigate a little bit more. How legitimate is this image? How legitimate are these claims? And people need to do that, to use those fact checking strategies even more now, and unfortunately, it's just going to get worse in the in in the future. More broadly, I think, I'm a big advocate of teaching critical-thinking skills throughout our education system from, you know, kindergarten right to the end of university and for adults too. We should have resources available for adults throughout the lifespan.

Tegan: What makes a source of information reliable? If you take the census, for example, what differentiates something like the census from other sources of information that Canadians might find, especially online?

Eric: First of all, it's the fact that it is produced by a trusted organization. So the source of the data, in this case, Statistics Canada, the transparency about the whole process, how we design it, which methods we use, how we consult, everything we do about the census is made available uh, to, to the users. So, transparency is key. And another aspect is the scientific aspect. The methods we use in our programs, such as the census, are published methods. And sometimes they are published in scientific journals,  journals that have been peer reviewed.

Tegan: Number one: Ask yourself, who is producing this information and why?

Eric: What is the source? Is it a recognized source or is it a national statistical organization like Statistics Canada? Is it a university? Or is it some just private website of some sort?

Tegan: Are they transparent?

Eric: One has to seek evidence of transparency, as I said, so are the methods readily available or is that hidden and nowhere to be found?

Tegan: What can you find about their methodology? Can you ring them up and ask questions? Can they back up their claims?

Eric: And if they actually know, then they can correct it or deny, but one should not hesitate to double check somewhere or contact the organization that has produced the information.

Tegan: If you don't know for sure, be mindful of what you share. It could be misinformation.

Eric: What people should do also is not proliferate, transfer of the information when it's unclear where it comes from.

Tegan: Other than not spreading misinformation, how else can Canadians help StatCan fight misinformation?

Eric: One way to participate and contribute is to provide information to Statistics Canada and respond to surveys. That enables Statistics Canada to produce very solid information in return to citizens in ways that are consumable for decision making.

Tegan: It can't be said enough. StatCan produces high quality data worthy of your trust. That's invaluable in this information environment, but the agency relies on you, puts its trust in you, to create it.

Tegan: If someone would like to learn more about your work, where should they go?

Timothy: Well, I'm easy to find online. I'm on a variety of social media platforms: @CaulfieldTim. That's where you can find my noise and we have our own misinformation project called hashtag science up first, where we try to counter misinformation and talk about science literacy and media literacy in a very constructive, positive way, using diverse voices from across Canada. So please come be part of the ScienceUpFirst team.

Tegan: If someone would like to learn more, where can they go?

Eric: I would start with Statistics Canada's website. The Trust Centre is really a place where you will learn what do we do, what are we planning, what sort of data collection are we planning, and if more is needed, then there is information on who to contact.

Tegan: You've been listening to Eh Sayers. Thank you to Timothy Caulfield and Eric Rancourt for taking the time to speak with us.

You can subscribe to this show wherever you get your podcasts. There, you can also find the French version of our show, called Hé-coutez bien! If you liked this show, please rate, review, and subscribe.

One more thing! If you've enjoyed hearing the stories behind the numbers on our podcast, you can get even more by downloading our newest mobile app, StatsCAN. Access the latest publications and get notified when there's new information relevant to your interests, like agriculture and food, health, or science and technology. The StatsCAN app is available for free in the Apple and Google app stores. Check it out!

And thanks for listening!


Statistics Canada. "Media Consumption in Canada: Are Canadians in the Know?" Statistics Canada. Government of Canada, March 28, 2023.

"Statistics Canada's Trust Centre." Statistics Canada. Government of Canada, February 7, 2023.