May 19, 2024
Shielding science from politics: how Joe Biden’s research integrity drive is faring

Shielding science from politics: how Joe Biden’s research integrity drive is faring

Adam Levy: 00:03

Hello, I’m Adam Levy and this is Working Scientist, a Nature Careers podcast. This episode: scientific integrity in the United States of America.

This series discusses attacks on the freedom and safety of researchers, from how careers and lives are cut short by conflict, to the discrimination and restrictions that LGBTQ+ researchers face.

Each episode in this series also concludes with a follow up sponsored slot from the International Science Council (the ISC), about how it is exploring freedom, responsibility and safety in science.

So what does it mean to be free to conduct research? How does it affect a scientist if they don’t feel safe to speak out about their findings? And how do we protect academia from political interference?

These are questions that scientists scientific institutions and governments across the world grapple with, and there are threats to scientific integrity across the world.

Scientific integrity includes such core values as conducting research ethically, and honestly.

But it also covers, say, how researchers are able to blow the whistle on practices that they believe to be unethical.

Evi Emmenegger is a US Department of Interior US Geological Survey, USGS scientist. She received two Bs degrees in fisheries and microbiology from Oregon State University, and her Master’s degree is from the University of Washington in Seattle.

The views expressed by Evi in this interview belong to her and do not represent the views of the US Department of the Interior, the US Geological Survey, or United States government,

Evi Emmenegger: 01:56

I study aquatic animal pathogens at a federal research facility. I typically work with fish viruses doing both in vitro, sort of molecular analyses and cell culture, and conduct in vivo live animal experiments with aquatic viruses. Next year will be 30 years in my current research position.

Adam Levy: 02:21

In her work, Evi became increasingly concerned by the welfare of test animals in her facility, as well as contamination of wastewater released into nearby wetlands.

But how these concerns were dealt with (or not), and what followed in her career has ended up affecting far more than her research.

Evi Emmenegger: 02:41

Their response for most part, my requests and warnings that I had been making so almost for over 11 years were ignored. And then I started officially reporting the infractions to the research centers animal care and biosafety committees and the wastewater spills and breaches to regulate regulatory authorities. in 2017.

I also submitted a scientific integrity complaint at that time, hoping that someone up the chain of command would investigate, correct the problems, which would provide me some level of protection.

Instead, I was eventually placed on administrative leave in January of 2020, for 15 months, which was during the COVID crisis.

I was not allowed to do any work or research and not allowed to contact collaborators. And then eventually, I was fired in March of 2021.

And that was under the pretext that the scientific manuscript draft that I had produced earlier was of insufficient quality.

And then note that that paper, though, after I was reinstated, was allowed to be submitted to a scientific journal for peer review. And it was accepted with moderate to minimal revisions and published.

Adam Levy: 03:56

So how did you follow up after being put on leave and then fired? Did you take any further action?

Evi Emmenegger: 04:02

Yeah, so I was fortunate enough to have some nonprofit attorneys take my case on. The Biden office came in, and I received notice that I was supposed to be reinstated a month-and-a-half after the firing. And then they proceeded to a court case, the which involved being returned to my true job, versus the job tasks that they had assigned me.

And also, the court case involved was an unjustified firing, and also whether it was whistleblower retaliation.

And the court case came out recently for that, and that administrative judge ruled that my firing was unjustified, and that it was due to whistleblower retaliation.

And he deferred to the agency on what was referred to as status quo ante, so that they didn’t have to return me to my original job performance plan. So that’s sort of the status I’m in now,

Adam Levy: 04:58

How did this whole profess of reporting and then dealing with the consequences of that affect you, not just in terms of your career, but emotionally?

Evi Emmenegger: 05:08

I think of it as being the, sort of the most important and worst thing I’ve done in my career. But for in terms of the effect personally, it’s devastating. Sort of mentally devastating, but also it’s had some physical impacts. And my family said, I had to say that I’ve been diagnosed with post traumatic stress, even though I’ve sort of resisted that, because, you know, I haven’t served in the military.

But you know, I have, I have kids, I’ve got a sister, I have a mom that, you know, I couldn’t I, during this time, and I’m still working on it, but I, I just haven’t been the best, you know, wife, mother, daughter, sister.

So probably shouldn’t talk about that. So probably get too upset. It’s, it’s had some emotional impacts, for sure.

Adam Levy: 5:59

And has that shifted at all with the findings of this court case? Do you feel more secure and vindicated in some way? Or did those feelings still linger?

Evi Emmenegger: 06:09

I’m happy that the administrative judge ruled that it was an unjustified firing, and that it was whistleblower retaliation. But the government has now appealed that ruling. I guess I feel I’m in a constant state of fear whether I’ll be demoted or they’ll find another reason to fire me, things like that.

Adam Levy: 06:28

And that also feeds into your ability to speak openly about these things. I know, I speaking today was a bit of a challenge, right?

Evi Emmenegger: 06:36

Yeah, there was a long process of asking permission to speak with you, saying that I needed to seek ethics guidance. So I mean, they said yes, but it was very much so that I have to be cautious in everything I say,

Adam Levy: 06:53

What would you say to someone who is now in the position that you were in concerned about a particular issue, but perhaps also scared of what consequences could follow in their careers, if they voiced these concerns?

Evi Emmenegger: 07:07

You better have a backup plan, and be prepared to be fired. So you better have an attorney lined up that’s well familiar with federal policy work policies.

Find some allies that believed similar to what you believed and are willing to stand up with you.

Documentation, I think I was fortunate enough that I have so much documentation and evidence of what was occurring at the facility. That helped my case. So you better have a ton of documentation, and potential witnesses who are willing and brave enough to stand up for you.

Adam Levy: 07:43

What then are your hopes for the future of protections for whistleblowers like yourself?

Evi Emmenegger: 07:49

The bottom line for me, in order to do the research we’re so passionate about, is we need to ensure that we’re not causing harm to the environment, we do not cause undue stress or death to the animals under our care. And the scientists who carry out this research don’t have their health compromised.

And then making improvements to our facilities and procedures is needed and great, but the only way to maintain consistent laboratory standards is to have an independent accreditation of research facilities.

And then secondly, and this is the most important caveat for a whistleblower, is if scientific personnel believe that there’s an issue that has not been resolved or handled properly, those people then can request an outside review from the same entity who issued the laboratory accreditation.

And then finally, I hope, this I think would help for protections, is that that independent entity who issued the accreditation, would also be tasked with doing follow up well checks with the whistleblower, to ensure that other reprisals don’t occur later on.

Adam Levy 08:54

Do you feel happy that you carried out these actions? Or is there now a part of you that regrets standing up?

Evi Emmenegger: 09:04

I don’t regret standing up. Was I happy about the outcomes? No. It just got to the point that I couldn’t stand by and watch, and it was my legal responsibility to report this stuff. I just don’t think I could live with myself if I had not done it.

Adam Levy: 09:21

That was Evi Emmenegger. Note that Nature contacted Evi’s employers for comment, but had heard nothing back by the time this episode was signed off. Evi is a scientist in the United States. And this episode is focusing on that country. Because over the past decade, there have been huge swings in how the government has interacted with research under the presidencies of Obama, Trump and now Joe Biden.

I wanted to see the challenges that whistleblowers in the United States face more generally, and how that’s changed, if at all.

Lauren Kurtz is executive director of the Climate Science Legal Defense Fund, a nonprofit to help environmental scientists in the United States who find themselves under fire.

I called her up to find out how we can define scientific whistleblowing, and how it fits into scientific integrity.

Lauren Kurtz: 10:12

In the colloquial sense, I think it is generally seen as someone who wants to call attention to an issue that should not be happening.

And in the scientific integrity realm, there are actually some legal protections. But more generally, I think it should be applied to anyone who wants to speak up against someone who sees something happening that shouldn’t be happening in science,

Adam Levy: 10:36

What kind of things actually hold back whistleblowing, and make it more challenging or maybe even dangerous for people to speak up when they see something that’s not right in the realm of research or academia or government science?

Lauren Kurtz: 10:52

There are these in the US very narrow applications of whistleblowing law sometimes, unfortunately. So that can hold people back. There are situations in which someone may legally not be protected against retribution, should they choose to speak up.

More concerningly almost is people who don’t understand if they’re protected or not. They may actually well be protected, but they don’t know that. And that also gives them pause or may prevent them entirely.

You know, I think the fears that people are concerned about things like getting fired, experiencing negative repercussions at work, negative media attention. I mean, those are very understandable. And they do, in fact happen.

There are ways to navigate that and mitigate that. And if you’re thinking about being a whistleblower, I certainly urge people to contact us at CSLBF or some other lawyer to really minimize the risks to you and your career. But it’s it’s absolutely a tricky area.

Adam Levy: 11:43

Can you give any examples of when those those fears have come to reality and a scientist or researchers spoken out and faced negative consequences as a result?

Lauren Kurtz: 11:54

Not everyone that we work with, in fact, most of the folks we work with are not public, even when bad things happen to them. They just don’t want to speak about it for fear that more bad things may happen to them. But one scientist that we have helped publicly is Maria Caffrey.

At the time, she was working at the National Park Service in the US. This was in the Trump administration. And there were some pretty aggressive attempts to censor and undercut her climate research and some climate reporting that she was doing. And she very successfully pushed back.

You know, she says, some journalists who were covering this, there was some media attention on the attempted censorship. And at the end of the day her report was published. I think she felt like the outcome was much better because she had gotten this public attention on the issue. But her position was terminated.

And she was no longer working at the National Park Service a year later. And her belief and my belief, and I think the rational conclusion, is that this was obviously a direct result of her speaking out.

So she thankfully landed on her feet. But it was a really ugly episode. And it is the sort of thing that I think people are rightfully concerned might happen to them too if they were to try to do whistleblowing like that.

Adam Levy: 13:10

Now, of course, whistleblowing and threats to whistleblowing are not new. But did any of this really shift under President Trump’s presidency?

Lauren Kurtz: 13:22

Yes. In short, yes. And I think that is largely because there were just so many more threats.

Unfortunately, in the Trump era. I mean, there were some very egregious, very widespread attempts to politically influence and otherwise inappropriately silence science.

And it’s not surprising now that the federal workforce has been decimated. There’s a morale issue. And you know, even the people who didn’t put their careers on the line were definitely negative affected.

It can’t be understated. The Trump administration was pretty hostile to climate science, and many other disciplines as well.

Adam Levy: 13:58

In contrast, what have we seen under President Biden? Has it just been a shift back to what we saw before President Trump? Or has there been something more than that?

Lauren Kurtz: 14:09

The main thing I would note with the Biden administration is it has been much better than Trump administration, but it has not been as good as I might have hoped.

There’s been some real opportunities for reform that I think have been missed. And there have been some continuing problems that haven’t been rectified.

The whistleblowing protections, as I’ve noted, are thin in some situations. They should certainly be more robust. And just having a culture in which scientific integrity is promoted and valued and a core part of federal science work that could use some bolstering too.

Adam Levy: 14:40

How do these threats that we’ve spoken about to whistleblowers in the United States reflect threats to whistleblowers around the world in other countries?

Lauren Kurtz: 14:50

Attempts to politically influence and censor and manipulate science are a global and very well documented historical phenomenon. I mean, we have have assisted to the extent we’re able to scientists in other countries. So it certainly is a global phenomenon. And if you wanted to speak to whistleblower lawyers who have more international experience, I would recommend the Government Accountability Project.

Adam Levy: 15:13

What advice would you give to academics in the United States or otherwise, on blowing the whistle on scientific wrongdoings?

Lauren Kurtz: 15:22

You know, despite my somewhat dour analysis of the state of play, I still think it’s a really important thing to do. And certainly, if it’s something you’re tempted to do, I want to commend folks who are interested in making the world a better place, basically.

I think the most important thing to do is really to think through your options and to truly understand the implications of what you’re doing. And I absolutely believe that there are always ways to mitigate risk.

So you know, you can reach out to us at CSLDF or other lawyers, and I am sure that they can help you figure out ways and perhaps you could do it anonymously.

Or maybe there’s some non-legal manoeuvres that may be more fruitful than, you know, coming outguns blazing. So they’re definitely options that are worth pursuing. I’m not going to pretend it’s easy. But I do think being strategic about undertaking this is absolutely worth the extra time and effort.

Adam Levy: 16:15

Is there anything that we can do as individuals in academia or as academic institutions, to try to create an environment that encourages and supports people to come forward in these kinds of ways?

Lauren Kurtz: 16:29

Yes, absolutely. I mean, one, don’t punish the people who do come forward. Even if their claims are found to be not a true violation or there was a misunderstanding or something, it’s imperative to not punish people who came forth with good faith claims.

Secondly, I think actually there needs to be some level of punishment for the wrongdoers that can be commensurate with what actually happened.

So I think showing that these claims are taken seriously, and to not make it too hard for people who are trying to blame them, I think that’s really important.

And we haven’t always seen that play out, which is part of the problem.

Adam Levy: 17:01

Lauren Kurtz there. But threats to whistleblowers are far from the only threats to scientific integrity and researchers. What happens when research findings themselves become politicized?

Jacob Carter, 17:13

The most well known example of the Trump administration in particular, violating scientific integrity was something that came to be known as Sharpiegate.

Adam Levy: 17:25

This is Jacob Carter, Research Director for the Center for Science and democracy at the Union of Concerned Scientists in the United States,

Jacob Carter: 17:33

President Trump took a permanent marker, and on a map actually drew a path of a hurricane and then really sort of doubled down that this path was going to mean that folks in Alabama were going to be impacted by the hurricane, which was not true at all.

Yet, the president, because he had tweeted about it, really wanted to double down, so drew the path himself, a wrong path, using a permanent marker.

This had a huge fallout, because the National Weather Service actually had to come out to protect public health and safety and say that that path was incorrect.

There was a full investigation that was launched into scientific integrity. And that’s, you know, an example that is really well known and could have really had harmful consequences. I mean it did have some harmful consequences on people’s mental health, who were really concerned about getting their family members out of the path of this hurricane. So it can have huge impacts.

Adam Levy: 18:51

Now, for people who aren’t familiar with the organization, can you explain a little bit about what the Union of Concerned Scientists is? Although I guess the name is a bit of a clue.

Jacob Carter: 19:00

Sure, the Union of Concerned Scientists is a nonprofit organization that was started by MIT professors and graduate students over 50 years ago, when they were concerned that the United States was investing too many resources in wartime efforts and not investing enough resources regarding environmental concerns in the United States.

And of course, this was at a time when we literally had rivers on fire because there was so much pollution in our water. And so the organization has evolved since but we still at the heart of our mission really believe that science should inform decisions that affect the public health and safety and the health and safety of our environment.

Adam Levy: 19:54

When we talk about integrity within science, within research, what are we actually talking about?

Jacob Carter: 20:00

Well, I think there are two separate ideas there. So one is research integrity, which really involves not manipulating your data, not plagiarizing.

Those kinds of things that you should be mindful of as a researcher. Scientific integrity has really come to be more about politicization of science.

So really has become a term that is synonymous with decision making by policy, or government entities.

So this means that you could have a violation of scientific integrity, for example, if you have a political leader that goes into a scientific policymaking document, and changes wordings such that they downplay, let’s say, the impacts of climate change, or they delete some language that shows that species should be listed as endangered.

So those sorts of things are what we’re talking about when we talk about scientific integrity.

Adam Levy: 21:14

Now, of course, these are concerns across countries, across governments. But a lot of these issues really came to a head in the United States, at least under President Trump.

Can you explain what actually happened over the course of this presidency when it comes to scientific integrity?

Jacob Carter: 21:31

Sure. And there’s one thing that I want to mention there, before I get into the Trump administration, I actually published a paper with former colleague Emily Berman, who was a historian.

And under every administration we can find an example where the administration has politicized a science-based decision-making process.

So it doesn’t matter if it’s a Republican administration, a Democratic administration. Every administration that we looked at, at least dating back to Eisenhower and probably before, does try to politicize science.

The thing that set the Trump administration apart was the sheer frequency at which they violated science-based decision making processes or politicized them, and how they responded when they were called out about politicizing those science based decision making processes.

So to give you an example of the frequency. We at the Union of Concerned Scientists have documented the scientific integrity violations since the George W Bush administration.

Under that administration, the Union of Concerned Scientists documented 98 attacks on science under eight years of the George W Bush administration.

Under the Trump administration, we have documented and are still documenting as investigations continue to unfurl, over 200 attacks on science during four years.

That equates to about one attack on science every week. So that is really kind of what sets the Trump administration apart from any other administration that we have done research on when it comes to scientific integrity.

Adam Levy: 23:19

Could you give any examples of the kinds of attacks on scientific integrity that actually took place under President Trump’s presidency?

Jacob Carter: 23:30

One that you see quite often is censorship. So asking the scientific staff to not use certain language because of the political contentiousness of the topic.

And so what we saw was that climate change was censored from research documents, government documents. And staff also stopped using it themselves because they knew how the administration felt about what came to be known as the double C word.

And so you not only have this effect of censorship, that you also have this effect of self censorship.

Adam Levy: 24:16

Now that describes what happened over the four=year Trump presidency. How has this shifted under President Biden?

Is is just a return to the status quo? or have there been any efforts to actively undo what happened under Trump?

Jacob Carter: 24:32

There have been efforts and really at a historic level that we have not seen before. So right out of the beginnings of the administration President Biden released a presidential memorandum on restoring public trust and science through strengthening scientific integrity.

So one of the first things that the administration did that The Union of Concerned Scientists and our Center for Science and Democracy has been asking administrations to do for a long time was to elevate the science advisor to the President to a cabinet level position to allow them in the meetings with other cabinet level members, and to really elevate the importance of science in our government and decision making processes.

And this presidential memorandum did that. The other thing that it did was it established scientific integrity officials at all federal agencies.

And what the Biden administration said was, every federal agency deals with scientific research or evidence in some capacity.

And that means there could be scientific integrity violations at every agency. So every agency needs a scientific integrity official.

The memorandum did a lot. It also established a task force on scientific integrity, which has produced a number of reports. This guidance will provide scientists the right to freely speak to the public and media about their scientific work, which is extremely important, especially in emergency situations like chemical spills, where the public is wondering whether their drinking water is actually safe to drink.

Before scientists have wondered whether or not they could actually get out their work and message to the public, and now this guidance will provide them the right to do so.

Adam Levy: 26:35

From your description of the measures under President Biden, it seems like the problem is being tackled head on. Are there limitations to this? Or does this genuinely serve as a model for what should happen within the United States, perhaps what should happen within other countries as well?

Jacob Carter: 26:53

There is a limitation here, because this is guidance. So agencies do not have to necessarily implement this. And it could be written over by a new administration or another administration that comes in, that doesn’t see the importance of scientific integrity in this way.

And so what really needs to happen is Congress really needs to pass legislation that codifies a lot of these provisions that the White House has put forward in this guidance, because otherwise, the guidance doesn’t really have any weight to it when it comes to a legal sort of framework. What it does do is it does set up a good culture of scientific integrity. So it’s still really, really important to have these policies in place. But they do not have any sort of legal heft to them if someone were to violate them.

Adam Levy: 27:56

Is there anything that academics or academic institutions as a whole could or potentially should do to try and uphold academic integrity?

Jacob Carter: 28:06

I think a lot of universities obviously focus on more of the research integrity end of the spectrum. I think something more that they could you and one of the scientific integrity reports that came out from this White House taskforce on scientific integrity, mentioned doing more training for graduate students on scientific integrity.

I think that is something they could be doing because I’m sure politicization of science-based decision making process is something that could potentially happen at universities.

It’s also something I think that all scientists should be aware of, because this is something that should be very important to scientists, and important for scientists to speak out about when they see their governments interfering with these science-based decision making processes.

Adam Levy: 29:00

That was Jacob Carter. And that’s it for this episode of our special series on freedom and safety in science.

But threats to science come in many forms, and in the next episode, we’ll be looking at the challenges researchers face when they simply don’t have the resources to conduct their research.

Now, though, it’s time for a sponsored slot from the International Science Council about how it’s exploring freedom, responsibility and safety in science. Thanks for listening. I’m Adam Levy.

Lidia Borrell-Damián 29:40

The current world needs science to develop well-informed decisions, and that can only come from scientific autonomy.

Willem Halffman 29:49

Scientific autonomy does not mean that individual scientists can or should be able to do whatever they want.

Marnie Chesterton 29:56

Hello, and welcome to this podcast series from the International Science Council on Freedom and Responsibility in Science. I’m Marnie Chesterton, and in this episode we are looking at scientific autonomy. How can things like political interference or output metrics encroach on the freedoms of scientists? When might those freedoms compromise the responsibilities of scientists? And who gets to decide the limits of autonomy?

First of all, what is scientific autonomy? Lidia Borrell-Damián is the Secretary General of Science Europe, representing major public organizations that fund research in Europe.

Lidia Borrell-Damián 30:31

Scientists have the right to conduct research in the field of their choice. There should be clear and consistent regulatory frameworks, refraining from interference in decisions of the subjects to research. I would add also that no discipline can be excluded for political reasons.

Marnie Chesterton 31:07

In today’s global research landscape, both these aspects of autonomy, when it comes to scientists themselves and the institutions where they work, can be infringed in many ways. This can of course happen directly when governments pass laws that limit the freedoms of scientists and institutions, but it can also happen in more subtle or indirect ways.

Lidia Borrell-Damián 31:29

Governments, they set up their priorities. They say, well, here’s where they have the, the money is for, and that affects also the choice of the topic of a researcher because maybe a researcher would have an idea, but there is no money to develop that idea so that person goes in a different direction because there is money to develop something else. So there is a lot of nuance in what I’m saying here.

Marnie Chesterton: 31.56

And it’s not just funding priorities that can distort research outputs. Indeed, the very systems that we use to evaluate research are themselves limiting the autonomy of scientists.

Lidia Borrell-Damián 32:07

Many researchers find themselves constrained by rigid research assessment systems that rely on countable indicators attributed to the impact of a journal or of a certain platform. We think that the importance of a scientific paper is not where the paper is published, it is the contribution of the paper to the advancement of research. Therefore, we would like to reposition the use of quantitative indicators and make them much less important when assessing individual researchers. And second, develop ways to assess other types of output beyond articles. Let’s talk about software, let’s talk about prototypes, etc., which today may not receive the attention or the recognition that they deserve. There is now a whole movement in the academic sector as to how the scientific community thinks we need to be assessed. So it’s really a worldwide discussion on this issue.

Marnie Chesterton 33:21

Making research assessments broader and less focused on metrics should lead to more autonomy for researchers. But of course, not all science happens within academia, and that brings its own challenges.

Lidia Borrell-Damián 33:35

There is very little knowledge of what happens in the private sector regarding research. That is a big black box. I think companies should make an effort to make their research processes and policies transparent. Very little has been developed in terms of accountability to society. So my proposal would be here to strengthen the dialogue, public–private research investment, to agree on a set of common policies that would be a reflection of the values that underpin research.

Marnie Chesterton 34:15

This last point about accountability applies to science everywhere, not just the private sector, because any discussion of scientific autonomy has to recognize that it’s a double-edged sword.

Willem Halffman 34:28

So it’s not so much a balance between autonomy and scientific responsibility as the two make each other possible. They’re actually connected to each other.

Marnie Chesterton 34:48

This is Willem Halffman, a sociologist of science working at Radboud University University in the Netherlands. Willem points out that on the one hand, there are lots of reasons to protect and value scientific autonomy.

Willem Halffman 34:51

So this relative independence of scientists is really important. First of all, we need impartial assessments of the safety of our products, for the safety of our medicines. We also need independent scientists because we need people to warn us for dangers that might be ahead. Even if we don’t like to hear it, sometimes we also need scientists to tell us that we are wrong, that we’re doing things that are not working. And yes, if you let scientists tinker, sometimes they come up with radical new ideas and breakthroughs that in the long run can lead to products. And lastly, you could also say, well, we need this knowledge community because knowledge is a cultural good and a value in and of itself, just like we don’t interfere too much with art or with journalism.

Marnie Chesterton 35:41

But on the other hand, autonomy that goes completely unchecked or unchallenged can be dangerous as history has taught us.

Willem Halffman 35:49

As societies, we’ve learned sometimes the hard way that if you award scientists this relative independence, they don’t automatically do the right thing. Things have gone wrong in the past. Sometimes when you let scientists decide for themselves, they will make ethical balances that we don’t agree with. For example, they might think that it’s okay to experiment on their patients. Sometimes if you leave them to their own accords, they might invent new mechanisms of mass destruction. They might come up with dangerous new technologies. So we want scientists to be accountable for these kinds of things. We want them to explain to society what is at stake and how we can find ways to deal with that.

Marnie Chesterton 36:34

So how do we ensure scientists live up to their responsibilities while giving them the relative autonomy that we’ve heard is so important? Well, according to Willem, it’s not just about regulation.

Willem Halffman 36:49

Part of how we keep scientists responsible is, on the one hand, by making them responsible; that is we put them under research evaluation control systems, we make them apply to ethical committees if they’re going to do research with humans. There’s all kinds of regulatory systems apply to scientists to kind of force them to be responsible. But I think it’s also important that we make clear to future scientists that we are actually giving them quite a lot of power when we hand them the keys to the laboratory. There’s a lot of powerful things you can do with science. Therefore, you also need from scientists the right kind of mindset. And that right kind of mindset is a matter of socialization, is a matter of teaching scientists how to behave, how to talk, and stressing how important it is for them to maintain this responsibility as part of the social contract for science.

Marnie Chesterton 37:46

Importantly, the limits of scientific autonomy are not fixed. Instead, they must be continually renegotiated in the light of the issues we face in science and society today.

Willem Halffman 37:59

Most of our ideas about scientific autonomy were very much shaped by things that had happened in the 20th century by the experience of the Second World War. But in our timeframe, there are all these new threats to scientific autonomy. By now, we’ve discovered that science can have really deep biases, can be racists, can be sexists. Science can be manipulated by organized industrial interests on an enormous scale. So for example, disproportionately highlighting the uncertainties of climate change or smoking. So the answers that we come up with now might help us now, but might in another couple of decades leads to other unintended consequences and may need to be readdressed and reassessed.

Marnie Chesterton 38:52

That’s it for this episode on freedom and responsibility in science from the International Science Council. The ISC has released a discussion paper on these issues. You can find the paper and learn more about the ISC’s mission online at council.science/podcast . Next time, we’ll be looking at science communication. How can we promote the spread of scientific knowledge while guarding against misinformation and protecting scientists and researchers from online harassment?

council.science/podcast

Source link