Episodes
Wednesday Jul 08, 2020
Emergent Research Ethics with Warren Speed
Wednesday Jul 08, 2020
Wednesday Jul 08, 2020
In this episode I talk to Warren Speed, postgraduate researcher at the University of Exeter, about the development of an emergent approach to research ethics during his PhD, and the Research Ethics Conference that recently secured funding for March 2021. During the podcast we discuss:
- British Education Research Association (BERA) Ethics and Guidance
- Research Ethics Conference website
- University of Exeter Research Ethics and Goverance
You can find Warren on twitter @WarrenSpeed1.
Music credit: Happy Boy Theme Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/
Music credit: Happy Boy Theme Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/
Podcast transcript
1
00:00:09,000 --> 00:00:11,000
Hello and welcome, R, D
2
00:00:11,000 --> 00:00:32,000
And in betweens, I'm your host, Kelly Preece, and every fortnight I talk to a different guest about researchers development and everything in between.
3
00:00:32,000 --> 00:00:38,000
Hello and welcome to Episode four. In this episode, I'm going to be talking to Warren Speed.
4
00:00:38,000 --> 00:00:44,000
He's a postgraduate researcher in the University of Exeter's Graduate School of Education.
5
00:00:44,000 --> 00:00:50,000
Warren takes a really interesting and I think innovative approach to research ethics in his project.
6
00:00:50,000 --> 00:00:56,000
He talks about research ethics as going beyond the processes and procedures that we have to engage with
7
00:00:56,000 --> 00:01:04,000
to meet university ethical requirements to what Warren terms an emergent approach to research ethics.
8
00:01:04,000 --> 00:01:08,000
Warren are you happy to introduce yourself So I'm Warren Speed.
9
00:01:08,000 --> 00:01:16,000
I'm studying PhD in education and I look at fundamental British values with the Prevent Duty.
10
00:01:16,000 --> 00:01:23,000
And I look at how schools apply this agenda across the across the regions of England.
11
00:01:23,000 --> 00:01:29,000
My conference has come about from what? DHT, although it wasn't expected.
12
00:01:29,000 --> 00:01:38,000
The conference. Funding for is to put on research ethics conference where it brings together people who
13
00:01:38,000 --> 00:01:44,000
are from various different disciplines across cost discipline and also from various
14
00:01:44,000 --> 00:01:49,000
universities and also outside organisations to get together and present papers and to attend
15
00:01:49,000 --> 00:01:54,000
and listen and do workshops about everything and anything to do with research ethics,
16
00:01:54,000 --> 00:02:01,000
to kind of open up conversations around things like, for example, I'm a social scientists.
17
00:02:01,000 --> 00:02:07,000
So, for example, things I wouldn't have thought of which animal animal ethics people do.
18
00:02:07,000 --> 00:02:15,000
And it's about trying to get those types of ethical kind of stuff together.
19
00:02:15,000 --> 00:02:18,000
So how did this come about in terms of your research?
20
00:02:18,000 --> 00:02:26,000
So what how how is it that you got so interested in some of the challenges in the discourse around research ethics?
21
00:02:26,000 --> 00:02:33,000
Yeah, I think it's a few things released in my background was I used to be teaching union for about 12 years,
22
00:02:33,000 --> 00:02:43,000
and my role mainly within this union was Equalities Officer. So within Devon mainly, I was the quality of service to schools around Devon.
23
00:02:43,000 --> 00:02:46,000
So I always had a thing about equity,
24
00:02:46,000 --> 00:02:55,000
all kind of rights of equality and ethics and making sure that everybody was treated equal and fairly and respectfully at all times.
25
00:02:55,000 --> 00:03:00,000
And the other thing was during my P.H. day, which looks at fundamental British values,
26
00:03:00,000 --> 00:03:04,000
it's got really deep roots into counter-terrorism within schools.
27
00:03:04,000 --> 00:03:10,000
So I had to be very mindful of that and mindful that participants might not want to get
28
00:03:10,000 --> 00:03:17,000
involved because they might they might have their own concerns and worries about it.
29
00:03:17,000 --> 00:03:24,000
And since thinking along those lines of of that, I my ethics awareness.
30
00:03:24,000 --> 00:03:27,000
But he's got greater as he got greater, I got a lot more interested in it.
31
00:03:27,000 --> 00:03:31,000
So now my PHC doesn't just have an ethics section.
32
00:03:31,000 --> 00:03:40,000
The entire thesis is all about fundamental British values and research ethics and how I've applied ethics and an ethically minded,
33
00:03:40,000 --> 00:03:44,000
very emergen ethically emergen approach. Might the H.T.?
34
00:03:44,000 --> 00:03:52,000
Yeah, so I think the thing that really has interested me when we sort of said in the past is that idea of emerging ethics.
35
00:03:52,000 --> 00:04:02,000
So the way in which ethics is embedded into the research process rather than being an approval process, I guess, that you go through at the beginning.
36
00:04:02,000 --> 00:04:07,000
Can you talk a bit more about that and how and what that means in terms of your research?
37
00:04:07,000 --> 00:04:12,000
Yeah. Okay. I actually wrote a paper on this. So I looked.
38
00:04:12,000 --> 00:04:18,000
I looked at what was happening in regards to the procedure, ethics or institutionalised ethics.
39
00:04:18,000 --> 00:04:23,000
So things that were going on within university and that the university wanted
40
00:04:23,000 --> 00:04:30,000
you to do in order to get ethical approval had to jump through some hurdles. I to look at some guidance forms and things like that.
41
00:04:30,000 --> 00:04:34,000
And although I do agree with it, that does definitely have its place.
42
00:04:34,000 --> 00:04:46,000
I know. So I've always thought is never enough. There was never enough done in order to really, truly be ethically minded whilst conducting research.
43
00:04:46,000 --> 00:04:51,000
So procedurally, people would fill out an ethical application form.
44
00:04:51,000 --> 00:04:55,000
The application form gets sent to an ethical panel.
45
00:04:55,000 --> 00:05:01,000
An ethical panel will decide whether or not the article ethics application form is sufficient enough for you to.
46
00:05:01,000 --> 00:05:08,000
Research. Once you've got that approval, I was curious as to know how many people actually went back to rehab a day.
47
00:05:08,000 --> 00:05:12,000
The ethical approval through the data collection or research process.
48
00:05:12,000 --> 00:05:17,000
And I found out through contacts at the University of Exeter that it's not very many at all.
49
00:05:17,000 --> 00:05:23,000
And they actually can't think of any and any people. I've actually had to go back and read keep on redoing it.
50
00:05:23,000 --> 00:05:30,000
So I had massive issues and concerns around that because I was thinking when your cadet conducting data,
51
00:05:30,000 --> 00:05:36,000
surely your ethics or your ethical standpoint should change because you're meeting people,
52
00:05:36,000 --> 00:05:42,000
you're building rapport and friendships and relationships, whatever, with participants, therefore, does this.
53
00:05:42,000 --> 00:05:45,000
This should change the dynamic of ethical approach.
54
00:05:45,000 --> 00:05:55,000
And this is why I thought taking a very emersion approach whilst applying complying with procedure or ethics was very important.
55
00:05:55,000 --> 00:06:07,000
The immersion approach allows me to to to really think about as effectively as I'm going along, the kind of the ethical dilemmas or implications,
56
00:06:07,000 --> 00:06:18,000
not even ethical successes that are coming ahead of me or happening at a time and the things that I need to do to change them or how.
57
00:06:18,000 --> 00:06:22,000
Yeah. The other thing I used to do it well, I still do. Actually can.
58
00:06:22,000 --> 00:06:24,000
Still collecting a little bit of data through the process.
59
00:06:24,000 --> 00:06:30,000
I've spoken to my participants as well in regards to the ethics and if there is anything specific
60
00:06:30,000 --> 00:06:35,000
to that school that ethically I should know about and how I could ethically support them,
61
00:06:35,000 --> 00:06:40,000
because I have a variety of schools with a variety of backgrounds.
62
00:06:40,000 --> 00:06:48,000
And by asking that question is quite good, actually, I managed to get quite a lot more information I never would have thought of.
63
00:06:48,000 --> 00:06:53,000
I could put into my ethical kind of writing my processes and my application forms.
64
00:06:53,000 --> 00:07:03,000
So be an immersion approach is I highly I highly recommend it is very beneficial and it really puts your self and the participant,
65
00:07:03,000 --> 00:07:07,000
the human participants in my case at the central research.
66
00:07:07,000 --> 00:07:14,000
Ethically, I think ethics is it should be the centre of the research without the ethical clearance not doing things properly, respectfully.
67
00:07:14,000 --> 00:07:21,000
You can't actually have, I don't think, a very asao ethically sound piece of work.
68
00:07:21,000 --> 00:07:26,000
Yeah, and I think there's something in that that really resonates with me as.
69
00:07:26,000 --> 00:07:35,000
Somebody who as an academic and researcher was always working with with people and certainly as an arts researcher.
70
00:07:35,000 --> 00:07:44,000
Kind of a strong awareness of and presence of kind of an eye and reflection of your subjectivity within the research.
71
00:07:44,000 --> 00:07:52,000
And I've always considered that to be about. About the methodology, but also about the ethics of the research is about being making clear about your
72
00:07:52,000 --> 00:07:57,000
place and perspective as the researcher and how that frames everything you're doing.
73
00:07:57,000 --> 00:08:00,000
And I think there's something really interesting about this difference between
74
00:08:00,000 --> 00:08:05,000
that you've that you're cutting procedural ethics and that emergent ethics,
75
00:08:05,000 --> 00:08:12,000
which is it's a sort of really speaks to me is, you know, a form of ethical, reflective practise.
76
00:08:12,000 --> 00:08:18,000
Yes. And it's something to me that I find quite odd doesn't actually happen.
77
00:08:18,000 --> 00:08:20,000
This is the reason why I want to put this conference on this.
78
00:08:20,000 --> 00:08:26,000
It's not these conversations to say, look, you know, procedural to institutionalised ethics does have its space.
79
00:08:26,000 --> 00:08:31,000
It really does. Of course it does. And also, the university has to mitigate sells against any issues.
80
00:08:31,000 --> 00:08:36,000
And, you know, anything that might happen, you know, there's legal requirements there as well. It has to happen.
81
00:08:36,000 --> 00:08:42,000
But I'm hoping that this conference start opening up this conversation to say, look, everyone,
82
00:08:42,000 --> 00:08:47,000
we need to start looking at ethics, not just an emergent way, but also a very innovative way.
83
00:08:47,000 --> 00:08:51,000
The world is changing quite a lot at the moment. You know, we've got a lot going on.
84
00:08:51,000 --> 00:08:56,000
We've had you know, we've had we have a very controversial things happen in the U.K. like we've had Brexit.
85
00:08:56,000 --> 00:09:01,000
We've had to change of prime ministers before October 19. We've had a lot of things happening.
86
00:09:01,000 --> 00:09:08,000
We've got Black Lives Matter movement, which is happening at the moment. We really need to not just be very static in what we do.
87
00:09:08,000 --> 00:09:16,000
And the only to do the one of the negative things I think about procedural ethics is which the universe is all institutionalised.
88
00:09:16,000 --> 00:09:23,000
Ethics is that we are expected to follow a specific or recommended ethical guidance.
89
00:09:23,000 --> 00:09:26,000
So, for example, the sort of education we want.
90
00:09:26,000 --> 00:09:34,000
We're told that we need to look at. We should be looking at, as you know, is the SPARER, the British Educational Research Association ethics.
91
00:09:34,000 --> 00:09:38,000
But that is very static. It's only stuck at a certain point.
92
00:09:38,000 --> 00:09:44,000
And I think it's up to the researcher to really go as far as they can consistently through
93
00:09:44,000 --> 00:09:48,000
their research to start looking at innovative ways of how they can be ethically do.
94
00:09:48,000 --> 00:09:53,000
The research can be ethically minded the entire time. And it doesn't happen.
95
00:09:53,000 --> 00:10:01,000
And I know it doesn't happen because I've spoken to colleagues, both academic colleagues and, yes, Senate colleagues.
96
00:10:01,000 --> 00:10:05,000
And I've had this conversation with them and they just don't.
97
00:10:05,000 --> 00:10:16,000
One of the things I've been told, which just it doesn't it doesn't normally I say, is that when HD,
98
00:10:16,000 --> 00:10:20,000
for example, need to update their ethics, they should keep on putting it through the ethical panel.
99
00:10:20,000 --> 00:10:24,000
An ethical panel should be approving it. That's what they're there for. That's what they should be doing.
100
00:10:24,000 --> 00:10:28,000
And it also keeps a paper trail, keeps everyone safe as well.
101
00:10:28,000 --> 00:10:35,000
I've been told by a senior academics that all the know the universe isn't haven't times that people don't have time to do that.
102
00:10:35,000 --> 00:10:40,000
And my my response is, so what is your responsibility to do it?
103
00:10:40,000 --> 00:10:45,000
It shouldn't be happening. We should be obeying our participants and ourselves and the universe.
104
00:10:45,000 --> 00:10:52,000
And everyone's resolved the research. The utmost respect that we do not is necessary.
105
00:10:52,000 --> 00:11:03,000
Thing happens all the time. No, I don't think there's a number of things that I think are really crucial for me in terms of because this is you know,
106
00:11:03,000 --> 00:11:06,000
this is something that is quite new to me as a topic area and that sort of thing
107
00:11:06,000 --> 00:11:11,000
about ethics not being static because the world is static and research isn't static.
108
00:11:11,000 --> 00:11:16,000
We know that research is constantly evolving through the through our research processes.
109
00:11:16,000 --> 00:11:22,000
So why would ah. Why would our ethical standpoint or ethical approval or ethical methods.
110
00:11:22,000 --> 00:11:28,000
Why would that be static if we know that the research is in and of itself ever changing?
111
00:11:28,000 --> 00:11:36,000
Yeah, I mean, it has been like that is it is is static, but I don't think it's static that anyone's fault.
112
00:11:36,000 --> 00:11:42,000
No, particularly. I just don't think we have. There's not enough training that goes on at all level.
113
00:11:42,000 --> 00:11:49,000
Was undergrad master p h d right up to presses ships and there's not enough training that goes on.
114
00:11:49,000 --> 00:11:57,000
On ethics we need training. Right from the beginning of all of studying, when we looked at looking at human,
115
00:11:57,000 --> 00:12:01,000
no data or tissue, animal ethics, whatever we need, if we need ethical training.
116
00:12:01,000 --> 00:12:06,000
Right. Beginning to just say to us that you do have to use procedural stuff is static.
117
00:12:06,000 --> 00:12:10,000
But it is a star. Don't frown on it. It is dirty. You do need it.
118
00:12:10,000 --> 00:12:14,000
But you do need to take a very emergen approach. And that comes from you as a person.
119
00:12:14,000 --> 00:12:18,000
The university. And you can't do that. It's individually.
120
00:12:18,000 --> 00:12:26,000
And then you got to ask the question, I guess, who is going to make to about that particular pasty old person of the day is take an emergent approach.
121
00:12:26,000 --> 00:12:30,000
Do they do it themselves? The supervisor cheque on it, just like the doctor college do it.
122
00:12:30,000 --> 00:12:35,000
I mean, it's not built into the E.M.S. stuff within the university. We don't have any.
123
00:12:35,000 --> 00:12:42,000
We don't have anything. Nothing exists, which I find strange. I find it really odd.
124
00:12:42,000 --> 00:12:51,000
Yeah, and I. I I really hate what you're saying about the difference between, you know.
125
00:12:51,000 --> 00:13:00,000
Because I guess the message that came back to you from that senior academic is about the I guess the administrative or workload associated with.
126
00:13:00,000 --> 00:13:14,000
The resubmission of know the kind of emergent approach to ethical approval and confusing that sense of of workload and box ticking with the actual.
127
00:13:14,000 --> 00:13:22,000
The fundamental kind of principles of the way that we operate in this environment, and I can see, you know,
128
00:13:22,000 --> 00:13:28,000
I can really see how this approach or imagine ethics would be crucial not just in terms of conducting research,
129
00:13:28,000 --> 00:13:33,000
but thinking about how we operate and treat each other as a community.
130
00:13:33,000 --> 00:13:39,000
Yeah, I mean, I do find it really odd that we we don't have emergent ethics properly.
131
00:13:39,000 --> 00:13:43,000
No one really talks about it. I can't. And I know a lot.
132
00:13:43,000 --> 00:13:48,000
I'm always up to conversations like this all the time. But I still can't believe I have a conversation.
133
00:13:48,000 --> 00:13:51,000
I'm having conversations where I'm still trying to prevent this sort.
134
00:13:51,000 --> 00:13:56,000
So we're trying to talk about it. It's really strange that this just doesn't happen.
135
00:13:56,000 --> 00:14:00,000
I think that the university. All university is not just us.
136
00:14:00,000 --> 00:14:05,000
So everyone and all of the. We need to do more. I mean, we have we got great resources.
137
00:14:05,000 --> 00:14:11,000
We've got great people in university where we are at the moment, where we've got we've got the governance and ethics manager.
138
00:14:11,000 --> 00:14:16,000
She's brilliant. She's so good. She's so good. And she is she's always got work to do.
139
00:14:16,000 --> 00:14:20,000
Surely that says that there is this. So if she's always got work to do.
140
00:14:20,000 --> 00:14:28,000
There's more scope. There's more things that we can do within ethics that needs to be more people to support this person within her role.
141
00:14:28,000 --> 00:14:34,000
I don't think many places take it as serious as they as they make it would like us to think.
142
00:14:34,000 --> 00:14:42,000
I think they just they just do it because I think they have to do it.
143
00:14:42,000 --> 00:14:49,000
And it's not enough and it's disrespectful to participants or it's disrespectful to any living thing,
144
00:14:49,000 --> 00:15:01,000
your your doing research on how is the conference going to challenge some of these kind of fundamental issues and flaws in our system,
145
00:15:01,000 --> 00:15:08,000
in the sector of approaching research ethics? Well, I mean, I don't think as much as it is about challenging.
146
00:15:08,000 --> 00:15:17,000
I think it's about starting to open up conversations. Yeah, I I've done a little bit of a look around to see what conferences go on.
147
00:15:17,000 --> 00:15:21,000
I've got a regards to research ethics within the U.K. or England, and there isn't any Ivens.
148
00:15:21,000 --> 00:15:28,000
I found anything. So I'm trying to open up that debate in conversation to get people to start sharing their stories, to open up networks.
149
00:15:28,000 --> 00:15:33,000
But what's even more important is keeping the conference free. So it's accessible,
150
00:15:33,000 --> 00:15:39,000
but also making sure that we invite actively in by organisations within the south west to come along
151
00:15:39,000 --> 00:15:45,000
for free and to apply for a bursary if they need one or the or if they're self-employed or something,
152
00:15:45,000 --> 00:15:50,000
where we can we can have Akeda in academia, academics,
153
00:15:50,000 --> 00:15:55,000
and we can have these professionals in organisations or skilled trades or whatever
154
00:15:55,000 --> 00:16:00,000
coming here to have or come into the conference to have these conversations.
155
00:16:00,000 --> 00:16:04,000
I mean, we don't we don't know what's going to happen within the conference call.
156
00:16:04,000 --> 00:16:10,000
That hasn't never happened yet. But what's really important is starting to open these conversations and start to
157
00:16:10,000 --> 00:16:18,000
start to ask questions about how we conduct our research ethically that way.
158
00:16:18,000 --> 00:16:23,000
Yes. Opening up the conversation and getting people to think more about.
159
00:16:23,000 --> 00:16:28,000
The way they approach ethics within their own research, but also outside as well.
160
00:16:28,000 --> 00:16:37,000
Yeah, some research cross discipline, looking at everything. Because we don't we don't when when do we ever get the chance to.
161
00:16:37,000 --> 00:16:43,000
I don't get the chance, for example, to speak someone to who looks at human tissue or animal ethics.
162
00:16:43,000 --> 00:16:50,000
I'd love to. I would love to be able to take something away. I would never have thought I would actually would work or would work as a form
163
00:16:50,000 --> 00:16:54,000
of a model which I could adapt to fit within my social science research.
164
00:16:54,000 --> 00:16:58,000
I think I think that's what we need to do. I think we start open up is really,
165
00:16:58,000 --> 00:17:08,000
really important conversations because we owe it to the participants of all the animals or to other living things, organisms, whatever.
166
00:17:08,000 --> 00:17:11,000
We owe it. You can't be can't just take it.
167
00:17:11,000 --> 00:17:18,000
You can't just do ethics to an ethical obligation to take off and get on with it and never look at it again.
168
00:17:18,000 --> 00:17:22,000
I think that is I think that's unacceptable. It's a responsibility.
169
00:17:22,000 --> 00:17:25,000
Of course it is. Yeah, I think you're doing research.
170
00:17:25,000 --> 00:17:31,000
Usually you should want to feel responsible for the living things and the participants, et cetera.
171
00:17:31,000 --> 00:17:36,000
You would want to make sure that these people are safe. All these these things are being respected.
172
00:17:36,000 --> 00:17:40,000
It's complicated, but you can do something about it even if you don't understand it.
173
00:17:40,000 --> 00:17:48,000
So by having these conversations challenging the status quo on ethics, challenging the ethical panel's challenging,
174
00:17:48,000 --> 00:17:53,000
why did the organisation or academic institution isn't doing enough for ethics?
175
00:17:53,000 --> 00:17:58,000
I think all you need to you need to look at everything and constantly thinking challenge.
176
00:17:58,000 --> 00:18:06,000
You don't ever have to understand that. You just need to be making an effort to protect yourself, funds and everything else.
177
00:18:06,000 --> 00:18:20,000
I think so. Aside from our responsibility to those involved in our research, whether those be human participants or animal participants or tissues,
178
00:18:20,000 --> 00:18:27,000
what what has this kind of sense of emergent ethics brought to your research?
179
00:18:27,000 --> 00:18:31,000
Well, you seem to sort of be saying by having those emergent ethics,
180
00:18:31,000 --> 00:18:37,000
ethical conversations with individual schools, you if you were getting more information.
181
00:18:37,000 --> 00:18:42,000
So what's the kind of the the benefits that you're reaping of having that imagine ethical approach?
182
00:18:42,000 --> 00:18:46,000
I think it's the outcome of my books, I suppose, in my research is following this process.
183
00:18:46,000 --> 00:18:50,000
My research followed a completely different path now than what I thought it would do.
184
00:18:50,000 --> 00:18:55,000
So I look, instead of looking at all the negative things which I was going to do,
185
00:18:55,000 --> 00:18:58,000
is going to look negative and positive things are fundamental British values and schools.
186
00:18:58,000 --> 00:19:02,000
And putting together this, there is no this just was that and it didn't really add anything.
187
00:19:02,000 --> 00:19:05,000
It's not it's not as nice as something.
188
00:19:05,000 --> 00:19:12,000
And if the researchers, the participants upon this research celebrate what they say and actually let's look all the positive things that we do.
189
00:19:12,000 --> 00:19:16,000
Let's share best practise. Let's look at all the great things that's going on.
190
00:19:16,000 --> 00:19:19,000
Sure. It takes an appreciative enquiry approach.
191
00:19:19,000 --> 00:19:25,000
So I'm trying to avoid all the negativity that comes with it and focus on the things that they consider that works well within.
192
00:19:25,000 --> 00:19:31,000
Why is I'm looking for or looking at. And in the hope to share, to share stuff.
193
00:19:31,000 --> 00:19:38,000
So I think because I I opened up my ethical and open to ethical discussions with my participants.
194
00:19:38,000 --> 00:19:43,000
This is what I got out of it. It was too much negativity that surrounds it, but it's something I wanted to look at.
195
00:19:43,000 --> 00:19:47,000
So I had to question myself, do I need to? I did my questions.
196
00:19:47,000 --> 00:19:50,000
I need to know all the negative things. And the answer is no, I don't.
197
00:19:50,000 --> 00:19:55,000
I can look at things positively and appreciate the approach, which is what I've done.
198
00:19:55,000 --> 00:19:59,000
The other thing is, is I've got offset as well.
199
00:19:59,000 --> 00:20:07,000
You're who I've met a few times. I've come to the context to meet me, discuss my research because they like the approach I take with it.
200
00:20:07,000 --> 00:20:11,000
Well, yeah, Wheatley's. And I've also got so the funding for the conference,
201
00:20:11,000 --> 00:20:20,000
as well as the many of the schools that I've got with they want to or they're looking at some of them are looking at waiving anonymity.
202
00:20:20,000 --> 00:20:25,000
They want. They want. They want to be known within their search was to have the approach that I'm taking.
203
00:20:25,000 --> 00:20:33,000
And I think that's all because the ethical process I, I, I took or I'm still taking, we should face it,
204
00:20:33,000 --> 00:20:38,000
informed me that I shouldn't be taking a negative or looking at this disclose kind of negatively.
205
00:20:38,000 --> 00:20:42,000
I should be celebrating the great work that they do to an end.
206
00:20:42,000 --> 00:20:45,000
It's come about that way.
207
00:20:45,000 --> 00:20:53,000
And it's just I'm quite I mean, I've got a lot I've still got a lot work to finish off all this, but it's it's done me really well.
208
00:20:53,000 --> 00:20:59,000
And I just feel up to really great relationships with different organisations and groups of people.
209
00:20:59,000 --> 00:21:05,000
And also my relationships with the participants is excellent. It's really good.
210
00:21:05,000 --> 00:21:11,000
And they're always positive to add more and get involved. And that's because I've kept them within the process.
211
00:21:11,000 --> 00:21:15,000
So they've they've been part of the methods, the type the way I collect the data,
212
00:21:15,000 --> 00:21:20,000
they've been collect the part of the the ethics, the really part of the the at the end as well.
213
00:21:20,000 --> 00:21:26,000
They've all got a chance to write a thesis about something. I would write about my about that story,
214
00:21:26,000 --> 00:21:35,000
about the research and what it is they've learnt and how they can do thing might do things differently or all that and that type of stuff.
215
00:21:35,000 --> 00:21:43,000
So the ethics stuff in it gives you I think it benefits you in more ways than what you actually might think.
216
00:21:43,000 --> 00:21:48,000
It's not just ethics in looking after people who suspected PPI participants,
217
00:21:48,000 --> 00:21:53,000
but it's also about the other the other bits I talked about the approach I take and
218
00:21:53,000 --> 00:21:59,000
appreciate Biglari approach and that that I think that goes hand-in-hand with ethics taken.
219
00:21:59,000 --> 00:22:09,000
Appreciate the quarry approach. Yeah, and it sounds like the the impact.
220
00:22:09,000 --> 00:22:14,000
The research has is going to be so much more wide ranging with the way that some
221
00:22:14,000 --> 00:22:19,000
schools are potentially waiving anonymity and sharing that best practise isn't.
222
00:22:19,000 --> 00:22:20,000
I was just I was just this one.
223
00:22:20,000 --> 00:22:26,000
I just think, you know, the other thing is, is the ethics of is still going on right now because of the created 19 thing.
224
00:22:26,000 --> 00:22:31,000
So we're saying I've still got some information to collect from some of the some schools and teachers,
225
00:22:31,000 --> 00:22:37,000
etc. But I have to ask myself, is it ethical for me to ask teachers now?
226
00:22:37,000 --> 00:22:42,000
Yes. They reach interested and carry on with the research, or do I have enough time to do it?
227
00:22:42,000 --> 00:22:46,000
Well, I could wait and I am waiting. I'm not going to put any more pressure on them.
228
00:22:46,000 --> 00:22:53,000
They're already under a lot of pressure. What we're teaching union. I also used to be a teacher and I can imagine the pressures are on.
229
00:22:53,000 --> 00:22:58,000
And I don't think it's ethical for me to continue right now to collect data.
230
00:22:58,000 --> 00:23:10,000
That's why everything's on hold at the moment. Yeah. And I think particularly with David and the way that the world is shifting.
231
00:23:10,000 --> 00:23:15,000
Again, these things are going to come more and more crucial because the ways in which we.
232
00:23:15,000 --> 00:23:21,000
Conduct, race research and the environments in which we work in and the pressures, the pressures that we're under.
233
00:23:21,000 --> 00:23:25,000
All of that is shifting so massively at the moment.
234
00:23:25,000 --> 00:23:33,000
And it's shown as well during the P.H. day with especially teachers around schools in the regions of England.
235
00:23:33,000 --> 00:23:40,000
What I do is I keep a touch base with them and ask are how they are asking if there's anything they would like me to do to support them.
236
00:23:40,000 --> 00:23:43,000
So I'm doing a few things to support some schools at the moment.
237
00:23:43,000 --> 00:23:50,000
Mostly is that around is around my research, but creating some sessions with with the teachers there to try and support them.
238
00:23:50,000 --> 00:23:54,000
And sometimes it's just touching base with them and asking if they're okay and that type of thing.
239
00:23:54,000 --> 00:24:00,000
Just so you know, I forgot about them. Still, there is no pressure. We don't need to get involved the research yet.
240
00:24:00,000 --> 00:24:04,000
I know that they know that they need to do it, the things that they have to do.
241
00:24:04,000 --> 00:24:09,000
And I know that they're waiting. And they did. Yeah.
242
00:24:09,000 --> 00:24:15,000
It's just been respectful listening. I would always say it's not ethical. That is four steps.
243
00:24:15,000 --> 00:24:23,000
Yeah. There's something, you know, some perhaps somewhat somewhat ironically, given the kind of fundamental, precious values that you're researching.
244
00:24:23,000 --> 00:24:27,000
There is something about fundamental value system within this.
245
00:24:27,000 --> 00:24:34,000
In research about respect for those that are involved in our research and care as well.
246
00:24:34,000 --> 00:24:39,000
Yes. Seems to be a huge amount of care in the way that you are approaching this.
247
00:24:39,000 --> 00:24:44,000
Yeah, I mean, it's really I think it is important. I do know that those are people that do come.
248
00:24:44,000 --> 00:24:51,000
You do care for them. But equally, there are, you know, what is it?
249
00:24:51,000 --> 00:24:56,000
There are quite a few people that just it's just a hurdle.
250
00:24:56,000 --> 00:25:05,000
The quality, the ethics or the respect, it's just what they do because they wanna get the research done, not because they actually care about them.
251
00:25:05,000 --> 00:25:10,000
And it sounds horrible, but I know this because I've had conversations with people about this and I don't.
252
00:25:10,000 --> 00:25:14,000
I do challenge them on it a little bit, but I didn't go into it too much. But it does sometimes.
253
00:25:14,000 --> 00:25:18,000
I do have to feel like I need to hold back. I try very hard to hold myself.
254
00:25:18,000 --> 00:25:25,000
Perhaps before I say something, I. I probably shouldn't say I don't want, sir.
255
00:25:25,000 --> 00:25:35,000
I don't know. I have issues with it. I'm more. And it's I guess it's finding the right environment in which to challenge that.
256
00:25:35,000 --> 00:25:41,000
Yeah, yeah. And the conference would be the perfect environment.
257
00:25:41,000 --> 00:25:46,000
Yeah, absolutely. So when is the conference planned to take place?
258
00:25:46,000 --> 00:25:51,000
Okay. Yes. So the day is now the 26 of March. It'll be around that day.
259
00:25:51,000 --> 00:25:57,000
We've got we're getting everything ready. We've got a team of people who are already on it.
260
00:25:57,000 --> 00:26:04,000
Which is brilliant. I am going to put another call on my first call out for anyone who wants to join the team.
261
00:26:04,000 --> 00:26:11,000
And that's site that is for us, everybody that says staff and students, because we've got a mixture of staff and students on the team as well.
262
00:26:11,000 --> 00:26:21,000
Yeah. Twenty six will be all day. But what I want to try and do is to get some organisations in with some decent academics
263
00:26:21,000 --> 00:26:25,000
from various disciplines or related disciplines to the organisation before the
264
00:26:25,000 --> 00:26:30,000
conference to do some workshops so we can go on to and trying to promote and highlight a
265
00:26:30,000 --> 00:26:36,000
conference and also to support organisations as well with with they might need support.
266
00:26:36,000 --> 00:26:42,000
Well, I could be writing something or poster or it could be just a networking session.
267
00:26:42,000 --> 00:26:51,000
And then on the day we're going to have a mixture of papers. It can be organisations, local sound institutions from the southwest.
268
00:26:51,000 --> 00:26:58,000
So but still we accept them if you are invited and academics and students as well.
269
00:26:58,000 --> 00:27:07,000
This is everybody just open up conversation. And at the end, we we have a drink section that'll be fun.
270
00:27:07,000 --> 00:27:13,000
And this that's somewhere online at the moment where people can find out information or keep keep up to date with the conference.
271
00:27:13,000 --> 00:27:23,000
So is that forthcoming? So if that's been made, we've got someone in the committee at the moment who is responsible for social media and the Web site.
272
00:27:23,000 --> 00:27:27,000
The Web site is a landing page, but it's not. We haven't published it yet.
273
00:27:27,000 --> 00:27:28,000
It should be published.
274
00:27:28,000 --> 00:27:38,000
I think in this couple of weeks and that'll be w w w dot research ethics conference, dot dot U.K. and then they'll be up there.
275
00:27:38,000 --> 00:27:42,000
And then we've got we've got Twitter and Facebook pages as well.
276
00:27:42,000 --> 00:27:53,000
So imagining that someone listening to this who hasn't really thought about ethics much beyond the kind of procedural institutional processes.
277
00:27:53,000 --> 00:27:59,000
And what would you say to them? What what do you want? What questions do you want them to ask themselves or to think about?
278
00:27:59,000 --> 00:28:05,000
I think we just need to think about taking it slow when doing ethics isn't a hurdle.
279
00:28:05,000 --> 00:28:12,000
And I think in that they do remember that removal is something which is going to affect your participants.
280
00:28:12,000 --> 00:28:18,000
It is going to affect them. They're involved in issues, are involved in research that in some way they might not be good or bad.
281
00:28:18,000 --> 00:28:26,000
It might be well could be anything. And I think it's just about making sure that you're mindful of the type of people
282
00:28:26,000 --> 00:28:33,000
or the type of person that's involved in your research and also speaking to them.
283
00:28:33,000 --> 00:28:40,000
You can't you can't do ethical. We can't complete an ethical application without finding out if there's any ethical concerns for your participants.
284
00:28:40,000 --> 00:28:47,000
But you can't access your participants and you've got ethical clearance so you don't get ethical clearance.
285
00:28:47,000 --> 00:28:52,000
Then you should speak to your participants and you shouldn't look a previous eight or pre talk or something like that,
286
00:28:52,000 --> 00:28:58,000
anything and just about and build rapport. And you should be doing but take some time to really ask them.
287
00:28:58,000 --> 00:29:05,000
Is there anything that really worries you about this research that I can do that might make you feel better?
288
00:29:05,000 --> 00:29:13,000
Is really support. I can give you, you know, that type of thing. And I think it's just take take your time.
289
00:29:13,000 --> 00:29:17,000
But every time you have an interaction with a participant, just reconsider.
290
00:29:17,000 --> 00:29:23,000
Is anything changing? If it does change it. And don't forget to write down your thesis.
291
00:29:23,000 --> 00:29:30,000
Thanks, Warren, for what must be one of the most illuminating discussions I've ever had about research ethics.
292
00:29:30,000 --> 00:29:38,000
I think the thing that's stuck with me the most is the fundamental thing that research isn't static.
293
00:29:38,000 --> 00:29:43,000
So why is our approach to research ethics static?
294
00:29:43,000 --> 00:30:15,636
And that's it for this episode. Don't forget to, like, rate and subscribe and join me next time where I'll be talking to somebody else about researchers, development and everything in between
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.