1
0:00:00 --> 0:00:02
Why is he a librarian?
2
0:00:02 --> 0:00:06
No, no, but he does a lot of work so he might as well be our librarian.
3
0:00:06 --> 0:00:08
We can cheer his genius.
4
0:00:08 --> 0:00:10
All right.
5
0:00:10 --> 0:00:[privacy contact redaction]ors for COVID Ethics International and today's discussion.
6
0:00:18 --> 0:00:[privacy contact redaction]ephen Frost over three years ago with a desire to pursue truth, ethics, justice, freedom and health.
7
0:00:28 --> 0:00:32
That rolls off my tongue now. So is the truth, justice, ethics, freedom and health.
8
0:00:32 --> 0:00:[privacy contact redaction] government and power over the years and has been a whistleblower and activist.
9
0:00:37 --> 0:00:39
His medical specialty is radiology.
10
0:00:39 --> 0:00:47
And at this time, when the Scottish Parliament has passed a law on hate speech and questioning anything to do with transgenders,
11
0:00:47 --> 0:00:53
and then the issue that say, well, why would it bother me in Australia?
12
0:00:54 --> 0:01:04
Because if anyone reads what I publish about my views about transgender nonsense, then the Scottish, then I can, that's a crim, I can be arrested in Scotland.
13
0:01:04 --> 0:01:[privacy contact redaction] in Australia, this is serious stuff, Karen, as you well know.
14
0:01:12 --> 0:01:[privacy contact redaction] government and power over the years and has been a whistleblower and activist.
15
0:01:16 --> 0:01:19
His medical specialty is radiology.
16
0:01:19 --> 0:01:21
I'm Charles Kovest, the moderator of this group.
17
0:01:21 --> 0:01:24
I'm Australasia's passion provocateur.
18
0:01:24 --> 0:01:[privacy contact redaction]e.
19
0:01:27 --> 0:01:[privacy contact redaction]iced law for 20 years before changing career 31 years ago.
20
0:01:31 --> 0:01:[privacy contact redaction] 13 years, I've helped parents and lawyers to strategize remedies for vaccine damage and damage from bad medical advice.
21
0:01:39 --> 0:01:42
I'm also the CEO of an industrial hemp company.
22
0:01:42 --> 0:01:46
We comprise lots of professions here and we're from all around the world.
23
0:01:47 --> 0:01:49
Many of us thought that vaccines were OK.
24
0:01:49 --> 0:01:54
Now, many of us proudly say, yes, we are passionate anti-vaxxers.
25
0:01:54 --> 0:02:[privacy contact redaction], just for the record, I am a passionate anti-vaxxer and I'm proud of that label.
26
0:02:01 --> 0:02:12
And as I say, and I say this to anyone watching this recording as well, I have yet to meet a parent of a child who refused all vaccines, who regrets that decision.
27
0:02:12 --> 0:02:13
I've yet to meet one.
28
0:02:13 --> 0:02:14
I'd love to meet one if there is any.
29
0:02:14 --> 0:02:17
He says, I wish I had vaccinated my children.
30
0:02:19 --> 0:02:[privacy contact redaction] time here, welcome and feel free to introduce yourself in the chat and where you're from.
31
0:02:26 --> 0:02:[privacy contact redaction] or you have a radio or TV show or you've written a book, put the links into the chat so we can follow you, promote you and find you.
32
0:02:35 --> 0:02:[privacy contact redaction] a weekly radio program on TNT radio.
33
0:02:38 --> 0:02:45
I urge you all to watch TNT radio dot live because there is no censorship, zero censorship.
34
0:02:46 --> 0:02:[privacy contact redaction]and that in the middle of World War Three and that the medical science battle is only one of [privacy contact redaction] world war.
35
0:02:55 --> 0:02:56
And there's no time to be tired.
36
0:02:56 --> 0:03:00
We're four years into what I thought was going to be a six year war.
37
0:03:00 --> 0:03:01
I think it's going to be a seven year war.
38
0:03:01 --> 0:03:04
We've got at least three years of this to go.
39
0:03:04 --> 0:03:[privacy contact redaction]and the development of science and that the science is never settled.
40
0:03:11 --> 0:03:[privacy contact redaction]
41
0:03:13 --> 0:03:18
Some of us believe that viruses are a hoax and some of us sit on the fence.
42
0:03:18 --> 0:03:[privacy contact redaction]ings keep abusing anybody who dares believe that viruses exist.
43
0:03:24 --> 0:03:25
It's quite interesting.
44
0:03:26 --> 0:03:32
The meeting runs for two and a half hours after which, for those with the time, Tom Rodman runs a video telegram meeting.
45
0:03:32 --> 0:03:35
Tom puts the links into the chat if you're able to join.
46
0:03:36 --> 0:03:41
We'll listen to our guest presenters today, Dr. Scott McLaughlin, for as long as Scott wishes to speak.
47
0:03:41 --> 0:03:[privacy contact redaction] Q&A.
48
0:03:42 --> 0:03:[privacy contact redaction], by long established tradition, asks the first questions for 15 minutes.
49
0:03:47 --> 0:03:51
This is a free speech environment with appropriate moderating.
50
0:03:51 --> 0:03:58
Free speech is crucially important in our fight to preserve our human freedoms, as we're seeing in Scotland and the attack on free speech.
51
0:03:58 --> 0:04:[privacy contact redaction]s, including Australia.
52
0:04:01 --> 0:04:08
Now, the important thing is free speech doesn't mean that's what you need to understand what appropriate moderating is.
53
0:04:08 --> 0:04:18
Free speech does not give you the right to have any ad hominem attacks, any attacks on people, any of our guests are not obliged to answer any of your questions.
54
0:04:18 --> 0:04:22
And if I consider as a moderator, your questions are irrelevant, then I won't allow them.
55
0:04:23 --> 0:04:30
OK, this is, you know, if you want to come here and talk about stuff that's nothing to do with what this group is about, I'm not going to allow the questions.
56
0:04:30 --> 0:04:34
That's what the moderators do to keep this group.
57
0:04:34 --> 0:04:38
If you want another group talking about other stuff, go and talk about that other stuff.
58
0:04:38 --> 0:04:40
Appropriate moderating.
59
0:04:40 --> 0:04:[privacy contact redaction], I've been moderating meetings for over 40 years and 31 years since I left my legal career.
60
0:04:46 --> 0:04:49
So if you want to debate on moderating, send me an email.
61
0:04:49 --> 0:04:51
I'll have a debate with you with pleasure.
62
0:04:52 --> 0:04:58
If you are offended by anything, be offended.
63
0:04:58 --> 0:05:[privacy contact redaction]ed.
64
0:05:01 --> 0:05:[privacy contact redaction]ry that requires nobody to say anything that may offend another.
65
0:05:08 --> 0:05:[privacy contact redaction]ive of love, not fear.
66
0:05:11 --> 0:05:13
Fear is the opposite of love.
67
0:05:13 --> 0:05:16
Fear squashes you love on the other hand, expands you.
68
0:05:16 --> 0:05:[privacy contact redaction] talk fest and extraordinary range of actions and initiatives have been generated from linkages made by attendees in these meetings.
69
0:05:26 --> 0:05:28
I'll give you another example.
70
0:05:28 --> 0:05:36
One Karen from Scotland might make one comment on a particular supplement that might just be the thing that saves somebody's life.
71
0:05:36 --> 0:05:45
And to give a case in point, I had a basketball player visit me during the week who had a tachy heart problem, racing problem.
72
0:05:45 --> 0:05:52
It's called tachy something, not tachycardia, but it was something, you know, and he went to the doctor and the doctor said, take this drug.
73
0:05:52 --> 0:06:00
And I said, why don't you go and see someone like Karen and find out what your body is missing rather than taking another bloody drug and putting it in there.
74
0:06:00 --> 0:06:10
So so the insights from this meeting for people with problems as if you've got a problem, put that in the chat as well, because someone here might have the perfect solution for you.
75
0:06:10 --> 0:06:15
So as I say, this is not just a talk fest.
76
0:06:15 --> 0:06:[privacy contact redaction] or talk or links or resources that will help people put the details into the chat, the meeting is recorded and is uploaded on the Rumble channel.
77
0:06:26 --> 0:06:31
Scott, please, when it goes up there, there are comments that are made.
78
0:06:31 --> 0:06:35
You can see those comments having the time to send them all to our presenters.
79
0:06:35 --> 0:06:[privacy contact redaction], Dr. Scott McLaughlin.
80
0:06:38 --> 0:06:41
And we thank you so much, Scott, for giving us your time and wisdom.
81
0:06:41 --> 0:06:49
And I want to tell you a little bit about Dr. Scott McLaughlin, whose greatest claim to fame is that he was born in Australia, where I was born.
82
0:06:49 --> 0:06:59
So there you are. He's got 11 degrees, I think Stephen Frost said, in information science, health and cyber law, data science, PhD in computer science.
83
0:06:59 --> 0:07:[privacy contact redaction]udied as a nurse.
84
0:07:02 --> 0:07:12
And he's been granted a prestigious Royal Academy of Engineering Fellowship, looking at the regulation and safety standards for using autonomous systems, focusing on aerial drones and motor vehicles.
85
0:07:12 --> 0:07:[privacy contact redaction]ion of automatic cars is got a fellowship in that his current work focuses on developing clinical decision support tools using large publicly available aggregate and anonymized data sets as the initial formative data source.
86
0:07:29 --> 0:07:[privacy contact redaction]ed to review the myriad of generally statistical approaches and models developed for singular maternity related clinical health issues.
87
0:07:38 --> 0:07:43
Okay, this whole question, we've had conversations here at evidence based medicine.
88
0:07:43 --> 0:07:47
Okay, and the whole issue of protocols.
89
0:07:47 --> 0:07:[privacy contact redaction] of us here, Scott, for your information, are against protocols and EBM. So I'd be interested to have this conversation.
90
0:07:54 --> 0:08:03
And then, and then during covert Scott also worked with a team that included people such as Professor Norman Fenton, Martin, Neil, Dr. Jonathan Engler who's spoken to us.
91
0:08:03 --> 0:08:09
Claire Craig who's spoken to us Jessica road Dr. Jessica Rose who's spoken to us and others.
92
0:08:09 --> 0:08:[privacy contact redaction] person to highlight the safety signals as they started to develop in the US. There's data sets that was heavily cited by people like Dr. Peter McCullough and which is worth the Jessica Rose went on the spearhead with great effect.
93
0:08:24 --> 0:08:40
Scott pointed out the lies deceptions poor quality statistics and assumptions that were used as the basis of a publicly lauded fear mongering covert predictive models like those of Neil Ferguson and David Fissman.
94
0:08:40 --> 0:08:58
And with the wider team, Scott exposed ongoing issues with the OMS covert statistics that were probably instrumental in misleading the wider public, and which led to the Office of the Statistics regulator agreeing with professors Fenton and Neil, Claire Craig and Scott that the
95
0:08:58 --> 0:09:12
That the OMS data was unfit for use in making claims about vaccine safety and efficacy. He's got a mile long set of qualifications, lots of publications, McLaughlin is you'll see on the show notes.
96
0:09:12 --> 0:09:22
There'd be more detail there and so thank you Scott for being here and thank you Stephen Frost again for creating this group that for organizing Scott to present to us today.
97
0:09:22 --> 0:09:34
Scott, you have the ability to share your screen if you wish. We are in your hands. You are in charge. We'll listen to you for as long as you wish to speak. And then we'll have Q&A.
98
0:09:34 --> 0:09:42
Lovely. Well, thank you for that introduction. I feel a bit, you know, overwhelmed.
99
0:09:42 --> 0:09:52
Yeah, so what I what I thought I'd like to do is after talking with Stephen I thought, you know, maybe let's have this as a bit of a discussion I'll give you a little bit of my, my background and history.
100
0:09:52 --> 0:10:10
You know, a few anecdotal stories and things along the way and talk about some of the things that I do. So some of you may have read a sub stack called law, health and technology that happens to be me in sort of, you know, pseudo anonymous guys.
101
0:10:10 --> 0:10:[privacy contact redaction]arted with looking at some of the covert statistics and looking at things like the various Scottish and UK maternity statistics for covert and it's morphed into I've done a whole lot of work on things like you'll be familiar with the issues around the Lucy Letby trial.
102
0:10:30 --> 0:10:46
And so, you know, I've, I've had long discussions with with various academics around the around the world and done a lot of analysis on some of the things like the clinical notes that came out of the Lucy Letby trial.
103
0:10:46 --> 0:11:00
On top of that I've also done a lot of work with looking at other topics that I understand this group are interested in. So the medical ethics of consent and you know where we're where we're slipping up with that.
104
0:11:00 --> 0:11:[privacy contact redaction] at the moment and I'm embedded in a whole heap of reading right now to try and understand.
105
0:11:08 --> 0:11:25
Some of you will be familiar with a concept called Gillick competence, which if you're doing anything with the with the covert vaccine, especially places like Canada and Australia over the last couple of years, or if you're looking at any of the trends treatments.
106
0:11:26 --> 0:11:49
One of the things that that I look at is I've looked at a whole heap of the court cases and watched as judges have slowly just started to do that whole Pontius pilot hand washing, you know, thing to avoid the trans activists or avoid the vaccine activists and step aside from making the decision by creating various situations.
107
0:11:50 --> 0:12:03
So we'll come to talk about all of these things. But getting back to the most important thing of all that Charles mentioned is the fact that I hail from where Charles is I hail from Australia.
108
0:12:04 --> 0:12:25
Originally I was born in possibly one of the tiniest hospitals in a backwater of New South Wales, a little tiny wee town called in Varell where my mother trained strangely enough as a nurse about 12 or 15 years before I was born.
109
0:12:25 --> 0:12:37
And, you know, I did most of my growing up in that sort of natural way that kids in the country do, you know, you go play in the river, you go climb a tree, you run around all day.
110
0:12:39 --> 0:12:48
And, you know, spent all my time outdoors, which is why I look at some of this stuff that, you know, the headlines the last couple of days about how the one day in the sun's going to kill you.
111
0:12:48 --> 0:12:51
I got sunburned a million times as a kid.
112
0:12:51 --> 0:12:53
I'm still here.
113
0:12:53 --> 0:12:58
Funny that. But, you know, I grew up in the country.
114
0:12:58 --> 0:13:[privacy contact redaction] of the kids that I went to high school with, you know, every so often I go back and a couple of years ago I went back to Tamworth and spent a week in Tamworth and then went up to in Varell.
115
0:13:11 --> 0:13:20
And it's interesting that most of the kids I went to school with are still in those two small towns.
116
0:13:20 --> 0:13:27
You know, I realize now they consider Tamworth to be a city, but it was a small town when we moved there when I was sort of four or five.
117
0:13:27 --> 0:13:[privacy contact redaction]ill do the things, you know, they do whatever their parents did.
118
0:13:33 --> 0:13:40
They work in a bakery or they work on a farm or, you know, it's very rare to find.
119
0:13:40 --> 0:13:46
And I think I've only found two of my classmates from high school who've left that small town.
120
0:13:46 --> 0:13:51
And, you know, one of them I ran into quite accidentally over on the coast.
121
0:13:51 --> 0:13:55
I went over to visit Coffs Harbour and ran into this guy who I went to school with.
122
0:13:55 --> 0:13:59
And, you know, he's like, oh, you know, I recognize you.
123
0:13:59 --> 0:14:03
I've seen your name on some stuff. You know, I didn't realize it was you.
124
0:14:03 --> 0:14:05
I know you. We went to school together.
125
0:14:05 --> 0:14:09
And it's like, you know, he's like, I've spent my I've spent [privacy contact redaction] selling used cars.
126
0:14:09 --> 0:14:12
And that's all I do. And I've never done anything.
127
0:14:12 --> 0:14:14
I've never left the country.
128
0:14:14 --> 0:14:26
I think it's a sad thing about a lot of people from, you know, my generation and my parents' generation, that they didn't travel around so much.
129
0:14:26 --> 0:14:34
I ended up doing a, of all things, a chef's apprenticeship because I got bored at school.
130
0:14:34 --> 0:14:[privacy contact redaction] awful. I did not like it.
131
0:14:38 --> 0:14:47
I am probably the only PhD you will sit with who didn't actually graduate high school.
132
0:14:47 --> 0:14:52
And, you know, I see some of you actually look up with your eyes open. And that's an absolute fact.
133
0:14:52 --> 0:15:00
I didn't even complete here. They call it the O levels. I think the GCSE, the year [privacy contact redaction]ralia, it was just your school certificate.
134
0:15:00 --> 0:15:07
Didn't even complete that. I left and went and did a chef's apprenticeship and worked in a whole heap of restaurants around Sydney.
135
0:15:07 --> 0:15:11
And, you know, French, Italian did a bit of everything.
136
0:15:11 --> 0:15:[privacy contact redaction], any of you that are in that neck of the woods, there used to be a restaurant chain [privacy contact redaction]ump,
137
0:15:21 --> 0:15:[privacy contact redaction]eak, seafood, sizzlery type things that you went into.
138
0:15:27 --> 0:15:[privacy contact redaction] year of my apprenticeship as the head apprentice in one of those, you know, totally, totally gauntless 15 year old kid.
139
0:15:36 --> 0:15:43
And ended up at the end of that, realizing that I was absolutely 100% totally bored.
140
0:15:43 --> 0:15:52
You know, once, you know, once I knew how to make a bavoir or I knew how to make a sac à tort or I knew how to make croissants or whatever.
141
0:15:52 --> 0:15:58
Once I knew how to make it, it was like, well, that's boring now. What's the next thing I can learn? And so off I went.
142
0:15:58 --> 0:16:04
I ended up, I went and sat the it used to be in Australia.
143
0:16:04 --> 0:16:11
I don't know if it's still the same, but you used to be able to sit an alternate version of what was the HSC exams.
144
0:16:11 --> 0:16:15
There was to get a tertiary entrance rank.
145
0:16:15 --> 0:16:24
I went and sat this tertiary entrance rank thing, not knowing what it was going to be about, not knowing, you know, not having done year 11 or 12.
146
0:16:24 --> 0:16:31
And got the letter in the mail that, you know, here you've scored 93.7 or something.
147
0:16:31 --> 0:16:37
And, you know, I go looking at the what are the grades you need to get into particular courses?
148
0:16:37 --> 0:16:42
And, you know, the things I wanted to do, aeronautical engineering required just a little bit more.
149
0:16:42 --> 0:16:[privacy contact redaction] a little bit more.
150
0:16:46 --> 0:16:55
And, you know, so I looked at all of those things and it was like, no, I'm going to get rejected for all of these things and ended up at nursing school.
151
0:16:55 --> 0:17:01
And so, you know, I studied nursing and, you know, I did enjoy nursing.
152
0:17:01 --> 0:17:14
But again, it was another situation where I could always tell that, you know, the situation quite often was that the the nurse tutor that we'd have on the ward would end up coming along.
153
0:17:14 --> 0:17:18
There'd be usually three nursing students assigned to a ward.
154
0:17:18 --> 0:17:28
And so there'd be three of us and the nursing tutor would come along and, you know, she was she she kept an eye on us, but she was fairly disinterested.
155
0:17:28 --> 0:17:32
And so what she'd often do is she'd go, well, I know Scott will have read it.
156
0:17:32 --> 0:17:34
So, Scott, you can demonstrate this.
157
0:17:34 --> 0:17:38
And then she'd go and sit and have a coffee somewhere else.
158
0:17:38 --> 0:17:44
And so, you know, it ended up being that I was teaching the two other students who were in my cohort.
159
0:17:44 --> 0:17:47
And it's like, well, I'm just as green as you are.
160
0:17:47 --> 0:18:03
But, you know, I got through most of that and ended up the brick wall that really sort of hit me was that, you know, it was still very much even in nursing, even if you did psychiatric or forensic nursing,
161
0:18:03 --> 0:18:[privacy contact redaction]ill very much hit that brick wall of you'd find you'd end up in front of this old matron who'd been, you know, this this this older woman who'd been the matron of a hospital somewhere who just completely meant you're an ambulance officer where your ward orderly get out of my sight.
162
0:18:19 --> 0:18:[privacy contact redaction] I ran up against that woman as I started third year.
163
0:18:27 --> 0:18:[privacy contact redaction]ing thing is that the professor I work for at King's College at the moment was doing her PhD with that woman at the same time that I was there.
164
0:18:41 --> 0:18:43
And it's like we never met each other.
165
0:18:43 --> 0:18:45
We never crossed paths.
166
0:18:45 --> 0:18:49
We were both aware of each other's name, but we never actually met.
167
0:18:49 --> 0:18:56
And it turned out, you know, we would have passed each other in the hall because she was doing her PhD with this particular professor.
168
0:18:56 --> 0:19:04
So, you know, odd, odd quirks of fate that 30 something years later, I end up working for that person.
169
0:19:04 --> 0:19:09
But in the end, I sort of got shoved out of nursing.
170
0:19:09 --> 0:19:19
And one of the final things that I watched as I left, and it's probably the reason I got so interested in the Lucy Lettby trial is I watched a trial.
171
0:19:19 --> 0:19:[privacy contact redaction]hand watched the trial of a nurse who got accused of one or two heinous crimes, but started out with, you know, the typical police prosecution thing.
172
0:19:31 --> 0:19:[privacy contact redaction] ever followed a criminal trial, the police love to go in, especially in Australia, you know, and I noticed that it's becoming more prevalent everywhere where they will go in and they'll charge you with 10 or 15 or 20 different things.
173
0:19:48 --> 0:19:55
And there's actually only two or three offenses. What they do is they reword the offense or they pick a slightly different section of the same act.
174
0:19:55 --> 0:20:11
And so, you know, it was a case of sitting in this trial and it was very curious to me to watch as, you know, the prosecutor would get up and lead evidence from their witness about what this nurse was supposed to have done.
175
0:20:11 --> 0:20:20
And then watching the judge as the judge goes, well, actually, that doesn't fit the tone of what's written in section 13A.
176
0:20:20 --> 0:20:30
It also doesn't fit 13B. And then the little prosecutor would stand up and she'd go, oh, well, you're on the prosecution withdraws charges 17 and 21 from the docket.
177
0:20:30 --> 0:20:32
And then she'd sit down again.
178
0:20:32 --> 0:20:45
And, you know, so watching this trial as these as these sort of repetitive charges that all looked to me at the time, you know, as this person who I hadn't studied law at that point, they all looked to me to be the same thing and I couldn't work out why.
179
0:20:45 --> 0:21:01
And to watch this trial and see this this nurse gradually losing their, you know, their case, even though of the 23 or 24 charges, most of them by the start of the second day of being thrown out, they'd been dismissed.
180
0:21:01 --> 0:21:24
And, you know, that poor nurse was found guilty, was, you know, censured, given a huge financial penalty and put on a quite extensive, I think it was like five or six year good behavior bond, which was, you know, patently ridiculous for the day.
181
0:21:24 --> 0:21:[privacy contact redaction] that then later on, when I started studying health law, much, much later, I did my GDL, which is sort of the equivalent of doing your LLB.
182
0:21:38 --> 0:21:43
It's a practical qualification for registration.
183
0:21:43 --> 0:21:47
I did that back in about 2014 or 2015.
184
0:21:47 --> 0:22:09
And one of the cases that we looked at during my, that I studied during my health law course was this same case and finding out that two years after this nurse had been, you know, quite badly remonstrated and, you know, very punitively punished,
185
0:22:09 --> 0:22:18
there'd been a civil trial. And it looked to me like the civil trial had been the whole reason for the charges in the first place.
186
0:22:18 --> 0:22:32
In the civil trial, the daughter of the patient, it was an elderly patient, the daughter of the patient was, you know, seeking a phenomenal amount of money from this nurse's employer.
187
0:22:32 --> 0:22:[privacy contact redaction]oyer, the employer is there. The employer was a very well known nursing agency at the time in Australia.
188
0:22:39 --> 0:22:41
They used to advertise on television.
189
0:22:41 --> 0:22:[privacy contact redaction]oyer is there and their insurer is there and they're all having these discussions.
190
0:22:46 --> 0:23:01
And it gets to a point where the judge is asking questions from the bench of the person who I suppose you'd call it like the human resources person who interviewed the nurse and then, you know,
191
0:23:01 --> 0:23:15
assigned them jobs. And one of the things that this nurse had been accused of was of claiming to be registered or claiming to be on the register when they weren't registered.
192
0:23:15 --> 0:23:[privacy contact redaction]udent in the cohort that I was in.
193
0:23:20 --> 0:23:34
And so, you know, the judge is talking to this to this human resources woman who'd interviewed the person and said something to the effect of, well, you know, what did they tell you when you interviewed them?
194
0:23:34 --> 0:23:43
What did they tell you? And she said, well, she said, I quite clearly remember we they told me that they were a student and they told me where they were a student and what lectures they knew.
195
0:23:43 --> 0:23:[privacy contact redaction]ures that I knew because I trained there at the same time.
196
0:23:48 --> 0:23:52
And, you know, we talked about all sorts of things like that.
197
0:23:52 --> 0:23:57
And, you know, I remember, you know, the fact that it was written what classes they were in.
198
0:23:57 --> 0:24:12
It was written on the back of the the like form that they had to fill out to apply to be on our register as they were on our register is like a ward orderly or a what in Australia was called a PCA, a patient care attendant.
199
0:24:12 --> 0:24:20
And the judge is looking from them to the rich woman who was running the daughter of the patient who was doing the lawsuit.
200
0:24:20 --> 0:24:29
It's like, but you told me he was prosecuted for being a, you know, for claiming to have been a registered nurse.
201
0:24:29 --> 0:24:40
And, you know, you can read it in the in the court report how the whole court just obviously pauses the judge in his open court report.
202
0:24:40 --> 0:24:54
And the judge in his obiter talks about the fact that he then got the police detective, he subpoenaed the police detective himself, put the police detective on the stand and said, well, you prosecuted for this.
203
0:24:54 --> 0:25:[privacy contact redaction]oyment form that you got from the employment agency from that woman over there?
204
0:25:01 --> 0:25:09
Where is the form to say what this person had claimed was their qualifications, etc.
205
0:25:09 --> 0:25:[privacy contact redaction]arted with the agency and the judge talks about the fact that the police officer was, you know, very deliberately obfuscative and tried to deviate the conversation.
206
0:25:23 --> 0:25:25
Is the term that the judge uses.
207
0:25:25 --> 0:25:41
And it turns out that because that piece of evidence didn't fit with their story, and this is where I got interested a lot in these sort of miscarriages of justice because the evidence didn't fit the story that this detective wanted to make.
208
0:25:41 --> 0:25:[privacy contact redaction] it in a box that went off to a police storage locker.
209
0:25:47 --> 0:25:52
So the judge then says to this police detective, well, you're going to have to go and get that.
210
0:25:52 --> 0:25:56
I want that on my desk by 5 p.m. today.
211
0:25:56 --> 0:26:[privacy contact redaction]ive made all sorts of excuses from the stand.
212
0:26:00 --> 0:26:[privacy contact redaction] that he made excuses and pretended and blah, blah, blah.
213
0:26:04 --> 0:26:[privacy contact redaction]ive down to the cells.
214
0:26:08 --> 0:26:[privacy contact redaction]s in Melbourne that still had cells underneath the courthouse.
215
0:26:13 --> 0:26:27
He put him in the cells and said, right, you can make phone calls only so long as those phone calls are overheard by my clerk and only so long as those phone calls lead to me getting that piece of paper by 5 p.m.
216
0:26:27 --> 0:26:[privacy contact redaction]ory was that the piece of paper was apparently delivered to the judge and the judge and he had a law student articling with him at the time.
217
0:26:37 --> 0:26:[privacy contact redaction]udying law, you know, going further with my law studies.
218
0:26:42 --> 0:26:54
He'd had this law student articling with him at the time who then made it her mission for the next couple of years to backtrack through all of the documentation.
219
0:26:54 --> 0:27:07
And it took six years for them to reverse the original prosecution that had happened to this nursing student back before the civil trial.
220
0:27:07 --> 0:27:10
And so, you know, you I followed up.
221
0:27:10 --> 0:27:[privacy contact redaction]ralia and followed up with, you know, what actually happened and saw the I can I can attest to the impact that that sort of police malfeasance, if you like, the effect that it had on that person's life and the fact that it put that person's life back probably 20 years.
222
0:27:36 --> 0:27:39
Yeah. You know, so you can you can really imagine.
223
0:27:39 --> 0:27:48
And so, you know, when you look at stories, you know, I've talked to Richard Gill about the work that he did on the Lucia de Burke case.
224
0:27:48 --> 0:27:57
When you look at, you know, when you look, even if even if one of these nurses who gets accused of something.
225
0:27:58 --> 0:28:19
So I don't know if many of you know, but not long after Lucy Lettby was found guilty, there was a bit of a knee jerk within the NHS and they started going around all the different neonatal units and looking at the staff and looking at, you know, the incidents that were happening.
226
0:28:19 --> 0:28:21
And they pinged a nurse.
227
0:28:21 --> 0:28:26
It was there was a headline that sort of very briefly blipped in a couple of the UK papers.
228
0:28:26 --> 0:28:33
They pinged a nurse, a neonatal nurse at one of the hospitals in Birmingham.
229
0:28:33 --> 0:28:45
Now, it turns out, again, same as with Lucy Lettby, a doctor claimed that a doctor who was present at both of the deaths claimed that it must have been her fault because it couldn't have been his fault.
230
0:28:45 --> 0:28:48
And that was that was literally the type of wording that was used.
231
0:28:48 --> 0:29:06
And so that that nurse who was happened to be present in in the neonatal unit at the time of two deaths was stood down by the hospital, had her registration taken off her by the NMC and was arrested by the police all within 48 hours.
232
0:29:07 --> 0:29:11
Now, it doesn't matter if she's never charged.
233
0:29:11 --> 0:29:20
It does not matter if even if she is charged, if she goes to court and gets acquitted, she will never work as a nurse again.
234
0:29:20 --> 0:29:23
Yeah, that that literally is the end of her career.
235
0:29:23 --> 0:29:[privacy contact redaction] in case.
236
0:29:27 --> 0:29:[privacy contact redaction]ication that she might have.
237
0:29:32 --> 0:29:38
And so that's that's the sort of thing that got me very much interested in that side of the law.
238
0:29:38 --> 0:29:48
And so, you know, I've ended up with three law degrees because I kept looking at stuff like that and going, OK, I want to research that a bit more.
239
0:29:48 --> 0:29:58
So, yeah, so I I did my nursing training, but I left at the end of my nursing training in a state of what you call all.
240
0:29:58 --> 0:30:05
But so, you know, I was I I think for the degree at the time I needed 240 credit points, I had 210.
241
0:30:05 --> 0:30:07
I needed to do one course.
242
0:30:07 --> 0:30:25
But this this professor who was the my year advisor was so down on the idea of males being her profession that I and I mean, I think my cohort started with nine males.
243
0:30:25 --> 0:30:34
There were only two males left who graduated and those two males that graduated had been state enrolled nurses for years before they were they were psych nurses already.
244
0:30:34 --> 0:30:37
And all they were doing was upskilling to keep working in psych.
245
0:30:37 --> 0:30:39
And she didn't have a problem with that.
246
0:30:39 --> 0:30:[privacy contact redaction]s and spent about 16 years working in I.T.
247
0:30:44 --> 0:30:48
I got qualifications in information science and I.T.
248
0:30:49 --> 0:30:56
And, you know, most of most of what I ended up sort of getting dragged into in the last five or six years of my I.T.
249
0:30:56 --> 0:31:[privacy contact redaction]s.
250
0:31:00 --> 0:31:[privacy contact redaction] two years working for the New Zealand Ministry of Health on a big health project there.
251
0:31:07 --> 0:31:[privacy contact redaction] that did it did a combination of three or four things sort of all in the same project.
252
0:31:14 --> 0:31:18
So we were doing a provider regulation and monitoring system.
253
0:31:18 --> 0:31:[privacy contact redaction]er GP clinics, register hospitals, operating theaters, any sort of medical.
254
0:31:27 --> 0:31:40
If you had a if you had a radiology clinic, you know, that you were running in a strip mall down the street, then your premises would get inspected and all of your information would end up in this provider regulation and monitoring system.
255
0:31:40 --> 0:31:[privacy contact redaction] to go through which which doctors worked there and which nurses worked there.
256
0:31:46 --> 0:31:54
And then, you know, we we we'd link through a whole heap of thing called a solution enterprise solution bus.
257
0:31:54 --> 0:32:06
So we'd think through all of the databases so that, you know, somebody wanted to look at, OK, this particular medical clinic, who's there, what's their history, what's the history and qualification of all their staff, et cetera.
258
0:32:06 --> 0:32:12
And then we'd go through all of the databases and pull together a report just on that facility at the same time.
259
0:32:12 --> 0:32:[privacy contact redaction] who were working a solution architect, we were working together at the Ministry of Health.
260
0:32:18 --> 0:32:27
We developed the process that got rolled out in New Zealand whereby everybody has a standardized national health identity number.
261
0:32:27 --> 0:32:[privacy contact redaction] an NHI number and we developed algorithms for cross checking and so that we could make sure that the idea was to try and make sure that, you know,
262
0:32:35 --> 0:32:46
when a health identity number was created, say, at birth in a in a maternity unit, it was automatically validated as being a valid ID.
263
0:32:46 --> 0:32:[privacy contact redaction]em. It could be enlivened and linked to health data.
264
0:32:52 --> 0:32:58
And so, you know, as your health record was built up, it was all linked to this national health identity.
265
0:32:59 --> 0:33:12
Now, at the time, I thought all of this sort of, you know, linking together of electronic health records and stuff, I could I was sitting on a completely different side of the fence to where I am right now.
266
0:33:12 --> 0:33:25
At that point, I could see that there was, you know, there might be some benefit to this, you know, the idea of the whole break glass and be able to see a person's test results or medical history if they turn up unconscious at an accident emergency.
267
0:33:25 --> 0:33:34
That sort of thing, you know, at the time as a solution architect and as somebody who'd done nursing training, it looked like a good idea.
268
0:33:34 --> 0:33:41
What I can tell you now is the research and work that I've done over the last 10 years, most of it here in the UK.
269
0:33:42 --> 0:33:[privacy contact redaction] of what's happening with health records around the world and most of the digitization of health care is anything but good.
270
0:33:53 --> 0:33:57
It's anything but useful for patients.
271
0:33:57 --> 0:34:02
And really, it's being it's now at the point where it's being used against us in.
272
0:34:02 --> 0:34:07
I noticed somebody before had their photo was all well.
273
0:34:07 --> 0:34:11
I've written a couple of papers talking about all well's predictions.
274
0:34:11 --> 0:34:[privacy contact redaction]s is one of those things that I think is very much, you know, all well predicted it.
275
0:34:17 --> 0:34:20
And it's being used against us.
276
0:34:20 --> 0:34:32
You know, at worst, you've got things like the my health record in Australia, which the government sold and sold and sold again, spent tens and hundreds of millions of dollars on.
277
0:34:32 --> 0:34:38
And it's from a clinical standpoint, it's barely functional, barely useful.
278
0:34:38 --> 0:34:[privacy contact redaction]ralia that I've seen so far that even remotely got anything right was the approach that New South Wales Health have done, whereby all of the hospitals now use the same health record interface so that at least the skills that a doctor or a nurse learns say at Westmead, they can go all the way out to Tamworth based hospitals.
279
0:35:02 --> 0:35:07
And I think that's a very good example of how you can log into a computer and everything looks the same.
280
0:35:07 --> 0:35:15
You know, I think there's safety benefits in that, especially if you're doing things in theater or intensive care or accident emergency.
281
0:35:15 --> 0:35:28
But what you've got here, for example, in the UK, which is very much what you see in the US and Canada and so on, is this system whereby each health provider in the UK, you call them trusts.
282
0:35:29 --> 0:35:33
Basically is a rogue operation doing their own thing.
283
0:35:33 --> 0:35:[privacy contact redaction]urer at King's, one of the parts of my remit at King's is supposedly to teach nurses and midwives how to use this, you know, patient records type technology.
284
0:35:46 --> 0:35:51
And so I said, OK, well, you know, let's let's go out and have a look at a couple of the hospitals.
285
0:35:51 --> 0:35:55
So I've had a look at, you know, the CERNA systems, for example, at Barts.
286
0:35:55 --> 0:36:01
I've had a look at the the various EMIS and EPIC projects that are that are running around London.
287
0:36:01 --> 0:36:10
And I went back and I said to the professor that employed me, I said, well, the problem you've got is the way that the UK are doing health records.
288
0:36:10 --> 0:36:12
Every hospital is totally different.
289
0:36:12 --> 0:36:[privacy contact redaction] within the hospital looks totally different.
290
0:36:16 --> 0:36:18
You've got no consistency.
291
0:36:18 --> 0:36:34
Anything I teach one of these nurses or midwives, you know, for this hospital, even if this hospital and that hospital have badger net, for example, for maternity, what I teach them in using badger net at this hospital is going to be barely useful for them when they go to the next hospital.
292
0:36:34 --> 0:36:37
Because it's just it's not the same.
293
0:36:37 --> 0:36:[privacy contact redaction]s.
294
0:36:39 --> 0:36:[privacy contact redaction]s.
295
0:36:41 --> 0:37:[privacy contact redaction]akes that we that we see across the NHS is the fact that even within the same NHS trust, you can have three or four different versions of that same program, you know, whether it be CERNA or EPIC or EMIS or whatever, you can have three or four versions of it running across the different campuses in that hospital.
296
0:37:00 --> 0:37:08
And then when they bring the data to someone like me and say, OK, we've got, you know, 130 patients from this campus and we've got 200 patients from that campus.
297
0:37:08 --> 0:37:10
Can you merge them together?
298
0:37:10 --> 0:37:13
And it ends up taking months.
299
0:37:13 --> 0:37:20
And the reason it takes months is because they've been deployed by different people from the vendor over time.
300
0:37:20 --> 0:37:[privacy contact redaction]ent.
301
0:37:23 --> 0:37:27
The information where the nurses and midwives and doctors are recording, it's different.
302
0:37:27 --> 0:37:35
And then, you know, that's leaving aside all of the cool problems which you have with the dealing with the natural language, text that's in a health record.
303
0:37:35 --> 0:37:47
And so the supposed benefits and I did a whole study on this, the supposed benefits that we keep being told that health records will bring, you know, and the fact that my PhD is in learning health systems.
304
0:37:47 --> 0:37:[privacy contact redaction]ems is meant to be that we get all of these data sets of health records together.
305
0:37:53 --> 0:37:[privacy contact redaction]ug them into an AI.
306
0:37:55 --> 0:38:12
We then sit there and we enter the symptoms off of the next patient who walks in the door and the AI goes, well, you know, filters through and finds the 50 patients that match the symptomology and the gender and the age and everything else about this patient.
307
0:38:12 --> 0:38:22
The idea is supposed to be that then you can get a more accurate prediction of, OK, what disease do they have and which treatment should we prescribe for them?
308
0:38:22 --> 0:38:27
The problem is that that's not how any of this data is being used.
309
0:38:27 --> 0:38:44
Every year in the UK, 74 million pound or more is spent licensing and deploying more hospitals with new health records projects or with updating the previous health records project.
310
0:38:44 --> 0:38:52
That money is often dead money because, like I say, you know, the minute you've trained the nurses at that hospital, the minute they leave to go to the next hospital, it's all useless.
311
0:38:52 --> 0:38:54
It's all lost. They've got to relearn it.
312
0:38:54 --> 0:39:01
But even worse, do you know how they pay for that [privacy contact redaction]us pound?
313
0:39:01 --> 0:39:[privacy contact redaction]s are being sucked up like a vacuum cleaner.
314
0:39:05 --> 0:39:15
They're being NHS digital sits like a you'd almost you'd almost call it like a data gatherer, hoovering up these health records in the middle.
315
0:39:15 --> 0:39:[privacy contact redaction] in Australia.
316
0:39:18 --> 0:39:27
Then what they're doing is they're repackaging it and giving it to people like Ben Goldacre at Oxford or to John Knowles.
317
0:39:27 --> 0:39:[privacy contact redaction] UK.
318
0:39:32 --> 0:39:34
They're selling that data.
319
0:39:34 --> 0:39:41
They're making half a billion pound a year selling your health record data to all and sundry companies.
320
0:39:41 --> 0:39:55
In the end, I even was able to demonstrate a court case where health record data for a person had been anonymized, anonymized and sold to a debt collection agency.
321
0:39:55 --> 0:40:[privacy contact redaction]ion agency had sifted through that data because one thing they knew about the particular person who they wanted to track down and serve was they knew that they had a health condition that meant that every six or eight weeks they were going.
322
0:40:09 --> 0:40:17
They had a chronic condition. They were going to the doctor every six or eight weeks because that was the reason the person had had their payments put on hold.
323
0:40:18 --> 0:40:29
So they went through this health data looking for people about the right age in about the right location who had this medical condition.
324
0:40:29 --> 0:40:[privacy contact redaction]or's clinic.
325
0:40:32 --> 0:40:[privacy contact redaction]or's clinic for three days, waiting to see when this patient would come in because they'd identified the doctor's clinic and they identified the cycle of appointments.
326
0:40:44 --> 0:40:50
He sat there until that person walked in, walked up to reception and said, I'm so and so I've got an appointment 11 a.m.
327
0:40:50 --> 0:40:56
And he walked up to the person while they were still at the reception counter and served them with papers.
328
0:40:56 --> 0:40:[privacy contact redaction]
329
0:40:59 --> 0:41:12
And I went and sat in the two and a half to three hour trial to watch and see how that sort of came about because I'd been told that it would be an interesting one of the health law professors at work.
330
0:41:12 --> 0:41:14
It said, oh, it'll be an interesting thing to go and watch.
331
0:41:14 --> 0:41:17
Go and watch it. And that was that was what came about.
332
0:41:17 --> 0:41:[privacy contact redaction]s are being anonymized.
333
0:41:20 --> 0:41:26
I did a whole heap of work. There's a there's a chap you'll see in the list of people here, Dr. Kuda Dube.
334
0:41:26 --> 0:41:34
He and I did a whole heap of work and did some research looking at various projects like Latoya Sweeney's project in the U.S.
335
0:41:34 --> 0:41:[privacy contact redaction]rated how easy it is for that anonymized data, even using all these weird and wonderful rules that the NHS and other health organizations put on your HIPAA laws and so on in America, how easy it is.
336
0:41:49 --> 0:41:59
Forty to [privacy contact redaction]ion of anonymized health records can eventually be re-identified back to the source patient.
337
0:42:00 --> 0:42:09
So that was a whole area of research I did that showed me that health records, the electronic health record systems aren't helping the patient.
338
0:42:09 --> 0:42:14
They're helping the government to learn more about us, to find us when they need to find us.
339
0:42:14 --> 0:42:19
And they're helping them to make money because they're selling it to Google and Microsoft and so on.
340
0:42:20 --> 0:42:34
Now, one of the side projects that I did while I was working for Professor Fenton was a side project with the Birmingham Law School at University of Birmingham on a project that was half funded by Microsoft called Engine B.
341
0:42:34 --> 0:42:43
Now, Engine B, by all accounts, was intended to look like an AI finance program.
342
0:42:44 --> 0:43:08
What they wanted to do was build these knowledge graphs, dynamic knowledge graphs of things like audit data so that if you were a lawyer or an accountant who was sent in to audit a company, the system Microsoft were developing would hoover up all of the data for that company, all of the invoices, all of the email, all of whatever was in the account.
343
0:43:08 --> 0:43:16
It would hoover it all up and then it would highlight particular transactions or interactions that potentially needed to be reviewed.
344
0:43:18 --> 0:43:34
They pulled me in on the process because they were having trouble getting from, they wanted to get from things like documents that had complex content or documents that were contracts or documents that had legal meaning.
345
0:43:34 --> 0:43:[privacy contact redaction] to get from that to some sort of structure that the Microsoft's AI developers could then emulate in their AI.
346
0:43:46 --> 0:44:02
And so one of the things I did, I've got a couple of papers I published showing this process where you can, using, sitting down with lawyers, I could develop a process map of, you know, here's how the lawyer, for example, is doing a property law transaction in the UK, following all of the processes.
347
0:44:04 --> 0:44:11
Here's where you can automate the ingestion of, say, the title data from a land title database.
348
0:44:12 --> 0:44:26
Here's how you can automate the ingestion of, say, the council, what New Zealand would call a limb report, but the council report with all of your, you know, every time that, like the council have approved for you to do renovations, like this room added on the side of your house.
349
0:44:26 --> 0:44:41
So I went through that whole process and showed them, you know, they were also trying to ingest the contracts behind commercial property and showed them, you know, here's how you can tell when, you know, they've changed the contract.
350
0:44:41 --> 0:44:[privacy contact redaction], here's where you can tell how to flag, you know.
351
0:44:47 --> 0:44:54
And then I realized I was teaching them how to do something again that was actually potentially damaging to the rest of us.
352
0:44:54 --> 0:45:10
Because a lot of that technology now that appeared to be for, you know, ostensibly good purpose, Microsoft admitted when we sat down with Microsoft at the end of my time on that project,
353
0:45:10 --> 0:45:[privacy contact redaction]ually what they're doing with this knowledge graphing system they've built in Azure is they are absorbing as much data as they can so that they can build a digital map of the whole world.
354
0:45:25 --> 0:45:37
That was their, we were shown a video of one of Microsoft's like top architects talking about the fact that that's the goal is that each of these little projects.
355
0:45:37 --> 0:45:42
So Microsoft at the same time were funding projects with the Turing Foundation for digital ID.
356
0:45:42 --> 0:45:46
You know, everybody knows that Bill Gates is funding all of these vaccines.
357
0:45:46 --> 0:45:49
Bill Gates also funds a whole heap of health records projects.
358
0:45:49 --> 0:46:04
But this chap from Microsoft acknowledging the fact that Microsoft's goal was to build a knowledge graph of the entire world so that they could point at one of your pictures or pull your name from this call.
359
0:46:04 --> 0:46:22
And then see all of the things where you've interacted all of the things you touch. Part of the process of how they're doing that is they're getting all of your employers and your educational institutions so all of the universities to move from having on premises things like on premises,
360
0:46:22 --> 0:46:[privacy contact redaction]ems and on premises SharePoint document systems. No, no, no, no, no, you don't want that. That's too hard for you.
361
0:46:29 --> 0:46:[privacy contact redaction]uff. They're silly and expensive. You don't want geeks with pocket protectors. Put it all in our cloud.
362
0:46:36 --> 0:46:47
And the entire purpose of Microsoft's cloud is to own all your data. And the whole time what you're doing is you are paying them. And I saw evidence of this firsthand.
363
0:46:47 --> 0:46:52
You are paying them literally to data mine you as a subject.
364
0:46:53 --> 0:47:00
Now, you know, coming forward to my PhD and the learning health systems work.
365
0:47:00 --> 0:47:07
You know, we see all of this stuff about how AI and machine learning are helping medicine.
366
0:47:07 --> 0:47:[privacy contact redaction]ly how many AI and machine learning tools are actually in day to day clinical use in the NHS.
367
0:47:17 --> 0:47:21
We've literally just hit two.
368
0:47:21 --> 0:47:[privacy contact redaction]em for breast cancers, which is still only about 60% sensitive, but it's about 85% specific for identifying and labeling a tumor for somebody to have a look at.
369
0:47:39 --> 0:47:44
The other one is the deep mind solution that's running at Moorfields Hospital.
370
0:47:44 --> 0:47:58
Now, as somebody who has eye problems myself, I spend quite a bit of time at Moorfields and watching as the doctors use this, it's a system that all it does is it looks at the retinal image when they stick your head in a box.
371
0:47:58 --> 0:48:01
They take a retinal image of the back of your eye.
372
0:48:01 --> 0:48:08
And all it does is identifies three or four particular aberrations that might affect your eyesight.
373
0:48:08 --> 0:48:[privacy contact redaction]ion on, you know, you've got the you've possibly got these two things.
374
0:48:15 --> 0:48:20
83% of people with those two things potentially in 18 months are blind.
375
0:48:20 --> 0:48:22
That's sort of a prediction.
376
0:48:22 --> 0:48:[privacy contact redaction]ain to the eye doctors at Moorfields exactly how to read and understand what those predictions mean.
377
0:48:30 --> 0:48:48
So I was sitting in a room one day at Moorfields having just had a doctor poking around at my eyes and listened to the doctor in the next bay telling this lady that, you know, that this thing it's it's it's had a look at your images.
378
0:48:48 --> 0:48:53
It says there's an 83% chance you're going to go blind.
379
0:48:53 --> 0:48:56
And of course, this poor woman was absolutely distraught.
380
0:48:56 --> 0:48:59
She was beside herself.
381
0:48:59 --> 0:49:[privacy contact redaction]ed in going to find out exactly how this deep mind solution works.
382
0:49:04 --> 0:49:13
And of course, if you've read the newspapers in the UK deep mind actually stole about one hundred and forty thousand health records and gave them to Google.
383
0:49:13 --> 0:49:[privacy contact redaction]ole them out of Hammersmith hospitals data set and gave them to Google.
384
0:49:17 --> 0:49:26
And, you know, supposedly got reprimanded for it and it continued to get paid eight million pound a year for holding these records that they stole.
385
0:49:26 --> 0:49:33
But leaving that aside, looking at what the solution was, it turns out that the deep mind solution at Moorfields is a machine learning.
386
0:49:33 --> 0:49:35
It's not AI.
387
0:49:35 --> 0:49:42
And it's not very sensitive or specific for one of the conditions that are identifies so badly.
388
0:49:42 --> 0:49:50
So the professor Fenton and I created a Bayesian model to demonstrate just how bad that Moorfields solution is.
389
0:49:50 --> 0:49:55
It turns out that this poor woman didn't have an 83% chance of going blind.
390
0:49:55 --> 0:49:57
She had a 9% chance of going blind.
391
0:49:57 --> 0:50:03
And that was only if it was verified she had both of the symptoms that the system identified.
392
0:50:03 --> 0:50:[privacy contact redaction]raught and probably potentially suicidal at the thought that she was almost 83% to most people is you're definitely something's going to happen.
393
0:50:17 --> 0:50:21
And yet in truth, it was 9%.
394
0:50:21 --> 0:50:28
So that's the whole time, of course, all of this information, none of that processing is done at Moorfields.
395
0:50:28 --> 0:50:32
It's all sent to Google in San Francisco, processed and sent back.
396
0:50:32 --> 0:50:[privacy contact redaction]antly, you know, yeah, they stole some medical records, but constantly we're actually giving them our medical records just without even realizing.
397
0:50:42 --> 0:50:[privacy contact redaction] of you who would go into a hospital like Moorfields would not even realize that, you know, you go and have the photo of your eyes done or you have your scan and all of that information automatically flitters off on the Internet.
398
0:50:57 --> 0:51:06
So my take on a whole lot of things is that we're very much going downhill.
399
0:51:06 --> 0:51:14
It's very much a situation where we've got nurses in jail for things that make no sense whatsoever.
400
0:51:14 --> 0:51:[privacy contact redaction]ions about that, I'm happy to.
401
0:51:18 --> 0:51:37
We've got a censorship state that means that people like myself who when I continue to speak out about some of these things and I always, I'm anal retentive about having the evidence to back these things up.
402
0:51:37 --> 0:51:42
So all of the things I've talked to you about, I've written academic papers with all of the references.
403
0:51:42 --> 0:51:48
All of those things, you know, I got censured for two and a half years, couldn't publish a paper.
404
0:51:48 --> 0:52:[privacy contact redaction] four jobs, four academic jobs in a row because literally it was like the I'd write a paper with Professor Fenton or with, you know, Jonathan Engler, whose face I can see here in this meeting.
405
0:52:02 --> 0:52:04
I'd write a paper with this group.
406
0:52:04 --> 0:52:10
And then the university would interview me for, oh, you're racist and misogynist.
407
0:52:10 --> 0:52:18
And it's like, well, how did you get from I wrote a comment or something in a paper about COVID to misogyny and racism?
408
0:52:18 --> 0:52:[privacy contact redaction]ain how they got there.
409
0:52:21 --> 0:52:23
But by golly, they got there.
410
0:52:23 --> 0:52:30
And so, you know, four times in a row, I got constructively dismissed from academic posts.
411
0:52:30 --> 0:52:34
So I noticed there's various people commenting on the on the side there.
412
0:52:34 --> 0:52:36
So I think I'll open it up at this point.
413
0:52:36 --> 0:52:41
And, you know, if people want to ask questions, then feel free.
414
0:52:41 --> 0:52:45
OK, Scott, great job. Great job.
415
0:52:45 --> 0:52:[privacy contact redaction]ory.
416
0:52:46 --> 0:52:[privacy contact redaction]ephen's gathering his wits for the first series of questions, Scott, my question to you is what on earth gave you the self-belief to go on your journey in the future?
417
0:52:59 --> 0:53:07
Go on your journey and get out of Inveral, where I've been on numerous occasions and go on your journey.
418
0:53:07 --> 0:53:[privacy contact redaction]e.
419
0:53:08 --> 0:53:12
You're you're only one of two who left in Inveral.
420
0:53:12 --> 0:53:20
What's your sense about your thinking process?
421
0:53:20 --> 0:53:22
It's, you know, small towns.
422
0:53:22 --> 0:53:28
So, you know, I did part of my growing up in Inveral and part of my growing up a couple of hours down the road in Tamworth.
423
0:53:28 --> 0:53:41
And towns become very insular and very close minded and claustrophobic would be would is probably a good description.
424
0:53:41 --> 0:53:46
And there's there's very little opportunity in those towns.
425
0:53:46 --> 0:53:[privacy contact redaction]ies are getting worse.
426
0:53:51 --> 0:53:53
They're getting smaller now.
427
0:53:53 --> 0:53:56
Even as the towns get bigger, they're getting smaller.
428
0:53:56 --> 0:54:15
You know, I mean, I've still got a brother who who lives in a small town over there, you know, with with a couple of daughters and looking at the fact that, you know, he's having to move to somewhere like Gosford, which, you know, as you know, if you if you know anything about Sydney now, Sydney's got the most ridiculous house prices anywhere in Australia.
429
0:54:15 --> 0:54:23
And he's having to move up to Gosford so that he can be near his eldest daughter who's going to go to university.
430
0:54:23 --> 0:54:27
And he wants her to go to university because there are just no opportunities.
431
0:54:27 --> 0:54:35
And the problem we've got, of course, is even when you've got the university degree now, you often don't have an opportunity in the country.
432
0:54:35 --> 0:54:56
So you've only got to look at the fact that all the farmers at the moment are at Westminster to know that, you know, even even the people who are doing the job in the country who are, you know, working hard to feed the rest of us, even they are struggling to be able to do what they do in the country.
433
0:54:56 --> 0:55:02
So so but you didn't worry about not getting not finishing school and yet you got a PhD.
434
0:55:02 --> 0:55:08
That's it's this self belief that some that came from somewhere that others don't appear to have.
435
0:55:08 --> 0:55:23
So it's a great credit to you. And it always intrigues me, you know, as to as to where I think I'd I'd actually credit it to when when my parents divorced, my father remarried.
436
0:55:23 --> 0:55:32
And I think I'd credit it to my my stepmom, my brother and my stepmother, who was a very cruel and sort of bitter woman.
437
0:55:32 --> 0:55:37
You know, she'd not had a good life herself, you know, and that's that's as it may.
438
0:55:37 --> 0:55:[privacy contact redaction]s told us that neither of us would amount to anything because we weren't her children.
439
0:55:42 --> 0:55:46
We wouldn't amount to anything.
440
0:55:46 --> 0:55:48
So you wanted to prove her wrong, you mean?
441
0:55:48 --> 0:55:49
Absolutely.
442
0:55:49 --> 0:55:50
That's a bloody lovely.
443
0:55:50 --> 0:55:52
That's a lovely example. That's a beautiful.
444
0:55:52 --> 0:55:54
That's right. Some people that message would destroy them.
445
0:55:54 --> 0:55:56
And for you, it builds you. So thank you.
446
0:55:56 --> 0:55:59
That's a lovely insight.
447
0:55:59 --> 0:56:07
And, you know, that's part of our own each one of us, our own self awareness journey of, hey, what have we learned and why do we why do we behave as we do?
448
0:56:07 --> 0:56:10
All right, Stephen, next 15 minutes is all yours. Thank you, Scott.
449
0:56:10 --> 0:56:12
Brilliant.
450
0:56:12 --> 0:56:16
So, Scott, well, thanks very much for coming on a short notice.
451
0:56:16 --> 0:56:18
That's really kind of you.
452
0:56:18 --> 0:56:[privacy contact redaction] four jobs.
453
0:56:22 --> 0:56:24
But the good news is you've got four new ones.
454
0:56:24 --> 0:56:[privacy contact redaction]?
455
0:56:26 --> 0:56:32
Well, it's it's it's that I yeah, I kept bouncing from job to job as they take a job off me.
456
0:56:32 --> 0:56:35
I'd go get another one and they take that one off me for the same reason.
457
0:56:35 --> 0:56:40
But, yes, I sort of ended up settling now with with with Kings.
458
0:56:40 --> 0:56:44
And it's it's been quite good in the in the nursing school there.
459
0:56:44 --> 0:56:[privacy contact redaction]e who don't know who watching the video afterwards as well, but on the call now, Kings College Hospital.
460
0:56:53 --> 0:56:57
You're at the hospital or no Kings College London at the university down the road.
461
0:56:57 --> 0:57:03
Yeah. So Kings College London is one of the top colleges of London University.
462
0:57:03 --> 0:57:06
And London University obviously is up there with amongst the best in the world.
463
0:57:07 --> 0:57:[privacy contact redaction], so Kings College has a really good reputation.
464
0:57:10 --> 0:57:[privacy contact redaction], my eldest son was born at Kings College Hospital to.
465
0:57:17 --> 0:57:19
Yeah. So that's a long story.
466
0:57:19 --> 0:57:21
I didn't trust the doctors in Sweden.
467
0:57:21 --> 0:57:[privacy contact redaction] son in the UK and I chose the doctor who delivered my first son, our first son, so that of three sons.
468
0:57:34 --> 0:57:[privacy contact redaction], but what I was trying to say, Scott, was that you'd lost four jobs and presumably the intention was to not only cancel you, but probably to destroy you because or at least to make life difficult for you.
469
0:57:48 --> 0:57:[privacy contact redaction]ually you got to four new ones very easily.
470
0:57:51 --> 0:57:53
So that says something about your abilities.
471
0:57:55 --> 0:57:57
I think so. Thank you. Yes.
472
0:57:57 --> 0:58:00
Yeah, I think it was it's a case of.
473
0:58:00 --> 0:58:19
You know, what I've what I've seen is often you've got people within, whether it be within an organization, your company or whether it be an academic institute, you've got people who agree with the stance that, you know, the covid situation was wrong.
474
0:58:19 --> 0:58:[privacy contact redaction]ics were wrong.
475
0:58:21 --> 0:58:33
You know, the PCR tests were were totally flawed, you know, and so when they when they see someone like like myself or the people that I work with speaking out, you know, they generally will agree with us.
476
0:58:33 --> 0:58:42
But at the same time, because they're protecting their their own job and they're too afraid or too nervous, I suppose, to to speak out themselves.
477
0:58:42 --> 0:58:53
So they give me the job and then it would be somebody else within the department who was, you know, wanted you to to toe the narrative, who would then be the person who would come down on me.
478
0:58:53 --> 0:59:02
So the reason I'm interested in that is because I recognize in you that you are very similar, a similar way of thinking and a way of looking at the world.
479
0:59:02 --> 0:59:07
And you I don't think you could tell a lie if you tried and you just call things out as you see it.
480
0:59:08 --> 0:59:[privacy contact redaction]ually you're saved by the fact that people like you.
481
0:59:12 --> 0:59:15
And I think so.
482
0:59:15 --> 0:59:17
And it's the same with me.
483
0:59:17 --> 0:59:21
Amazingly, they some people like me and those people have always saved me.
484
0:59:21 --> 0:59:28
I didn't realize I was a serial whistleblower, but I think there are quite a few of us and we don't even know what whistleblowing is.
485
0:59:28 --> 0:59:30
And I know now, of course.
486
0:59:30 --> 0:59:38
So it looks like I'm very so Moorfields Eye Hospital is one of the finest eye hospitals in the world.
487
0:59:38 --> 0:59:[privacy contact redaction]opped at Moorfields.
488
0:59:41 --> 0:59:56
Well, I think it's a case of they bought in this technology and the problem that tends to happen with organizations like Google or DeepMind or, you know, you remember there was Babylon Health a while back.
489
0:59:56 --> 1:00:02
You get organizations like that and they've got this evangelist who sits at the top who the technology is wonderful.
490
1:00:02 --> 1:00:05
The AI it's going to diagnose. It's going to do this. It's going to do that.
491
1:00:05 --> 1:00:12
And, you know, as somebody who, you know, I myself, I develop, I sit over here.
492
1:00:12 --> 1:00:23
I've got a whole set up over here where I do a whole heap of Bayesian statistics and modeling that, you know, people like Professor Fenton and Professor Neil have been trying desperately to teach me for seven years.
493
1:00:24 --> 1:00:38
You know, I do a lot of that sort of work and I struggle to see how most of the AI and machine learning articles that I read in journals could ever actually help anyone or could ever find its way into clinical practice.
494
1:00:38 --> 1:00:[privacy contact redaction] of it's complete nutter nonsense.
495
1:00:40 --> 1:00:50
Yeah. So I so I don't know anything about artificial intelligence, but I instinctively know what I know what medicine is about the practice of medicine.
496
1:00:51 --> 1:00:57
I know that artificial intelligence, no computer can do what a good doctor does.
497
1:00:57 --> 1:01:05
So weigh up all the evidence, a very nuanced view, whereas computers are completely un-nuanced as far as I can see.
498
1:01:05 --> 1:01:09
And also totally black and white. Absolutely.
499
1:01:09 --> 1:01:21
It's all black and white. And I wonder, is that the reason that they're trying to push the AI, but it sounds like they're not actually succeeding, or they're not interested in the AI.
500
1:01:21 --> 1:01:34
I wonder whether the apparent preoccupation with AI is just a cover for them wanting to get everyone's data on behalf of governments around the world to control us.
501
1:01:34 --> 1:01:50
I've honestly thought that. I honestly that's that is a thought in my mind and it is sort of, you know, evidenced by the fact that they make they make more in the NHS from selling the data than they spend on trying to develop something themselves with it.
502
1:01:50 --> 1:02:00
So, so to me, you know, the practice of medicine is so nuanced and it's all about treating or helping the patient in front of you.
503
1:02:00 --> 1:02:18
There's no way that an artificial intelligence without empathy can ever satisfy a patient, you know, and so it seems to me that people who don't understand the practice of medicine, this is if they are actually doing it for the reasons they say they're doing it.
504
1:02:18 --> 1:02:[privacy contact redaction] thought about it while you were talking that that that that that there's no chance that anything, any computer can satisfy a patient's needs, because, especially when it comes to their health, they're so nuanced and so individual.
505
1:02:37 --> 1:02:47
And if the if the patient doesn't think that the doctor or this machine in this case, the AI is listening to him or her, they just reject everything that is said.
506
1:02:47 --> 1:02:59
And they go and find someone else who they think does understand them. And there's no way a machine can understand a human mind, certainly not mine and certainly not yours and definitely not Charles's.
507
1:02:59 --> 1:03:07
I think a lot of it is it's what's that saying that the lawyers and sailors have, you know, it's money for old rope.
508
1:03:07 --> 1:03:12
I think is the term that I've heard. It's it's grafting.
509
1:03:12 --> 1:03:[privacy contact redaction]and up. I don't I normally work at a standing desk. I don't often sit.
510
1:03:19 --> 1:03:29
You you look at something like that Babylon AI solution that was around for a while now they they managed to absorb tens of millions.
511
1:03:29 --> 1:03:36
I'm certain that the last figure that I saw was something in the avenue of about [privacy contact redaction]ralian friends.
512
1:03:36 --> 1:03:39
That's well over [privacy contact redaction]ralian.
513
1:03:39 --> 1:03:45
They they managed to absorb that money and convince the NHS to do something about it.
514
1:03:45 --> 1:03:52
They they managed to absorb that money and convince the NHS to let them set up a medical clinic in London.
515
1:03:52 --> 1:03:57
That medical clinic managed as far as I know to get about it.
516
1:03:57 --> 1:04:09
It got sort of a six figure number of patients who left their GP practice and signed up to have Babylon Health be their GP practice.
517
1:04:09 --> 1:04:30
The whole Babylon Health thing was a conglomeration of machine learning tools and a little bit of Bayesian, a little bit of something else and a little bit of, you know, Markov algorithms and things all sort of running in a computer system with the idea being that it would ask you what was supposed to be natural language questions.
518
1:04:31 --> 1:04:48
And if any of you watched, I think Hannah Fry did a TV documentary in about [privacy contact redaction]ood in a in a cafe or something with the Babylon app and she's tapping some, you know, she's reading what it says and then she's tapping in her answers.
519
1:04:48 --> 1:05:[privacy contact redaction]ration of the fact that the Babylon AI didn't know its ass from its elbow because it was she was putting in symptoms of one thing and it was returning symptoms for a completely different condition or it was returning diagnosis for a completely different condition.
520
1:05:06 --> 1:05:19
When the sort of, you know, it started to heat up and Babylon were not looking so crash hot in the mainstream, you know, what did they do?
521
1:05:19 --> 1:05:22
They all of a sudden closed up shop and disappeared.
522
1:05:22 --> 1:05:27
And now they've, you know, their claim is that, oh, you know, the UK market's too small.
523
1:05:27 --> 1:05:29
So we've gone overseas.
524
1:05:29 --> 1:05:37
Well, you know, now they've they're doing the same graft, the exact same graft with the exact same AI in places like Canada and America.
525
1:05:38 --> 1:05:40
You know, have they changed anything?
526
1:05:40 --> 1:05:41
No, they're doing the exact same thing.
527
1:05:41 --> 1:05:48
They're selling it to the government, getting the government to pay them to be a health provider and running the same algorithms.
528
1:05:50 --> 1:05:51
Yeah.
529
1:05:51 --> 1:05:57
And the ridiculous thing about it is that AI can come up with an answer, which is unbelievably stupid.
530
1:05:57 --> 1:06:04
And any human being would know that it's stupid, but the artificial intelligence doesn't know.
531
1:06:04 --> 1:06:06
Is that right?
532
1:06:06 --> 1:06:10
Yeah. Well, another example would be Hammersmith Hospital.
533
1:06:10 --> 1:06:17
There's a Google around, there's a technology or a sort of a clinical decision tool that I think was called Isabelle.
534
1:06:17 --> 1:06:18
Isabelle.
535
1:06:18 --> 1:06:19
Isabelle.
536
1:06:19 --> 1:06:22
Hammersmith Hospital for a little while ran a trial of it.
537
1:06:22 --> 1:06:25
And there's a couple of academic papers they wrote.
538
1:06:25 --> 1:06:34
In the academic papers, they interview a range of junior doctors, you know, house officers and so on, all the way up to, you know, top consultants.
539
1:06:35 --> 1:06:42
One of the things that came out is the fact that the doctors, when they were presented with, here's this tool that sits in you.
540
1:06:42 --> 1:06:47
You've got your patient window open that you normally would create your health record in.
541
1:06:47 --> 1:06:53
Isabelle would put like a column, much like the chat panel on the side of Zoom here.
542
1:06:53 --> 1:06:55
It would put a thing down the side.
543
1:06:55 --> 1:07:03
And so as you typed in information, so the patient's got, you know, this saw here and they've had a blood test for this and something else and they've got this other symptom.
544
1:07:03 --> 1:07:10
As you're putting it in, what Isabelle was doing was data mining the words that you typed and then popping up a thing on the side.
545
1:07:10 --> 1:07:13
And it would pop up a number of different diagnoses.
546
1:07:13 --> 1:07:23
When it reached a threshold of, you know, it might be 80% or 85% likelihood for a particular diagnosis, that diagnosis would change color.
547
1:07:23 --> 1:07:26
What was happening, the junior doctors admitted it.
548
1:07:26 --> 1:07:[privacy contact redaction] with all of that medical training you're doing, how to make clinical decisions and how to, you know, how to lay your hands on the patient and listen to the patient and work out what the signs and symptoms.
549
1:07:38 --> 1:07:40
They threw all of that out the door.
550
1:07:40 --> 1:07:49
What they were doing was gaming it until Isabelle would highlight a particular thing and they treat you for that thing.
551
1:07:49 --> 1:07:50
It didn't matter.
552
1:07:50 --> 1:07:[privacy contact redaction]or who goes along with that nonsense doesn't deserve to be a doctor or shouldn't be a doctor.
553
1:07:56 --> 1:07:[privacy contact redaction]ly.
554
1:07:57 --> 1:08:[privacy contact redaction]or who believes in systems like that.
555
1:08:01 --> 1:08:05
But unfortunately, I was unaware.
556
1:08:05 --> 1:08:07
I did know that there was a big problem.
557
1:08:07 --> 1:08:09
I kept speaking out about it as well.
558
1:08:09 --> 1:08:27
So the whole thing with revalidation and appraisals, I saw that as a means of controlling particularly inexperienced doctors and forcing them into a narrative, into a way of looking at medicine, which I didn't even recognize.
559
1:08:27 --> 1:08:30
But I didn't realize how right I was.
560
1:08:30 --> 1:08:36
So one of my appraisers said to me, he said, well, one thing he said, you'd speak like a consultant.
561
1:08:36 --> 1:08:39
Well, I said, I am a consultant.
562
1:08:39 --> 1:08:43
I was in Sweden in radiology or the equivalent.
563
1:08:43 --> 1:08:45
You know, they don't have consultants in Sweden.
564
1:08:45 --> 1:08:47
So overlecker in Sweden.
565
1:08:47 --> 1:09:02
And so but the other thing he said was when he kind of gave up on me because he had me for several years before anyway, was seeing how do you play the game?
566
1:09:02 --> 1:09:03
And I said, which game?
567
1:09:03 --> 1:09:05
And he said, just play the game.
568
1:09:05 --> 1:09:07
Make things easy for yourself.
569
1:09:07 --> 1:09:11
I said, sorry, I didn't realize that the practice of medicine was a game.
570
1:09:11 --> 1:09:13
No, I can't play it.
571
1:09:13 --> 1:09:16
No, I've never been able to play the game, as you put it.
572
1:09:16 --> 1:09:18
And then he wouldn't do it.
573
1:09:18 --> 1:09:20
He wouldn't appraise me the next year.
574
1:09:20 --> 1:09:22
It doesn't surprise me.
575
1:09:22 --> 1:09:23
It doesn't surprise me.
576
1:09:23 --> 1:09:40
You know, I when I was working for Professor Fenton and, you know, I was motivated by the fact that the time, you know, I was I was when I started my PhD, Professor Fenton was my supervisor.
577
1:09:40 --> 1:09:[privacy contact redaction] as I could get the PhD done because my intention had been that once I couldn't get into medical school because I didn't have high school, you know, what the Americans call your high school diploma.
578
1:09:57 --> 1:09:58
I didn't have that.
579
1:09:58 --> 1:10:08
The other thing was that technically, again, I'm one of the rare PhDs you'll ever find who, you know, I was all but on my bachelor's degree.
580
1:10:08 --> 1:10:09
Right.
581
1:10:09 --> 1:10:10
So, you know, here I am.
582
1:10:10 --> 1:10:11
I don't have a bachelor's degree.
583
1:10:11 --> 1:10:17
I don't have the high school thing, but I've got all of the other degrees that I did since then.
584
1:10:17 --> 1:10:24
And so one of the one of the sorry, including a PhD, including a PhD credit to you.
585
1:10:24 --> 1:10:26
That's amazing.
586
1:10:26 --> 1:10:[privacy contact redaction]e don't know I completed that PhD in two years and one month from start to finish.
587
1:10:32 --> 1:10:35
And that was while working a full time job.
588
1:10:35 --> 1:10:37
Excellent.
589
1:10:37 --> 1:10:43
Yeah, you know, normally takes four years to do a PhD as far as I understand.
590
1:10:43 --> 1:10:49
I suppose, at minimum three years, but most people it takes four years, would you say?
591
1:10:49 --> 1:10:56
Yeah. So if you're doing your PhD full time, then yes, it's usually classed as a four year program.
592
1:10:56 --> 1:10:58
If you're doing it part time, it's usually six years.
593
1:10:58 --> 1:11:00
So Scott, you did yours in what? Two?
594
1:11:00 --> 1:11:[privacy contact redaction]art to finish.
595
1:11:03 --> 1:11:12
Professor Fenton had to go through a whole heap of hoops and fill out a whole heap of forms to get the research degrees office to even accept my thesis.
596
1:11:12 --> 1:11:15
Yeah. So why do you think so?
597
1:11:15 --> 1:11:21
Were you told to be in a rush or did you just choose to be in a rush?
598
1:11:21 --> 1:11:25
Well, it was a case of I wanted to get that out of out of the road because I was working.
599
1:11:25 --> 1:11:30
I was working. I was working with as part of the project I was working on with Professor Fenton.
600
1:11:30 --> 1:11:39
I was working with one of the emeritus professors from the medical school who had been the head of the medical school for years and years.
601
1:11:39 --> 1:11:50
And so the thing that I wanted to do, you know, still still thinking that, oh, my God, you know, you know, that was Norman Fenton, was it?
602
1:11:50 --> 1:11:53
No, one of the people Norman had me working with.
603
1:11:53 --> 1:11:57
Oh, yes. You mentioned his name earlier. Yes. OK. Maybe you don't want to name him.
604
1:11:57 --> 1:12:06
Yeah. So I was Norman had me working with some midwives and with this endocrinologist who'd been the head of the medical school.
605
1:12:06 --> 1:12:16
And so one of the things that I was hoping to do and that person had sort of offered me a whole heap of help and suggestions to do it was I was going to apply then to go into medical school.
606
1:12:16 --> 1:12:35
So, Scott, the fact that you were working with the head of a medical previous head of a medical school and that was on the recommendation of Fenton, that tells me that you're an extraordinary student and that somebody saw something in you to push you to work with that professor who was the head of the medical school.
607
1:12:35 --> 1:12:[privacy contact redaction], well done to you. Thank you.
608
1:12:38 --> 1:12:42
Thank you. I wanted to ask you about Lucy Lettby.
609
1:12:42 --> 1:12:51
So it's really important story for all of us because I think there's this possibility and even a probability from looking at the story.
610
1:12:51 --> 1:13:[privacy contact redaction] proved it beyond reasonable doubt or beyond the balance of probabilities or I haven't got the science to support it, but my instincts tell me that there's a possibility and a probability even that that was a miscarriage of justice.
611
1:13:05 --> 1:13:[privacy contact redaction]ice because she was said to have killed eight babies and the
612
1:13:10 --> 1:13:15
trial only ended quite recently and she was sentenced about a year ago, was it from memory?
613
1:13:16 --> 1:13:17
Yeah, so...
614
1:13:17 --> 1:13:23
It was demonised by the judge and more or less the whole of the UK and the world was told that she
615
1:13:23 --> 1:13:[privacy contact redaction] evil woman who had ever lived and what I saw of her, yeah, you can't tell,
616
1:13:29 --> 1:13:34
that she was unassuming, she looked like the perfect target for someone who wanted to cover
617
1:13:34 --> 1:13:[privacy contact redaction]ivity in the NHS and... sorry, yes, and to blame someone criminally,
618
1:13:45 --> 1:13:[privacy contact redaction]rt blame, civil blame from themselves or even criminal behaviour, blame for criminal
619
1:13:54 --> 1:14:01
behaviour by the NHS in this case and the NHS in my opinion is a cult, it's a deadly cult and so
620
1:14:01 --> 1:14:07
everything screamed to me that the person I was telling about this morning, who I think was a
621
1:14:07 --> 1:14:13
legal scholar or he wasn't a lawyer, he wasn't qualified, but he was very interested in the law,
622
1:14:13 --> 1:14:18
he'd been researching it for a long time and he turned his attention to the Lucy Letbe case,
623
1:14:18 --> 1:14:25
I don't know who suggested it, and he gave a monologue of about half an hour to an hour
624
1:14:26 --> 1:14:32
grabbed my attention, it was posted on the doctors for patients group in the UK,
625
1:14:32 --> 1:14:[privacy contact redaction]ors patients UK group and so someone within the group who I need to find out
626
1:14:39 --> 1:14:[privacy contact redaction]ed in Lucy Letbe case and knew a lot more about it than I did, but I listened to this
627
1:14:44 --> 1:14:[privacy contact redaction]ened to you today, I couldn't believe that you also knew about Lucy Letbe case and you
628
1:14:51 --> 1:14:59
also think that there's at least a possibility that she is wrongly in prison, but worse than that
629
1:14:59 --> 1:15:05
the world needs to know that she's been really demonised in the UK, she had no defence virtually,
630
1:15:05 --> 1:15:[privacy contact redaction]ymied in some way, goodness knows what, but he only produced one
631
1:15:13 --> 1:15:20
witness for the defence and that was the plumber and to get people's interest in this story and to
632
1:15:20 --> 1:15:26
hopefully someone will research it, it looks like the NHS had problems with sewage
633
1:15:27 --> 1:15:[privacy contact redaction]s in the ceilings of the wards but leaking and yeah.
634
1:15:34 --> 1:15:41
It's I mean there was a whole lot of sepsis issues in different hospitals at around that time that
635
1:15:41 --> 1:15:47
sort of period from about 2014 to 2017, there was issues with sepsis across and usually it was
636
1:15:47 --> 1:15:[privacy contact redaction]erial or viral sepsis across a range of hospitals across the sort of the midlands all
637
1:15:55 --> 1:16:[privacy contact redaction] up to Liverpool. Now the issue at the time that's specific to the unit where Lucy Letbe was
638
1:16:06 --> 1:16:12
and it's you know I myself have only just gotten a little bit of the history of the building,
639
1:16:12 --> 1:16:16
I didn't know much about the history of the building except I knew about the plumbing in
640
1:16:16 --> 1:16:[privacy contact redaction]uff from the evidence that was given and from looking at some of the data.
641
1:16:22 --> 1:16:29
But the book that I've been reviewing at the moment by Paul Bamford, he talks about the
642
1:16:29 --> 1:16:[privacy contact redaction] that he worked in that building literally when it was originally opened and it started out
643
1:16:35 --> 1:16:44
as a I would call it a cluege, it was a little single floor bungalow originally where a mother,
644
1:16:44 --> 1:16:[privacy contact redaction], a mother would go in, she'd have a baby, there was a couple little
645
1:16:48 --> 1:16:55
sort of neonatal rooms with cots so you know back in the old days of course when our mothers had
646
1:16:55 --> 1:17:00
their babies often you know the nurses would take the baby and put them in the little room across
647
1:17:00 --> 1:17:[privacy contact redaction] Well that's what this building was, it was a little red brick bungalow and then after
648
1:17:06 --> 1:17:12
about [privacy contact redaction]etely you know unfit for purpose, it was too small at
649
1:17:12 --> 1:17:18
this point, it wasn't meeting what they needed so they even while the building was in use they
650
1:17:18 --> 1:17:[privacy contact redaction]ory and so they operated with that second story for a while
651
1:17:25 --> 1:17:31
and that you know we're talking 56 years ago thereabouts maybe a little bit more maybe 60
652
1:17:31 --> 1:17:38
maybe pushing 60 years then they decided they needed more rooms so they built these two clip-on
653
1:17:38 --> 1:17:43
wings either side of the building and then they decided they needed to connect that building to
654
1:17:43 --> 1:17:[privacy contact redaction] of the hospital campus so they then built you know as most hospitals do they built sort of
655
1:17:47 --> 1:17:[privacy contact redaction]s that goes to other places so you know it makes sense if you had to
656
1:17:54 --> 1:17:[privacy contact redaction] delivered who needed to be treated somewhere else in the hospital it makes
657
1:17:59 --> 1:18:[privacy contact redaction]s be transporting in an enclosed space that's you know normal but as
658
1:18:06 --> 1:18:12
the building it's sort of being clued bit onto bit and then bit onto bit what you had was these
659
1:18:12 --> 1:18:[privacy contact redaction] iron pipe work and in the case of the room where Lucy Letby mainly worked
660
1:18:22 --> 1:18:[privacy contact redaction] poorly of these babies were being housed that pipe ran from about two foot
661
1:18:31 --> 1:18:35
in from that corner across diagonally the middle of the room
662
1:18:37 --> 1:18:43
right and that's taking the sewerage that's taking toilet water and water from sluice machines and
663
1:18:43 --> 1:18:[privacy contact redaction]airs and it's moving it from that corner to that corner and you've got
664
1:18:50 --> 1:18:56
incubators underneath it now part of the part of the evidence that was given and part of the
665
1:18:56 --> 1:19:[privacy contact redaction]s that were delivered under discovery what you learn is that these old
666
1:19:03 --> 1:19:[privacy contact redaction] iron pipes were prone to and you see it sometimes i mean if anyone in the uk if you've
667
1:19:09 --> 1:19:16
got that underfloor heating from about 30 years ago every so often you'll see somebody who's got
668
1:19:16 --> 1:19:20
to rip up their slab and they've got to replace the metal pipes and their underfloor heating because
669
1:19:20 --> 1:19:25
the the little pipes get pitting and eventually the steam gets out and it comes up through your
670
1:19:25 --> 1:19:30
concrete slab and it wets your carpet same sort of thing with these cast iron sewerage pipes that
671
1:19:30 --> 1:19:38
were in the roof over time of course the pipe rusts it breaks down it would pit and so you had
672
1:19:38 --> 1:19:43
all these little micro pits where you can only imagine they're taking off you always have a false
673
1:19:43 --> 1:19:[privacy contact redaction]rial office false ceiling you take off those tiles that are
674
1:19:49 --> 1:19:[privacy contact redaction] and foam and whatnot and they'd be covered in the fine
675
1:19:57 --> 1:20:[privacy contact redaction]uff that was growing from whatever had come out of these pipes so that's that's that's
676
1:20:03 --> 1:20:[privacy contact redaction] clue that something was wrong the second clue of course is the evidence that was given
677
1:20:08 --> 1:20:[privacy contact redaction]umber talks about the fact that on a weekly basis somebody was having
678
1:20:13 --> 1:20:[privacy contact redaction] and you know bear in mind it consists of four rooms and a nurse's area
679
1:20:20 --> 1:20:25
nurse's station and a sluice room and a few other bits and pieces you know a medication room and so
680
1:20:25 --> 1:20:34
on you've got this these these rooms where next door to them you've got then the the maternity unit
681
1:20:34 --> 1:20:39
and so you had sometimes flooding in the showers in the patient rooms in the maternity unit
682
1:20:39 --> 1:20:45
and then flooding would come through and up the the sink pipes and up the floor drains in the neonatal
683
1:20:45 --> 1:20:52
unit so you've got that as well you've got all these sources of infection but all that would happen
684
1:20:52 --> 1:20:58
is you know the the time that the water came up through the floor that the the chap gave evidence
685
1:20:58 --> 1:21:04
about all that happened literally was once it was all drained and cleaned away oh well we get in with
686
1:21:04 --> 1:21:[privacy contact redaction]ant we mop the floor isn't that great well the answer is no the answer is no
687
1:21:08 --> 1:21:[privacy contact redaction] of that type of bacteria especially when it gets in those roof
688
1:21:13 --> 1:21:20
tiles or when it gets under the linoleum on the floor or when it gets in and around the the tap
689
1:21:20 --> 1:21:26
fixtures you know you can be you can be putting bleach pure bleach down the the sink fixture every
690
1:21:26 --> 1:21:33
day and it's just regrowing you know all you're doing is is pausing it and one of the examples
691
1:21:34 --> 1:21:40
but that's half an hour that's half an hour already yeah let's just say so but i'm just trying to get
692
1:21:40 --> 1:21:46
to the point uh scott and i'm helping you uh trying to help you so so it seems to me that the NHS had
693
1:21:46 --> 1:21:55
problems in other hospitals uh in other parts of the country on exactly this leakage from the from
694
1:21:55 --> 1:22:01
the pipes so and then it looks to me that they could have been that they were afraid that it was
695
1:22:01 --> 1:22:08
going to be pinned on them at the neonatal unit where all these babies had died and so instead of
696
1:22:08 --> 1:22:[privacy contact redaction]ead of uh accepting the blame or allowing an investigation into it the somehow or other the
697
1:22:15 --> 1:22:22
police got involved and there was a criminal child and so i expressed the view to a lawyer why or
698
1:22:22 --> 1:22:29
was it a lawyer i can't remember who um i expressed the view that uh you know why did they uh turn what
699
1:22:29 --> 1:22:[privacy contact redaction] been a civil matter into a criminal matter but targeting truly targeting one person
700
1:22:35 --> 1:22:42
if that's what occurred and the answer i got was it's a lot less money and to get someone
701
1:22:42 --> 1:22:49
charged criminally to avoid blame for you for the NHS in this case but also uh the reputation the
702
1:22:49 --> 1:22:55
damage of the to reputation so but i couldn't believe that that could happen you know if it
703
1:22:56 --> 1:23:02
has happened i can't believe that that could happen in the uk and people uh who knew about it
704
1:23:03 --> 1:23:10
including presumably the police um unless someone was driving the crime with people knowing the
705
1:23:10 --> 1:23:[privacy contact redaction]e only had little bits of information i don't know it's just incredible
706
1:23:15 --> 1:23:23
but it's okay okay oh can i just can i just answer that with something there's a sub-stack
707
1:23:23 --> 1:23:28
article i wrote back very early in the lucy let b series i've written a series of about 20 articles
708
1:23:29 --> 1:23:34
paralleling what happened at countess of chester with an american hospital that had the same
709
1:23:34 --> 1:23:40
problem literally had babies dying in a neonatal unit because of because of what was suspected
710
1:23:40 --> 1:23:[privacy contact redaction] the way that they dealt with it the american hospital what they
711
1:23:46 --> 1:23:51
did was they they they went okay well rather than calling the police let's call in they called in
712
1:23:51 --> 1:23:[privacy contact redaction] they called in a pathologist of you know pathologists usually deal with either of
713
1:23:58 --> 1:24:03
two things they either deal with fluids or they deal with physical tissue and bodies they called
714
1:24:03 --> 1:24:[privacy contact redaction] to go around and do a whole heap of tests in the hospital track down whereabouts
715
1:24:08 --> 1:24:13
in the hospital were the pathogens living right and then they worked out i mean they did actually
716
1:24:13 --> 1:24:[privacy contact redaction]s on all of the the neonatal nurses in this american unit in the end they
717
1:24:19 --> 1:24:[privacy contact redaction]ually unintentionally picking up the the the bacteria
718
1:24:28 --> 1:24:[privacy contact redaction]er if i can pronounce that correctly they were picking it up
719
1:24:33 --> 1:24:40
one of them had it under her nails and so that was how it got around now if that had been countess
720
1:24:40 --> 1:24:[privacy contact redaction] done was taken her to court and and and you know criminalized her for
721
1:24:46 --> 1:24:[privacy contact redaction] that somehow she ended up with bacteria under her fingernails that was where they found it
722
1:24:51 --> 1:24:[privacy contact redaction]e they looked at the fact that okay it's a unit-wide
723
1:24:56 --> 1:25:01
problem we found it in some of the sinks we found it you know under a drain in another hospital
724
1:25:01 --> 1:25:07
another part of the hospital they found it in a brand new faucet the faucet had been so the
725
1:25:07 --> 1:25:[privacy contact redaction]alled in the hospital some water had been run through to prove that it was
726
1:25:12 --> 1:25:18
ready and then [privacy contact redaction] what they didn't know was in the freshwater
727
1:25:18 --> 1:25:[privacy contact redaction]erias had gotten in and so they had they had bacteria in that faucet
728
1:25:25 --> 1:25:[privacy contact redaction]e of babies but you know they looked at the facility
729
1:25:29 --> 1:25:35
and looked at how can we clean it how can we stop it from happening here it seems to be grab the
730
1:25:35 --> 1:25:[privacy contact redaction] person on the totem pole there's so many cases that that i've read
731
1:25:40 --> 1:25:46
i often read like the the nmc reports and i read things like the health and disability commission
732
1:25:46 --> 1:25:[privacy contact redaction]e and look at the nurses and midwives nurses and midwives get
733
1:25:51 --> 1:25:57
because they're the lowest rank on the totem pole so to speak end up taking the blame often for
734
1:25:57 --> 1:26:[privacy contact redaction]or who who cut a patient wrong who bled out in surgery somehow it's the
735
1:26:03 --> 1:26:08
nurse's fault that sort of thing that seems to be a lot of what's happening one of the things that
736
1:26:08 --> 1:26:16
the jury were never told with lucy let b lucy let b had originally been charged with eight deaths
737
1:26:17 --> 1:26:21
the judge before trial had removed one of those deaths the jury weren't told that
738
1:26:21 --> 1:26:27
secondly the jury weren't told that during the period for the seven deaths that they were told
739
1:26:27 --> 1:26:33
about there'd been that eighth death plus nine others that had occurred the only reason that
740
1:26:33 --> 1:26:38
she didn't stand trial for the nine others was because they couldn't put it on a spreadsheet
741
1:26:38 --> 1:26:43
and go yes lucy was definitely anywhere near any of those nine other providers outrageous
742
1:26:43 --> 1:26:49
yep yeah that's all right it really needs to be investigated charles yep i think it's a it's an
743
1:26:49 --> 1:26:57
issue on an agenda that this deserves a group being formed to support scott in pursuing this
744
1:26:57 --> 1:27:04
but let's move on because we're tempest tempest fugit as the romans would say or tempest fugit as
745
1:27:04 --> 1:27:[privacy contact redaction]ion all right now before we get to jerry waters our favorite irish
746
1:27:12 --> 1:27:18
medicare i want to bring everybody's attention before we started the or the status meeting i
747
1:27:18 --> 1:27:24
talked about this nonsensical article in the australian newspaper of yesterday that autism
748
1:27:24 --> 1:27:[privacy contact redaction]an is being created as you know in 1986 in america one in four one in ten
749
1:27:32 --> 1:27:[privacy contact redaction]ic now it's down to one in 32 andy wakefield who's presented to us said
750
1:27:38 --> 1:27:[privacy contact redaction]ralia it's one in 40 have autism and now what they want to do is
751
1:27:44 --> 1:27:50
to make an autism friendly society but of course as we said at the start of this we don't know what
752
1:27:50 --> 1:27:57
causes autism but it's not vaccines so let me show you something that arrived scott as you started
753
1:27:57 --> 1:28:03
speaking this morning into my inbox everybody this is very relevant jerry please bear with me
754
1:28:03 --> 1:28:[privacy contact redaction]en's health defense
755
1:28:12 --> 1:28:16
and mary holand who's the president of children's health defense has presented to us
756
1:28:16 --> 1:28:23
we're looking for a million dollar match look at this demanding justice for vaccine injured children
757
1:28:23 --> 1:28:29
means ending the fraud around autism and we're closer than ever read on for decades children who
758
1:28:29 --> 1:28:34
developed autism after receiving routine vaccines have been denied justice their parents have been
759
1:28:34 --> 1:28:40
ridiculed gaslighted left to cope on their own kids suffered all because the next national
760
1:28:41 --> 1:28:46
all because the next national vaccine injury compensation program determined vaccines didn't
761
1:28:46 --> 1:28:[privacy contact redaction] claims those three claims determined the fate of over [privacy contact redaction]en
762
1:28:53 --> 1:28:58
in the omnibus autism proceeding and those [privacy contact redaction]eds of thousands of
763
1:28:58 --> 1:29:[privacy contact redaction]en including not just america of course this twisted miscarriage of
764
1:29:03 --> 1:29:[privacy contact redaction]ed the u.s supreme court decision in bruce vitz and wife largely shielding
765
1:29:10 --> 1:29:16
pharma from liability for vaccine injury what if the rulings were based on fraud
766
1:29:17 --> 1:29:[privacy contact redaction]en's health defense thinks so today we filed a motion to expose the fraud upon the courts we
767
1:29:23 --> 1:29:[privacy contact redaction]ice lawyers committed fraud that led to a supreme court decision that
768
1:29:29 --> 1:29:37
led big pharma off the hook for injuring millions of children loyal children's health defense donors
769
1:29:37 --> 1:29:[privacy contact redaction]epped in to help us launch a million dollar match campaign i'm asking you for your
770
1:29:43 --> 1:29:47
support to help fund the critical work of demanding answers and justice for all children injured by a
771
1:29:47 --> 1:29:[privacy contact redaction]y to make a donation now april is autism awareness month you
772
1:29:54 --> 1:29:59
see now i'm realizing why this was in the front page of the australian or as some now defeatedly
773
1:29:59 --> 1:30:06
call it autism acceptance month as in the past april's health agencies the media will tell us
774
1:30:06 --> 1:30:[privacy contact redaction]en now one in [privacy contact redaction]ralia
775
1:30:12 --> 1:30:19
are diagnosed we might hear and on it goes okay so i'll bring that to your attention very relevant
776
1:30:19 --> 1:30:23
and all of us need to speak up about this fraud thank you jerry
777
1:30:28 --> 1:30:29
you're on mute jerry
778
1:30:36 --> 1:30:45
am i unmuted got to go now yeah yeah um hiya scott uh just by way of introduction i'm a gp
779
1:30:45 --> 1:30:51
from ireland 40 years as a gp probably saw somewhere between there's very very very busy
780
1:30:52 --> 1:30:59
saw somewhere between 250 300 000 consult did consultations over a 40 year period
781
1:30:59 --> 1:31:06
in 2020 refused to go along with them asking the covid hoax the covid pathogenicity hoax
782
1:31:06 --> 1:31:15
um wouldn't mask wouldn't social distance examined all my patients got 100 success rate with my
783
1:31:15 --> 1:31:22
patients nobody died nobody even got sick from covid refused to do the pcr tests and was eventually
784
1:31:22 --> 1:31:[privacy contact redaction]er because i asked too many questions in three years ago
785
1:31:30 --> 1:31:40
and i'm still suspended that's a quick synopsis of who i am um what are two comments this idea
786
1:31:40 --> 1:31:47
of you feeling that you uh your mother your your stepmother was instrumental in helping you do
787
1:31:47 --> 1:31:53
do what you did i've always claimed i've got a chip on my shoulder i've gone through life
788
1:31:53 --> 1:31:59
needing to prove not only to myself but everybody else that i was as good as them good as them if
789
1:31:59 --> 1:32:03
not better and i'm sure that's what you're saying so it's not a bad idea to be told that you're a
790
1:32:03 --> 1:32:08
failure or you're never going to mount to anything so congratulations and having come through that
791
1:32:09 --> 1:32:15
um the the other thing like that i did a phd and i don't know why i did a phd in sociology back in
792
1:32:15 --> 1:32:[privacy contact redaction]ually got it because they wouldn't give it to me unless i gave
793
1:32:21 --> 1:32:[privacy contact redaction]ually just making too much money in general practice in a
794
1:32:25 --> 1:32:[privacy contact redaction] couldn't afford to abandon it for a year but they wouldn't actually
795
1:32:31 --> 1:32:39
give me my phd unless i was research phd and unless i gave up my my general practice for a year which
796
1:32:39 --> 1:32:44
which wasn't a possibility because it would dissipate but getting on to the computers and
797
1:32:44 --> 1:32:50
i'm going to say something that that's going to shock everybody here there's absolutely no need
798
1:32:50 --> 1:33:[privacy contact redaction]ice i refused in in the in the mid 80s early 90s the hse decided they'd
799
1:33:01 --> 1:33:06
buy us computers they'd buy the gps computers and get pay for our software and i said no
800
1:33:07 --> 1:33:13
because i realized because they wanted access to my patients records and i said no i'm not going
801
1:33:13 --> 1:33:[privacy contact redaction] i refused to put my patients records on computer and i continued right up to
802
1:33:21 --> 1:33:31
[privacy contact redaction]s the point about you know the records is i worked
803
1:33:31 --> 1:33:41
for a thing called kdoc an out of hours a service and we we were covering something like 200 and
804
1:33:41 --> 1:33:[privacy contact redaction]ors we're doing 460 000 consultations a year and we had no access to
805
1:33:50 --> 1:33:[privacy contact redaction]s because of the the sort of diversity and because we weren't related we weren't actually
806
1:33:56 --> 1:34:02
part of the government we couldn't access the gps records with the result that we every patient
807
1:34:02 --> 1:34:11
we saw 460 000 a year and we saw de novo and when they rang the rang in with the complaint
808
1:34:11 --> 1:34:[privacy contact redaction] would say bring your medicine in and as soon as a patient a every
809
1:34:17 --> 1:34:[privacy contact redaction]ory they just know the medical history and are somebody with them
810
1:34:25 --> 1:34:[privacy contact redaction]ory secondly as soon as all the medication they were on you knew exactly what
811
1:34:30 --> 1:34:35
was wrong with them and i discussed this with my medical protection society once i said there's
812
1:34:35 --> 1:34:[privacy contact redaction]s in general practice and because i was keeping paper records
813
1:34:40 --> 1:34:46
they were very sparse unless there was some idea that there was a medical there'd be a medical
814
1:34:46 --> 1:34:[privacy contact redaction] traffic accident or you had an assault or something that that was
815
1:34:52 --> 1:34:[privacy contact redaction] you took copious notes but with snotty noses and earaches you weren't so
816
1:34:58 --> 1:35:[privacy contact redaction]udious or so what i remember saying is there's actually no need for records in
817
1:35:07 --> 1:35:[privacy contact redaction]ice because the patient knows what what's wrong with them and within within
818
1:35:14 --> 1:35:21
30 seconds of talking to a patient you know what's wrong with them and not only that but a lot of
819
1:35:21 --> 1:35:[privacy contact redaction]ice more so than what you were doing a lot of the the good doctors get to know
820
1:35:27 --> 1:35:[privacy contact redaction]e that they're seeing they know their patients so you know i mean i i had a penetrating
821
1:35:33 --> 1:35:39
eye injury to my left eye when i was nine and so for a good 15 years or so was seen by the same
822
1:35:40 --> 1:35:[privacy contact redaction] he recently retired as a chap in sydney called bill barnett william barnett
823
1:35:48 --> 1:35:[privacy contact redaction] finished his specialization his clinic he used to run and he
824
1:35:55 --> 1:36:[privacy contact redaction]e who had eye problems in tamworth and armadale and
825
1:36:01 --> 1:36:09
but barabar and bingra and all sorts of towns around he had the the giant um round sort of
826
1:36:09 --> 1:36:15
turntable thing in the back of his reception room and all it had in it was those tiny wee little
827
1:36:15 --> 1:36:[privacy contact redaction]s yeah that's what i had well yeah all he would do and and i mean i've i've seen he
828
1:36:23 --> 1:36:[privacy contact redaction]s i've seen my cards the whole thing is it's it's little
829
1:36:27 --> 1:36:[privacy contact redaction]ures and then pitman shorthand he wrote down the three or four things that he knew he'd want
830
1:36:33 --> 1:36:38
to remember for next time everything else though i could pass him in the street and he'd stop and
831
1:36:38 --> 1:36:42
he you know hello how are you you know how's it going is it better you know you're feeling better
832
1:36:42 --> 1:36:49
today is the you know it was it was a you had a personal relationship with this person who you
833
1:36:49 --> 1:36:57
went to see on occasion yeah see this is the point because you didn't keep records on a computer and
834
1:36:57 --> 1:37:05
that by necessity you remember things yeah and i even still years later i still remember
835
1:37:06 --> 1:37:12
somebody's house number i know where they live and i remember i remember lots of things i remember
836
1:37:12 --> 1:37:[privacy contact redaction]e say to me surely i haven't seen you in 10 years and you're still asking me
837
1:37:16 --> 1:37:23
about that by necessity i committed that to to my brain and i remember when they were talking
838
1:37:23 --> 1:37:27
about getting buying commuters into general practice and i said no i said i've actually got
839
1:37:27 --> 1:37:[privacy contact redaction]even was saying saying saying earlier on i've actually got the best computer
840
1:37:33 --> 1:37:40
in my surgery it happens to be my brain but the other thing is patient patients love coming into
841
1:37:40 --> 1:37:45
me because i communicated with them you know person to person and they always said oh it's
842
1:37:45 --> 1:37:[privacy contact redaction]or who's where there isn't a three-way conversation between the doctor
843
1:37:51 --> 1:37:57
and the patient and the computer because the doctor is under pressure to record everything sort of
844
1:37:57 --> 1:38:04
contemporaneously and they're trying to get everything down and that creates a problem
845
1:38:04 --> 1:38:11
and lengthens the consultation whereas i could jot down in it as you say in my shorthand now my
846
1:38:11 --> 1:38:18
shorthand was illegible even to me at times but it didn't matter and i never ended up in trouble
847
1:38:18 --> 1:38:29
after 40 years as a gp other than when i well when i refused to go along with hoax but overall i never
848
1:38:29 --> 1:38:38
any medical legal problems because of my sort of four words on a consultation
849
1:38:40 --> 1:38:45
yeah no i totally totally agree that this this thing with the imposition of the computer yes
850
1:38:45 --> 1:38:51
it's about collecting data and yes it's about you know somebody else getting the data and making
851
1:38:51 --> 1:38:57
money but at the same time i think it's also about this it's it's part of this push that's ended up
852
1:38:57 --> 1:39:[privacy contact redaction]ice going so defensive that you know you see that that's about the medical
853
1:39:04 --> 1:39:[privacy contact redaction]ion society and they're pushing the records and then you're actually caught on the quality of
854
1:39:10 --> 1:39:[privacy contact redaction]s you know your defense your defense association is actually pushing you into this
855
1:39:15 --> 1:39:21
oh they came out you know the company's it's paper records i say and where's my paper records there
856
1:39:21 --> 1:39:27
they are there's four words on that consultation i mean they can't say i say oh bugger off the
857
1:39:27 --> 1:39:33
reality of it is as far as i'm concerned i don't need records in my general practice i you know i
858
1:39:34 --> 1:39:39
generally was quite you know i wouldn't say offensive with them but i said you know
859
1:39:39 --> 1:39:45
the only it's you guys in in the nps who want us to keep records not us the only thing is i
860
1:39:45 --> 1:39:51
didn't like the idea that when it came down to a revenue auditor and not that in any way would i
861
1:39:51 --> 1:40:00
try to cheat the revenue out of their due their due um kevery um but but it all made it too handy
862
1:40:00 --> 1:40:[privacy contact redaction]e to come in and be able to push a button and know how many
863
1:40:05 --> 1:40:12
patients you saw on what day and you know how long you spent with them and so i i never computerized
864
1:40:12 --> 1:40:18
and you know after four years i was delighted brilliant jerry it's a great story and we're
865
1:40:18 --> 1:40:26
going to keep moving and everybody i refer this book to you by tim ferris tools of titans a
866
1:40:26 --> 1:40:34
magnificent book of insights i recommend from tim ferris's interviews of some 300 amazing human
867
1:40:34 --> 1:40:41
beings one of them is cal fuss man if you double sman who apparently i've never heard of him is a
868
1:40:41 --> 1:40:[privacy contact redaction]-selling author and writer at large for s y magazine anyway tim just got
869
1:40:48 --> 1:40:54
wonderful nuggets of gold i put the link into the chat i recommended probably some years ago
870
1:40:54 --> 1:41:02
but cal fussman said this jerry this reminds me of what you just said and when cal once asked harry
871
1:41:02 --> 1:41:[privacy contact redaction] of snakes how he could remember anything given how much booze
872
1:41:08 --> 1:41:[privacy contact redaction]ugs he consumed harry kept no diary his response was boy the good shit sticks
873
1:41:18 --> 1:41:25
yeah well that's that's the point you hang on your computer the good shit sticks so there you are
874
1:41:25 --> 1:41:31
everybody so sorry to divert but this book the tools of titans tim ferris excellent put in the
875
1:41:32 --> 1:41:41
you jerry charles jerry jerry doesn't drink like a good irishman no no no well i i i don't drink
876
1:41:41 --> 1:41:46
prior to coming on a zoom meeting because i tend to get a little aggressive and i might become
877
1:41:46 --> 1:41:[privacy contact redaction]reperous so yeah i can assure you i've never ever drunk on this in this forum now scott you
878
1:41:53 --> 1:41:[privacy contact redaction]ease because this is a well-kept secret he never shows his his
879
1:41:59 --> 1:42:05
long hair scott his long hair with his ponytail and he's a motorbike riding
880
1:42:07 --> 1:42:16
legend around ireland thank you jerry glenn hi scott very interesting uh your dialogue around
881
1:42:16 --> 1:42:[privacy contact redaction]arting point of the enterprise solution bus is quite interesting and highly parallels my
882
1:42:22 --> 1:42:[privacy contact redaction]ry only i had somewhere between two and three decades jump on
883
1:42:27 --> 1:42:35
you i i created and and uh and was architect of the most successful message bus say asynchronous
884
1:42:35 --> 1:42:[privacy contact redaction] message bus during the 1990s uh worldwide um and uh and and uh to some extent
885
1:42:46 --> 1:42:52
it very much allows enterprise solutions to be layered on it and many many people described
886
1:42:52 --> 1:42:57
how they did that with with our our tool and the fact that being both durable and asynchronous
887
1:42:57 --> 1:43:[privacy contact redaction]ually operate without all the parts running at once um now
888
1:43:04 --> 1:43:10
you're i i find extremely interesting your history and your insight in that you uh you you found that
889
1:43:10 --> 1:43:14
a variety of things you hoped would turn to solutions in fact that turned against people
890
1:43:14 --> 1:43:[privacy contact redaction]ead turned into control mechanisms uh for corporations and
891
1:43:21 --> 1:43:29
what many of us around here will talk about as the evil elite and uh so and and but it's not new
892
1:43:29 --> 1:43:34
i mean this isn't something in the last two decades this clearly goes all the way back to world war two
893
1:43:34 --> 1:43:41
and everything that occurred with the holocaust and the cooperation of ibm at and t ford motor
894
1:43:42 --> 1:43:50
uh lockheed uh aircraft and and so on that so so linked together did things in a uniform way
895
1:43:50 --> 1:43:56
on behalf of the Rockefeller family and and therefore extended a wide set of monopolies
896
1:43:56 --> 1:44:05
across everything and and that's that's going on even now um so uh and your description at
897
1:44:05 --> 1:44:10
at each time you try to do something you know i'm sure you're familiar with the term kiss and gico
898
1:44:11 --> 1:44:[privacy contact redaction]upid and gico is garbage in garbage out and and that at each time
899
1:44:18 --> 1:44:[privacy contact redaction]e and try to make things that could be readily reusable and produce results
900
1:44:24 --> 1:44:[privacy contact redaction]ead you are forced to do it more complicated to seed confusion and i
901
1:44:30 --> 1:44:38
would claim that's a standard mechanism of of marxism uh and and the elites attempt to prevent
902
1:44:38 --> 1:44:46
anyone from figuring out what they're doing in order to uh to have a result back so with now
903
1:44:46 --> 1:44:51
that you've gone down these trails and and you've discovered what i would call the atoms in the
904
1:44:51 --> 1:45:[privacy contact redaction] you come to conclusions of how to fight back uh in any any
905
1:45:00 --> 1:45:10
particular pattern uh say other than from a spiritual one um it's to to some degree yes
906
1:45:10 --> 1:45:15
you know i mean obviously there are there are there are things that you know that i try and espouse
907
1:45:16 --> 1:45:[privacy contact redaction] an iphone or smartphone like most people um very very little
908
1:45:23 --> 1:45:29
actually goes on that device you know i'm i'm very careful about the fact that you know i i don't as
909
1:45:29 --> 1:45:34
a general like i wouldn't put a facebook app on that device i don't put the microsoft or google
910
1:45:34 --> 1:45:40
authenticators on that device because i know i know how they um even even without apple alerting
911
1:45:40 --> 1:45:[privacy contact redaction]e doesn't alert you they're collecting things like your location whenever
912
1:45:44 --> 1:45:50
you're using your authenticator app so i don't put those it literally is barely smarter than a
913
1:45:50 --> 1:45:[privacy contact redaction] that i use it um you know there's a lot of things like as i wrote a substack
914
1:45:56 --> 1:46:[privacy contact redaction]e of days ago you know as microsoft each i don't know whether you've noticed
915
1:46:01 --> 1:46:08
but each time that there's that um there's one of these things where oh you know 100 000 or a million
916
1:46:08 --> 1:46:[privacy contact redaction]e's records got stolen from azure cloud or from amazon cloud or from wherever
917
1:46:15 --> 1:46:21
else um what's the first thing that happens the first thing that happens is they turn around and
918
1:46:21 --> 1:46:26
they go oh well what we want you to do now is we want you to do two-factor authentication so we want
919
1:46:26 --> 1:46:31
your mobile phone number so that we can send you a text message right and then they turned around
920
1:46:31 --> 1:46:36
and said oh but you know what's happening sometimes is these text messages you know they're not safe
921
1:46:36 --> 1:46:39
on your phone they're getting intercepted you know people are getting them off your phone or
922
1:46:39 --> 1:46:44
you're sharing them with other people so the next thing was we want you to put this authenticator
923
1:46:44 --> 1:46:49
app on your device if my device isn't secure for the text message it's really not secure for the
924
1:46:49 --> 1:46:[privacy contact redaction]e didn't question it they just do it they go and install
925
1:46:55 --> 1:46:59
the damn thing don't realize that that authenticator app then is sending back
926
1:46:59 --> 1:47:04
microsoft dynamics data it's sending back your location when you use it every time your
927
1:47:04 --> 1:47:10
phone syncs email it's sending your location data and other data back to microsoft so
928
1:47:11 --> 1:47:[privacy contact redaction] do it now we're at the point where in the last two or three weeks there's there's been
929
1:47:17 --> 1:47:[privacy contact redaction] year there's been three more really big data breaches in the microsoft cloud
930
1:47:23 --> 1:47:31
space microsoft's reaction to that is not to secure their system properly you know and i mean
931
1:47:31 --> 1:47:35
there's it guys like me who've been around for for 20 years and yourself you know you've probably
932
1:47:35 --> 1:47:41
been twice that who are telling them you know you don't you you don't secure your entire global
933
1:47:41 --> 1:47:[privacy contact redaction]ure with a single certificate you know that was really dumb that was that was
934
1:47:46 --> 1:47:[privacy contact redaction]acularly dumb you know you don't have one one admin account that controls everything from
935
1:47:53 --> 1:48:00
singapore to to texas that's dumb and yet that's the sort of stuff they supposedly were doing
936
1:48:01 --> 1:48:[privacy contact redaction]e missed is that every time that happened microsoft have escalated these things
937
1:48:07 --> 1:48:13
that they want you to do and all these things do is track you more so now what they want you to do
938
1:48:13 --> 1:48:17
is we got told at the universities all of the university of london groups well you've got to
939
1:48:17 --> 1:48:[privacy contact redaction]all microsoft in tune then you've got to install the microsoft authenticator app
940
1:48:22 --> 1:48:28
and then you've got to you've got to allow the university and therefore microsoft to control
941
1:48:28 --> 1:48:[privacy contact redaction]s and pins are on your device and they can they can use that that
942
1:48:35 --> 1:48:41
in tune process to monitor then what's on your device what apps you're allowed to install it's
943
1:48:41 --> 1:48:48
my thing that i bought the university doesn't give us phones i mean i'm in a situation where
944
1:48:48 --> 1:48:53
the university don't even give me a laptop i own the laptop i own the computer over there that i
945
1:48:53 --> 1:48:58
do my work on they've they've got no ownership they've got no skin in that game yet they want
946
1:48:58 --> 1:49:[privacy contact redaction]all that software on these devices and so i've turned around and refused and what i've
947
1:49:04 --> 1:49:[privacy contact redaction]e how to build virtual machines empty little virtual machines
948
1:49:09 --> 1:49:[privacy contact redaction]uff that they want you to run put it in an empty little virtual
949
1:49:15 --> 1:49:21
machine so that it's not touching the physical computer it's just it's it's a memory process
950
1:49:21 --> 1:49:[privacy contact redaction]uff that i do in that virtual machine quite literally is when i've got
951
1:49:27 --> 1:49:32
to log on to something for the university to you know fill out a payroll form or or fill out a
952
1:49:34 --> 1:49:[privacy contact redaction]udent or something like that every other piece i work piece of work
953
1:49:39 --> 1:49:46
i do i do out on the desktop on both my windows computers and my macbook that i'm sitting on
954
1:49:46 --> 1:49:52
i've got firewall programs that are completely separate from the nonsense firewalls that the
955
1:49:52 --> 1:49:[privacy contact redaction]em those firewalls are going through even right now i'm
956
1:49:57 --> 1:50:[privacy contact redaction]e don't realize apple you know supposedly oh apple is so wonderful
957
1:50:05 --> 1:50:11
they're so secure they're so all about your privacy why is it then that more than 1400 times
958
1:50:11 --> 1:50:17
an hour this computer gets blocked by that firewall trying to talk and send data to apple servers
959
1:50:17 --> 1:50:24
about what apps i'm using on the computer and what websites i open so you know the the biggest thing
960
1:50:24 --> 1:50:[privacy contact redaction]e to do is to you know firewall yourself in and stop all of the as much
961
1:50:29 --> 1:50:36
as you can of the data that's going out you know yes more people need to fight back against the
962
1:50:36 --> 1:50:[privacy contact redaction]uff that's you know i when bill gates funded he sent four
963
1:50:44 --> 1:50:[privacy contact redaction]itute in london and said okay we want you to do reversible
964
1:50:51 --> 1:50:[privacy contact redaction]en now let that sink in for a minute what was bill gates's
965
1:50:57 --> 1:51:06
target it was i'm funding millions of pound for you to create a digital id for african children
966
1:51:06 --> 1:51:[privacy contact redaction] a look you know i do research with with several african people you
967
1:51:11 --> 1:51:[privacy contact redaction]s like zimbabwe you go and have a look out in the out in the boonies you know
968
1:51:17 --> 1:51:23
yeah they've all got smartphones and stuff but do they actually need a digital id what services is
969
1:51:24 --> 1:51:28
you know microsoft are selling this idea that by having this encrypted reversible so
970
1:51:29 --> 1:51:35
by reversible i mean microsoft can get in and look by having this digital id you know what's
971
1:51:35 --> 1:51:41
really your target am i is that little kid going to as it grows up be able to access services and
972
1:51:41 --> 1:51:[privacy contact redaction]ually have the the money most of these countries for example
973
1:51:46 --> 1:51:51
don't actually have the money or the need or the want to put in all of this electronic digital
974
1:51:51 --> 1:51:57
tomfoolery that that microsoft and co want and you go and have a look you know they're doing the
975
1:51:57 --> 1:52:03
same sort of thing in in the the backwaters of places like india and bangladesh and you go and
976
1:52:03 --> 1:52:07
have a look and the thing is there's no services there for the people so what are what are microsoft
977
1:52:07 --> 1:52:14
and amazon and so on do oh we'll we'll loan you 180 million u.s dollars to to set up this
978
1:52:14 --> 1:52:[privacy contact redaction]ure and therefore we own the infrastructure so you have to use our software and you have to
979
1:52:18 --> 1:52:24
pay us every year that you use it you know it's most of this stuff i tell people go and fight back
980
1:52:24 --> 1:52:[privacy contact redaction]rs licenses are a bad idea they're a really bad idea we've already seen
981
1:52:30 --> 1:52:35
in new south wales where they were introduced where police officers in new south wales have
982
1:52:35 --> 1:52:[privacy contact redaction]op and search on somebody's device by claiming oh gee the the
983
1:52:43 --> 1:52:49
little app that i use to scan your your digital drivers license oh it's not working i just need
984
1:52:49 --> 1:52:54
to take your phone back to my police car and type it into the computer and they go back to the police
985
1:52:54 --> 1:52:59
car and they scan through um there's a couple of police officers recently who got prosecuted for
986
1:52:59 --> 1:53:[privacy contact redaction] that what they did was they scanned through this this rather attractive young girl's
987
1:53:03 --> 1:53:09
phone and took naked photos out of it that she'd taken for her boyfriend these cops took it and
988
1:53:09 --> 1:53:[privacy contact redaction]ributed around all the police officers in the station you know so all of this all of this move
989
1:53:15 --> 1:53:[privacy contact redaction] digital and and to have digital electric cars and stuff it's got nothing
990
1:53:22 --> 1:53:[privacy contact redaction]t net zero's complete nonsense nothing to do with saving the planet it's
991
1:53:27 --> 1:53:34
about being at the point where they can control and they can turn you off are you familiar with
992
1:53:34 --> 1:53:[privacy contact redaction]ralia i've heard the name um i don't all right he's actually on the
993
1:53:42 --> 1:53:49
call hopefully he'll raise his hand come up he's he's he's organized a a localism approach to things
994
1:53:50 --> 1:53:56
that that can you know by face-to-face things cannot be automated and controlled by the evil
995
1:53:56 --> 1:54:[privacy contact redaction]ill need some degree of of communication mechanisms and and if you if
996
1:54:03 --> 1:54:09
there are a variety of things you've uncovered and could be added to his localism uh game plan
997
1:54:09 --> 1:54:[privacy contact redaction] for all of us and for humanity and against the evil the trying
998
1:54:15 --> 1:54:[privacy contact redaction] state thank you thanks lind thanks lind good thinking
999
1:54:23 --> 1:54:[privacy contact redaction]ing um what i'd like to ask i've got three questions
1000
1:54:34 --> 1:54:44
one is uh the uk biobank um is the information uh because i'm a member of this right i i had
1001
1:54:44 --> 1:54:[privacy contact redaction]arted in [privacy contact redaction]e that don't know that are uh listening to
1002
1:54:52 --> 1:55:[privacy contact redaction]s data personal data on health and it actually then distributes
1003
1:55:00 --> 1:55:[privacy contact redaction] been asked to come along for some scanning to do body
1004
1:55:09 --> 1:55:15
scans uh and then that data is going to be available it's not available to me you know
1005
1:55:15 --> 1:55:25
it's available to researchers right um is that going to be sold yes all of that biobank data
1006
1:55:25 --> 1:55:[privacy contact redaction]ate that i know for example that um there are organizations who can
1007
1:55:35 --> 1:55:41
go and they can pay a fee of somewhere between six and twenty thousand pound and gain access to so
1008
1:55:41 --> 1:55:47
they can go in they can specify you know we want these types of people um and while it's on it's
1009
1:55:47 --> 1:55:53
anonymized data yes it's for sale and yes it's being used so you've got farmer companies who
1010
1:55:53 --> 1:55:57
are using it um you've got health medical device companies you've got app companies
1011
1:55:58 --> 1:56:[privacy contact redaction] that one of the app that we were doing that one of the app companies was in on
1012
1:56:03 --> 1:56:11
they had literally gone and spent um close to 20 000 pound to get biobank data to use for training
1013
1:56:11 --> 1:56:16
this it was an app for arthritis and they were using it to train their app which is a commercial
1014
1:56:16 --> 1:56:26
app that's offered for sale right so i suppose uh giving my data is being altruistic right but
1015
1:56:27 --> 1:56:33
um i'm i'm now i'm now concerned because i wasn't maybe i'm very naive i wasn't aware that
1016
1:56:33 --> 1:56:39
i thought the whole point of this was that it was going to be available for researchers
1017
1:56:39 --> 1:56:45
without being charged you know i mean a minimal charge maybe for access but not actually selling
1018
1:56:45 --> 1:56:57
the data all right um number two is a a technical uh issue um i recently received from the nhs a
1019
1:56:57 --> 1:57:07
text asking me to go along to my local surgery um to um have a course so i could so my smoking
1020
1:57:07 --> 1:57:[privacy contact redaction] never smoked right so what i was interested in is i logged onto
1021
1:57:16 --> 1:57:[privacy contact redaction]s i couldn't actually find a profile because what i was
1022
1:57:23 --> 1:57:[privacy contact redaction] said smoker nonsmoker you know like have you ever
1023
1:57:29 --> 1:57:36
smoked etc and what i was thinking is that maybe there was a field which was having a null value
1024
1:57:36 --> 1:57:44
and then the programmer has misprogrammed and instead of looking for smoker yes has actually
1025
1:57:44 --> 1:57:[privacy contact redaction] looked for the reverse and maybe saying you know is not is not no right and of course it's
1026
1:57:52 --> 1:57:58
right and then right so what i wanted to find out is is that something that you think might be
1027
1:57:58 --> 1:58:04
possible because i i've tried to get to the bottom of it because i want to stop it because there are
1028
1:58:04 --> 1:58:[privacy contact redaction]ing let's say the gp's time by phoning in and complaining
1029
1:58:12 --> 1:58:19
right so right it's another it's another kind of like um delay all right there so that was that
1030
1:58:20 --> 1:58:28
so do you think do you think that that's just a simple matter of a program misprogramming
1031
1:58:28 --> 1:58:[privacy contact redaction]s to that the first aspect is the fact that there are these
1032
1:58:33 --> 1:58:43
commercial companies who pay for access to nhs data i did some work for a very little while with
1033
1:58:44 --> 1:58:50
a data group the sort of an offshoot of nhs digital that's running out at leeds hospital and
1034
1:58:50 --> 1:58:55
they spend millions and millions and millions of dollars every year and they sit they a group of
1035
1:58:55 --> 1:58:59
them they all sit there they give each other manager manager you know band seven band eight
1036
1:58:59 --> 1:59:[privacy contact redaction] sit in a little room all day and create ways to extract data from the
1037
1:59:05 --> 1:59:[privacy contact redaction]em to package it up to you know some of it yes goes to legitimate research but others of
1038
1:59:11 --> 1:59:16
it's going to all sorts of other organizations who are paying money in this case what you get
1039
1:59:16 --> 1:59:21
is there are organizations who are getting sponsorship now they're getting paid by the
1040
1:59:21 --> 1:59:27
nhs to provide these courses but at the same time they're getting paid by other companies and farmer
1041
1:59:27 --> 1:59:32
companies because often on these courses they will sell you nico-rect or they'll sell you some
1042
1:59:32 --> 1:59:37
books about you know the books and pamphlets or they'll sell you something else you know this
1043
1:59:37 --> 1:59:43
retreat you can go to and whatnot so there's there's money involved at that end as far as the
1044
1:59:43 --> 1:59:50
programming goes usually when you go to when you start with a new gp clinic here in the nhs usually
1045
1:59:50 --> 1:59:54
you'll get like the the nurse will pull you off into a side room and she'll ask you all of these
1046
1:59:54 --> 2:00:[privacy contact redaction]ions for like 45 minutes what they're doing is a whole heap of little surveys that end
1047
2:00:00 --> 2:00:07
up you know if you when you've got access to the patient access back end you know the website
1048
2:00:07 --> 2:00:12
patient access i've seen the back end of that when you get access to the back end of that you can
1049
2:00:12 --> 2:00:[privacy contact redaction]ions she's asking and she's ticking the little boxes in the computer
1050
2:00:18 --> 2:00:[privacy contact redaction]s and of course you know in some cases now there's funding
1051
2:00:23 --> 2:00:28
for the gps based on you know what what all of these little tests were whether or not they
1052
2:00:28 --> 2:00:[privacy contact redaction]ed the data and whether the data is available now in your case what's happened is
1053
2:00:35 --> 2:00:[privacy contact redaction]ly what you say a programmer has probably been told if the person's a smoke if they've got
1054
2:00:41 --> 2:00:47
an absolute yes in their box there's three values for the box it'll be yes no and no or empty
1055
2:00:48 --> 2:00:55
they'll have been told that if a person says yes yes you can assume they smoke therefore send them a
1056
2:00:55 --> 2:01:01
text if a person says no then they probably don't smoke so don't waste the money sending them the
1057
2:01:01 --> 2:01:08
text but if a person has no value in the box if it's a null value then assume that they might smoke
1058
2:01:09 --> 2:01:15
because they know that if they send out you know if they send out tens of thousands of these text
1059
2:01:15 --> 2:01:[privacy contact redaction]e all they've got to do is get you know maybe six or eight or
1060
2:01:21 --> 2:01:26
ten out of every thousand text messages result in a response and they're making tens of millions
1061
2:01:26 --> 2:01:33
of dollars a year got you got you what was interesting is that i was actually able to
1062
2:01:33 --> 2:01:[privacy contact redaction] smoke right okay and it came back since i joined in 2014
1063
2:01:41 --> 2:01:47
right with non-smoker non-smoker non-smoker non-smoker non-smoker right and i took a
1064
2:01:47 --> 2:01:[privacy contact redaction]ice to say come on guys what's going on all right the other question
1065
2:01:53 --> 2:02:[privacy contact redaction] looked at all of the hospital data from 2012 to it would be 2023 you know so
1066
2:02:04 --> 2:02:11
it's [privacy contact redaction]n't got to the end of 20 you know it's a financial year right 23 to
1067
2:02:11 --> 2:02:19
24 from april to march right so i had a look at all the hospital records and i was looking at
1068
2:02:20 --> 2:02:32
the icd codes all right the four and when i had i was looking for astigmatism and i noticed that
1069
2:02:33 --> 2:02:[privacy contact redaction]igmatism is very very low it's it's it it runs through all of the whole
1070
2:02:42 --> 2:02:49
period less than 500 in the whole year for the whole of england but if you look at all diagnosis
1071
2:02:51 --> 2:03:[privacy contact redaction]arts to go up in about 2017 then when we get the
1072
2:03:01 --> 2:03:12
covid jabs it's going up in leaps and bounds and then in the last two years so 21 22 and 22 23
1073
2:03:12 --> 2:03:25
it it went from 110 000 to 210 000 and the thing is it's all diagnosis and i was trying to understand
1074
2:03:26 --> 2:03:34
how does what's the process because if you come in for one diagnosis how how how do you then process
1075
2:03:34 --> 2:03:[privacy contact redaction]igmatism did they has something changed in the way in which
1076
2:03:41 --> 2:03:47
the nhs is operating there's okay there's there's two parts that answer and one of them actually
1077
2:03:47 --> 2:03:52
allows me to sort of cross back into the previous question you can you can have seven or eight
1078
2:03:52 --> 2:03:[privacy contact redaction] like yours that says non-smoker but you can't guarantee that that's the
1079
2:03:58 --> 2:04:05
field that the person then goes and looks at right now what there's been a couple of things that you
1080
2:04:05 --> 2:04:10
need to when you look at this from sort of a big picture causal perspective there's been a couple
1081
2:04:10 --> 2:04:[privacy contact redaction] changed over the maybe the last [privacy contact redaction]e of things
1082
2:04:17 --> 2:04:[privacy contact redaction]s some of the diagnosis that you talk about so if we look at
1083
2:04:24 --> 2:04:[privacy contact redaction] that you know it's very likely if we went back and we look at what fields were recorded in
1084
2:04:29 --> 2:04:[privacy contact redaction]s [privacy contact redaction] been recorded at all
1085
2:04:35 --> 2:04:[privacy contact redaction] been something that maybe your ophthalmologist and your optometrist
1086
2:04:39 --> 2:04:45
knew about and you know so they knew about it locally but it wasn't necessarily we don't know
1087
2:04:45 --> 2:04:[privacy contact redaction]ed in the database i'd have to go and have a look at those databases
1088
2:04:49 --> 2:04:57
the other thing that's changed since about the middle of the last decade so since about 2015 2016
1089
2:04:58 --> 2:05:04
is when you go sort of to go and get your glasses or your contact lenses they've started doing all
1090
2:05:04 --> 2:05:[privacy contact redaction]ick your head in a machine and get you to look at like a
1091
2:05:08 --> 2:05:13
balloon that's that's moving around the screen or a bunny rabbit hopping or something like that
1092
2:05:14 --> 2:05:[privacy contact redaction]s right a lot of these different tests you know you don't
1093
2:05:19 --> 2:05:24
realize necessarily what that data can be used for and often some of these tests can be used to
1094
2:05:24 --> 2:05:30
look at two or three different ophthalmic conditions right but that again one of the
1095
2:05:30 --> 2:05:[privacy contact redaction]s when you go and you look at the do the balloon or the bunny test one of the things
1096
2:05:35 --> 2:05:39
they're looking for is they're looking for that center point of where your where your eyes are
1097
2:05:39 --> 2:05:47
right so where's your focal point and so that's something that again prior to having those tests
1098
2:05:47 --> 2:05:[privacy contact redaction]ing of some of these conditions wasn't automatic and so as they
1099
2:05:55 --> 2:06:[privacy contact redaction]s and making it automatic all of a sudden you see more you see
1100
2:06:00 --> 2:06:[privacy contact redaction] because we see it more times in a data set doesn't
1101
2:06:06 --> 2:06:11
necessarily mean you know in the data you know temporarily doesn't necessarily mean it didn't
1102
2:06:11 --> 2:06:[privacy contact redaction] means that we've become better at finding it or better at recording it or
1103
2:06:17 --> 2:06:23
we've got a field for it now right right scott i think i think you've hit on something i must have
1104
2:06:23 --> 2:06:33
missed it says all english hospitals hospitals so is it possible that when you go to the optician
1105
2:06:33 --> 2:06:39
that is feeding into the hospital data i know from i know for mine it definitely did because
1106
2:06:40 --> 2:06:[privacy contact redaction] vision problems because of the fact that i had a penetrating injury in my left eye
1107
2:06:47 --> 2:06:53
and so you know i go and see the gp the gp makes a note i go to the hospital the hospital ophthalmologist
1108
2:06:53 --> 2:07:00
can see the gp's note i then go to the optometrist um because i my eyes over the last sort of
1109
2:07:01 --> 2:07:07
three to six months um i've lost about 40 percent of my my vision in my right eye which is my only
1110
2:07:07 --> 2:07:[privacy contact redaction] and the optometrist already knows what i talked to
1111
2:07:13 --> 2:07:[privacy contact redaction] at the hospital about right okay because the thing is the the this hospital
1112
2:07:20 --> 2:07:27
data right that it keeps going up right from 20 you know it keeps going up right so what what
1113
2:07:27 --> 2:07:34
it probably is is that it's not that we're looking at a signal it's that they've changed the process
1114
2:07:34 --> 2:07:38
and they're collecting more data all right that that's that that's fine thank you thank you very
1115
2:07:38 --> 2:07:49
much scott better move now um scott we've got sam du bay on the call and he's driving his car
1116
2:07:49 --> 2:07:56
and i note on some of your publications that you have a colleague called hey du bay so there you
1117
2:07:56 --> 2:08:04
are so there's not many debates on the planet so i i think i think kuda is actually um he was for
1118
2:08:04 --> 2:08:[privacy contact redaction] logged out now kuda was actually on the call here now it looks like he's
1119
2:08:09 --> 2:08:14
logged out yes he was he was on the call anyway there you ask not many debates so now we've got
1120
2:08:14 --> 2:08:18
two one in your publications the other question is can you type before we go to gary and then
1121
2:08:18 --> 2:08:[privacy contact redaction]even we're finishing in [privacy contact redaction]ack details
1122
2:08:25 --> 2:08:[privacy contact redaction]e can find your subs sub stack i presume it's scott
1123
2:08:31 --> 2:08:37
mcgulkin put it in so make it easy for people so sam good to see you driving very safely and well
1124
2:08:37 --> 2:08:[privacy contact redaction] all right gary the hawkins i sorry sam sam so scott does have a sub stack no why didn't i
1125
2:08:48 --> 2:08:55
find it when i searched for it well you must be saying some things that uh they don't want out
1126
2:08:55 --> 2:09:[privacy contact redaction] a look through because yeah that um some of some of that some of what i've got on
1127
2:09:02 --> 2:09:[privacy contact redaction]ack is some of the work that i've done with the various people that you've spoken to so
1128
2:09:07 --> 2:09:14
jessica rose and jonathan engler and so on um some of that work is definitely work that has seen me
1129
2:09:14 --> 2:09:22
get censored and um you know i've gone from publishing 12 or 15 academic papers in journal
1130
2:09:22 --> 2:09:[privacy contact redaction]ruggling to publish one or two in the last two years
1131
2:09:30 --> 2:09:37
yeah uh i didn't intend to say this but i'm like that too i write on sub stack uh mainly about
1132
2:09:37 --> 2:09:43
bears and the search engines don't seem to like to tell people about me it's at deep dots if
1133
2:09:43 --> 2:09:51
anybody's interested but uh theirs was my second other question uh did you want would you like to
1134
2:09:51 --> 2:09:59
talk about bears specifically so thanks well um yeah so right back sort of as they started the
1135
2:09:59 --> 2:10:06
vaccine rollout in december 2020 um i'd i'd been because there'd been lots of talk for example
1136
2:10:07 --> 2:10:[privacy contact redaction] system um and there'd been lots of talk about you know
1137
2:10:13 --> 2:10:[privacy contact redaction]ralia and new zealand's reporting system and i knew a little bit about new zealand's reporting
1138
2:10:17 --> 2:10:[privacy contact redaction]em and um how potentially bad it was um that got me interested and i was having discussions with
1139
2:10:26 --> 2:10:[privacy contact redaction]e in the group and in the end i decided okay you know we got through to a think
1140
2:10:32 --> 2:10:39
about the end of march 2021 and i'd been on a weekly basis i'd been going and looking at
1141
2:10:39 --> 2:10:[privacy contact redaction] you know doing a few quick searches to see what was in it and noticed um i think the
1142
2:10:45 --> 2:10:[privacy contact redaction] thing i noticed was the that as they rolled it out that initial vaccine rollout was
1143
2:10:51 --> 2:10:[privacy contact redaction]e and one of the things that i noticed visually sort of anecdotally
1144
2:10:59 --> 2:11:03
was i kept saying well you know i'm looking at it and a lot of these people are dying within sort of
1145
2:11:03 --> 2:11:12
you know two or three days of being jabbed and so um various members of the the research group
1146
2:11:12 --> 2:11:17
that i'm in um with the professors and so on sort of said well why don't you go and have a why don't
1147
2:11:17 --> 2:11:[privacy contact redaction] a look at it if it's if it's tweaked your interest go and go and pull the data and play with
1148
2:11:21 --> 2:11:28
it so i pulled down um at the end of march as it rolled into april [privacy contact redaction]ete
1149
2:11:28 --> 2:11:37
dump of what was in vars at that point um and then a group of us literally went through and because
1150
2:11:37 --> 2:11:43
the one the beauty of vars as compared to yellow card in the uk and yellow card europe and the
1151
2:11:43 --> 2:11:[privacy contact redaction]ems you've got in australia and new zealand um by comparison are patently useless
1152
2:11:51 --> 2:11:57
and i say that because they don't you don't get to see any of the medical record information for
1153
2:11:57 --> 2:12:04
the patient you don't get to see um a lot of the there's no way for example in the yellow card
1154
2:12:04 --> 2:12:12
data that if i wanted to look up and see how many patients who had say a vaccine reaction had
1155
2:12:12 --> 2:12:20
particular types of comorbid morbidities right all i can see is that you know 17 000 had diabetes and
1156
2:12:20 --> 2:12:26
21 000 had something else i can't actually go and look at individual patient level data and go
1157
2:12:26 --> 2:12:31
okay you know i can see this guy had diabetes and he had another auto-inflammatory disorder
1158
2:12:31 --> 2:12:37
this patient had you know something else this patient had a kidney transplant you can't see
1159
2:12:37 --> 2:12:[privacy contact redaction] of the data of the rest of the world in vars it's great because
1160
2:12:41 --> 2:12:48
what we could do is we divided up between us so there was myself another another person who was
1161
2:12:48 --> 2:12:[privacy contact redaction]ered nurse who'd finished her training we had a midwife we had a gp
1162
2:12:56 --> 2:13:[privacy contact redaction] dubay who we what we did was we went through and we
1163
2:13:05 --> 2:13:[privacy contact redaction]ion of patients i think it was 250 patients who had been reported
1164
2:13:15 --> 2:13:22
as having died to vars we pulled that out and we looked at we looked at their that all of the
1165
2:13:22 --> 2:13:25
clinical notes that were included in vars and from some of these patients you know the clinical
1166
2:13:25 --> 2:13:31
notes would go you know two or three inches down the page in excel when you were looking at them
1167
2:13:31 --> 2:13:35
and so we went through and we graded them all we set up a system like a you know almost like doing
1168
2:13:35 --> 2:13:[privacy contact redaction]udy where if a if it said that a patient had a cardiac condition
1169
2:13:41 --> 2:13:[privacy contact redaction]iac for that patient it said the patient had diabetes we ticked the box
1170
2:13:46 --> 2:13:52
for diabetes but we went through and we built a data set of all of these patients and we knew
1171
2:13:52 --> 2:13:58
we knew therefore you know how many had different comorbidities how many how many had things like
1172
2:13:58 --> 2:14:[privacy contact redaction] you'd be surprised how many of them had negative pcr tests
1173
2:14:04 --> 2:14:11
right up to their death and yet their death for the whole [privacy contact redaction]
1174
2:14:11 --> 2:14:20
group and i noticed that the the cdc now have censored the column in that data set but for all
1175
2:14:20 --> 2:14:[privacy contact redaction]e covid was their primary cause of death and yet what we could see when you look
1176
2:14:31 --> 2:14:35
at the graphs in the paper and i've put there's a link in the email to the to the two vas papers
1177
2:14:36 --> 2:14:42
when you look at the graphs what we could see from that data was half of these patients were
1178
2:14:42 --> 2:14:[privacy contact redaction] 72 hours of having been jabbed you had patients who were having clearly
1179
2:14:47 --> 2:14:[privacy contact redaction]ion and dying within sort of half an hour to four hours of being jabbed
1180
2:14:55 --> 2:15:00
who you know like there was a chap in it who literally got his jab wasn't feeling well went
1181
2:15:00 --> 2:15:07
home died at the dinner table and yet he was classed as a covid death nothing to do with the
1182
2:15:07 --> 2:15:12
vaccine you know nothing to see him move along go somewhere else that was what got me interested
1183
2:15:12 --> 2:15:19
and then what we did was a small group of us who worked on that first paper wrote a second paper
1184
2:15:19 --> 2:15:[privacy contact redaction]even included in the email a small group of us stepped down and we
1185
2:15:28 --> 2:15:35
wrote a machine learning text classifier to go through a second set so we did a thousand
1186
2:15:35 --> 2:15:41
12 months later we went back and we said okay let's pull down april 2022 we pull down a data
1187
2:15:41 --> 2:15:49
set and we wrote this text classifier literally to do what machine learning is you know probably
1188
2:15:49 --> 2:15:52
you know one of the two things machine learning is good for which is just go through and count
1189
2:15:52 --> 2:15:59
something right write this thing let it go through and count some of the things that we found that
1190
2:15:59 --> 2:16:[privacy contact redaction]ream media were telling you that vase was all of the data that was going into
1191
2:16:05 --> 2:16:12
vase was all you know hippie anti-vaxxers you know trying to make the vaccines look bad because you
1192
2:16:12 --> 2:16:19
know they were evil yet what we were finding was that up to like 72 percent of vase reports
1193
2:16:20 --> 2:16:[privacy contact redaction]ually reports done by clinicians and they're you know the the practice managers of
1194
2:16:27 --> 2:16:[privacy contact redaction]aff at hospitals you could tell that because what you had in these vase reports
1195
2:16:33 --> 2:16:39
was clinical language often you had them saying you know this person was my patient
1196
2:16:40 --> 2:16:[privacy contact redaction]e of the patient record right so the sort of hospital level data that
1197
2:16:49 --> 2:16:[privacy contact redaction]ruggle to get their hands up and so we were able thank you that's great
1198
2:16:55 --> 2:17:00
stuff now it's very detailed did you write about it on stub stack if not would you and then we need
1199
2:17:00 --> 2:17:[privacy contact redaction]even here yes i mean we're actually at the moment considering
1200
2:17:08 --> 2:17:[privacy contact redaction] review where we pull down data because it's it's now that time again
1201
2:17:16 --> 2:17:20
it's now just coming to april we're thinking of going back and doing a third review and because
1202
2:17:20 --> 2:17:26
we've got the text classifier this time we'll maybe do 5 000 reports so we did 250 then we
1203
2:17:26 --> 2:17:32
did a thousand maybe this time we do 5 000 and we let the computer at it and then you know find
1204
2:17:32 --> 2:17:38
work out what it finds out very good excellent question gary thanks scott okay steven last
1205
2:17:39 --> 2:17:[privacy contact redaction]ions um just one second steven you've got the stub back
1206
2:17:46 --> 2:17:[privacy contact redaction]even you so um so my job is to try and find um uh how should i say patterns
1207
2:17:57 --> 2:18:03
in what you've been saying so you know you're so into all this um that it's probably difficult
1208
2:18:03 --> 2:18:[privacy contact redaction]and back and and look at all the information and actually identify what it is
1209
2:18:09 --> 2:18:19
that's important but um i just wonder it's really important if if it is the case that they know ai
1210
2:18:20 --> 2:18:28
is rubbish but they're pushing it because they want to control us through data illegal data
1211
2:18:28 --> 2:18:[privacy contact redaction]ion they've made it legal um do you think that's a possibility or not because
1212
2:18:35 --> 2:18:41
it's very i i think that's more than a probability it's i think it's it's it's almost when you start
1213
2:18:41 --> 2:18:46
looking at the digital id laws and the online safety laws and so on and you you look at the
1214
2:18:46 --> 2:18:53
the content and the context um there is an agenda and it's working very quickly and in fact one of
1215
2:18:53 --> 2:19:[privacy contact redaction]ually i start all the way back with tnm in square and the the
1216
2:19:03 --> 2:19:10
trigger of the academics at tnm in square who who saw the internet as being this democratizing thing
1217
2:19:10 --> 2:19:16
where they could go and learn about how different cultures lived and you know talk they could talk
1218
2:19:16 --> 2:19:[privacy contact redaction] themselves about how bad it was to live under the the ccp communist regime and i bring it
1219
2:19:22 --> 2:19:[privacy contact redaction]rate this would be interesting to people like charles i demonstrate
1220
2:19:27 --> 2:19:36
the points along the touch points along the way where legislation sought to close down so legislation
1221
2:19:36 --> 2:19:[privacy contact redaction]s legislation then sought to compel decryption
1222
2:19:43 --> 2:19:[privacy contact redaction] through now to the point where we've got at least two countries in the world who've
1223
2:19:47 --> 2:19:54
passed legislation that requires backdoor access even to you know if you've got telegram in the
1224
2:19:54 --> 2:20:[privacy contact redaction]e of weeks so so scott i've only got a few minutes so i'm just trying to get the so
1225
2:20:01 --> 2:20:07
so the problem so it looks to me like it's like like a kind of limited hangout a possible limited
1226
2:20:07 --> 2:20:14
hangout that they're pushing ai which looks ridiculous you know to people like me uh but
1227
2:20:14 --> 2:20:18
actually they're not they don't they know it doesn't work it's a bit like the gain of function
1228
2:20:18 --> 2:20:23
narrative you know so i don't think that the the gain of function narratives they've forgotten the
1229
2:20:23 --> 2:20:[privacy contact redaction] they kill the host but it's important to say that they're doing
1230
2:20:29 --> 2:20:[privacy contact redaction]ion research to maintain the fear yeah do you understand and the similar thing with
1231
2:20:37 --> 2:20:43
lockdowns you know they knew that lockdowns were harmful to human human beings to isolate human
1232
2:20:43 --> 2:20:50
beings social highly social beings is extremely dangerous and also it was economically dangerous
1233
2:20:51 --> 2:20:57
so but they wanted to to use the lockdowns to convince the population to go along and to
1234
2:20:57 --> 2:21:03
increase the fear so they could psychologically torture people and they've done the same thing
1235
2:21:03 --> 2:21:09
with climate you the whole climate narrative is is the same it's it's all targeted to the same
1236
2:21:09 --> 2:21:16
thing to make they've made the teenagers that we see coming into university have been so thoroughly
1237
2:21:16 --> 2:21:22
brainwashed into believing that the world's just about to melt so they use these limited hangouts
1238
2:21:22 --> 2:21:28
you know they they kind of uh use the gain of function you know it's highly illegal in
1239
2:21:28 --> 2:21:[privacy contact redaction]ates to use taxpayers money to do gain of function research but they they admit that
1240
2:21:34 --> 2:21:42
they're doing it in order to propagate the myth that pandemics can occur in the future because
1241
2:21:42 --> 2:21:[privacy contact redaction] trojan horse for totalitarianism for one world government and so
1242
2:21:49 --> 2:21:54
i think we've got to think about these things you know so you're you're really into computers
1243
2:21:55 --> 2:22:01
and so you're saying you understand this and i don't understand it but i just kind of thought
1244
2:22:01 --> 2:22:09
while you were talking something made me think that actually the the ai doesn't make sense and
1245
2:22:09 --> 2:22:14
never has made sense but that doesn't matter it's a means to an end it's a means to an end
1246
2:22:14 --> 2:22:22
it's a means to get the data convince people they need the data and then they're going to
1247
2:22:22 --> 2:22:29
control us with all that data think for a minute what is the most famous example of something that
1248
2:22:29 --> 2:22:[privacy contact redaction]ly garry casparov and deep blue most most people don't know and i
1249
2:22:39 --> 2:22:45
worked for ibm for a little while right most people don't know that is the best example of
1250
2:22:45 --> 2:22:[privacy contact redaction] of osu! ever seen there literally was a man in the machine there was a physical man a
1251
2:22:52 --> 2:22:58
chess champion in the server room feeding the information in and helping the computer system
1252
2:22:58 --> 2:23:04
to make the recommendations to win the chess game wow so that they could push the push the
1253
2:23:04 --> 2:23:12
the massive power of computers you know and get everybody in this cult admiring computers but you
1254
2:23:12 --> 2:23:17
know what we've had what 50 years of computers now and we're in a very very dangerous position so
1255
2:23:17 --> 2:23:22
maybe you ought to give up computers all together reject them you know very good steven very good
1256
2:23:22 --> 2:23:28
all right wait that's excellent comment maybe we do because steven you as you know you can buy
1257
2:23:28 --> 2:23:35
ferrari in wales it'll go [privacy contact redaction] at 350 kilometers an hour
1258
2:23:35 --> 2:23:[privacy contact redaction]even no but that wouldn't stop me doing well yeah it would when they crush your car
1259
2:23:44 --> 2:23:[privacy contact redaction]ion i want to ask one more question okay yeah this is also important
1260
2:23:50 --> 2:23:56
um so it seems to me that there's an awful lot of computer fraud going on which isn't helping
1261
2:23:57 --> 2:24:[privacy contact redaction]e particularly politicians are totally incapable of regulating
1262
2:24:02 --> 2:24:10
computers so they don't understand it sorry they don't understand it you you show me a politician
1263
2:24:10 --> 2:24:[privacy contact redaction]ands absolutely this all of this it and the digital
1264
2:24:15 --> 2:24:[privacy contact redaction]uff show me one i'll show you a liar so of course isn't that a very dangerous
1265
2:24:21 --> 2:24:29
situation for human beings you know to be so reliant on computers and actually we have no idea
1266
2:24:29 --> 2:24:34
how it's how much it's taking us away from our humanity but we can see it now with our own eyes
1267
2:24:34 --> 2:24:39
you've only got to look at a group of teenagers we're all looking at their mobile phones together
1268
2:24:39 --> 2:24:44
not talking to each other that's dangerous for human beings correctly it's dangerous that's a
1269
2:24:44 --> 2:24:[privacy contact redaction] to end there scotty my buddy here says hello one more question charles this is also
1270
2:24:53 --> 2:24:59
important do you think there is a possibility scott that no data collections are fit for purpose
1271
2:24:59 --> 2:25:06
given that the ons the office for national statistics that is in the uk have uh admitted
1272
2:25:06 --> 2:25:12
that their data are not fit for purpose that's their whole reason for being the office
1273
2:25:14 --> 2:25:19
yeah and then you've got you've got the fact don't forget now um norman fenton and martin neal
1274
2:25:19 --> 2:25:[privacy contact redaction] published a we released the paper as a pre-print it's it's in peer review at the
1275
2:25:24 --> 2:25:[privacy contact redaction]ematic the first ever systematic review from pubmed pulling down all of
1276
2:25:31 --> 2:25:[privacy contact redaction]rated that the data that was put together for the to say
1277
2:25:38 --> 2:25:46
that the covid vaccines were safe and effective was all stacked and rigged with miscategorizations
1278
2:25:46 --> 2:25:51
so that they could make it look even even to the point professor fenton's done a video based on
1279
2:25:51 --> 2:26:[privacy contact redaction]bo the way that they manipulated the ons and and
1280
2:26:00 --> 2:26:[privacy contact redaction] manipulated the data you can make a placebo looks like it saves
1281
2:26:06 --> 2:26:14
lives sure so no so you think there is a possibility that no data collections are fit for purpose is
1282
2:26:14 --> 2:26:[privacy contact redaction] government i think it's wholly possible yes and it's a construct i think it's
1283
2:26:19 --> 2:26:25
whether or not it's the data collection or whether it's the data publication you know
1284
2:26:26 --> 2:26:32
there's a huge difference between if i get the raw data that's being fed into the ons
1285
2:26:32 --> 2:26:37
i might be able to do something clever with it and you know work out what's going on
1286
2:26:37 --> 2:26:41
but it's the fact that when it hits the ons or when it hits the australian bureau of statistics
1287
2:26:41 --> 2:26:48
or those types of organizations the cdc the fda and so on when it hits them they do all of this
1288
2:26:48 --> 2:26:[privacy contact redaction]ical mumbo jumbo trickery and you know age standardized mortality rates but they use the age
1289
2:26:54 --> 2:27:[privacy contact redaction]ized mortality rate of you know a belgian farmer and don't don't laugh that's what the ons
1290
2:27:00 --> 2:27:07
were doing they were using the asmr from a small country in europe as their example to compare
1291
2:27:07 --> 2:27:[privacy contact redaction] densely populated when they do that sort of stuff it's it's not so
1292
2:27:13 --> 2:27:18
much the data that's being collected that's the problem it's the data that's then being made
1293
2:27:18 --> 2:27:24
available to convince the public of certain things that's being politically driven absolutely and
1294
2:27:24 --> 2:27:28
there's no way of checking up on the on the quality of the data because they don't show you
1295
2:27:28 --> 2:27:35
they don't show you what went what what steak went into the mince maker yeah all right and then and
1296
2:27:35 --> 2:27:[privacy contact redaction]n't even talked about the people actually researching researching in inverted
1297
2:27:40 --> 2:27:48
colors the data all right so so we can all i hope everybody can see that um our worship of computers
1298
2:27:48 --> 2:27:[privacy contact redaction] as possible it is stephen thank you um everybody it's two
1299
2:27:55 --> 2:28:01
and a half hours scotty thank you good to have an ozzy presenting to us in proper language
1300
2:28:01 --> 2:28:11
rather than this welsh this welsh lingo and um um the the the one question i had scott for you as an
1301
2:28:12 --> 2:28:19
as an it expert which you clearly are as stephen points out what is your favorite browser we've
1302
2:28:19 --> 2:28:26
had conversation in the chat about linux linux what he had one different linux what what do you
1303
2:28:26 --> 2:28:[privacy contact redaction]e who want to be a little bit not monitored well linux is an operating system
1304
2:28:35 --> 2:28:[privacy contact redaction]ill if you're using linux it's like using microsoft windows or
1305
2:28:40 --> 2:28:[privacy contact redaction]all some sort of browser over the top um i've ended up i stay
1306
2:28:48 --> 2:28:53
away from any of the chrome variants because even the chrome variants that supposedly have been
1307
2:28:53 --> 2:28:[privacy contact redaction]n't don't don't believe that for a minute um the chrome variants of course even the
1308
2:28:59 --> 2:29:05
microsoft version of chrome you know the microsoft edge browser all you're doing is you're giving
1309
2:29:05 --> 2:29:11
google and microsoft information about what you're doing um admittedly at the moment i'm
1310
2:29:11 --> 2:29:17
sticking with a version of firefox that i use there's a there's a privacy version of firefox
1311
2:29:17 --> 2:29:[privacy contact redaction]all and then within that privacy version of firefox i'm running um various tools
1312
2:29:24 --> 2:29:[privacy contact redaction]us and and and a couple of other tools i mean my favorite tool on
1313
2:29:31 --> 2:29:37
my mac at the moment um is a tool called little snitch and little snitch not only helps you to
1314
2:29:37 --> 2:29:44
firewall a lot of the traffic that's being sucked out of your machine um but it also shows you all
1315
2:29:44 --> 2:29:[privacy contact redaction]e inbuilt stuff that's trying to you know keylog you um the
1316
2:29:52 --> 2:29:58
only shame is that the the the two chaps who developed little snitch haven't made a windows
1317
2:29:58 --> 2:30:05
version so for the windows version i use a pie hole that has a list of all of the microsoft
1318
2:30:06 --> 2:30:12
domains where microsoft like to ship data from your computer most people don't realize you
1319
2:30:12 --> 2:30:[privacy contact redaction]all windows 10 or windows 11 microsoft keylogger is enabled by default so that means
1320
2:30:19 --> 2:30:27
every letter you type on that computer by default packaged up and sent to a microsoft data center
1321
2:30:27 --> 2:30:33
and you're giving it away for free thank you there's there's very very good projects on
1322
2:30:33 --> 2:30:41
gitlab for securing windows 10 and windows 11 thank you charles charles there's someone on the
1323
2:30:41 --> 2:30:46
fall who doesn't understand the importance of limited hangout so the best way i can describe
1324
2:30:46 --> 2:30:54
it is that um so they they concede one lie to hide a much bigger and more important lie
1325
2:30:54 --> 2:31:[privacy contact redaction]even problem reaction solution so they create the problem
1326
2:31:01 --> 2:31:[privacy contact redaction] us well that's not quite the same but but anyway similar yeah they concede a lie
1327
2:31:07 --> 2:31:[privacy contact redaction] you from the main line yeah they want to yeah so the whole time you were paying
1328
2:31:14 --> 2:31:19
attention to covet and you're in lockdown what you didn't realize was at least six countries of
1329
2:31:19 --> 2:31:25
the world were developing online safety law digital id law and they were developing in the
1330
2:31:25 --> 2:31:[privacy contact redaction]uff that all of this you know net zero climate stuff to the point where
1331
2:31:31 --> 2:31:36
you know if you talked in [privacy contact redaction] that they were going
1332
2:31:36 --> 2:31:41
to the covert lockdown was going to become a climate lockdown everybody thought he was completely
1333
2:31:41 --> 2:31:[privacy contact redaction] council now and the fact that they're starting their process whereby you're
1334
2:31:46 --> 2:31:[privacy contact redaction] your car out of oxford if you live there a hundred times a year and tell me
1335
2:31:51 --> 2:31:57
that's not the climate lockdown we're across [privacy contact redaction]enty of people here
1336
2:31:57 --> 2:32:[privacy contact redaction] that but charles i didn't know that you couldn't drive your car more than a
1337
2:32:01 --> 2:32:[privacy contact redaction]ed times uh that's that's the game steven that's the game plan of the 15 minutes city
1338
2:32:06 --> 2:32:11
[privacy contact redaction]ed times in a year i did absolutely yeah yeah yeah
1339
2:32:12 --> 2:32:18
okay well evil evil and then and then watch as they drop that to 50 and then 30 and then 20
1340
2:32:19 --> 2:32:23
charles i didn't know the detail so if i didn't know it there are lots of other people who don't
1341
2:32:23 --> 2:32:27
know it that's correct but you knew it maybe yeah well that's the that's the value of this group we
1342
2:32:27 --> 2:32:[privacy contact redaction]uff from each other all right but we're running out of time thank you scott thank you
1343
2:32:31 --> 2:32:37
steven thanks everybody tom rodman video telegram meeting scott you're most welcome the link is in
1344
2:32:37 --> 2:32:42
the chat if you've got time there are people there's [privacy contact redaction]e from this group that
1345
2:32:42 --> 2:32:[privacy contact redaction] got that time but we'll stop this recording here and you can go to that if you
1346
2:32:47 --> 2:32:53
wish thank you very much for this and and we look forward to having you back to explore further
1347
2:32:54 --> 2:33:00
and further matters because we've only got a little way deep thanks everybody thanks charles
1348
2:33:01 --> 2:33:05
scott if you'd like to work with me on some of the things we've talked about today i'd be very
1349
2:33:05 --> 2:33:[privacy contact redaction]op me an email or three yeah i'll try to remember and everyone
1350
2:33:10 --> 2:33:17
reads scott's sub stacks scott just say it aloud because it was law and tech wasn't it law law
1351
2:33:17 --> 2:33:[privacy contact redaction] law health and a and d tech at sub stack dot com beautiful
1352
2:33:24 --> 2:33:27
thank you thanks everybody cheers bye