Files

339 lines
19 KiB
Plaintext
Raw Permalink Normal View History

Episode: 1029
Title: HPR1029: Karen Sandler on Medical Devices: OGG Camp Part Two
Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr1029/hpr1029.mp3
Transcribed: 2025-10-17 17:39:59
---
The full circle podcast on hacker public radio in this episode, ArcCamp 11, Karen Sandler
on medical devices.
Hello World, and welcome to the full circle podcast on hacker public radio.
This is the second of our highlights of ArcCamp 11, last summer's unconference, held
at phantom maltings in the south of England.
The full circle podcast is the companion to full circle magazine, the independent magazine
for the Ubuntu community.
Find us at fullcirclemagazine.org forward slash podcast.
Next the presentation from Karen Sandler, legal eagle, formerly of the Software Freedom
Law Centre and newly appointed executive director at the Nome Foundation.
Karen wasn't due to speak on the scheduled track, but stepped into an unexpected gap to
talk about something dare I say very close to her heart.
You know talking about how all the things that we really like about open-source software
of business come from software freedom that we have to tend to software freedom and we're
going to expect to continue to have those freedom.
So the first thing you need to know about me is that I'm your new cyborg overworked.
So the story about how I've gotten involved in these issues is quite a personal one.
I have a heart condition.
It still feels really weird to say that in front of a whole group of people.
I sort of feel like I've had some support group and sort of feeling like, hey everybody,
I have a heart condition, but I do.
I have a big heart.
I have a huge heart.
So my heart is actually three times the size of a moral person's heart.
It's called hyper-trophic cardiomyopathy and it means that my heart is very big.
But it's also very stiff and it's pretty clear.
It's not just that my heart is scaled up, but it's much thicker.
So the problem with it is that if I, you know, in certain circumstances,
it might try to feed it really hard and then become it effective.
And the technical term for that is sudden death.
I have a very high risk of suddenly dying.
And that's all the biggest part about my heart condition.
Other than that, I don't mean symptoms.
I'm not allowed to do things that I never wanted to do like run a marathon or go racing
or realize that it's the greatest case.
There is a chance that I could run across the street to try to catch the bus
and not make it across the street.
And there's a solution for it, which is to get a case-maker familiar and become cyborg.
When I was told that I needed a heart device, you know,
I think there were all sorts of normal kinds of reactions to that.
Just first, there was dealing with my own mortality, the weirdness of it.
The fact that, you know, I am now.
But the longer I went on without getting it, the more I realized that I had to get it.
And I could be a cyborg for good.
But there was another problem that I don't think anybody who,
I know who got a heart device before had thought about,
which is that when the cardiologist slid the device,
an example device that was, this is so terrifying,
you could take out this little thing and say, look, it's really not scary at all.
It's just this little thing.
Doesn't it seem friendly?
You can have it in your body, and it will stay your life.
And you keep slitting over this to me, and I said, well, what is it run?
And they had a mindset for me.
And nobody there asked me that question before.
He said, well, what do you mean? What is it run?
And I said, well, it has software on it. That's how it operates.
I actually already was a lawyer at the Software Freedom Law Center.
So I was thinking about these issues.
But I also was an engineer and a programmer.
So I, you know, I said, surely, if you tell me what kind of software is running,
and how runs and can I look at it?
And he said, okay, well, we're going to pull out.
There's actually a representative from the medical best manufacturer.
You're going to pull him into the office and see if he can answer your questions.
And he said, what are you talking about?
There's software on this device.
And you're going to put the device in my body connected to my heart.
Can I see the source code?
Can I see how it runs?
And the answer was, whoa, that's a big problem.
Yeah.
I'm going to go talk to the company about that.
And then nothing.
So I called all three of the major medical device manufacturers in Aslam.
You know, can I see the source code for the device?
If I can, I'll go with your device.
You know, I might have, I might put it in a call.
I'm not sure.
I'm going to work under a while.
I don't know what I can find.
But I ask my friends to look at it.
And nothing, nothing.
I offered them to sign a non-disclosure agreement saying, you know,
I just feel so uncomfortable having this thing put in my body
without being able to see it.
Why don't I sign a non-disclosure agreement?
Then I comment.
I won't share it with anybody else, which is, you know,
I took it belted, me being at the software Freedom Law Center.
But I just felt so weird about having this thing that I was willing to do that.
And they said, no.
So I kept putting it up, putting it up.
And my parents, you know, every time I didn't call my mother back for an hour,
she would call all my friends and say, have you heard from Karen that she's still alive?
It's a really important issue, but I, you know, I can't risk my life on it.
So when Simon said, so I got on the device.
And I decided that I was going to do all three research then.
And I have really worked at the software Freedom Law Center
and was able to launch an initiative there on this very issue.
So I did a whole bunch of research.
And I found out some things that probably won't surprise you like software has bugs.
The software engineering institute published a study that they estimated
that for every 100 lines of code, a bug is introduced, which is not very many lines of code.
So even if you get most of the bugs, they're still going to be some bugs.
And then I looked at all of these studies about the software of medical devices in particular.
There's one study that systematically took all the recalls and other kinds of publications
but the People Drug Administration in the United States approved these devices
and they also put out, you know, warnings or whatever, something that's wrong.
And these people studied all of the recalls.
And so of all the recalls of the ones that they could determine were because of software,
which wasn't all of them. And other ones that they could see what was a problem,
many percent of them wouldn't have been detected if the company who had made the device
had done all pairs testing.
And so people who may not be a technical, it's basically testing where you test
for multiple commissions happening at last.
You're reading about the software, the bugs in the software was really sickening
because there are all these examples of medical devices failing.
And we're all aware that there are radiations that there's a lot of steps in them.
There are insulin pops where was very unclear what field people were entering.
And such clinicians thought they were entering hourly dosages for insulin delivery
but they were actually entering minutes.
A bunch of people died.
But it really was telling me.
And every time I would find one of these major medical device failures,
it really made me physically ill.
And I'm thinking, you know, what are the problems in my device?
So I put it away and not work on it for a while back.
So it took me like an over a year to work on this stuff.
So I looked into what the Food and Drug Administration does.
But what I really found out is what it doesn't.
The Food and Drug Administration typically does not with you the source code of these devices.
I couldn't believe it.
They don't generally do so unless they already think there's some problem.
Because of that, they actually don't even request the software in most instances.
What they do is they ask the medical device manufacturers to come up with their own reports
about what the software does and how it operates and they certify and contest it.
But there's no clear set of guidelines as to what those companies have to do.
And the idea behind that is, you know, pretty clear.
The FDA basically doesn't want to have to be in a situation where they're saying,
you know, your requirements are A, B, and C.
And this new device should have been D.
But because the FDA is a new device, the FDA didn't know it.
So they say, the people who know the device is the best are these companies that are designing them.
And so therefore, they know what kind of tests they should do.
But basically, what this means is that because the NK is not working at the source code,
and they're asking for it, and they don't have any clear guidelines, any clear requirements.
They have some guidelines for requirements.
You've tested, there's a lot of power being left at these companies.
And because the FDA is not asking for the source code, they're never getting the source code,
which means that there's no copy and there's no repository for the public.
So in an instance that there's catastrophic failure at metronics, for example,
you know, there may not be a copy of the software I went hard to buy.
And my hard to buy is only as good as it works right now.
So, you know, you can talk to these hard devices remotely, but you can update them.
But if I don't have a source code, I can't patch it.
And if there's no one on the tronic to help me, then I'm just out of the month.
I talked to the head of cybersecurity and the FDA and said,
I just want to say that people that really confused about what the FDA does.
But we are not here to view every part of every device.
We just couldn't do it.
We don't have no resources.
We can't really just sort of software.
And I was like, that means, you know, this is for me, you know, time for me to tell them and say,
well, actually, that's not what I'm asking for.
What I'm asking for is for you to require them to be published so that everyone can review it.
You know, this way, the FDA concerts the public.
And he said, I thought, you know, I haven't actually thought that makes a lot of sense.
Your name is that everybody has to ask that.
Right.
One of the things that also has found to me as a lawyer looking into this is that in the United States,
because of the way that the FDA approval process works, it's a federal approval process.
And therefore, patients are actually preempted from suing other state court law.
So I can't just go into my state court and sue over my medical device because it's going to prove on a medical federal level.
So it's like there's this whole list of remedies that I couldn't even go to court and take these medical device companies to task board
because it was properly approved of the FDA.
Which, you know, take a step further from what Simon was saying,
how software freedom is critical of business.
But this is my body.
You're going to put something in my body.
You're going to put a device and connect it literally screwed into my heart.
And you won't let me see it.
It's just madness.
So we've got the worst of both worlds.
We have no encryption.
These devices are...
Recently, someone was able to do the same thing with the implanted insulin pumps.
Those are, like, the two biggest causes of implanted medical devices.
There's the insulin pumps that are the case-maker to thirdulators.
And there's also, like, a bunch of heat management steps and some implanted neuro simulation therapy.
And then also, somebody wrote me that they're considering getting implanted hearing aids.
So hearing implants that implanted into their skull.
And they wanted to talk to me about, you know, the software freedom implications of this.
And what's interesting to me is how well they get a whole bunch of work with hearing aids where they were able to turn.
There could be privacy implications with hearing aids because, as they talk to each other,
there's an opportunity to set that and actually be able to spy on people and get their whole conversation.
So, like, just really many things.
So we've got no security on these devices.
And the reason they know that is because they say that it is to basically preserve battery power.
But there's going to be a whole slew of proposals that would not run down the battery.
And they still don't implement them.
And on the other side, we've got plus code.
So it's just like, who would need to leave us up?
Right? Nobody can do the code so that, you know, to make sure that it's safe.
And there's no requirement for privacy or to prevent people from tampering with them.
All this led me to the conclusion that, admittedly, I was biased before as an employee of the software Freedom Law Center.
But that these devices need to be free and open.
It's so easy to see.
You know, if it's free and open, anybody can...
But no, you know, anyone can look at the system and find out if it has problems.
They can assess the risks.
If there is a problem, anybody can propose a patch.
In the case of medical advice, software, you can see that being extremely important.
But I guess in some ways, counterintuitive.
But it goes the opposite way, which is, make our software open, but encrypted.
So that I can choose what patch it's going to my software.
And that's connected to my heart.
And the most important thing about this is that all of these just will line up on my truck.
Like, we've seen so many companies come in and out.
We've seen their failures.
We've seen the way that basic things that we expect companies to do don't get done.
So I don't want to depend on a single party.
A single, as a party, because I'm a large.
But, you know, I always tend to rely on a single company.
And so it's easy to go from there to say,
our software must be safe.
You know, and it's not just medical medicine.
It's anything we rely on for our life and for our society.
So our cars, we had a problem with one of the cars where, you know, I think there were,
I think there was different examples in some countries, but where the software,
cars were malfunctioning before it's crashing, which is public.
The voting machines, you really need our voting machines.
Because that's how we choose to govern our society, our financial markets,
and again, our medical devices.
But what's amazing to me right now is that what is science-sized critical is just completely
learning.
The way that we do our computing is changed.
So like five years ago, my mother did not use computers for almost anything.
She was terrified of them.
She didn't want to do anything.
It was completely not in her role.
But in the last couple of years, she's literally taken to her mat.
And she now uses a computer for everything.
She was called to travel online.
She, you know, she, and that's true across the board.
Now, everybody is using their software for everything.
And that means, you know, all of our grandparents using your software.
Everybody is using software for everything.
We're using it to see when our buses are coming.
We're using it to keep track of, you know, what restaurants we should go to,
and where our friends are from.
We're using it to say what we have eaten, and what, you know,
how to keep track of our diet, and how to keep track of our exercise.
Now, there is already software for phones that can interact with those insulin pumps,
so that you can interact, what you can track of your diet,
and exercise with your function or something.
So, this idea of what is life in society critical has completely blurred.
And when people use computers, they expect them to be easily used to
expect to use them to do this with something I was saying before.
Even typing in a command to run a projector was too much effort.
And if it was for assignment, he's technically minded.
You know, my mother would never be able to do that.
But, my mother expects to be able to use computers,
and she expects to be able to use it for everything.
So, our UI institutions have completely changed,
which brings me to the desktop.
So, about a month ago, I switched over from the software green and law center
to be coming exactly for a perfect notebook.
And, you know, I'm just released, you know, I'm free.
It's so pretty.
And I don't know if you guys have used it yet,
but you should give it a try.
It's still early days, there's been a lot of criticism about it,
but there's been a lot of really positive feedback to you,
especially for people like my mother,
who are really reticent with computers.
This is basically a complete redesign by, you know,
for user experience.
Basically, they solely went about trying to make a better experience
for people ordinary people, not necessarily hackers,
which I think, you know, it's kind of,
it's just it's sleek and it's clean, and it's easy to use.
And it dispenses with a lot of the things that we think,
that we use to think people wanted to use better computers,
but we've seen from, from Matthewsage and from other kinds of computers
that this isn't really what we're going to need people want.
I would just say that I would sign it.
We need trends towards freedom.
I love this thinking for a lot because it's just,
it's so iconic and it's, you know, it's the,
it's the crashing of the street, right?
And we have no idea what our life and society critical
functionality is going to be.
We don't know what we're going to rely on.
I mean, who thought that we would rely on our icon to interact
with our, you know, our folks interact with our insulin pumps?
We would never have thought that.
And it's because it's coming out of the way we use everyday computing.
We need to build on free open platforms as much as we can.
It's so important.
I sign in joke and say, think of the children,
but think of the children.
I mean, think about what kind of a world we're creating.
And if we can all work together and focus on free open platforms,
we're going to be much safer ultimately.
And I think I am totally out of time as a candidate.
The, you know, foundation and the software we don't care where I am.
So being a pro bono work or both by the ones we've raised,
become a friend of the known.
We really need your help.
Creating a great user interface is, it takes a lot of effort and a lot of volunteer
and provide a lot of people.
And having financial support is really important.
Try out the goal.
And I'm a CC biasing.
So we should have freedom in content as well as if it's not work.
Thank you very much.
Thank you very much.
So that was Karen Sandler of the Known Foundation on the subject of
Close Source Medical Devices.
Og Camp is a joint venture between those lovely podcasters for Linux Outlaws
and the Ubuntu UK podcast.
With more highlights of Og Camp coming up on the full circle podcast very soon,
including Andy Piper on MQTT and the Og Camp panel discussion.
I'm Robin Katling.
Thank you for listening.
And goodbye.
Thank you for listening to Hack with OverGradio.
HPR is sponsored by Carol.net.
So head on over to CARO.NC for all of those meetings.
Thank you very much.