Initial commit: HPR Knowledge Base MCP Server
- MCP server with stdio transport for local use - Search episodes, transcripts, hosts, and series - 4,511 episodes with metadata and transcripts - Data loader with in-memory JSON storage 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
394
hpr_transcripts/hpr0578.txt
Normal file
394
hpr_transcripts/hpr0578.txt
Normal file
@@ -0,0 +1,394 @@
|
||||
Episode: 578
|
||||
Title: HPR0578: Open Source Security Concepts
|
||||
Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr0578/hpr0578.mp3
|
||||
Transcribed: 2025-10-07 23:27:15
|
||||
|
||||
---
|
||||
|
||||
.
|
||||
So I'm here today to talk to you about open source and security. And the ultimate
|
||||
question of this is, do too many cooks spoil the broth? It does show in the
|
||||
source code to too many people and damage the end product. My name is Nick Walker,
|
||||
aka Telos. I'm a second year ethical hacker and can't imagine a student at
|
||||
Abate. I'm vice president of the Linux Society. As of today I'm acting
|
||||
president as our own stepping down to do other things. I've fought at the
|
||||
moment. I'm a full-time arch Linux user. I've gone through various
|
||||
distributions started out with Gen2, Susan, Fedora, moved on to
|
||||
aptitude-based distros like DBM and Ubuntu, which he's for a
|
||||
little while. And I like to use lightweight distribution, so I move
|
||||
towards arch, but it's still got a good package manager like DBM
|
||||
as well. What's my contact details? Can you mail me at there? Same
|
||||
for my MSN. I'm always, always, always on IRC. No matter what, I'm
|
||||
always on my phone. I'm on IRC. You think I'm joking? Yeah, always. You
|
||||
better contact me with the same username Telos on remote
|
||||
exploit forms on free node. All right, the aims are today are
|
||||
H for calling. The aims are today are I want you to have an
|
||||
understanding of the concepts behind open source and closed source
|
||||
security and the two different ways of looking at security and
|
||||
software as a whole. I want you to draw a comparison between the
|
||||
two, complete understanding of both. And I want you to understand
|
||||
that there's only, there's no one size fits all solution to security.
|
||||
There's no one way about it that solves all the issues. If
|
||||
there was, I'd be out of a job. Just because you paid for
|
||||
closed source, just because you paid good money or even open
|
||||
source, just because you paid money for it doesn't make it
|
||||
necessarily a better or more secure product than a free alternative.
|
||||
I'm going to cover what security, the many eyes argument. This
|
||||
is what I mentioned earlier, where does opening the source to more
|
||||
people, showing more people the source, does it cause problems or
|
||||
does it bring more benefits than damaging? Keeping vulnerability
|
||||
secret doesn't make them disappear. Not necessarily hide it hide
|
||||
in vulnerability in the source doesn't necessarily mean that
|
||||
it's not going to get brought up and found really easily. I'll
|
||||
go more into detail in that later. Also, aside from the inherent
|
||||
security and open source software, what is actual, what is the
|
||||
open source communities out there actually doing for the security
|
||||
field as a whole? I'm going to summarize it all.
|
||||
What is security? Security in the computer field you can
|
||||
summarize into five points. Confidentiality, integrity,
|
||||
availability, accountability and assurance. All the data on
|
||||
the system has to be secure so that only those with the authority
|
||||
to view it can view it. It's always, well generally you handle
|
||||
using accounts and privileges, setting restrictions on certain
|
||||
user accounts. You can only access certain files, et cetera,
|
||||
et cetera. There's only one problem with this. There always
|
||||
has to be one user that can always access everything.
|
||||
It might be out of bounds to everybody but one guy.
|
||||
But it only takes a one bug in the system in the code to be
|
||||
able to compromise that account on how to access the entire
|
||||
system. Integrity, data in the system always has to be
|
||||
100% complete. It's pointless having a piece of software that's
|
||||
writing certain bits to a file that's using all the time and
|
||||
then all of a sudden one day it writes a bad bit of code and
|
||||
you end up getting errors in your data. It can happen in three
|
||||
different ways you can get bad data integrity using malicious
|
||||
altering an attacker can inject code or abuse the way of
|
||||
functions used in a program to make data right to the
|
||||
disk wrong. Accidental altering such as a transmission error
|
||||
over the network or a hard drive crash. And
|
||||
programming errors as I mentioned a minute ago that result
|
||||
in inconsistencies in the data. Availability.
|
||||
Data always needs to be available on request at any one time.
|
||||
Otherwise what's the point in having the system there?
|
||||
Yes. Availability is the degree to which a system is
|
||||
operable and in a committable state at any one point.
|
||||
So it needs to be up. Obviously the up time needs to be
|
||||
near on 100% if it's a mission critical system.
|
||||
Accountability. Accountability ties in with the user
|
||||
accounts and privileges. If something goes wrong in a system
|
||||
you need to know why or through what user error or
|
||||
program code it went wrong. The users of a network need
|
||||
through the rules of the network need to be held accountable
|
||||
for what they can do on the network. They need to know that
|
||||
on a network what they do on that network is either
|
||||
a monitored or they can be held accountable for causing
|
||||
damage to the network. Assurance, user of the system
|
||||
needs to be sure that reasonable measures have been taken
|
||||
to ensure both the security and the up time of the system.
|
||||
Right, the many eyes argument. One of the key issues of
|
||||
the source is that the argument that opening the source
|
||||
code to expose the source code to hackers and it makes
|
||||
it easier for them to find vulnerabilities in the software.
|
||||
People have got different views on it.
|
||||
Experts say it's a better to open the source. People say it's
|
||||
better to close the source and these are all top end people
|
||||
who know what they're talking about but there's two sides of
|
||||
the argument. People say the more people that look at and
|
||||
commit to the code, the more people can hide the malicious
|
||||
code or find pre-existing bugs which may lead to running
|
||||
your own code through the system as a hacker.
|
||||
The other side of the argument is the more people that can see
|
||||
the code, it's easier to find the bugs in the system and
|
||||
therefore you can stop the attacks before it's even
|
||||
happened. Open source practices, the way the open source
|
||||
works is purely on the community basis. You have to
|
||||
write a cleaner code so that it's legible for other users
|
||||
to update. On its basis, it has to be cleaner code in
|
||||
order to be usable by the community.
|
||||
Which makes it easier to spot bugs and makes it easier to
|
||||
patch issues once a bug is being found if all your code is
|
||||
clean and concise.
|
||||
The majority of the arguments tend to lead towards that
|
||||
open source has the potential through the inherent design
|
||||
of cleaner code, more people being able to see it.
|
||||
It's inherently more secure than its closed source
|
||||
counterparts but simply being open source does not make
|
||||
it a secure product. It needs to be audited and reviewed
|
||||
the code needs to go over, reviewed, patched, needs
|
||||
constantly being improved. You can't just write a
|
||||
piece of code for a function when you need it and then
|
||||
come back to it for years later. You need to keep constantly
|
||||
reviewing the code and all the community needs to be involved
|
||||
for this. Many people argue that closing the source makes
|
||||
it more difficult for hackers to compromise the system.
|
||||
It's wrong. Many people argue that a system with non-plugged
|
||||
soft code is more secure because there's less information
|
||||
for the attacker to work with. The attacker generally
|
||||
don't tend to work with source code because most targets
|
||||
don't have the source code available. There's tools
|
||||
developed called forzers and automated processes that
|
||||
spam a program with data or certain files or different
|
||||
commands to try and knock over the system. The second
|
||||
the system has been knocked over, you know, there's a bug in
|
||||
the code and as you've got a possible entry point for an
|
||||
exploit. The argument's flawed as although source codes
|
||||
are very important when adding functionality to a program,
|
||||
like I say, attackers generally don't require it to find
|
||||
a vulnerability. It's much easier to destroy a card and to
|
||||
create one, for example. You can take a good car in a
|
||||
sledgehammer, beat the crap out of it and you've got a broken
|
||||
car. It's the only need one way to break it and takes a lot
|
||||
more thought to put it together. Attackers have many
|
||||
advantages over the defenders as programmers need to ensure
|
||||
that all their code is secure and the attacker only needs to find one entry
|
||||
hole on its game all of it. Attackers tend to lead towards
|
||||
generic security issues that most programs have. The more
|
||||
they don't go and look for a specific bug in the code. They
|
||||
try and lead towards generic bugs which are like problems
|
||||
in the way that the programs have been put together in the
|
||||
logic rather than in the actual functions themselves.
|
||||
Sorry, that's hard to explain. For example, buffer
|
||||
overflows and format string exploits. Those are both problems in
|
||||
the C language. It's not an actual issue in the language itself.
|
||||
It's the way that the language is used. It tends to cause
|
||||
issues unless it's really kept on top of. All data
|
||||
input needs to be validated manually, otherwise you end up with
|
||||
these bugs. Attackers use these techniques to find vulnerabilities
|
||||
and examine the errors and stand it up from the programs after
|
||||
trying to exploit them. So say using a fuzzer to try and knock
|
||||
over a program. You throw random stuff at it and see what it
|
||||
spits out. If it spits out something unexpected then you know
|
||||
you cause an issue in the program and that's how attackers
|
||||
tend to go towards finding an exploit rather than going through
|
||||
the source code. In order to carry out the exploits,
|
||||
the attacker doesn't need access to the source code.
|
||||
The debug output from the program itself tells the attacker
|
||||
everything they need to know. So the program falling
|
||||
over will in essence tell the hacker where he needs to push
|
||||
the right buttons in order to make the program do what he wants.
|
||||
Decompilers are brilliant as far as a hacker is concerned.
|
||||
Decompilers take a compile piece of computer code.
|
||||
So as Rob, I don't know if you've been in Rob's talk earlier,
|
||||
Rob had the diagram on where he had a programmer leading to
|
||||
some source code going through a compiler and ended up with
|
||||
the end product which was on his slide and actually file,
|
||||
which is just a binary file. Decompilers can take
|
||||
the compile file without the source code.
|
||||
You can run it through a decompiler and you can get something
|
||||
along the lines of the original source code.
|
||||
You'll never get the original source code out of it.
|
||||
But you can get a generic idea of what the source code
|
||||
would have lent towards if you understand my meaning.
|
||||
Essentially that gives, even though the software was
|
||||
closed source, that gives the hacker the source code in essence.
|
||||
They can use that to find various vulnerabilities if they need to.
|
||||
But generally that doesn't tend to happen.
|
||||
It's mollying towards knocking it over with forces and various
|
||||
automated procedures.
|
||||
So like I say, if you've closed the source and you're someone
|
||||
trying to find a vulnerability in a piece of software and they
|
||||
just decompile it, you back to square one again,
|
||||
you might as well have not closed the source.
|
||||
You've missed out on the opportunity of having people
|
||||
being able to look at the source and find the bugs before
|
||||
they're already been used.
|
||||
So why developers need the source code in the first place
|
||||
if anybody can decompile?
|
||||
Well, it's quite simple for an attacker to locate a vulnerability
|
||||
within a program, but it's really difficult for the developer
|
||||
to update and improve the program from the decompiled source.
|
||||
As most of it ends up coming out jumbled, for example,
|
||||
if you were to call a function, I don't know,
|
||||
go to URL, for example, in a web browser.
|
||||
Someone typed in a URL.
|
||||
The function that's wrong when you hit the enter button would be
|
||||
go to URL.
|
||||
If you were to run that through a decompiler, you wouldn't get
|
||||
the function name out.
|
||||
You'd end up getting something like F24B7.
|
||||
And it would be a function.
|
||||
You see the contents of the function, but you don't know really
|
||||
what the function does unless you can follow the program logic.
|
||||
So it's obviously better to open the source and make it easy
|
||||
to adapt.
|
||||
Developers will say source codes vital for updating and
|
||||
improving software.
|
||||
Huckers will say source codes desirable, but not required.
|
||||
I can find the vulnerability for examination and debugging
|
||||
and compile object.
|
||||
They don't need the source code to knock it over and off.
|
||||
Why keeping vulnerability secret doesn't make them go away?
|
||||
And devils to close the source on a program is, in essence,
|
||||
an endeavor to hide the vulnerabilities from the hackers.
|
||||
And in certain cases, from the general public.
|
||||
It's obviously bad publicity if there's a big bug
|
||||
gone round because you've opened your source.
|
||||
If nobody knows the bugs there, then you'd not cause yourself
|
||||
any damage.
|
||||
Some companies employ this idea about the whole thing.
|
||||
And try and find the bugs themselves and patching themselves in
|
||||
us.
|
||||
Hiding the vulnerabilities might be a temporary fix to the problem,
|
||||
but at the end of the day, if nobody knows about the vulnerabilities,
|
||||
eventually someone's going to flush your program and you end up,
|
||||
you're going to end up getting bit in the backside basically.
|
||||
Would your vulnerability be found by the developers maintaining the
|
||||
program because it's open source and loads of developers can look at it?
|
||||
Or would you rather take your chances and see a possible attack?
|
||||
As I said a minute ago.
|
||||
It's the sad truth that companies don't tend to fix
|
||||
the vulnerabilities unless it starts to cause them issues as far as
|
||||
profits are concerned.
|
||||
Another opinion is that discussion of vulnerabilities leads to
|
||||
exploitation of them on a larger scale.
|
||||
For example, recently there's been a buffer overflow attack.
|
||||
There's been a blue screen of death.
|
||||
A Windows machine remotely with a 10 line python script.
|
||||
If that was kept under the hood, it wouldn't be used that much,
|
||||
but it's been announced and there's loads of
|
||||
vulnerabilities.
|
||||
It sounds like a plausible argument.
|
||||
I've already found that this cost of vulnerabilities using a
|
||||
well for different channels of talker on ILC, talk about forums,
|
||||
bulletin boards.
|
||||
It's not going to stop anything.
|
||||
At the end of the day, you're still going to get hit by the bugs.
|
||||
This approach leads to defenders at disadvantage because
|
||||
nobody can do anything to prevent the attacks.
|
||||
Aside from playing code, what is open source actually doing for
|
||||
security?
|
||||
You've heard me talk about the inherent security in open source
|
||||
programs.
|
||||
I'm going to talk to you now about what open source communities are
|
||||
doing to help the security field as a whole.
|
||||
There's loads of different open source projects on the go.
|
||||
The Metasploit Frameworks, one of them, it's used for
|
||||
automated vulnerability testing.
|
||||
You can run it against the system, point it at any system,
|
||||
and you can do a scan and see whether the system's vulnerable
|
||||
to certain attacks.
|
||||
All the attacks are scripted and built in using modules.
|
||||
I think it's in Ruby.
|
||||
It's in Ruby.
|
||||
It's Ruby.
|
||||
So that's used by hackers and ethical hackers alike to test
|
||||
the vulnerability and network.
|
||||
Nice as project.
|
||||
Similar thing, but it doesn't actually do the exploiting.
|
||||
It does the scans.
|
||||
You can give it custom rules, and you can search for existing
|
||||
vulnerabilities that have been certified.
|
||||
You can write your own vulnerability rules for it, and you can use
|
||||
it to, in essence, test your network for security.
|
||||
It's not on the source.
|
||||
Is it not?
|
||||
When did they close the source on it?
|
||||
It's been closed for a little while, but there's the vision
|
||||
about the source.
|
||||
If you close the source, you just fork it.
|
||||
Is there an open source fork of it yet?
|
||||
Yeah, there's an vulnerability scan.
|
||||
It's called open files, I think.
|
||||
Right.
|
||||
And that's basically the four nicest
|
||||
close to the source.
|
||||
They are default and made up a version of the AWS.
|
||||
I don't think they've got a big enough plant market.
|
||||
It's a script market.
|
||||
It does have a project, but it's a brand new way.
|
||||
It's called water analysis.
|
||||
Yeah, yeah.
|
||||
Follow water analysis.
|
||||
That makes it even easier to find those scans.
|
||||
So, automate the entire process as barely any.
|
||||
Okay.
|
||||
It also means that if you use some of the news necessities
|
||||
that if you use it all the time, you get the same reports
|
||||
being generated.
|
||||
It's okay if you're doing a strong network,
|
||||
being told that you have a web server directly.
|
||||
And if you don't want to be told that when you're doing a weekly scan,
|
||||
you know that you have a server that's correct.
|
||||
So water analysis.
|
||||
You can use two to tick that.
|
||||
It's not important.
|
||||
Yeah.
|
||||
Okay.
|
||||
Snort is an intrusion detection system.
|
||||
Basically, it sits between you,
|
||||
between the internet and your machine,
|
||||
sometimes on a completely different machine
|
||||
or sometimes on your machine as an application.
|
||||
And it captures the packet.
|
||||
It sniffs the network for all the information
|
||||
that's coming between the internet and your machine.
|
||||
And it scans those packets and looks and can detect through signature detection
|
||||
that a certain packet is an evil packet, for example,
|
||||
like it would cause damage to a certain piece of software.
|
||||
It's an intended attack.
|
||||
It scans for the intrusion.
|
||||
Most hacking tools are open source, such as I'm not.
|
||||
All of them really heavily used ones.
|
||||
You don't really get any commercial hacking tools.
|
||||
What?
|
||||
There's a few of them, but probably plus.
|
||||
Currently, the commercial version of our mouse play.
|
||||
But it's not through the same organization,
|
||||
but it kind of bondles with the mouse exploring the mouse.
|
||||
Right.
|
||||
So it's kind of the combination of the two open source cool.
|
||||
It's very, very expensive.
|
||||
How do I imagine this?
|
||||
Hence, however, it is.
|
||||
Very.
|
||||
Right.
|
||||
Backtrack is a very widely used Linux distribution for ethical hackers.
|
||||
It's sole purpose is for testing security on a network
|
||||
and it's completely open source.
|
||||
My summary.
|
||||
Do you too many cooks spoil a broth when it comes to security?
|
||||
Not really.
|
||||
As hackers at the end of the day can always reverse engineer the source cold anyway.
|
||||
It doesn't tend to happen that way, but it's a possible way to go.
|
||||
It's better to have more people look at the source and audit it for bugs.
|
||||
Open source has a potential to be inherently more secure by design,
|
||||
but it obviously relies on the diligence of the people leading the project.
|
||||
You need to review your code, check your patches.
|
||||
Don't just write a function and leave it for three years.
|
||||
Come back to it and keep checking your code.
|
||||
Excuse me.
|
||||
Should Kobe lock down and hidden in order to hide vulnerabilities?
|
||||
No.
|
||||
It's much easier for a hacker to find their own discourse vulnerability than it is for a programmer to patch.
|
||||
Quite a big hole.
|
||||
The open source organizations help security as a whole.
|
||||
Yeah.
|
||||
The large majority of security-based projects are open source or have open source elements built into them.
|
||||
Any questions?
|
||||
Do you know, of course, I'm sure you remember the recent SEBI and X-point would be an entropy problem?
|
||||
Was that with the SSH?
|
||||
Yes.
|
||||
Was that found by nasty hackers?
|
||||
That was a DEFCON, was it not?
|
||||
Yeah, they got home.
|
||||
The last year at DEFCON they basically used the SSH vulnerability problem.
|
||||
SSH vulnerabilities can highlight into the code almost.
|
||||
So if you've got the package, they could have been patched from that time.
|
||||
You can still have them play.
|
||||
Yeah.
|
||||
They attacked them.
|
||||
So what happened is, like, the load of the packages, they were talking about.
|
||||
Obviously, they had their pockets smelt at some point and they managed to get their SSH passwords.
|
||||
I remember you saying, basically took all of the cleaners at that point.
|
||||
I mean, all of the big names of that could go on.
|
||||
And you know, having all of them, you know, all of them on that side.
|
||||
And actually, they said to them, you know, lots of cool.
|
||||
But yeah, that's the brand and the image in it.
|
||||
And it has an abstract.
|
||||
Oh, there's comic likes that say, brand and big four.
|
||||
That was wrong.
|
||||
Anybody else?
|
||||
Thank you for listening to Hack with Public Radio.
|
||||
HPR is sponsored by Carol.net.
|
||||
So head on over to C-A-R-O dot N-E-C for all of us in need.
|
||||
Thank you.
|
||||
Reference in New Issue
Block a user