Files
hpr-knowledge-base/hpr_transcripts/hpr1329.txt
Lee Hanken 7c8efd2228 Initial commit: HPR Knowledge Base MCP Server
- MCP server with stdio transport for local use
- Search episodes, transcripts, hosts, and series
- 4,511 episodes with metadata and transcripts
- Data loader with in-memory JSON storage

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-26 10:54:13 +00:00

389 lines
27 KiB
Plaintext

Episode: 1329
Title: HPR1329: TGTM Newscast for 2013-13-08
Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr1329/hpr1329.mp3
Transcribed: 2025-10-17 23:38:23
---
Let's go.
listening to TGTM News No. 101, record for Tuesday, August 13, 2013.
You're listening to the Tech Only Hacker Public Radio Edition to get the full podcast,
including political, commentary, and other controversial topics.
Please visit www.talkeakedme.us.
Here are the vials statistics for this program.
Your feedback matters to me.
Please send your comments to dgatdeepgeek.us.
To a page for this program, it's at www.talkeakedme.us.
You can subscribe to me on Identica, as the username DeepGeek, or you could follow me on Twitter.
My username there is dgtgtm, as in DeepGeek Talk Geek to me.
Hi, this is Dan Washco, and now the tech roundup.
LavaBit email service snowed in allegedly used shutdown.
This is an open letter from the owner of LavaBit.com, a secure email service that snowed in
allegedly used that was published on their site.
My fellow users, I have been forced to make a difficult decision to become complicit
in crimes against the American people, or walk away from nearly 10 years of hard work
by shutting down LavaBit.
After significant soul searching, I've decided to suspend operations.
I wish that I could legally share with you the events that led to my decision.
I cannot.
I feel you deserve to know what's going on.
The first amendment is supposed to guarantee me the freedom to speak out in situations
like this.
Unfortunately, Congress has passed laws that say otherwise.
As things currently stand, I cannot share my experience over the last six weeks, even
though I have twice made the appropriate requests.
What's going to happen now?
We've already started preparing the paperwork needed to continue to fight for the Constitution
and the Fourth Circuit Court of Appeals.
A favorable decision would allow me to resurrect LavaBit as an American company.
This experience has taught me one very important lesson.
Without congressional action or strong judicial precedent, I would strongly recommend against
anyone trusting their private data to accompany with physical ties to the United States.
This is literally Ladder Levison, owner and operator LavaBit LCC.
Five-dimensional glass memory can store 360 terabit per disk, rugged enough to outlive
the human race.
This recording is from the command line podcast by Thomas Gideon about a new storage technology.
Extreme tech was one of several outlets covering research on a new, five-dimensional storage
medium.
I was skeptical at first, as was the author Sebastian Anthony who encouraged the reader
to read past what it first seems like pseudoscience.
The first three dimensions under discussion sound similar to other optical storage media.
This one is based right now on plain old silica glass, so not too different from the plastics
involved with the optical disks with which we're familiar, and also not too dissimilar
from what I understand IBM has been looking at into for the last couple of decades in
the form of holographic storage in 3D optical lattices.
Even multiple layer versions of current optical disks, that's actually not uncommon with
Blu-ray in particular, are technically three dimensions, two for the placement of the
bit, along a circular track, and then from the inside to the outside of the disk as either
a dot on a die-based, re-writeable, or a consumer-writeable disk, or a pit in a surface
from the ones that are commercially produced.
The third, I'll bait small in these multi-layer disks, a third small dimension for the particular
layer within the disk itself, the top to bottom layer.
In this scheme, the extra two dimensions for this new approach arise from manipulating
refraction and polarization, so not technically physical dimensions, but definitely dimensions
in the sense of mapping out a space of possibilities.
With very fine placement on the Z-axis, it sounds like the traditional third dimension
here also is a bit more dense than the disks with which we're familiar, more than just layering,
talking about nanometers, if not smaller, within the density of the disks.
Those two additional dimensions are apparently made possible through careful manipulation
of the nanostructure of the glass itself.
It made me think a bit about the diffusion glass story that I shared a while back where
that same structure was used as a valuable source of randomness, one that could be altered
with a slight application of heat, for key and crypto systems in a durable yet very secure
way.
You might remember that as a discussion about a novel approach, a novel physical approach
to the idea of a one-time pad.
The difference here is that the glass involved for this storage purpose is apparently super
stable, maybe not as susceptible to the fine grain and minor alterations from applying
a little bit of heat that the crypto application was, so there might be different kinds of glass.
The researchers from the University of Southampton and the UK note that the necessary structures
remain stable up to a thousand degrees Celsius, so that suggests that the silica here is
a bit different from the diffusion glass in the end of the story.
They also didn't observe any noticeable degradation of the medium over time, although they clearly
have not been doing this research for very long, so it seems like they may be jumping
to a bit of a conclusion here, although maybe a valuable one we'll see, of this particular
medium potentially lasting for millennia.
A few similar claims were probably made though in hindsight for CDs and DVDs when they
were first developed until some specific studies were done over longer periods, multiple
years worth of time to look at the actual effects of degradation that might suggest the
lifetime at discs over decades, if not hundreds of years, centuries, let alone millennia.
Regardless, the actual provable benefit is the storage density, some 3,000 times that of
a blue ray disc at a whopping 360 terabytes worth of storage.
That's even 18 times the capacity of the latest generation of new hard drive technology
mentioned, the article that has not seen commercial release yet.
The limiter, as is usually the case with these sorts of research driven efforts, is the
size of the machinery needed to use this glass.
Right now data is recorded with a femtopulse laser, I'm not really sure that's the resolution
that we see in the optical drives we already have in our desktops and servers, and red
with a high quality microscope, which is definitely not the way the optical discs that were
used dealing with our red back.
Anthony speculates, I think correctly that the first applications are likely archival.
This conclusion in particular reminded me of the graphene paper that I talked about a
few weeks ago, though here, the article is clear, this is for digital storage and if
you're curious, there's a really good description of actually how that information is directly
encoded into this glass.
Anthony does provide, as I said, a good amount of clear detail about how this works, leveraging
those very physical attributes, the one familiar and unfamiliar, are used to both store and
retrieve three bits for I think every two coordinates in that five dimensional space.
It's a little fuzzy on that, but definitely worth a read, I think it does a far better
treatment than I'm doing justice to regardless.
The South, Ampton researchers are definitely looking at commercialization next.
The article notes that they are already looking for industrial partners to help.
From TechDirt.com by Glenn Moody dated August 7, 2013, US government war on hackers backfires.
Now top hackers won't work with the US government.
From what did they expect department?
TechDirt has noticed the increased demonization of hackers, not to be confused with crackers
that break in the system for criminal purposes.
For example, by trying to add an extra layer of punishment on other crimes if they were
done on a computer, high profile victims of this approach include Bradley Manning, Aaron
Schwartz, Jeremy Hammond, Barrett Brown, and of course Edward Snowden.
But is this router's story reports that crash attempt to intimidate an entire community
in case anyone there might use computers to embarrass the US government or reveal its
wrong doings is now starting to backfire?
From the article, the US government's efforts to recruit talented hackers could suffer
from the recent revelations about its vast domestic surveillance programs as many private
researchers expressed disillusionment with the national security agency.
Though hackers tend to be anti-establishment by nature, the NSA and other intelligence
agencies had made major inroads in recent years in hiring some of the best and brightest
and paying for information on software flaws that helped them gain access to target computers
and phones.
Much of that goodwill has been erased after the NSA's classified programs to monitor phone
records and internet activity were exposed by former NSA contractor Edward Snowden, according
to prominent hackers and cyber experts.
The article goes on.
Closest to home for many hackers are the government's aggressive prosecutions under the
Computer Fraud and Abuse Act, which has been used against internet activist Aaron Swartz,
who committed suicide in January and US soldier Bradley Manning, who leaked classified files
to anti secrecy website WikiLeaks.
A letter circulating at DEF CON and signed by some of the most prominent academics in computer
security said the law was chilling research in public interest by allowing prosecutors
and victim companies to argue that violations on electronic terms of service constitute
unauthorized intrusions.
This latest development also exposes a paradox at the heart of the NSA's spying program.
Such total surveillance, things like GCHQ's Tempora that essentially downloads and stores
all internet traffic for a while, is only possible thanks to advances in digital technology.
Much of the most innovative work there is being done by hackers.
It's significant that the NSA's massive X-key score program runs on the Linux cluster,
but as the NSA is now finding out those same hackers are increasingly angry with legal
assault on both them and their basic freedoms.
That may make it much harder to keep the pace of technological development within the spying
program in the future unless the US government takes steps to address hackers concerns, something
that seems unlikely.
From torrentfreak.com by Ernesto dated August 7, 2013, Hollywood keeps censoring pirate
bay documentary, director outraged.
Over the past few months, several Hollywood studios have asked Google to remove links to
the free-to-share pirate bay documentary tpb-afk.
The film's director Simon Klaus has contacted the search engine in an attempt to have the
links put back online, but thus far without success.
Meanwhile, film studios continue to submit new DMCA requests to censor this documentary.
After a long wait, the pirate bay documentary tpb-afk was released to the public earlier
this year.
The film created by Simon Klaus follows three pirate bay founders during their trial
in Sweden, true to the nature of the site it can be downloaded for free.
Through this model, the film's director hopes to reach a broadest audience possible.
However, several film studios are obstructing this goal by sending DMCA notices to Google,
asking the search giant to remove the links to the documentary.
Close was quite shocked when he found out about the unwarranted censorship and initially
decided to lawyer up and sue the movie studios.
That plan was later aborted when the lawyers at chilling effects advised the director that
his efforts would be futile.
Quote, the lawyers saw no use suing the movie studios for filing false DMCA claims and
seek damages, unless I could prove subject, intent, and bad faith, instead they advised
me to file a counter notice once.
I had found out whether Google had actually taken down the links or not.
End quote close explains.
The director decided to take up this advice and contacted Google instead, hoping to get
the censor links put back up.
Close managed to get in touch with Google's Nordic Policy Council David Mothander, who
said he would follow up on the case, but today, two months and several reminders later,
the director still has to receive no reply.
This leads close to a belief that Google is more interested in helping Hollywood to censor
the web than assisting independent creators to correct DMCA takedown abuses.
Quote, to me, it's a depressive lesson that Google rather acts as a private proxy for
dinosaur copyright enforcement and helping indie filmmakers experiment with sustainable
distribution models.
End quote close says, while the automated takedown request in question may not be intentional,
they are certainly not an isolated incident.
After our initial report back in May, copyright holders have sent in several new takedown
requests for tpb-afk.
Below is one that was sent on behalf of HBO and June.
Here's the claims to protect the Pacific, but also lists a link to the pirate bay documentary.
A quick search on Google reveals that the results has indeed been removed.
Now go to the article to see the actual HBO takedown notice, which includes a number
of entries and number 62 is the pirate bay away from keyboard documentary.
The same is true for DMCA requests that was sent in Lionsgate name recently.
This notice list, the haunting of Connecticut 2, goes to Georgia as the copyrighted work
that is can be seen below, the notice also affects tpb-afk and a wide variety of other unrelated
titles.
Again, go to the website in the article's notes to see the Lionsgate takedown notice and
a portion of it that clearly shows tpb-afk is included in there.
Close is pretty upset by the unwarranted censorship which he says hurts his business model and he
urges Google to also protect those who gladly give away their work for free.
Quote, it sucks to be arbitrarily censored by Google's and Hollywood's bot system.
By making it harder for us to share the film they are harming, our freemium distribution
experiment, end quote, gloss, tells torn freak, quote, it's bizarre to be punished for
experimenting with distribution models by an industry doing so little for the filmmakers
they claim to protect, end quote, he adds, well it is unrealistic to expect Google
to catch all errors made by copyright holders, gloss is problem, does bring up one important
issue.
There is currently no easy way to check whether a link has been removed from Google.
In addition, it is not clear how third parties can send counter notices to reinstate content
on websites that they don't own.
First and foremost, however, copyright holders should make sure that their automated tools
don't take down legitimate content that is not theirs.
Update, sign in class, tells torn freak that after he posted his complaint in public,
Google's Nordic Policy Council, David Moffander, offered to read and state the links, quote,
David from Google just called me up and said his reply to me had gotten stuck in his
outbox.
He said it was really lame excuse and he said he was sorry, then he offered to put the
links that have been taken down back, end quote, gloss says.
From tornfreak.com by Andy, dated August 6, 2013, DMCA notices the search engines won't
mitigate piracy, tech giants say.
A news research paper seriously downplays the importance of search engine traffic on
sites that offer unauthorized downloads.
The CCIA, which counts Google Yahoo and Microsoft among its members, says that making items
disappear from search results via DMCA notices is not the key to substantially reducing piracy.
General-purpose search engines are not part of the average infringers toolbox.
The companies know, adding that entertainment company should focus on their own SEO.
One of the hottest piracy-related topics in recent times is the role search engines
play in the discovery of unauthorized copyrighted material.
Rights holders and their thousands have already sent Google more than 100 million DMCA take-down
notices this year in the belief that removing search engine listings will go a long way
towards making illicit content harder to find.
But is that really the case?
According to a new research paper title, the search fixation and infringement search results
in online content.
The emphasis right holders are placing on censoring search engine results is actually achieving
very little, and those valuable resources might be better off spent elsewhere.
The paper published by the Computer and Communications Industry Association CCIA, which accounts
Google Yahoo, Microsoft, and Facebook among its members, says that entertainment industry
companies have become fixated on the role search engines play in unauthoristic content.
The focus is so great, there was even an attempt to legislate site censorship via a controversial
stop online piracy act.
Quote, this might lead to the conclusion that search engines are a prominent tool in
infringer's toolbox.
In fact, available evidence suggests that search is not a particularly relevant tool for
infringer seeking to find sites, such as the pirate bay, or for sites to find users,
end quote, the report states.
The CCIA cites research from BAE Systems Dectica, which found that users are far more likely
to return to infringing sites via direct browser entry or via social networks.
Furthermore, it appears that users looking for illicit material already know where they
want to obtain it from even before they start searching.
Quote, as of August 2013, over 20% of queries that results in traffic being directed to
the pirate bay consist of words comprising the pirate bay's domain name.
This suggests that users are quite aware of their intended destination before they arrive
in the search engine, and that any facilitation was minimal.
CCIA explained.
When criticizing Google over its search results, the RIA has previously noted that searches
including the terms download MP3 or Torrent often turn up links of to infringing content.
However, in their report, the CCIA says that such searches are infrequent when compared
to straightforward lookups on artists' names, which are actually more likely to turn
up links to authorized content.
So why not improve the usefulness of those?
Quote, the fixation on the moding responsive but undesirable search results overlooks
a more viable strategy, promoting desirable search results, end quote, the paper notes.
CCIA suggests that if the entertainment industry wants their content to appear in search
results for the user's type objectionable terms such as those listed above, then they
will have to start using them on pages offering legal content, noting that legitimate sites
aren't currently employing such a strategy, the CCIA comes to two conclusions.
Quote, this suggests either A, a deficiency in otherwise robust online marketing strategies
or B, that these terms are judged to be unworthy of optimizing because they will drive a trivial
amount of commercial traffic.
Quote stated otherwise, if search terms such as MP3 and download were likely to lead to
sales or subscriptions, a rational, profit-minded online platform engaging in basic search engine
optimization, SEO, would attempt to incorporate those terms in site content, end quote.
The CCIA concludes by noting that while DMCA notices might be a useful tool, they are
unlikely to achieve the desired result of substantially reducing piracy, concentrating
on improving the visibility of legitimate content, even if that means utilizing objectionable
terms would be more robust strategy.
From perspectives.mvdirona.com by James Hamilton, dated July 16, 2013, counting servers
is hard.
At the Microsoft Worldwide Partners Conference, Microsoft CEO Steve Walmer announced that,
quote, we have something over a million servers in our data center infrastructure.
Google is bigger than we are, Amazon is a little bit smaller.
You get Yahoo and Facebook, and then everybody else is 100,000 units probably or less, end quote.
That's a surprising data point for a variety of reasons.
The most surprising is that data point was released at all.
Just about nobody at the top of the server world chooses the boast with a server count
data point, partly because it's not all that useful a number, but mostly because a single
data point is open to a lot of misinterpretation, even by even skilled industry observers.
Basically, it's pretty hard to see the value of talking about server counts, and it's
very easy to see the many negative implications that follow from such a number.
The first question when thinking about this number is, where does the comparative data
actually come from?
I know for sure that Amazon has never released server count data.
Google hasn't either, although estimates of their server footprint abound.
Interestingly, the estimates of Google server counts five years ago was one million servers,
whereas current estimates have them only in the 900,000 to one million range.
The Microsoft number is surprising when compared against past external estimates.
Data center build rates or ramp rates from previous hints and estimates.
But as I said, little data has been released by any of the large players, and what's out
there is typically nothing more than speculation.
Saving servers is hard, and credibly comparing server counts is close to impasse.
Assume that each server runs 150 to 300 watts, including all server components, with a weighted
average of, say, 250 watts per server.
And as power usage effectiveness estimator, we will use 1.2, only 16.7% of the powers
lost to data center cooling and power distribution losses.
With these scaling points, the implied total power consumption is over 300 megawatts, 300
million watts, or looking at annual megawatts per hour.
We get an annual consumption of over 2,629 743 megawatt hours, or 2.6 terawatt hours.
That's a hefty slice of power, even by my standards.
As a point of scale, since these are big numbers, the US Energy Infrastructure Administration
reports that in 2011, the average US household consumes 11.28 megawatts.
Using that data point, 2.6 terawatts is just a bit more than the power consumed by 230,000
US homes.
Continuing through the data and thinking through what follows over the 1 million servers,
the capital expense of servers would be $1.45 billion, assuming a very inexpensive $1,450
per server.
Assuming a mix of different servers with an average cost of $2,000 per server, the overall
capital expense would be $2 billion, before looking at the data center infrastructure and
networking costs required to house them.
With the overall power consumption computed above 300 megawatts, which is 250 megawatts
of critical power using a PUE of 1.2, and assuming a data center bill costs at a low
$9 per watt of critical load, uptime institute estimates numbers close to $12 per watt, we
would have a data center cost of $2.25 billion.
The implication of over 1 million servers is an infrastructure investment of $4.25 billion,
including the servers.
That's an athletic expense, even for Microsoft, but certainly possible.
How many data centers would be implied by the more than 1 million servers?
Ignoring the small points of presence since they don't move the needle and focusing on
the big centers, let's assume 50,000 servers in each facility.
That assumption would lead to 30 major facilities.
Once across check, if we instead focus on power consumption as a way to compute facility
count and assume a total data center power consumption of 20 megawatts each, and the previously
computed 300 megawatts total power consumption, we would have roughly 15 large facilities,
not an unreasonable number in this context.
The summary, following from these data, and the over 1 million servers number, facilities
15 to 30 large facilities, capital expense $4.25 billion, total power 300 megawatts.
Power consumption, 2.6 terawatts per hour annually, over 1 million servers is a pretty big
number even in web-scaled world.
Staff and produced by the TGTM News Team Editorial Selection by DeepGeek, views of the story
authors reflect their own opinions and not necessarily those of TGTM News.
News from TechDirt.com, e-o-i-online.org, perspectives.mv-irona.com, and allgov.com, used
under a range permission.
News from e-f-f.org and tornfreak.com, used under permission of the Creative Commons
by Attribution License.
News from r-h-realitycheck.org, used under terms published on their website.
News from lavabit.com is an open letter.
News from fair.org, used under permission of the Creative Commons by Attribution, non-commercial
no-derivatives license.
News from the commandline.net, used under permission of the Creative Commons by Attribution, share
a like license.
New sources retain their respective copyrights.
Thank you for listening to this episode of Talk Geek to Me.
Here are the vials statistics for this program.
Your feedback matters to me.
Please send your comments to dgatdeepgeek.us.
The web page for this program is at www.talkgeektoMe.us.
You can subscribe to me on Identica as the username DeepGeek or you could follow me on Twitter.
My username there is dggtm as in DeepGeek Talk Geek to me.
This episode of Talk Geek to Me is licensed under the Creative Commons Attribution, share
like 3.0 on Port License.
This license allows commercial reuse of the work, as well as allowing you to modify
the work, so long as you share a like the same rights you have received under this license.
Thank you for listening to this episode of Talk Geek to Me.
You have been listening to Hacker Public Radio at Hacker Public Radio, those aren't.
We are a community podcast network that releases shows every weekday Monday through Friday.
Today's show, like all our shows, was contributed by a HBR listener by yourself.
If you ever consider recording a podcast, then visit our website to find out how easy
it really is.
Hacker Public Radio was founded by the Digital.Pound and the Infonomicom Computer Club.
HBR is funded by the binary revolution at binref.com, all binref projects are crowd-responsive
by linear pages.
If you have shared hosting to custom private clouds, go to lunarpages.com for all your hosting
needs.
Unless otherwise stasis, today's show is released under Creative Commons Attribution, share
a like 3.0 on License.
You have been listening to Hacker Public Radio at Hacker Public Radio, those aren't.
We are a community podcast network that releases shows every weekday Monday through Friday.
Today's show, like all our shows, was contributed by a HBR listener by yourself.
If you ever consider recording a podcast, then visit our website to find out how easy
it really is.
Hacker Public Radio was founded by the Digital.Pound and the Infonomicom Computer Club.
HBR is funded by the binary revolution at binref.com, all binref projects are crowd-responsive
by linear pages.
From shared hosting to custom private clouds, go to lunarpages.com for all your hosting
needs.
Unless otherwise stasis, today's show is released under Creative Commons Attribution, share
a like 3.0 on License.