Files
hpr-knowledge-base/hpr_transcripts/hpr1050.txt

312 lines
20 KiB
Plaintext
Raw Normal View History

Episode: 1050
Title: HPR1050: TGTM Newscast for 2012/8/8 DeepGeek
Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr1050/hpr1050.mp3
Transcribed: 2025-10-17 17:59:04
---
You are listening to TalkEak 3 News, number 71, recorded for Wednesday, August 8, 2012.
You are listening to the Tech Only Hacker Public Radio Edition, to get the full podcast
including political, commentary, and other controversial topics.
Please visit www.talkEak3.us.
Here are the vials statistics for this program.
Your feedback matters to me.
Please send your comments to dgatdeepgeek.us.
The webpage for this program is at www.talkGeek2Me.us.
You can subscribe to me on Identica as the username DeepGeek or you could follow me on Twitter.
My username there is dgtgtm as in DeepGeek TalkGeek2Me.
Before I get into tech news, I want to once again thank my listeners for sticking by me through my
needs to take two months of vacation time from the podcast.
My family and myself thank all of you for sticking with me.
The time was just needed.
My life is not completely my own.
What else can I say?
The Hope Conference was absolutely amazing.
If you go to my website www.talkEak3Me.us and click the blog link.
I put a little post where I gave a link to the whole list of show talks and MP3 format
on the web and I gave hot links directly to five ones that I know to be excellent.
If you want to take advantage of that, please help yourself.
And now the tech roundup.
From eff.org, day of July 4, 2012, by Eva Galperin, open source developer Basel Cotterbil
detained in Syria.
This weekend, the eff learned that Basel Cotterbil, a longtime member of the open source community
and creative comments volunteer, has been detained in Syria since March 12, 2012, as part
of a wave of arrests made in the Mesodistrict of Damascus.
For months, Basel's family hasn't had no knowledge of his whereabouts or the reason for
his arrest, only recently have they heard from previous detainees at the Café Sousa
that he is being held at this location.
This news is especially alarming in light of the recent human rights watch report documenting
the use of torture in 27 detention facilities run by Syrian intelligence agencies.
In addition to his work as technical director on several projects, including efforts to
restore the Palmyra atoll and the publication of Ford Syria Magazine.
Basel contributed his time and expertise to countless organizations, including creative
comments, Visital Firefox, Wikipedia, the Open Clip Art Library, February Cotters, Charism
and the Arab bloggers community.
More than 1,000 people from all over the world have already signed a letter of support
for Basel, addressed to the Syrian government requesting information about his detention,
health and psychological state, and demanding his immediate release.
They are also treating the support using the hashtag free Basel, EFF joins Basel's friends,
family, and colleagues in calling for his release and condemns the Syrian government,
which has held him for almost 4 months without charges or a trial.
From torrentfreak.com, day of June 28, 2012, by Ben Jones, mega-upload search warrants
ruled illegal by a high court, a case that seemed just 5 months ago to be a veritable David
and Goliath fight is certainly living up to its billing.
The battle between mega-upload David and the U.S. government and the MPAA Goliath
stored out with a flurry of blows against the New Zealand-based side-staff, but in recent
weeks, the blows have all been falling stateside.
Today, the New Zealand high court ruled that the search warrants used to raid at .com's
mention were illegal, casting uncertainty over the entire mega-conspiracy case.
And early ruling by high court Justice Judith Potter concluded that a previous search in
seizure order was invalid because of improper paperwork, the documents were later corrected,
and the ruling chief Justice Helen Winkleman declared the warrants illegal knowing that
they were not adequately descriptive of the offenses .com was accused of.
Indeed, they felt short of that.
They were general warrants, and as such, are involved, in addition, the data removed from
New Zealand by the FBI, which they claim was not stolen since it was only data, was also
ruled to be illegally obtained and should not have been taken out of the country.
Quote, the release of cloned-hard drives to the FBI for shipping to the United States
was contrary to the February 16th direction on the Section 49-Support 2 of the MACMA Mutual
Assistance and Criminal Matters Act that the item seized were to remain in custody and
control of the Commissioner of Police.
This dealing with the cloned-hard drives was therefore in breach of Section 49-Support
3 of the MACMA.
Winkleman also voiced concerns of a police conduct questioning if their actions in January
amounted to unreasonable search in seizure where they preliminary view that they did, along
with these concerns came a note that the raid could be considered trespassed by the police,
not something the elite anti-terrorist team used for the raid will want on their record.
Perhaps the biggest setback for any prosecution relates to what evidence was collected and
independently and appropriately experienced high court lawyer will now conduct a review
of the evidence to determine what is and is not relevant to the charges.com faces.
Anything deemed that relevant will be returned to.com and not provided to the U.S., anything
deemed relevant will be copied to both.com and U.S. authorities for use in court.
While the ruling does not amount to the unequivocal quashing of the search warrants and the invalidation
of any evidence collected through them, it is a significant win for .com.
Meanwhile a request for the cloned-hard drives to be returned, presumably without being copied,
has been made to U.S. authorities.
The amount of respect for the New Zealand legal system held by the U.S. authorities may
be inferred by the time it takes to comply with the request.
As for the extradition hearing, that's still going ahead.
On TechDirt.com dated July 2, 2012, by Mike Maznick, Congress players see no evil, pretend
there's no evil, let the evil continue with NSA domestic spying.
We're still completely perplexed at how anyone in Congress could recognize that the NSA
has refused to tell Congress how often it's violated the Piracy of Americans without
a warrant under the FISA Amendment Act, the FAA, and then still vote to renew it.
What kind of oversight is that?
As Julian Sanchez recently wrote, it's no oversight at all.
As he notes, the law requires the NSA to prevent the spying on folks when both parties
in communication are in the U.S., but here the NSA is admitting that it has no mechanism
to do that, either A. It's lying or B. It's admitting that it cannot do what the law requires.
Extended quote, if we care about the spirit as well as a letter of that constraint being
respected, it ought to be a little disturbing that the NSA has admitted it doesn't have
any systematic mechanism for identifying communications with U.S. endpoints.
Similar considerations apply to the minimization procedures which are supposed to limit the
retention and dissemination of information about U.S. persons.
How meaningfully can these be applied if there's no systematic effort to detect when a U.S.
person is partied to a communication?
In black quote, normally this should be the point at which Congress steps in and says
no more to the NSA, instead it's chance those who even ask the basic questions.
And as in the case of Representative Dan Lungren, pretends that as long as no one proves
to them that the NSA is abusing its power, there is simply no reason to demand evidence.
That's not oversight, that's real full ignorance.
And given that they're choosing to ignore their own oversight obligations over the NSA spying
on Americans, it should calm us no surprise that the House Intelligence Committee unanimously
voted to extend the FAA for five more years.
Why not?
Not like Congress is actually going to make sure that the NSA is playing by the rules.
The NSA apparently just needs to say that it would be too much work to do.
What the law requires in Congress says here, have a gift of five more years to spy on Americans
against the specifics of the law.
And once again, as Sanch has points out, there are plenty of ways that the NSA could at
least estimate how many Americans they're spying on.
But why would it do that?
As Sanch has points out, the NSA seems to redact anything even remotely embarrassing from
its reports, including data on how often it failed to file a law.
Extend quote, more generally these reports contain a good deal of redacted statistical
information that there is simply no plausible excuse for keeping secret.
A table of statistical data relating to compliance incidents, for example, is included, but entirely
blocked out.
Are we to believe that the national security agency of the United States would be imperiled
if the public knew the number of times the NSA had difficulty following the law?
The review has concluded that the number of compliance incidents remain small, particularly
when compared with the total amount of activity.
But is there any legitimate reason for borrowing the public from knowing what counts as a small
number or just how massive the total amount of activity truly is?
How do folks in Congress who vote for this kind of thing defend such actions?
They can't say that's to protect Americans when they refuse to even seek to get the
data on whether or not Americans are being illegally spied on.
From techdead.com dated July 2, 2012 by Mike Masnick, you don't own what you buy, part
15,000 to 332, Cisco forces questionable new firmware on routers.
One of the things that we keep learning in a connected digital age is that what you think
you bought you often don't really own.
Companies who sell you products seem to feel a certain freedom to unilateral change the
terms of your purchase after the fact.
I'm reminded of Sony removing key features of the PS3, though there are plenty of other
examples.
A new one is the story of Cisco pushing out a firmware update to routers without customer
approval and even worse having that firmware update block people from logging in directly
to their own routers.
Apparently if you don't like it too bad quote, Cisco has stored automatically pushing
the company's new cloud connect firmware update to consumer routers without customer
approval.
Onoid uses note that the update won't let customers directly log into the routers anymore.
They have to register for a new cloud connect account.
The only way to revert to directly assessing your device you paid for, you have to unplug
it from the internet.
Oh, and registering for such an account means you have to agree to give up your data so
that Cisco can sell it.
As per the terms, we may keep track of certain information related to your use of the service,
including but not limited to the zest and health of your network and network products,
which apps relating to the service you are using, which features you are using within
the service infrastructure, network traffic, i.e. megabytes per hour, internet history,
how frequently you encounter errors on the service system, and other related information.
Other information.
We use this other information to help us quickly and efficiently respond to inquiries and requests
and to enhance or administer our overall service for our customers.
We may also use this other information for traffic analysis.
For example, determining when the most customers are using the service and to determine which
features within the service are most or least effective and useful to you.
In addition, we may periodically transmit system information to our servers in order to
optimize your overall experience with the service.
We may share aggregate and anonymous user experience information with service providers,
contractors, and other third parties.
End quote.
Seems like a good way to drive people into buying routers from other companies.
I can see how a cloud service could have value, but it should be presented to users as
a choice.
Where the actual benefit to them, if there is one, is clearly presented.
This world seems designed solely to benefit Cisco and its partners rather than the people
who bought or so they thought their routers.
From perspectives.mvderona.com, day of July 13, 2012, by James Hamilton.
Why are the data centers in New York, Hong Kong, and Tokyo?
Why are there so many data centers in New York, Hong Kong, and Tokyo?
These urban centers have some of the most expensive real estate in the world.
The cost of the labor is high, the tax environment is unfavorable, power costs are high, construction
is difficult to permit, and expensive.
Urban data centers are incredibly expensive, facilities, and yet a huge percentage of
the world's computing is done in expensive urban centers.
One of my favorite examples is the 111-8 Avenue data center in New York.
Google bought this data center for $1.9 billion.
They already have facilities on the Columbia River where the power and land are cheap.
Why go to New York when neither is true?
Google is innovating in cooling technologies in their Belgium facility where they are using
waste water cooling.
Why go to New York where the facility is conventional, the power source predominantly coalsourced,
and the opportunity for energy innovation is restricted by legacy design and the lack
of real estate available in the area around the facility.
It's pretty clear that 111-8 Avenue isn't going to be wind-formed powered.
A solar ray could likely be placed on the roof but that wouldn't have the capacity to run
the interior lights in this large facility.
See I love solar but for more on the space challenges of solar power at data center powered
densities.
There isn't space to do anything relevant along these dimensions.
Google has some of the most efficient data centers in the world running on some of the
cleanest power sources in the world and custom engineered from the ground up to meet their
needs.
Why would they buy an old facility in a very expensive metropolitan area with a legacy
design?
Are they nuts?
Of course not.
Google is in New York because many millions of Google customers are in New York or nearby.
Companies cite data centers near the customers of those data centers.
Why not serve the planet from Iceland where the power is both cheap and clean?
When your latency budget to serve customers is 200 milliseconds, you can't give up three
cores of that time budget on speed of light delays traveling long distances.
Just crossing the continent from California to New York is 74 milliseconds round trip
time.
New York to London is 70 milliseconds round trip time.
The speed of light is unbending.
Actually it's even worse than the speed of light and that the speed of light in a fiber
is about two thirds of the speed of light in a vacuum.
See communicating beyond the speed of light.
Because of the cruel realities of the speed of light, companies must cite data centers where
their customers are.
That's why companies on worldwide often need to have data centers all over the world.
That's why the Akamae Content Distribution Network has over 12,000 points of presence
worldwide.
To serve customers competitively, you need to be near those customers.
The reason data centers are located in Tokyo, New York, London, Singapore, and other expensive
metropolitan locations is they need to be near customers or near data that is in those
locations.
It costs considerably to maintain data centers all over the world, but there is little alternative.
Many articles recently have been quoting the Greenpeace Open Letter asking Balmer Bezos
and Cook to go to Iceland.
See for example, let up to Balmer Bezos and Cook go to Iceland.
Having come many of these articles recently, it seemed worth stopping and reflecting on why
this hasn't already happened.
It's not like companies just love paying more or using less environmentally friendly power
sources for their data centers.
Google is in New York because it has millions of customers in New York.
If it were physically possible to serve those customers from an already built hyper-efficient
data center like Google Dales, they certainly would, but the facility is 70 millisecond
round trip away from New York.
What about Iceland?
Roughly the same distance.
It simply doesn't work.
Comparatively, companies build near their uses because physics of the speed of light
is unbending and uncaring.
So what can we do?
It turns out that many workloads are not latency-sensitive.
The right strategy is to house latency-sensitive workloads near customers, or the data needed
at low latency and house latency-insensitive workloads optimizing on other dimensions.
This is exactly what Google does, but to do that you need to have many data centers all
over the world.
So the appropriate facility can be selected on a workload by workload basis.
This isn't a practical approach for many smaller companies with only one or two data centers
to choose from.
This is another area where cloud computing can help.
Cloud computing can allow mid-size and even small companies to have many different data
centers optimized for different goals all over the world.
Using AWS, a company can house workloads near customers in Singapore, Tokyo, Brazil,
Ireland to be close to their international customers, being close to these customers
makes a big difference in the overall quality of customer experience.
As well as allowing a company to co-host effectively have an international presence.
Cloud computing also allows companies to make careful decisions on where they locate workloads
in North America.
Again, using AWS as an example, customers can place workloads in Virginia to serve the
East Coast or use Northern California to serve the population dense California region.
If the workloads are not latency-sensitive or is serving customers near the Pacific Northwest,
they can be housed in AWS Oregon region where the workload can be hosted cold-free and
less expensively than Northern California.
The reality is that physics is uncaring and many workloads do not need to be close to users.
Cloud computing allows all companies to have access to data centers all over the world
so they can target individual workloads to the facilities that most closely meet their
goals and the needs of their customers.
Some computing will have to stay in New York even though it is mostly cold-powered, expensive
and difficult to expand.
But some workload will run very economically in the AWS West Oregon region where there
is no cold power, expansionist cheap, and power inexpensive.
With placement decisions are more complex than move to Iceland.
News from techdirt.com, perspectives.mvderona.com, and Maggie McNeil.wordpress.com, used under
a range permission, news from torrentfreak.com, and eff.org, used in the permission of the
Creative Commons by Attribution License.
News from democracynow.org, and wlcentral.org, used under permission of the Creative Commons
by Attribution, non-commercial, no-derivatives license.
News from jillstein.org is a press release.
News sources retain their respective copyrights.
Thank you for listening to this episode of Talk Geek To Me.
Here are the vials statistics for this program.
Your feedback matters to me.
Please send your comments to dgatdeepgeek.us.
The webpage for this program is at www.talkgeektoMe.us.
You can subscribe to me on Identica as the username DeepGeek or you could follow me on
Twitter.
My username there is dgatm, as in DeepGeek Talk Geek To Me.
This episode of Talk Geek To Me is licensed under the Creative Commons Attribution Share
Like 3.0 on Port License.
This license allows commercial reuse of the work, as well as allowing you to modify the
work, so long as you share alike the same rights you have received under this license.
Thank you for listening to this episode of Talk Geek To Me.
You have been listening to Hacker Public Radio at Hacker Public Radio.
We are a community podcast network that releases shows every weekday on their free Friday.
Today's show, like all our shows, was contributed by an HBR listener by yourself.
If you ever consider recording a podcast, then visit our website to find out how easy
it really is.
Hacker Public Radio was founded by the Digital Dog Pound and the Infonomicom Computer Club.
HBR is funded by the Binary Revolution at binref.com.
All binref projects are crowd-responsive by linear pages.
From shared hosting to custom private clouds, go to lunarpages.com for all your hosting
needs.
Unless otherwise stasis, today's show is released under a Creative Commons Attribution Share
and Like.
The Digital Dog License.