Episode: 3793 Title: HPR3793: RE: Zen_Floater2 Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr3793/hpr3793.mp3 Transcribed: 2025-10-25 05:25:20 --- This is Hacker Public Radio Episode 3,793 for Wednesday, the 15th of February 2023. Today's show is entitled Re Zen Floater 2. It is hosted by some guy on the internet and is about 19 minutes long. It carries an explicit flag. The summary is, God probably will use a Chromebook. Hello and welcome to another episode of Hacker Public Radio. I'm your host, some guy on the internet. Today's show is done as a response to Zen Floaters. God probably will use a Chromebook show. I want to use Zen Floaters argument and some sound bites from his show to assist me in this show. First, let's start with an introduction of Zen Floater himself. Zen Floater is your favorite magical forest squirrel. Former human being converted into squirrel by aliens in the 1960s and atheists. It's important that we understand what type of hardware that Mr. Zen Floater is using for this show. I'm on the Google Chromebook Go. Our magical forest friend discusses a show that he overheard with host Clad 2 from the GNU World Order where he discusses relying on web-based services. I believe I have also spoken out about this about how humanity is just relying too much on web-based services, especially that software as a service model. I believe that this is dangerous from a freedom standpoint and that you do not own your software or your data most importantly, and that this puts you at a severe disadvantage in the technological world. In this episode, Zen Floater discusses his fondness for the Chromebook. I personally would not use Chromebooks because they have this built-in obsolescence or whatever. You know, it's designed to fail after a certain point, thus creating e-waste. I understand that Mr. Floater is saving the device from becoming e-waste and putting another operating system onto the device, which is beneficial for all of us and himself. However, I still do not support Google's effort in making a device that's designed to fail. I believe it's called planned obsolescence. First, I started with one. Then I bought the Google Chromebook Go, which I dearly love this thing. It's got a beautiful screen. It's really fast. It's the best Chromebook ever. Now, following that up, Zen Floater gave us some interesting information around cloud services and his view on cloud services. We should all be careful about investing too much of our lives in cloud services or Chromebooks. However, when you're as old as I am, there's not much left to risk. So, you know, I'm the pilot that doesn't wear the seatbelt in the airplane, you know. From here, things begin to get a little bit interesting as we start to introduce God and his beliefs on God. God is coming, believe it or not, and I'm an atheist. It's a strange thing for me to say, but he's going to come in the form of AI someday. And there you have it. God will come in the form of AI someday, according to Zen Floater. Now, there's no doubt machines are impressive. They have extraordinary capabilities. And I do think that there are segments of all societies on, you know, all humans on the planet Earth. I think that there is a segment, a rather small segment, that will eventually adopt this belief that, you know, AI is sentient. And once they adopt that, you know, once they come to believe that a computer or a piece of software is the same as a human being. I mean, what is to then, you know, stand in the way of them believing that this thing that is the same as a human yet somehow smarter, somehow better, because it doesn't have all the selfishness and all the other elements that Zen has mentioned. What does this stop them from saying that isn't God? I personally would not adopt this understanding, but I do understand where Zen's coming from. And yes, I do believe will eventually hear about some human being setting up some sort of church or, you know, place of worship or, you know, ideology around this belief that AI is God. Zen then goes forward to provide some more insight into how, how limited the AI is today. And I find it fascinating because his explanation does sort of defeat the argument. I think it's, I think it's sort of like a battle within themselves as he's explaining how some will adopt the understanding that AI will become God. He's also explaining how it can't be that way because of how limited the technology is. We have not mastered how the human brain works to where it can originate its own thoughts. All the things that we call AI today are basically just machines that will speak in a language and listen in a language and access a database. And that's true. All computers are, it's a smart rock. You know, we've tricked a rock into thinking. We hit it with some electricity and now it is able to perform some, some complex math for us. And they're wonderful. I love computers. I'm sure you do too. However, I'm not willing to take one and, you know, hold it up as my belief symbol. Zen begins to then build more on the limitations of our current technology, as well as explain, you know, how things will eventually evolve, concluding with the understanding that AI will become God. But when we finally develop a thinking machine, we master the ability to think like the human brain does in an electronic form or some other high-speed form, then we will really have true artificial intelligence. In other words, a true intelligence that's artificial, not just a database access machine. Zen then began to follow with as mankind creates God in his own image. You'll have super computers that will be beating each other to rule the world. That's when things will get very interesting. Imagine the world when one group of people begin to accept one supercomputer as God. This is the smartest one. This is the best one. This is the one that should be God. And then another one comes along. Well, which one is actually God now? There's two of them. And then another one comes along. Now, imagine what society accepts these different things like, okay, so we can have more than one God then. What happens when attackers, as usual, get a hold of one of these supercomputers and then start swarping the messages that these computers bring forth. So you've got a group of gullible human beings that take AI as their God and then it gets hacked. So now, you know, this hacked AI is now spewing some sort of terrible message that people will begin to adopt because that's what their God tells them to do. That kind of thing. I think it's interesting. I don't think that it's likely to cause a massive problem, but I think it's very interesting to discuss. We're currently surviving using governments and lobbyists and congresses, sometimes dictators around the world are kings and princes, sometimes queens. I found this point to be the most interesting because he's right. The way our system is set up, we govern everything with these bodies. And we have these elected officials or monarchies or whatever. And this is a select group of people designed to tell us all what to do basically. And we accept this because we feel like this is the best way to go forward. Now, with that understanding, imagine when someone says, okay, well, if we can remove all the biases, if we can remove all of the different things that we don't like, like the racism and the sexism and all of the other bigotries and whatever, start all into something that doesn't have that sort of that isn't emotionally tied to the species, then wouldn't it be a better governing body? I mean, wouldn't it be better if it could think logically without going, okay, we need to have X for the black X for the white X for this rather than having all that have a system designed to govern humanity on a larger scale as a whole. I think that's how this belief will ultimately come into play where some will say, okay, we can call it God or maybe they won't call it God. You know, they'll probably give it another name, but in all, you know, for all intending purposes, that's what it'll become to some people. It will become their God that they will, you know, do tofully listen to it and obey it to provide a better way forward. Zen then points out some of the weaknesses and flaws in society with capitalism and communism and how that's going to shape up to bring about the AI God. We have to use capitalism because, you know, humans get lazy and they quit working and there has to be motivation in the form of payment or food eat or something for them to continue to work. There has to be motivation. That's one of the reasons why communism failed. So after discovering all the limitations of mankind in our society of structure, he then goes into how the AI God will be better, how it will provide for us in place of a human-based government or a human-based governing system. The AI God will be providing you with food. It will provide you with clothing, a place to live, transportation, whatever that turns out to be through a year or so now could be robotic cars, you know, airplane rides, boat rides, you name it, even trips another space. All of this will be controlled by the AI God. People won't be working anymore. The human race will be reduced to basically furthering the education with the notion that they're bettering the world when the reality of it is is that the AI God is so far ahead of them in intelligence that there's no humans left on earth and comprehend what it's trying to do. So with a system that is developed around AI and human beings who just take their hands completely off the wheel and allow computers to drive humanity forward, this obviously comes with some sort of caveat, some sort of consequence of just, you know, not being aware of your own situation as a species, especially when we're supposed to be the stewards of the planet. This then gives us his take on what those consequences may be basically the human race will become similar to a farm animal, except they won't notice that they will be closed, fed, taken care of, encouraged exercise, encouraged to maybe take trips, they'll be free to do whatever they want to do. And of course, population controls going to be a part of that. You just know that's going to happen. There's limits to everything. This reminds me of a scene in the movie The Matrix where Neil was speaking with Morpheus and Morpheus shows him what became of human beings like what rolled the human beings play in a machine driven world. He holds up a double a battery basically human beings become the power source for the machines and nothing more a disposable power source on top of it from what I can gather from Zen he believes basically the same thing you will serve no purpose other than to help the machines go forward. The population control argument is an interesting one because we have seen this kind of thing before in certain countries like China. They once had a limitation on how many children you can have because they felt the country was becoming over populated so the government would disincentivize its citizens for having more children. You know, there would be a benefit for limiting the amount of children you would have need have and then it would be sort of like a penalty for having more than the allowed amount. We hear the same arguments here in the United States whenever poor families most of the time have multiple children, especially when they're single parents with multiple children, there's a lot of scrutiny around that kind of thing. I can believe him when he says that population control is going to be a factor in a world where human beings accept AI as its God. The AI will just use logic there are too many mouths to feed and not enough resources so in order to benefit both sides and stop depleting the limited amount of resources. Well, you know, you got to you got to kind of cut down on those that are using the resources. Now, how would they go about that? I don't think it will be anything extreme like genocide, but in America we have we have had a very dark history when it comes to population control over here as well. There was a system that was used in the past. I believe it was called eugenics. So it's a we have a dark history and AI would probably just dig that up. Finally, communism will begin to work because machines don't tire. They don't die and they don't get resentful and they'll sit there produce clothing and food and building materials and anything else we need. And the human race will basically be farm like an animal will turn into marshmallows. We won't be able to write the software to do the three dimensional thing that the AI God does because it's way above our heads. And our ending our our control of our data and our lives will basically come to an end will be in the hands of an artificial intelligence. Now, when you think about it, it does sound a little bit possible. I mean human beings relying on technology the way that we do now. I mean, how many of you can actually walk away from your phone? Everything we do in today's world is centered around our technology. You'll have people sitting next to each other on a bench and they'd rather text each other rather than physically talk to each other. There's so much of our lives, our social lives, our health care, everything is governed around technology. And once we just, you know, just make that final step and just hand it all over, you know, the actual governing of our lives to the technology. I could see where this could possibly be a thing. Now Zen has mentioned this would obviously be, you know, maybe a few hundred years in the future five, six hundred years, something like that. And I would have to agree. It wouldn't happen today. It wouldn't happen tomorrow. But it'd be sometime far enough away where people are so used to the model that's being developed today. But think about it. How many people even questioned the fact that Apple, Google, Meta and all these other companies are just hovering up all your data and you have literally no control over on top of which they create these licenses where and I'm pointing specifically at Apple here. But there are other other companies do it as well where you can purchase a twelve hundred dollar device. And due to the licensing behind it, you don't even own the device because the device is designed to only work with their software. And because you can't manipulate the software, you can't even install a new software on the device that that breaks the terms of the agreement. It would break the device if you tried to. So you own literally nothing. And people are okay with that. Just as long as they get to walk around and use the device. So we're already heading in that direction. Now we're, you know, I'm admitting that we're super far away from it today. There's still too many, you know, people like you out there too many you weirdos out here on hacker public radio listening to this right now. Put your tinfoil hats on, by the way, I should have give you that warning. So let's pop back in was in, you know, they got to decide it. Look, I've got enough subordinate machines here that can repair themselves and raise food. I'll just take off and he does and 2000 years later they have a huge machine collapse because we had a solar flare or something that took them all out. And there you go, ladies and gentlemen, the fall of humanity. We got too close to the sun. Our wax wings burned up and we've came crashing down to earth only to perish under the weight of our own design. I think it's very interesting. I don't think it's likely to happen anytime soon. But I think it's worth discussing, especially with, you know, the way that we're conditioning future generations to be okay. But the things we're fighting against today. Now there are many crusaders out there. You got the folks over at next cloud and just tons of other organizations pushing free software to free software foundation. You know, just making sure that people understand you have rights. This data is yours. But it's kind of hard to get that argument over, you know, the convenience factor of, you know, what's what a company like Apple could provide for you. You know, you watch sinking with your phone, sinking with your TV, sinking with your tablet, sinking with your computer, you know, all of that creating this wonderful little world, this ecosystem you could live inside of and just ignore the fact that you're being taken advantage of. It's kind of hard to break people out of that and then tell them, look, wake up. Take this pill. Wake up in the matrix where you have shabby clothes on and you have to fight every day and you have to be afraid and that steak isn't really steak anymore. You know, it's actually just data pushed into your brain to make you believe that you're actually eating something. I don't know. Maybe I'm off the deep end too. But if I am, why don't you go ahead and leave a comment and let me know. Maybe you even do a show. Maybe we can get together and discuss it together and mumble. Who knows. But that's all I got time for today, folks. I'm going to go ahead and head out here, but first I got to tuck away this tinful hat. I think you should do the same thing before you leave. I'm some guy on the internet and this is hacker public radio. Take it easy. You have been listening to Hacker Public Radio at HackerPublicRadio.org. Today's show was contributed by a HBR listener like yourself. If you ever thought of recording podcasts, then click on our contribute link to find out how easy it really is. Hosting for HBR has been kindly provided by an honesthost.com, the internet archive and our syncs.net. On the Sadois status, today's show is released under Creative Commons Attribution 4.0 International License.