Episode: 4057 Title: HPR4057: Raspberry Pi and astro imaging Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr4057/hpr4057.mp3 Transcribed: 2025-10-25 19:07:15 --- This is Hacker Public Radio Episode 4000 and 57 for Tuesday the 20th of February 2024. Today's show is entitled, Resbury Pie and Astro Imaging. It is hosted by Andrew Conway and is about 31 minutes long. It carries a clean flag. The summary is, on how to build a cheap Astro, Imager using a Resbury Pie. Hello Hacker Public Radio people, it is McNally here, also known as Andrew, and I want to tell you about some adventures which I think I mentioned I was going to embark on in a previous Hacker Public Radio Episode. Let me just check. This is live, exciting actions. Yes, it was the, it was Hacker Public Radio Episode 38, 3857. Yesterday I saw a solar flare. Now, if you listen to that and recall it, and I can't blame you if you neither apply, but what happened in that is that I realized I could see solar flares, as well as many other things through this small solar telescope that I have access to. And I also saw an unusual plane. It was a US military plane, possibly returning back from Ukraine, maybe, it was a C-17 globe master, four-engine transport type plane. I forget. Anyway, what I wanted to do after seeing that was, well, could I hook up a rudimentary camera so that if I see something like that again, I can record it for posterity. Well, yes, the answer is that, in the months following, I did exactly as I intended to do, and I got the camera working. I saw more planes passing from the sun through the camera, but unfortunately I just happened not to be recording at the time, or if I have recorded one, I haven't noticed it yet, and because I haven't shifted to all the frames of recording that I've done, that's sort of a problem. Yes, you can record stuff, but how do you find interesting stuff? Anyway, so I wanted to tell you about, is the rather more interesting adventure that this turned out to be. So what I did was, initially I took a Raspberry Pi 2 or Raspberry Pi 3 that I had lying around, and then I had the idea of taking the Raspberry Pi camera and stuffing it into where you would put the eyepiece in the telescope. I have seen people do this. In fact, this is exactly what you do with many cameras in telescopes. I mean, there's better ways. You can just screw them on, and that's the better way. But any telescope has a, if it's a one and a quarter inch barrel traditionally, and usually put eyepieces in there, but with a suitable adapter, you can put a camera in there. Unfortunately, the Raspberry Pi camera has no suitable adapter, at least not one that you can buy off the shelf. So I approached Dave Morris, and he very kindly offered to print me from 3D printed design some day else created, an eyepiece adapter for a Raspberry Pi camera, and this was done, and he sent me, in fact, sent me two, a prototype and one black, which the black is obviously the best colour when you're dealing with telescope optics, but the prototype one was in blue. So I gave the blue one to a friend of mine, who's also into Raspberry Pi's astronomy, and I kept the black one for myself. And I attached the Raspberry Pi camera to it, popped it in, popped it in the solar telescope, I can't remember which one I tried first, but I possibly tested it in a normal astronomical night time telescope first, but focused in a tree, just, well, trained in a tree, so I can focus it. And then once I got figured out the commands on the command line to get an image out of it, I then took it to the solar telescope. Now, I don't actually remember the commands I used back in those early days, be with this June, or was this July last year, I can't remember, but it's summer time last year. So there wasn't much prospect of trying it out at night time, but there was plenty of sunlight, well, for Glasgow anyway, there was more sunlight than there was other stars at that time of year in Glasgow, because our nights are very short and they don't get very dark either. So yeah, I stuffed it into the eyepiece of the solar telescope, and then focusing with tricky actually, as was getting the sun in the field of view, in fact, that was the first problem, is the Raspberry Pi camera's sensor is actually rather small, so it can't actually fit the whole of the sun at once in the sensor I don't think, or it nearly can, but you lose bits that are chopped off the side. Anyway, that's not a big deal. After I figured out how to focus it, I got some acceptable images, and I could certainly see sunspots, and I couldn't really explore it at this point, I think I was using, I think it was the, I forget what the command was, not the old suite of commands, which I've completely forgotten now, one of them, you put in option minus T0, and it does this kind of infinite stream, preview stream in the screen, but I quickly found that if I wanted to use the latest Raspberry Pi OS, then I had to embrace the new lip camera based Pi camera 2, and also the lip camera tool, so the equivalent tool for what I was doing was lip camera dash hello, and that's a very like a hello world thing, where you just type in the thing and it shows you something from the camera, and it gives you very little control or anything else over it, and then I graduated to the lip camera still, which allows you to take stills, lip camera video, which allows you to take video, and then the hardest quote amongst some lip camera raw, which enables you to extract raw files, so I had quite a lot of success with that, and I got some half decent images of the sun, but then I quickly found that the small sensor of the Raspberry Pi cameras, although not a big deal when you're using the sun, when I put them in my astronomical telescope, the field of view was so tiny, you know, it was like, I don't know, maybe a fifth or sixth of the area of the moon, that for finding the moon and maybe a bright planet, you could just about with a bit of patience get away with it, but you know, the field of view was so small that if you were looking for something that you couldn't readily see in a short exposure, and by short exposure, I mean, maybe by terrestrial terms, quite a long one, and what, 200 milliseconds, a fifth of a second, something like that. If I couldn't see the object at that time, at that exposure, then with such a tiny field of view dictated by the sensor, it was very hard to use the Raspberry Pi camera, so I can scratch my head about this, I could switch to another telescope, but I didn't actually have access to a telescope with a wider field of view, what you need is a short focal length telescope, and all telescopes that I have access to through the astronomical society of Glasgow have long focal lengths, two meters, one's got four meter focal length, why I needed was one that was, you know, sub one meter well under, you know, ideally, you know, a few hundred millimeter focal length, but I didn't have one, so the other thing I could do was look at other Raspberry Pi cameras, and so I discovered that the Raspberry Pi HQ camera is a higher resolution sensor, but it is the largest, it's physically the largest area, I think it's 6.3 millimeters across, I forget what the normal Raspberry Pi sensor is, maybe about four millimeters, but it's much smaller. So that extra area of the sensor was very valuable to increase my field of view, so I went to that and then maybe it gets me to a third of the area, the full moon on that astronomical telescope that I was using, which is a mid ETX 90 ECF anyone cares, 90 refers to the diameter of aperture, I mean 90 millimeters, so a small telescope, but quite a capable one. So also what I did, so I ordered the Raspberry Pi high quality camera, I also ordered a Raspberry Pi zero, and this was so for portability, but also just for low power, sorry, it was a Pi zero W, I would rather have gone for a Pi zero, the second generation, I think they're called, Pi zero two, is that right? But you couldn't get them at the time, store shortages, but the Pi zero W I could get, so I went with one of those, and I came with a nice little case, red and white, I think it's the official case, it's really neat, I could actually just strap it velcro, blue jacket or an elastic band of some kind, strap it on the side of the telescope, it was that easy, you know, and so light, it's not going to interfere with the motion of the telescope and it's tracking, as it, it's trying to track the earth's rotation, which is essential when you've got a telescope in a, you know, a rather small field of view and fairly high magnification. But another good thing about the Raspberry Pi H2 camera is it's got a screw threaded mount, so you can screw lenses into it, so I bought a lens, which I've also played with and taken terrestrial and night time photos with, and it's very good, and with that, the focal length varies between 2.8 millimeters and 12 millimeters, so I've got a ridiculously large field of view, but a very small aperture, so it's not great for astronomy, it's like a mini teeny-weeny telescope, so it's good for like photographing, photographing the whole of a constellation like Orion, for instance, and then you don't need to worry about tracking so much, because you can maybe take a 10-second exposure, and then the stars won't trail as the earth rotates. Anyway, so the Raspberry Pi H2 camera, as I was saying, has a thread on it, and you can actually buy an adapter, so you can screw the lens in, or you can screw this adapter in, and the adapter is basically a cylinder that you can just shove into an IPC of any telescope, so that's why I did. And instantly the thread is a slightly non-standard, unusual, shall we say, mounting for a lens, it's called a sea mount, or a CS mount, I think they're the same thread size, about an inch in diameter, but I think it's, you know, most your DSLRs, your proper daytime cameras, they have a much larger lens attachment, I think it's screw thread, is it 42 millimeters? I'm not sure, I think it is, but this is much smaller, this is like 25 millimeters or so, I've read somewhere, it's about an inch, and it's sea or CS, and I think they're the same size, threads, but different thicknesses are something, different depths, because sometimes you need different, it's called backfocus, the distance between, actually I don't really know what backfocus is, I sort of know it's not the focus of the telescope or the lens, it's happening at the back of the sensor. Anyway, I'm not being too concerned about whatever the backfocus may be, and it by definition, because the Raspberry Pitch camera also comes with a little neural ring, which is a backfocus adjustment, so if the focusing of your lens or your telescope isn't enough, then you can just focus at the back of your optical assembly at the camera, in fact, with this ring that's on the Raspberry Pi HQ camera. It's a 12 megapixel camera, it's got tiny little pixels, it's 4,056 by 2028, though I've discovered it in my playing with it, that two of those rows of columns of the 2028 are always 0, so it's actually 4,056 by 2026, sorry, that's the, that's the, no I've got that completely wrong, I've got that completely wrong, it's 4,056 by 3040, and it's the last four pixels of the 4,056 that won't display, right, yeah, now the reason that I got confused there is not only because I'm slightly scatterbrained and forgetful, but because I rarely use it at that resolution for two reasons, one, if the Raspberry Pi 0 really struggles to run at any kind of different frame rate or cope with a memory rate or write the known as file sizes, and secondly, none of the optics that I have really justified is, well, the optics will tell us, bigger telescopes might, but in any case, for astronomical objects, you're also up against the motions of the atmosphere, what astronomers call the seeing, so if you've seen a star twinkling, that's what I'm talking about, it may look like, make the star looking nice and pretty to you, but the more a star twinkle the worse the seeing is, the more disrupted the light is as it passes through the atmosphere, so instead of seeing what should be a perfect pinprick of light that lands exactly inside one pixel in your sensor, you get a disc, you get the star dances about, and over any exposure will create a disc on your image, that's not the image of the star, that is just the star dancing around because it's twinkling, as it's light moves through the earth's atmosphere and gets refracted and wiggle down below the place, that is what limits you, so there's no point having like using this Raspberry Pi H2 camera as it's full resolution, much better to use it as a three megapixel camera, or maybe you've been it further, because you're not gaining anything by it, in fact, you're losing stuff because especially if the Raspberry Pi and your storage fills up, the Raspberry Pi can't cope with it, and the storage fills up and it's not buying you anything, don't do it, so I usually use it as in a two times two binning mode, which puts it at 2028 by 152 rho, and then the last two pixels of the 2028 are always zero, I discovered, so that was great, you know, and I was able to produce JPEGs and other formats from the lib camera suite on the command line very easily, and I got quite good at it, and I started to take some OK-ish pictures of the sun and sunspots, and really got anywhere with any astronomical objects at this point, and then that took me up until the summer, at some point during the summer, when we made you light August, I got to this point, and then I was accompanied by a couple of people out at the observatory that the astronomical society has, and then the first visit, a nice chap called Frank, he helped me find the setting and the solar telescope that brought all the features of the sunspots, the active regions of the prominences, you buy adjusting the etle on, it's what selects the weight, precise wavelength that you're looking at, so when you look at the sun, you need to stop down the light tremendously, I mean I guess you're throwing away over 99% of the light, but if you're clever, you don't just throw away, you throw what you do is you filter it down to one wavelength, and that wavelength is hydrogen alpha, which will bring out features that involve hydrogen, which is what the sun and all stars are mostly made of, so using this hydrogen alpha filter, and I think it's a wavelength of 656 nanometers, shoot me if I'm wrong, but that's what I call. Anyway, it's a long wavelength, it's a red end, this visible spectrum, and Frank twiddled the thing, the etle on adjustment collar on the solar telescope until we had a great image, I was really pleased with that, then I showed it well afterwards, another chap in the Astronomical Society, our most expert image or sinkler, he came along, he wanted to go on the solar telescope, he had his own camera, which wasn't color, and it wasn't even as high resolution as the Raspberry Pi camera, but the one thing it could do better is it had no infrared filter inside it, which unfortunately the Raspberry Pi HQ camera doesn't come with the no IR option, so you're stuck with a red infrared filter, which chops out stuff at the red end of the spectrum, which is not great when you're looking at a red wavelength in a solar telescope, so his Astronomical planetary camera didn't have an infrared filter, and also he was capable of the driver, USB3 driver, to a much faster, well fairly bog standard laptop to be honest, it was a much faster throughput of signal compared to what the Raspberry Pi zero could do, and with that he was able to get 30 to 60 frames per second, and so you know you could very quickly suck down a thousand frames of the sun, now the reason that's important is he was getting, you could do so with exposure as a short as I think six milliseconds, now the reason that 's important is the shorter your exposure, the greater your chances of happening to get a moment when the atmosphere is still, when you just get a moment of clarity where there's not a lot of turbulence in the atmosphere, and then the idea is that you take maybe a thousand images, and you throw away say seven hundred of them, and from the three hundred that are left you align them, so they're all on top of each other, stack them, and then you can process them to bring a one image that's far clearer than any individual image, or indeed the seven hundred images that were distorted by that atmosphere that you threw away, so with that technique he produced a fantastic image of the sun, and the other thing that I learned was, well the solar telescope could do better, when I examined the specs of his camera, I filmed I had a Sony IMX sensor, monochrome, so I didn't have bare pixels, which give you the colours, which was an advantage and a disadvantage, but the pixels were bigger and they were lower resolution, which didn't really matter that much, I think, for what we were doing, because it was near the resolution power of that small solar telescope, so I was thinking, well why were his images so much better than the ones I was getting with the Raspberry Pi camera? Now one explanation could simply be that I was struggling to get a decent image below 20 milliseconds, where he could go all the way down to six, I can maybe get to 10, my images were getting quite dim and polluted by the random retinoise of the sensor in places, so yeah, that was a consequence, I think, purely of having this annoying infrared filter, which you can remove with surgery, there are videos of people doing it online, but I don't fancy it, maybe I'll buy a second Raspberry Pi camera, HQ camera, and see if I can do it in that, 50 quid though, so if you destroy it in a very delicate operation, it's 50 quid, 50 pounds is, you know, not to be sniffed at, for a astronomical camera though, 50 quid is extremely cheap, I haven't said that yet, also, what's the Raspberry Pi cost? Well next and nothing compared to any laptop or PC, so don't forget, rather than spending hundreds on astrophotography equipment, if you've got a telescope, you could get yourself up and running for well under 100 quid with the Raspberry Pi stuff, which is part of the point of what I'm doing, was to find a way for members of the society to get into astrophotography with the equipment that we already had, without having to spend hundreds on cameras, and also to play along with the Raspberry Pi and HQ camera is fun, you know, to me, anyway, and you can do things with it, you can't do with other cameras, where you're just constrained by the proprietary drivers that are supplied with it, and software in some cases. Anyway, so back to the story, the story is, why was Sinclair able to get such fantastic images of the Sun, and I couldn't? Well, the answer was, in the end, because he used me to focus the telescope, when I had been doing it most of the time, except when the odd occasion, when somebody else was with me, I had to focus it by myself, and it's very difficult to focus, when you look at a screen, and you're stood by the telescope, the glare of the sun means you can't see the screen very well, and also the focus adjustments to get it perfectly in focused, I didn't realise, where my mute, you know, right at the limit of what your fingers can really do, I could have done with a better focuser. So once I realised that I was just not focusing the solar telescope, as well as I could have done, then I knew what I had to do, is I had to write some software, where I make a change to the telescope, I take a picture, I save the picture, and then press the button in the screen to say I've advanced the next, I know another eighth of a turn of the focuser, take a picture, press the button to say I've advanced another eighth of a turn of the focuser, or maybe I've gone too far, I want to go back, backtrack, press another button to say I know I'm going back again by a quarter turn, you know, something like that, so that's what I did, is I wrote a little bit software, and what it does is it runs a web server in Python, and there has been a pie, and then I can connect my phone to it, see the picture of the thing on the screen, and then I have all the controls and buttons in the display of what's the camera, you know, a stream of what the camera is showing me on the screen, and using this technique I was able to get results from my Raspberry Pi camera, which were almost as good as the one Sinkler got with his, say, over 100 quad astrophotography level camera, I say almost as good, I suspect that there's still a slight difference because I can't get them, I need to get rid of that infrared filter somehow, that's still a barrier, but not as big a win as I had thought, and also I think Sinkler is much better at processing the images afterwards, and can bring up details in them, which I can't, because he really spends a lot of time and has a lot of expertise with the image processing software, which frankly I don't know so well, and probably don't have the patience to learn, but inpatient I'm going to go on to the next thing, so I think I'm nearly there, since then I've also successfully managed to shove the Raspberry Pi HQ camera with the Raspberry Pi on the back of a telescope, the small meat ETX, and also a bigger 8 inch, so what's that, what, 200mm telescope? Yeah, with much more light collecting power, but a longer focal lens, or even smaller field of view, and I successfully managed to image Jupiter and the moons, I saw a lot of Ganymede go into and out of their clips using it, I've been able to live stream from that camera across the web, so other people can watch, I've imaged, what else I imaged, I've found the planet Uranus, which is pretty straightforward really, which you know where to look, I've got an image Venus, which is pretty dull, Mars, which is too small, Saturn's not been favourably placed yet, the moon, yeah, I've done the moon, and I've managed to do the Pleiades star cluster and a few other things, what I haven't managed to do yet is capture any galaxies or or in nebula with it, I've also tried, instead of using a telescope, screwing the small camera lens with the short focal length, and for that I've managed to photograph constellations like famously Orion, the notable winter constellations here in northern hemisphere, and so with all of this, a Raspberry Pi, sorry, with the Raspberry Pi 0, W, and the HD camera, and a little bit of know how input together software and Raspberry Pi's work, you can do a lot with with them for under 100 quid if you already have a telescope, if you don't have a telescope for another, well under another, only of another maybe 40 pounds I think I spent, you can buy one of these cheap lenses, and have a lot of fun with that, and you don't even need to be tracking, you know, you can still take 10 second exposures and see some really nice stuff, I'm in a light polluted place and I'm still doing something with it, if I take it out somewhere darker like where our observatory is, still near the city, you can go further, so I really do especially if you're in the countryside, if you're away from cities, this is a really capable setup, even without a telescope, just with this little lens, where am I going to go next with this? Well, I haven't quite pushed it to its limits yet, one thing I discovered recently is even in this light polluted suburb that I live in, outside the city of Glasgow, I can, with a small telescope, it's only 90 millimeters in aperture, I'm able to see stars known to 16th magnitude in exposures of, well, maybe at five minute exposures, it's 10 second exposures, five second exposures stacked over the space of about five minutes, I can see 16th magnitude stars, now what does that mean if you're not astronomer? Well, the brightest stars in the sky are about magnitude zero, and as the number gets bigger, the stars get dimmer, by magnitude six, you're really not going to see any stars with the naked eye that are dimmer than that. Through a telescope, you may be getting, depending on where you are and what kind of telescope you've got, you may be getting down below minus you 10. But to get to magnitude 16, in a light polluted suburb of Glasgow, with a 90 millimeter aperture telescope is just blowing my mind, to be honest, I had no idea, like when I was growing up, that was the thing that you did with giant big telescopes and mountain tops, I had no idea you could do this, and to give you an idea of how dim this is, for every five magnitudes you go down, you're dropping in the energy in the light, or power in the light, technically watts per square meter, you're dropping a factor of 100. So if I go from limits of the human eye, which is magnitude six, 10 magnitude down to magnitude 16, that's two times five, that's a hundred times a hundred, that's a 10,000th of drop in watts per square meter, it's phenomenal, it blows my mind, this is possible, and I don't think I've reached it yet, my calculations, if I'm right, I mean that if I'm looking somewhere near the top of the sky, where light pollution is most favourable, I could probably push the setup down to magnitude 20, which frankly, if you told the 10 year old me that this was the case, that later in my life, from the suburb of Glasgow, I would use a small telescope and look down to magnitude 16 or a dimmer, I just would not have believed you, so it's remarkable. So I'm not going to go for the pretty pictures, I've done a little bit of pretty pictures, my pictures aren't that pretty, that's not really where my talents lie, other people can do that better than me, but I'm now interested in what's called photometry, which is measuring the brightness of things, what can you do with that? You can find exoplanets, planets going around other stars, that is now in shooting range of the smaller rig that I've got, not discovering new ones, I don't think I'd be very lucky to do that, but certainly I would be able to detect existing ones, maybe find asteroids and comets, that's just a matter of taking enough pictures. The other thing that I'm keen to do is the software that drives it, I've written a Python, I've improved and improved, and now it can actually steer the telescope to, with the help of a free software like Stellarium or K-stars, but I don't even need that software, I can manually slow it down the sky myself now, which is quite a handy thing to be able to do all from my phone using the web interface that I've created, it's great fun, I've really enjoyed it, so if you're into astronomy or even slightly interested in it, but you're good with raspberry pies and you can do a bit of coding, this is actually a fun thing to do, it's something where you could really get somewhere in a fairly short space of time. So I hope that wasn't too detailed, hope to give you a flavor of what I was doing, I could probably talk for at least, I probably could keep talking indefinitely about this, but I think what I'll do is I'll stop there, and maybe if there's demand, if there's interest, I might go into a bit more detail about what I've done and I should share the first code, you usually reason I haven't, it's because it's all a bit messy and I need to tidy it up, I'm a bit embarrassed to share it as it is, but I will do soon, and I'd welcome other people if they want to get involved, I improve upon what I've done or take it off in a different direction, that'll be great, that's really one looking for you. So I hope you enjoyed it, please do a show if you're own, if you're interested on me, or leave comments, I'd love to read them. Okay, thanks, bye bye! You have been listening to Hacker Public Radio, at Hacker Public Radio, does work. Today's show was contributed by a H-Pillar listener like yourself, if you ever thought of recording podcasts, and click on our contribute link to find out how easy it really is. Hosting for HBR has been kindly provided by an honesthost.com, the internet archive and our syncs.net. On the Sadois stages, today's show is released under Creative Commons, Attribution, 4.0 International License.