No announcement yet.

This weeks astrophoto: Stars, and something else...

  • Filter
  • Time
  • Show
Clear All
new posts

  • This weeks astrophoto: Stars, and something else...

    Last night was very dark and transparent. I decided to take a few wide angle shots with just the camera on the drive. I don't usually point my equipment north because of the sky glow in that direction from Williams Lake.

    Last night I did and in the image you can make out Ursa Major (The Big Dipper) in the center of the image standing with handle up. If you look closely at the bottom left quarter you can also make out a faint satellite track. That's pretty common since there are thousands of objects in Near Earth Orbit (NEO).

    However, upon close inspection and some serious image fiddling there is much more to see. In just two minutes I also imaged six satellite passes over the pole or close to it. That is the preferred path for some weather satellites and especially military spy satellites.

    There has been a lot of speculation over the years about the imaging capability of the US National Reconnaissance Office. They are responsible for satellite intelligence and it has been speculated for a long time that they are using clusters of satellites to perform synthetic aperture imaging at visual wavelengths. This is very recent technology in the civilian astronomy arena since the entire subject has been locked down tight by the US defense Dept for reasons of national security. Only recently have some of the security classifications been relaxed allowing for some observatories to begin using such imaging techniques.

    In this severely enhanced image you can see a parallel pair of tracks that passed over at the same time. That is without doubt a military satellite pair or even triad that is using synthetic aperture imaging. Never mind the old saw about them being able to read your license plate. With an effective aperture of miles they can count your nose hairs if you look up.
    Privacy? Fugettaboutit.

    Free software for calculating bolt circles and similar: Click Here

  • #2
    Yes, but can they see through my tinfoil hat?


    • #3
      So it's not atmospheric conditions that limit the imaging capability of sattelites then?


      • #4
        No, they can easily correct for that. All the need to do is shoot a laser beam at the ground or even one up to the satellites and the distortions in the wavefront of the beam is used to calculate a deconvolution algorithm that unscrews the distortions in the image.
        Free software for calculating bolt circles and similar: Click Here


        • #5
          The GPS bird cage is very populated, now, too. All in polar orbit. You should be able to catch the debris cloud from that recent collision if you know when to look for it. It's getting more sparse by the hour but the main orbital components are still in polar orbit.


          • #6
            Originally posted by Evan
            No, they can easily correct for that. All the need to do is shoot a laser beam at the ground or even one up to the satellites and the distortions in the wavefront of the beam is used to calculate a deconvolution algorithm that unscrews the distortions in the image.
            Deconvolution is my new word for the day.

            Thanks for the cool images Evan. Is there any way to identify the satellites (the non military ones)?
            This product has been determined by the state of California to cause permanent irreversible death. This statement may or may not be recognized as valid by all states.
            Heirs of an old war/that's what we've become Inheriting troubles I'm mentally numb
            Plastic Operators Dot Com


            • #7

              Free software for calculating bolt circles and similar: Click Here


              • #8
                Thanks for the great discription,
                AND the photos.
                You make it sound so simple....and we all know it's not !



                • #9

                  I found a Wiki article on synthetic aperture radar, but nothing about using the technique in the visual. I'm curious how this works, wondering if the technique might be usable for ground based photography of distant scenes (like the Grand Canyon).

                  The two images can be combined to yield a 3D image, but I gather that SAR is a lot more than stereoscopy.
                  Allan Ostling

                  Phoenix, Arizona


                  • #10
                    You will have very little luck finding much on Synthetic Aperture Visible light. It has been highly classified up until the last few years and even now publication is strictly limited. Here however is a clue to what is going on. This is what the military has but pointed in instead of out.

                    Future astronomical space missions will comprise a constellation of several optical telescopes to detect exo-planets by interferometric nulling of starlight. The Darwin mission of the European Space Agency (ESA) and NASAs Terrestrial Planet Finder Interferometer both consist of a free-flying collection of telescopes and a beam combiner. As such, the constellation provides a co-phased array of telescopes that can also be used
                    for aperture synthesis imaging. This imaging technique relies on recording intensity interference patterns, in which the layout of the beam combination optics and the detector play a key role. Several designs for beam combination have been proposed in the literature. In this article, we compare these beam combiners by rigorously simulating the imaging process of a weak stellar source, taking into account the photon arrival statistics, an imperfect detection process and the image reconstruction from the recorded data. The results are presented as the to be expected reconstruction error in the luminous intensity distribution function of a wide-field stellar source versus the provided amount of photons. Using these results, the optimum design of the combination beam combiner and detector array can be identified.
                    Free software for calculating bolt circles and similar: Click Here


                    • #11
                      A single camera in motion taking pictures is a synthetic aperture. The problem with SAR and SAP (synthetic aperture radar/sa photography) is the subject is viewed from a changing perspective over time which allows for a lot of image creep. That would be motion of the object or it's environment, and obscuring and revealing of object details caused by complex shapes and viewing angle distortions. Several cameras spread out over a span creates a very long array similar to the VLA radio telescopes. VLA camers in motion produce synthetic VLA apertures. Three satellites in close orbit shooting the same object over time produces extreme hi-def 3D imaging. The ESA Mars Express is capable of SAP and their site has a number of spectacular images:

                      SAP is used in a lot of TV commercials to create special effects like 360 degree panning of a fixed or moving object where the cameras surround the object. SAP can also be used to remove intervening objects from an image forground. If the moving camera is held very tight on a remote object, foreground objects will move relative. Photo stitching software can remove them.

                      The inverse is a stationary camera shooting a stationary object with intervening moving objects. A freeway interchange, for example. Take enough pictures and stitch them together without the temporary objects and you will have an empty interchange at rush hour. This example can be reproduced using very long exposures with very small apertures.

                      Anyway - Google "synthetic aperture photography -radar" to see some examples.


                      • #12
                        That is a different type of synthetic aperture imaging. All phase information has been lost so it really isn't any different than the technique that I use when stacking images taken over a period of time. It certainly is capable of increasing resolution since each image has slightly different information and when combined the information from each image is additive.

                        Example: right is one image, left is a stack of maybe 20 images

                        However, true synthetic aperture imaging is a different creature. By simulating a large aperture by collecting light simultaneously and combining it with accurate wavefront timing the usual mathematical limit on resolution imposed by physics and the wavelength of light is exploited to advantage. Normally the resolving power of a camera is limited by the aperture of the camera (or telescope). The relationship is dead simple:

                        resolution in arc seconds = 0.25 x (wavelength of light / diameter of aperture)

                        Using a camera to take pictures in two different locations does not satisfy the condition of sythesizing a large effective aperture since the light from both locations must be combined in phase as it would be in a single aperture of that size. If we use the above formula to calculate the resolution theoretically available with an aperture of a kilometer then you can see that they really can count your nose hairs. That wasn't an exaggeration.

                        The process depends on being able to calculate and adjust the precise location of the multiple apertures with a precision of better than about 1/10 wavelength of light over a distance of kilometers. Makes microns seem positively enormous.
                        Free software for calculating bolt circles and similar: Click Here