Google Maps Captures Plane in Flight Over Chicago

What happens when a plane flies through a Google Maps satellite photo? TheAtlantic.com has the answer:

See for yourself. Geekosystem.com notes that “the rainbow effect could have been caused by a satellite imaging process that consisted of “several colored exposures [taken] over a few seconds and then combined”; failing that, unicorn magic.”

47 Responses

  1. A Physicist

    A few seconds?? The entire exposure must be very very quick, BGR and possibly intensity/gamma or somesuch. In fact, with the size and speed of the plane known you could work out the individual exposure times.

    Reply
  2. Optical Engineer

    It’s not chromatic aberration as that is when the colors do not focus in the same plane. These colors all focused in the same plan meaning it is probably just the different color filters.

    Reply
  3. Frollard

    It is separate exposures taken FRACTIONS of a second one after another.

    Given the size of the plane to the size of the ground, its flying very low altitude, otherwise everything else in the shot would be dwarfed, AND there is no jet exhaust, which is visible when planes are up in cold air at cruising altitude.
    From this we can surmise it’s probably landing/taking off, and thus going much slower than usual, say 300 instead of 700 knotts. Note how also, the exposures are EXTREMELY quick, freezing a jet plane still, but each exposure is taken with an offset time.

    Now look way below, at the cars, note how the white cars have the SAME blue abberation on their rear bumpers? Same exact effect but it’s at 50mph instead of 500. It’s just a temporal shift caused by a moving object being photographed several times for each colour.

    Simple.

    AWESOME timing, nonetheless.

    Reply
  4. shawn

    kinda weird..if u look at the first cutout of the plane u can see thru it….mite be sumthin to that…needs sum attention :)

    Reply
  5. will

    it would be interesting to see what happened to the shadow of the airplane

    Reply
  6. Jeff

    If you look at the map, Chicago Midway Int’l Airport is ~8 miles due west of the plane’s location.

    Reply
  7. enrique

    there is a plane that looks the same way in google maps near cancun, mexico NW of city, over the jungle

    Reply
  8. yls

    “It’s not chromatic aberration as that is when the colors do not focus in the same plane.”

    Yes indeed, the plane is higher than the ground therfore it’s not on the same plane.

    Reply
  9. Anonymouse

    Google “Harris shutter effect” to see what causes the colours.

    Reply
  10. zngga

    Any one take the time to take a closer look at the structure of the plane itself? Are there any planes flying commercially that look like that… I think not! Good photo shop though!

    Reply
  11. Mentat

    “Any one take the time to take a closer look at the structure of the plane itself? Are there any planes flying commercially that look like that… I think not! Good photo shop though!”

    Someone really had to go and call it shopped…

    And to answer your question: http://en.wikipedia.org/wiki/Boeing_737

    Reply
  12. cody

    yeah this is not real. i looked it up on google maps and went to the exact location and theres not a plane there.

    Reply
  13. billy ray

    this is fake. i took the picture and sent it to google after i changed it. payed em fitty bucks to put it on google maps

    Reply
  14. Todd

    41°47’7.79″N 87°34’43.78″W

    Those are the coordinates.

    Reply
  15. cormac

    there was one near my house in cork, Ireland until recently when they replaced the photo.

    Reply
  16. Right

    Basically different wavelengths of light will travel at different speeds through air. It’s got nothing to do with exposure times.

    Reply
  17. Travis

    does anyone remotely realize that this has nothing to do with any chromatic abberation or some of the technical stuff.. has it crossed your minds that its the same colors as the google logo and they probably added it in as an easter egg???

    Reply
  18. jacob

    I agree with travis. otherwise shouldnt rthe cars on the road have the same color trail, they show no signs of it

    Reply
  19. Rocky

    Seriously, Travis/jacob? Did you even read the previous posts? Those aren’t the “same colors as the google logo”, it’s called RGB. The cars don’t display this effect because they’re going at a fraction of the plane’s speed.

    Reply
  20. Aidan

    I’m wondering if the distance between the plane low to the ground and the satellite in orbit is enough that the difference of around 200 nm between blue and red light had enough effect that the single exposure that was a fraction of a second cause the blue light from the plane at its rear most position and the faster red light from its forward most position?

    Reply
  21. Phil

    Mhhh… The real reason for this colour change is due to the fact that different colours in the light spectrum move ad different wavelengths. Couple this with the atmospheric density and the refraction of light going through the atmosphere.

    The atmosphere then acts as a prism, refracting different colours’ (red, green, blue, which most digital imaging equipment use) wavelengths at different angles creating the rainbow effect.

    Now you ask, “why doesn’t all light sources do this on google maps?”. The reason is that the plane moved faster underneath the satellite as for the refracted light to ‘sync’ up as one image.

    I think this would be a good guess.

    Reply
  22. me

    I just Googled the Bobby Franks house in Chicago,–5052 S Ellis Ave, Chicago, Cook, Illinois 60615–and found a plane nearby at the corner of E. 50th St and S. Drexel Blvd. I’ve seen them before, but dummy me always thought they were parked, because they were usually in fields (knowing people use old airplanes as houses). After looking at street view and seeing no plane, it finally dawned on me….oh, they are flying!

    Reply
  23. The Right Answer

    The images are from a linear sensor like a fax machine. Mutliple collection arrays, black/white, blue,red,green are used to gather all the image information which are physically offset from each other. To present a composite image, they must be re-aligned as a whole image. The re-alignment assumes a fixed relationship between the sensors. For non moving objects they all are realigned.
    Check out the Daytona Raceway. Look for a car at about the 8:00 position. The same color offset occurs.
    It’s moving faster than the color offset can account for.
    ” I am the eye in the sky, I can read your mind” Alan Parsons

    Reply
  24. Johnny the kite flyer

    It sure was a hoot reading all the attempts at a scientific explanation above! Too bad almost every one of them is completely wrong! So much for American science education.

    Optical Engineer – close.

    Frollard – wrong, the plane is high up. This image was taken with a super-telephoto lens from thousands of miles away. In such a setup, the few miles between the plane and the ground are meaningless. The light rays from the ground and from the plane to the camera are almost parallel, causing the scales of the two images to be essentially equal. In fact the plan is many miles up in the air.

    Anonymouse – correct. More below.

    Andrew – LOL! Good one.

    Right – Wrong. If what you said were correct, or rather consequential, the image would be a blur rather than 3 distinct monochromatic images.

    Travis – No, it’s a real effect.

    Aidan – good example of “enough information to be dangerous”. Totally illogical explanation.

    Phil – ditto. “… different colors move at different wavelengths” ??? Maybe you mean “speeds”. Your reasoning of refraction doesn’t hold up. The speed of light is so much greater than the speed of the plane that the plane might as well be still as far as any motion based refraction effect is concerned. Any refraction would therefore apply equally to the plane and the still ground. And what you mean by “sync up” is beyond the realm of scientific logic.

    The correct answer is the “Harris shutter effect”. This image was created by a camera that took 3 separate images (red, green, blue) spaced apart in time and then merged them together into one photo. But why? Because the image sensor is a super high resolution device that is very expensive. To get the best possible resolution at the best price, instead of using 3 expensive sensors, the camera uses one sensor and a low cost rotating color wheel that successively passes red, green, and blue light onto the sensor. This allows the sensor to provide 3x the resolution that it could provide if it took a single image for all colors. For example, if the sensor is a 4 bit sensor, then this technique provides 4 bits red + 4 bits green + 4 bits blue = 12 bits which is high enough quality for most purposes (4096 colors), whereas a 4 bit color image would be very poor quality (only 16 colors).
    Because the 3 monochromatic images are taken at different times, and the plane is moving between each image, the plane appears to be in 3 places in the composite image, one place for each color.

    The plane is probably flying around 500 mph, which translates to about 750 fps. The distance between the images is roughly equal to the width of the plane, which, if it’s a 737, would be about 12 feet. That means the 3 images were taken about .016 sec apart, meaning the color wheel is spinning about 20 rps.

    Reply

Leave a Reply

Your email address will not be published.