October 10, 2013
It’s interesting to see camouflage. I mean the design on clothes. I’m guessing the fashion kind people wear are mostly symbolic. Exactly which symbolism they represent to someone…you’ll have to ask them. I came across an interesting camouflage design that’s been around since the early 1900’s. It’s not symbolic but was used on war ships. The basic reason for any camouflage is to confuse another observer but in the case of the dazzle design it apparently was most effective when the enemy observer used a rangefinder. The dazzle patterns were in a way like those of a zebra and were designed to make position and headings difficult to estimate when aiming. Radar and other higher tech methods have really eliminated these fun dazzle painted ships. Too bad!
Another thing about dazzle camouflage is that it can confuse facial recognition programs. Yeah, facebook and google recognition algorithms apparently don’t see a face when it’s painted in this dazzle design which is basically face painting and some hair over the face. Everytime I upload a picture to google+, google wants to identify people in my pictures and asks me if I want to tag them. I don’t think dazzle face painting after the fact will work for this but there is an app for android called Obscuracam if you want to “camouflage” faces in your pics. And I think youtube has a tool to blur faces….probably something for iOS as well. Maybe we like identifying people because it makes us feel like we’re helping a computer or we might forget who they are?
As humans we see patterns everywhere and our visual system sorts out shapes and colors through some complex processing. But we can be confused by shapes and colors and shadows. Even two simple lines can be drawn which can influence our perspective on size and dimension. We can fool nature sometimes but it’s hard to fool ourselves unless you’re looking at one of those visual/optical illusions. One researcher believes that once our brain receives an input which involves a time lag from the eye to interpretation, we have an automatic response which helps mitigate the delay, or lag. But that automatic response may not match what happens in reality which is why these illusions might work. Needless to say, these illusions work and can trick our brains.
As I was reading up on dazzle camouflage I came across way too many interesting things to mention as happens when googling. If you’re interested in the topic, check out a couple of the links below or just google on your own.
September 7, 2013
You may have seen the news recently about this flight which didn’t land successfully. Most of the causes have been identified. The problem started with a thermal runaway lithium battery fire. You might recall the lithium battery problem used in the Boeing 787 Dreamliner.
What eventually happened is smoke filled the cabin and caused the pilot to not be able to see outside the window. That’s got to be a completely disoriented feeling when the thousands of flight hours before the pilot had a clear view of his instruments. Perhaps they have training for this situation but in this flight there was a large task load put upon one person in an emergency.
Many of you know that there’s an autopilot in airplanes and many of these planes can take-off and land themselves. Pilots are very familiar with letting a plane fly itself. There’s a lot of news recently of cars that can drive themselves and it sounds like one day we might be riding in those. I don’t know what it’s like to be sitting in a cockpit or a car and let the machine do the critical driving but I wonder how the machine will work in an emergency. Perhaps UPS flight 6 is an extreme example of how things can easily surpass our knowledge of what’s happening and if our vision is taken away we are left either to the comfort or uncomfortable feeling that the machine can handle the situation.
I am all for trying autonomous cars when they become available but I’m guessing there’s going to be a lot of interesting sensations when I first let a car do the driving. I just hope there will be all kinds of thought that go into those potential situations where there’s no way I could handle something like driving in bad fog or other severe conditions. Google‘s driverless car has over 300,000 miles of accident free driving. There have been a few incidents but they haven’t been blamed on their system. One involved a rear-end accident.
What we’re going to have to do is trust the machine and not our eyes and I know that’s hard to do even when someone else is driving. As a passenger, you might see and want to react but you have to let the driver make that decision. Can we leave it up to a computer to make all those decisions? I wouldn’t let Windows or my iphone do it for me. I suppose we’ll need a lot of selling first before we trust a car to drive us to work. A plane doesn’t have to dodge the kinds of obstacles we see on the road every day.