This afternoon, Jeky asked me abruptly “Can we develop a program, using iphone’s camera, that makes the phone looks like transparent?” I answered “probably yes” and continued coding something about Perspective Warp. After that, I thought that question explicitly, which is interesting.
I have seen a lot of pictures on the internet about transparent phones, just like:
but most of these are artworks, funny thing is, every time before one new generation of iphone’s press conference, people will guess what will the new iphone looks like, and someone will photoshop some transparent iphone as spy shots 🙂
However, it is said that a Taiwanese company named Polytron Technologies is currently working on this, and they claims to have developed a gen-next transparent mobile phone which it says will be in the market soon.
What? Finally someone did it? That is awesome!
Polytron Technologies phone video from youtube
What we are talking about now is, are we able to achieve this idea by software?
I think the answer is partially yes, but with an iphone5, hard.
For doing this, the iphone need to know these things (and need to know these in real-time)
- the position of user’s eye; (assume that our user is Mike in Monsters University)
- the angle between Mike’s sight and the screen of iphone.
The second thing is actually same as the first one, because there are gyroscopes inside our phone, so it is totally realizable to know the Position(x,y,z) and Orientation(yaw,pitch,roll) of the phone, so if we have the position of Mike’s eye, we are able to calculate that angle.
But how to know the position of Mike’s big eye? With iphone’s front camera? No way.
First: only one front camera is not enough, we need at least two! There is algorithm which measure distance from Dual-camera, and even that does not always work well, adding an IR camera will be better (like kinect);
Second: iphone5 have a front camera of 30 megapixels, this is too weak to be used for doing this. BTW, it is Sep.10 today, iphone 5S press conference will start in several hours. Wow us, apple!
If, I say if, we already got the two things above, what will we do next? Definitely some maths, kinds of matrix transformation, we must change images that captured by iphone’s back camera into something that likely to be seen by Mike’s eye, maybe we will use perspective warp I wrote today (it is easy and effective).
By doing maths, we maybe can get what we need, but there are some other problem that exists.
- 1. What most people seen is not like what Mike seen, most people have two eyes, and what we think we seen is actually compound result made by brain. Maybe given that little phone screen in properly distance from eyes, our left eye and right eye will not seen so differently, but more or less, what we seen will be not that real.
- 2. Even if there are super advanced digital cameras, however, the most advanced one is much worse than human eye, and combined with attributes of screens, what we seen will be sort of unreal.
- 3. As the graph above, what our eyes see is between the two black lines, and what phone camera captures is between the two blue lines, and what the phone need to render on screen is everything between the red lines which behind the phone. Unfortunately, ther are blind zones exist, such as red zones just behind the phone, so objects there will not be rendered on screen. (It is most likely that we can not see our own hand which holding the phone, that will make us unhappy, really transparent?)
Above is all what I thought, and my curiosity forced me to search this in the app store, and amazing, there is an app named “Transparent Screen” in it, the price is $0.99, I think if someone really did all these above, this price is definitely what he deserve, so I made my payment, and download it… and open it…
WHAT THE F*CK IS THAT!!!
It is cheating, it just paste a fake hand into a camera captured picture.
Since hard to get refund, so the only thing I can do is to enjoy it.