With the advent of the iPhone 16 (non-Pro version), Apple put the two lenses parallel to each other instead of on a diagonal so you can shoot spatial photos. Like your two eyes (assuming you have two functional eyeballs), shooting with two lenses spaced apart can add depth information to what you see (or photograph). Sure, the two lenses on the iPhone aren't spaced as far apart as your eyes, but it's still enough spacing to create a stereo image thanks to the iPhone's LiDAR depth sensor which can build a depth map.
The iPhone Pro has had parallel lenses for a while, but until the new 16 Pro, you've only been able to shoot spatial video. Now you can shoot spatial photos as well. These photos are meant to be displayed on Apple's astronomically expensive $3,500 Vision Pro.
Apple gives you no way of viewing spatial images on your iPhone. So if you don't have a Vision Pro, the only way I know to view the 3D image is, unbelievably, to post them to Facebook because they have had the ability to create 3D images for a while. It's just that iPhone's LiDAR depth data makes them better than usual since depth doesn't have to be interpreted.
But having to go through FACEBOOK?!? This is really fucking shitty of Apple.
They could easily build 3D viewing into iPhone's Photos app... drag your finger on the screen to change the angle... or pivot your iPhone to change the angle. It's about the biggest no-brainer there is, but nope. Apple demands you spend $3,500 additional in order to view what you shot on your iPhone.
So what I did to share them here is take some test shots of my cats, upload them to Facebook, then screen record my mouse changing the angle to see the 3D effect of the spatial images.
The result is very good (no thanks to Apple). When you play this shot of Jake showing off his junk, note how his foot in the air and the folds in the blanket have really good separation...
This shot of Jake is both impressive and a big ol' mess at the same time. The LiDAR sensor didn't register his whiskers, so they are merged with the background and look nuts. But the rest of him? How cool is this?
Shame about the whiskers. Maybe in the future Apple will add an improved LiDAR sensor to pick up smaller details. I wonder if there's a way to manually edit the depth map so I could paint the whiskers back in? Something I should look into...
Do I want an AppleVision Pro so I can see my spatial photos in real 3D? Of course I do. But do I have $3,500 to spend on one? No I do not. Hence Facebook. My guess is that Apple will eventually release an AppleVision (minus the Pro) which is actually affordable. Until then I'm hoping that Apple (or a third party) will at least give us a way of looking at the 3D images on our iPhone somehow. It's fucking embarrassing that Apple hasn't done something already.
Though given some of the curious decisions they made with my new iPhone 16 Pro Max, that's pretty much Apple in a nutshell.
I love comments! However, all comments are moderated, and won't appear until approved. Are you an abusive troll with nothing to contribute? Don't bother. Selling something? Don't bother. Spam linking? Don't bother.
PLEASE NOTE: My comment-spam protection requires JavaScript... if you have it turned off or are using a mobile device without JavaScript, commenting won't work. Sorry.
There's no comments here...