Final Cut Pro 11 includes Spatial Video editing features, but it assumes the files are properly aligned and good to go.
(It will allow adjustment of convergence but no features like vertical, rotational or size disparity.) Folks using two cameras like me will need another solution. (I'm using the discontinued Dashwood Cinema StereoToolbox LE)
The Apple Compressor app allows converting separate left and right files into the Spatial Video format/container but both tracks have to be same time and length. That's fine if you have a professional synced and perfect aligned 3D camera setup, but not so much for what I'm doing where each file is manually started, although now synced by Timecode.
For side by side files straight out of the Qoocam Ego, (and I assume the XR Real Beam Pro)–just import the clips as normal.
Select the clip(s) and in the inspector properties -- set Stereoscopic Conform to "Side By Side"
Create a spatial video project.
In the viewer-> View>Show Stereoscopic as Anaglyph
Once the clip is on the timeline, there is a"Stereoscopic Convergence adjustment". Use anaglyph glasses to perform the adjustment.
One cool thing is that now you can stabilize SBS clips directly in Final Cut Pro
When your masterpiece is completed, you can export the video as Full SBS (using any of the codecs) or as Spatial Video (MV-HEVC) for the Apple Vision or any device that natively plays MV-HEVC format.
Pretty nice! I just wish they had more support for two camera setups.
If you are doing only straight Spatial Videos from the iPhone..... then new workflow is great.
I was recommended to check out the Atomos UltraSync Blue for a way to synchronize a pair of iPhones for 3D video capturing.
The UltraSync Blue generates Timecode that is passed via Bluetooth to six devices. (iPhones and supported Audio Recorders)
Using the Mavis Pro Camera app, it adds timecode to the video clips.
It makes editing in Final Cut Pro X so much easier, as I don't have to manually massage the clips. (The sync is frame perfect I think).
Downsides? It maxes out a 30fps. It cannot sync faster than that. The Ultrasync can let the Mavis Pro Camera know what frame rate you have selected.
However, you can choose to shoot at 60fps.. and then only take the timecode from the box.
The iPhone 6s stutters trying to record 4K, 30fps with the Mavis app. (Update 4K, 24fps, 30mbit works, but not 4K 30fps)
One bug with the older version on the iPhone 6s, it prefers 23.98 vs straight 24p(?). (Best to manually select frame rate in the app, and not the Ultrasync blue) You don’t want to mix it up and have phones stupidly record at different frame rate.
The only video recording iOS app that supports the blue is the Mavis Pro Camera app. I wish more recording applications would support it.
(Unrelated, but the BlackMagic Camera app and seems to have some synchronization features built in, but I do not have another iPhone test with).
Here is some sample footage with a pair of iPhone 6s running the Mavis Pro Camera app.
The shutter speed is locked on both phones.
Strobing is an issue with higher shutter speeds, so ND filters should be used outside to smooth it out.
At 1080 60fps and match shutters on both phones and it works great.
My biggest issue is that I don't want the ultra wide fisheye look.
Yes, you can defish the video in editing or record in LINEAR mode and save a lot of money but I was looking at other options.
One option is to swap out the lenses on the GoPro. Earlier GoPros use m12 lenses?
(Newer GoPros make this impossible without a spendy conversion--which means you might as well have bought a proper camera).
The GoPro 7 Hero Black lenses unscrew right off so I searched the web and found that the 8K production Store on AliExpress has a 3.4mm no distortion lens for GoPro 7 Hero Black.
The biggest issue after installation is to set focus. (You only have to do this once).
You have to connect the GoPro via HDMI to a monitor and slow focus the lens. Once dialed in, that's it.
Here's a link to the item. (Not an affliate link. I spent my own money on the lenses)
This seems like a very good candidate for a 3D video rig. With one camera flipped, and the metal strap ring removed.. it seems appears to give you a lens separation of 72mm.
It only does 1080p video.. but it's an affordable camera. Not bad. I would like 4K though as it wouldn't improve my Nikon P330 setup currently.
If set up side by side.. it appears the lens separation is 105mm.
I usually look at cameras online and check their dimensions.
Here is one: Canon EOS R100... an affordable mirrorless. I bring an image of the camera into Photoshop and try to do some measuring after mocking up a configuration.
4k recording is cropped, so that might be good for hyper stereo video.
The camera is supposed to be around 4.6 inches wide.
Distance between both lenses would still be 9.5 inches or so. Hyper!
Straight side by side layout is around 115mm between lenses. I might as well stick with my Panasonic G7s. I think the GoPro with modified lens is still better.
I've also been looking at a Mokose 4K camera online.