Welcome back to our 3D Viewer series, this episode we interviewed our CTO Felix Chang for a review of the new iPhone 12 Pro & Pro Max, and his thoughts on how it can influence the VR/AR industry.
Q1: What were you most excited about and what’s your first impression?
Prior to the launch, I was really excited about the big advancement in camera and also the fact that it’s their first 5G phone. When watching their product launch event, they mentioned that iPhone Pro Max incorporated sensor-shift image stabilisation system as in many cameras and capable of recording video with Dolby Vision HDR up to 60 fps. So I had the feeling that Apple is not just targeting normal consumers, but also professionals. In terms of the design, the pacific blue color looks really high quality. I also like the squared-off edge design, it’s easier to hold, the only downside that it leaves fingerprints easily.
Q2: What do you think about its camera?
In terms of photo shoot, I like the wide angle lens and low-light environment performance, but I wouldn’t say that I was super amazed by it. I’m more stoked about its video recording result, capturing only with the phone and uploading it directly on YouTube looks already really professional. I have to say that photo capturing is not better than Huawei Mate 40 pro, but video recording has definitely outperformed it.
Q4: How is the performance of the phone?
Performance is better compared to the previous model, however, battery mAh is lower than iPhone 11, so as a normal usage it can last around 17-18 hours, however if you run heavy applications or play games, it’s not enough.
Q5: Can you explain more about its LiDAR sensor and why is it a valuable feature?
Many smartphones have depth sensors, what’s special about Apple is that instead of using Indirect TOF, it uses Direct TOF. The difference is that LiDAR is more accurate, and that’s why iPhone 12 Pro & Max are suitable for interior design, due to its accuracy on supporting content generation. For AR application, the experience is also enhanced, because with the already detected depth information, it element fits to the floor/wall naturally. (For more explanation about LiDAR, please refer to our previous episode here)
Q6: How do you think it will affect the VR/AR industry in general?
Previously we needed an expensive device (around $23,000 USD) and with Google Tango in order to come up with AR content. But the iPhone achieved it in such a light method and not affecting power consumption, that’s amazing. Nowadays everyone has a phone, when people can easily capture the 3 dimensional space with their device, generating AR/VR content becomes easier.
Q7: How can it be used for a virtual tour?
Seeing iPhone’s advantages in measuring space and re-modeling applications, we are now working on using the wide angle lens to capture panoramas and combine the space and depth information to create a 3D virtual tour.
Q8: Overall how do you feel about the phone and what surprised you the most?
They are now targeting professional users, from photographers to designers. This makes generating content much more efficient. The only downsides are lens flare, which is quite annoying when capturing outdoors and its battery-life. (So remember to bring a portable charger when you plan on running heavy applications the whole day) But overall, I’m really impressed and excited to incorporate it with different VR/AR applications.
Clearly we have a lot to look forward to, it’s definitely a big jump forward. In the next episode, we will be demonstrating how we combine the capturing with the depth details to come up with a 3D virtual tour. Stay tuned!