It’s been almost a month now since Apple announced its iPhone series with LiDAR camera sensors. From a total of four models, two models include the sensor. From the bunch, the mini and standard iPhone have good cameras. But the so-called pro models have all the camera features you’d ever want in a phone.
I specifically want to talk about the LiDAR present in the iPhone 12 Pro, iPhone 12 Pro Max, and the iPad Pro. LiDAR stands for Light Detection and Ranging. It determines the distance between itself and an object by sending a pulse of light and monitoring how long it takes to bounce back. It is similar to radar, except instead of radio waves, LiDAR uses light.
More sophisticated versions of LiDAR are used by satellites orbiting our Earth. They generate precise, 3D information about the shape of Earth and its surface. But it also works on a smaller scale when using a phone. You can work out distances and object sizes with accuracy over short distances.
How Apple uses the LiDAR on iPhone 12?
Apple iPhone uses LiDAR a bit; differently, the sensor has a range of around 16 feet (5 meters) in both the iPhone 12 Pro and even in the iPad Pro.
The primary purpose of LiDAR in the iPhones and iPad is to improve the augmented reality function. It will be used to give more reliable information about their surroundings for better AR results.
If you’ve never used AR before, it allows you to use the device’s camera to apply interactive features on apps like Snapchat or preview the placement of furniture around your living room and other objects.
You might have used AR extensively if you played the once-popular game Pokemon GO. Using AR, the game allows you to capture creatures in the real-world environment.
Ikea has also implemented something similar to their app, which you can use to see furniture placement from the company’s catalog and match placement in your home.
The sensor in iPhones and iPad is not that accurate with the measurement of big structures like buildings. Sebastiaan De, who developed the popular iPhone camera app Halide, did suggest that it might not be good for buildings, but 3D models can be accurately measured using these devices.
Apple also has plans to use LiDAR to improve camera performance in low light. The company earlier, with the launch of the iPhone XS, introduced its phase-detect autofocus feature. The technology still depends on the light; therefore, it doesn’t work well in dark environments. But as LiDAR uses distance, it can tell the camera where to focus in low light and get the best results. This can improve your iPhone’s ability to capture a better picture in the dark when combined with night mode.
Is LiDAR necessary for all future smartphones?
Presently only the latest “Pro” models of Apple devices have the sensor, but with the inclusion of AR in the software development kit, Apple plans to build more AR products and software around this.
The developers will also be able to push the latest software build to all devices running iOS. This means that you would be able to use your iPad Pro to do what the 2022 iPad will do.
The biggest benefit is that when a company like Apple do this they have the resources to make abundant errors and correct them. Also, the whole Apple ecosystem can be optimized using software to give you a more seamless experience.
Apple also has plans to take AR beyond smartphones. According to an article on Macrumors, Apple might introduce AR-enabled glasses.
If this concept becomes reality, then using your iPhone and glasses, you might be able to remodel your house virtually. More devices mean there will be more apps available on the Apple store to benefit from AR.
But LiDAR also has its downfalls
One of the main disadvantages is that you can’t use the LiDAR sensor well in fog, rain, snow, and dusty weather. It also struggles to detect a glass wall or door, so many car manufacturers like Tesla use it along with secondary cameras. Even the iPad has added a 12MP wide-angle and another 10MP ultrawide camera along with the lidar sensor.
What other gadgets have a LiDAR sensor?
LiDAR is popping up everywhere now, thanks to the improvement in VR and AR tech. It is already included in self-driving cars, robots, and drones. Augmented reality headset HoloLens 2 from Microsoft has LiDAR, which maps out room spaces before layering 3D objects into them.
Even the old Microsoft Xbox console had an accessory named Kinect that was a camera that did depth scanning. In fact, the parent company PrimeSense that helped Microsoft develop the tech was later acquired by Apple in 2013. Then we saw face-scanning depth cameras and rear LiDAR sensors in Apple devices.
More posts like this:
- Facebook Avatar: How to make one for yourself?
- How to check your Facebook messages without the sender knowing
- How to export your WhatsApp chats from iPhone to Android
Hello, my name is Parth Suthar, with a background in Engineering and a never-ending interest in technology has driven me to start this blog. From leaks about upcoming smartphones to tips and tricks I cover everything. Reach to me if you have any questions.