New Delhi -°C
Today in New Delhi, India

Jan 19, 2020-Sunday
-°C

Humidity
-

Wind
-

Select city

Metro cities - Delhi, Mumbai, Chennai, Kolkata

Other cities - Noida, Gurgaon, Bengaluru, Hyderabad, Bhopal , Chandigarh , Dehradun, Indore, Jaipur, Lucknow, Patna, Ranchi

Monday, Jan 20, 2020
Home / Tech / Apple’s Deep Fusion comes to iOS 13 dev beta, said to make iPhone 11 camera better

Apple’s Deep Fusion comes to iOS 13 dev beta, said to make iPhone 11 camera better

Apple Deep Fusion camera is coming soon to the new iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max smartphones.

tech Updated: Oct 02, 2019 14:48 IST
HT Correspondent
HT Correspondent
Hindustan Times
iPhone 11’s Deep Fusion camera comes to iOS 13 dev beta
iPhone 11’s Deep Fusion camera comes to iOS 13 dev beta(Bloomberg)
         

Apple at its iPhone 11 launch event had showcased a new camera technology called Deep Fusion. Powered by Neural Engine of A13 Bionic chip, Deep Fusion leverages latest machine algorithms to deliver high quality images. The new technology seems to be ready for the wide roll out as it has now made its way to the latest iOS 13 developer beta.

Apple’s new iPhone 11, iPhone 11 Pro, and iPhone Pro Max rely on Smart HDR to combine multiple images and deliver you a photo that’s rich in detail and dynamic range. The technology allows the camera to automatically detect when it’s too dark and switch on the Night Mode. The Deep Fusion is said to take this to a new level. Apple says the new feature will do pixel-by-pixel processing of photos, add optimisation for texture, and enhance details in each photo.

The Verge further explains how the new Deep Fusion camera technology works. The first step involves preparing four frames at a fast shutter speed to freeze motion in a picture. This happens by the time you tap on the camera button. Once you’ve tapped the shutter button, the camera takes a long-exposure photo to add the details.

WATCH: Apple iPhone 11 now available in India: Here are our first impressions

 

Next, Apple creates a “synthetic long” by adding three regular frames and long-exposure, unlike how the Smart HDR works. Now, Deep Fusion kicks in to merge the short-exposure image and synthetic long shot. The technology then goes after the image by optimising the shot pixel by pixel and increase details. Then the final image is created.

According to CNET, Deep Fusion-enabled processing takes roughly a second to complete. The technology is also capable of identifying in-photo content such as sky, textures, and even finer details such as hair.

While Deep Fusion won’t work in burst mode, it is likely to come handy when lowlight and dark light settings. For brighter settings, Apple iPhone 11 and iPhone 11 Pro will continue to use Smart HDR. The tele lens on iPhone 11 will rely on the new technology whereas ultrawide sensors will use Smart HDR.