Copy link to clipboard
Copied
This post applies to the Project Indigo iOS camera app.
Adobe Labs is excited to share an early look at Project Indigo, an iPhone camera app we've started to develop, to get feedback from the photography community. The app offers full manual controls, a more natural ("SLR-like") look, and high image quality in both JPEG and raw formats. It also introduces some new photographic experiences not available in other camera apps. For more information on the underlying technology, please refer to this Project Indigo blog post.
Before you start with Project Indigo
Recipes for success when using Project Indigo
To get the maximum out of your images captured with the app, follow these guidelines:
Sending feedback
Please try the app and share feedback in this community forum thread. If you report a problem you encountered, it would help to include details like which device you are running Project Indigo on, what kind of scene you were trying to capture, what you were trying to achieve with the camera, and as much information as possible about what you like or do not like about the resulting photo quality. Our team will continually monitor this thread to track issues and improve future experiences.
To improve the performance and results of Project Indigo, it is important that examples of images that do not meet your expectations are forwarded to the team via your report. A large variety of file formats are allowed as attachments in these forum posts. The best option is to attach your image's raw file directly to your feedback post. Note that there is a 50 MB limit on an attachment's file size. If your raw file is too large to attach, the best option is to share the file via a file-sharing service (Dropbox or similar) and then share the link in your feedback post. Thank you for continuing to provide feedback on the Project Indigo camera!
Boris Ajdin: Product Manager, NextCam
Posted by:
Copy link to clipboard
Copied
Would it be possible for Indigo to still do 10x digital since the sensor size is quite large vs the previous gen, then upscales it back to 12MP? I do not know how hard this is to achieve, but Topaz upscaling does quite amazing for things like that nature.
By @nhan_8084
This is not about sensor size, but about resolution. With 10x digital zoom, we get only 0.12MP to work with... one cannot upscale that more than 2x (i.e., to 0.5MP) without hallucinating. Topaz upscaling uses hallucination for high upscale ratios, and that is OK when you are editing because you can always undo. But for a camera we do not (at least not yet) hallucinate pixels we are not confident should be there.
Copy link to clipboard
Copied
I am a new user to Indigo, having just upgraded from an old iPhone XR to the iPhone 17 Pro. I have started testing using the iPhone on a tripod, comparing the native phone app to Indigo. On a 14mm photo, I noticed a subtle white band in the sky. The shape of a rainbow, but not a rainbow, just white. Is there a way of sharing the file with Adobe that is not public?
Copy link to clipboard
Copied
I am a new user to Indigo, having just upgraded from an old iPhone XR to the iPhone 17 Pro. I have started testing using the iPhone on a tripod, comparing the native phone app to Indigo. On a 14mm photo, I noticed a subtle white band in the sky. The shape of a rainbow, but not a rainbow, just white. Is there a way of sharing the file with Adobe that is not public?
By @Sweet Light
Thank you for reporting this - we are aware of some problems on 17-series devices with lens vignetting calibration and are working with Apple to resolve that. It is possible that the problem is fixed in iOS 26.1, which we are trying to verify. For sharing files privately I will look into that, as we currently do not have a mechanism for it. As you can imagine, we cannot publicly share teammembers' emails.
Copy link to clipboard
Copied
I am a new user to Indigo, having just upgraded from an old iPhone XR to the iPhone 17 Pro. I have started testing using the iPhone on a tripod, comparing the native phone app to Indigo. Super Resolution tests: at 200 mm Indigo gives better detail than the Apple app. However, at 48 mm, Indigo smears fine detail. The Apple app has much better detail at 48 mm (12 mp file). I will do more tests to see if these results are typical for the 200 mm and 48 mm settings.
Copy link to clipboard
Copied
Just a note, I have personally tested SR at 2x and the 8x (17 pro max), and the 10x (15 pro max) and it causes severe mushing and denoising along with artifacts of incomplete imaging (merging and alignment issue) if there are motion detected in the scene. Even the very slight wind movement of foliages causes mush in jpeg AND raw. I have brought this up to the team and they will work on improving the AI algorithm.
Copy link to clipboard
Copied
I am a new user to Indigo, having just upgraded from an old iPhone XR to the iPhone 17 Pro. I have started testing using the iPhone on a tripod, comparing the native phone app to Indigo. Super Resolution tests: at 200 mm Indigo gives better detail than the Apple app. However, at 48 mm, Indigo smears fine detail. The Apple app has much better detail at 48 mm (12 mp file). I will do more tests to see if these results are typical for the 200 mm and 48 mm settings.
By @Sweet Light
Thank you for doing such rigorous testing. Please note that for Super-Resolution to work, there is an expectation that the device is *not* perfectly stable (i.e., that it is not on a steady tripod). When there is a small amount of hand-shake, which is low frequency enough to not introduce motion blur in each individual frame, we "extract" the extra information that is essentially hidden in each raw image to do the 2x upscale without hallucination. If you put the camera on a tripod the ability of the algorithm to extract such information may get reduced.
Copy link to clipboard
Copied
Their are various reasons for using a tripod or monopod. Perhaps Indigo's algorithm could recognize and take into account that there is no camera shake and therefore it is on a tripod. Or, there could be a simple toggle button for hand held or tripod, then Indigo takes this information into account when performing SR.
Copy link to clipboard
Copied
In case you missed it, when Indigo launched back in June, it was accompanied by a blog post which goes into some detail on the technology behind the app: LINK . There is section on super-resolution which is relevant for your questions, but the whole blog is quite informative and more detailed than large companies would typically disclose.
Copy link to clipboard
Copied
this technique needs this small shakes to capture more information used for sr. it is a system imanent part of sr! so this wouldnt help at all
Copy link to clipboard
Copied
Just bought a 16 Pro and have been testing the Indigo camera app. Here are my findings:
Pros:
Fantastic low-light performance; great landscape photos
Excellent dynamic range and detail
No other app currently delivers such impressive 2× and 10× (SR) shots
Clean design
Quick jpg development option in the gallery when shooting DNG only
Cons:
Touch-to-focus is broken; the camera always chooses the focus point on its own
No AE/AF lock
No magnified focus assist in manual mode
No portrait mode
The interface needs work and optimization:
No way to hide the histogram
Too many taps to access exposure compensation
No customizable/favorite buttons (the histogram area could be repurposed for this)
The “Camera” button in the top-right corner is pointless (repeats Photo/Night switch?)
Space left &right to the Photo/Night switches could be used for custom buttons
The app could really use a proper icon (Pi?)
Overall, it’s a great app to experiment with and has huge potential. But in its current state, there’s no way it can replace the stock camera app as the primary shooter.
Copy link to clipboard
Copied
I never understood this obsessions with megapixels.
I'd take color rendition over megapixels any day and in that regard Indigo reminds me of Foveon camera color rendition (Sigma DP2 as prime example)
Copy link to clipboard
Copied
I agree that color reproduction is essential no matter the amount of megapixels. Indigo definitely provides better color reproduction than the Apple camera app, particularly in the shadow areas. Whether or not megapixels is important depends upon your final output. For social media or around a 16 X 24 inch plus print, then I find 12 mp can be sufficient. Above these sizes the higher 48 mp sensor can provide noticeable detail that is lacking from the 12 mp sensor. According to Adobe it is not possible to do Indigo's computational photography that allows for improved color reproduction from the 48 mp sensor. Hopefully at some point that will change. For now I know that the iPhone 17 pro 12 mp or 48 mp is not a substitute for the full frame 61 mp sensor that I normally use for landscape work. However, I only have the full frame camera with me for planned shoots. I have the iPhone with me all of the time, which opens up a new world of photography opportunities.
Copy link to clipboard
Copied
Just bought a 16 Pro and have been testing the Indigo camera app. Here are my findings:
Pros:
Fantastic low-light performance; great landscape photos
Excellent dynamic range and detail
No other app currently delivers such impressive 2× and 10× (SR) shots
Clean design
Quick jpg development option in the gallery when shooting DNG only
Cons:
Touch-to-focus is broken; the camera always chooses the focus point on its own
No AE/AF lock
No magnified focus assist in manual mode
No portrait mode
The interface needs work and optimization:
No way to hide the histogram
Too many taps to access exposure compensation
No customizable/favorite buttons (the histogram area could be repurposed for this)
The “Camera” button in the top-right corner is pointless (repeats Photo/Night switch?)
Space left &right to the Photo/Night switches could be used for custom buttons
The app could really use a proper icon (Pi?)
Overall, it’s a great app to experiment with and has huge potential. But in its current state, there’s no way it can replace the stock camera app as the primary shooter.
By @multipraktik
Thank you for sharing your thoughtful pros and cons for Project Indigo. The team is very happy when we get detailed feedback on what our customers think of the app (even if we may disagree at times :)). To go point by point on the cons side:
Copy link to clipboard
Copied
@BorisTheBlade how do you think apple generates the 48mp proraw for the main camera? Because i think the reason they dont allow 48mp single frame dngs is because these quad bayer sensors cannot output 48mp of both color and luma resolution at the same time its either going to be heavily interpolated and is going to look very bad in terms of artifacts so they need stacking to get all the color data from other frames but what if there is a moving subject or many moving subjects? Do you think that super res applied on a full picture not a crop on a 12mp bayer sensor like 13 pro can match apple pro raw 48mp in terms of quality and sharpness? What is your team take on this especially Marc Levoy's?
Copy link to clipboard
Copied
@BorisTheBlade how do you think apple generates the 48mp proraw for the main camera? Because i think the reason they dont allow 48mp single frame dngs is because these quad bayer sensors cannot output 48mp of both color and luma resolution at the same time its either going to be heavily interpolated and is going to look very bad in terms of artifacts so they need stacking to get all the color data from other frames but what if there is a moving subject or many moving subjects? Do you think that super res applied on a full picture not a crop on a 12mp bayer sensor like 13 pro can match apple pro raw 48mp in terms of quality and sharpness? What is your team take on this especially Marc Levoy's?
By @powerful_Elixir5E29
Apple captures multiple frames using the full resolution quadbayer sensor and then they use their flavor of computational photography to try and extract a 48MP image out of it. Because they are using a quad bayer sensor, each frame does not actually contain 48MP-equivalent amount of detail, which is why you can sometimes see some artifacts in the ProRaw result. We are experimenting with using our Super-Resolution on a burst of 12MP bayer raw images, and once we are happy with how it operates (quality and speed) we will likely release it so users can try.
Copy link to clipboard
Copied
@BorisTheBlade But if they merge multiple frames then how i dont see any blurriness even in moving subjects in these proraws? Also for the sr algorithm i think in moving areas and high frequency details where your alignment fails you are denoising the image too much resulting in very soft areas and smears. Do you think sr applied to a 12mp bayer can ever match the 48mp quad bayer proraw even when there are moving subjects? Thanks!
Copy link to clipboard
Copied
It is frustrating trying to do a systematic evaluation of Indigo because it does not record proper metadata for the Focal Length 35mm. I try to be systematic in evaluation photographic equipment. In the case of Indigo, this means organizing files by the Focal Length 35mm equivalent. For some reason Indigo incorrectly records this import metadata. The result is a frustrating amount of time required to cross reference files imported to a desktop computer, viewed in Lighroom, against the Indigo app's information for each file indicating what the digital zoom was. For example, I want to organize all of the images shot at 48 mm SR into a single folder. This way I can compare them to 48 mm images taken with the Apple camera app or a different camera app. Indigo incorrectly shows 24mm as the 35mm equivalent focal length for both 24 mm and 48 mm shots. This is true of 100 mm and 200 mm shots If you want a systematic and proper evaluation done of the Indigo app it would be very helpful that the metadata is recorded accurately. In analysing the iPhone 17 pro camera apps, I treat the iPhone as three different cameras. In addition I want to further differentiate between 24 mm vs 48 mm and 100 mm vs 200 mm photos. This is the basis for evaluating a camera app ability to record images using 14 mm, 24 mm, 48 mm, 100 mm and 200 mm settings.
Copy link to clipboard
Copied
It is frustrating trying to do a systematic evaluation of Indigo because it does not record proper metadata for the Focal Length 35mm. I try to be systematic in evaluation photographic equipment. In the case of Indigo, this means organizing files by the Focal Length 35mm equivalent. For some reason Indigo incorrectly records this import metadata. The result is a frustrating amount of time required to cross reference files imported to a desktop computer, viewed in Lighroom, against the Indigo app's information for each file indicating what the digital zoom was. For example, I want to organize all of the images shot at 48 mm SR into a single folder. This way I can compare them to 48 mm images taken with the Apple camera app or a different camera app. Indigo incorrectly shows 24mm as the 35mm equivalent focal length for both 24 mm and 48 mm shots. This is true of 100 mm and 200 mm shots If you want a systematic and proper evaluation done of the Indigo app it would be very helpful that the metadata is recorded accurately. In analysing the iPhone 17 pro camera apps, I treat the iPhone as three different cameras. In addition I want to further differentiate between 24 mm vs 48 mm and 100 mm vs 200 mm photos. This is the basis for evaluating a camera app ability to record images using 14 mm, 24 mm, 48 mm, 100 mm and 200 mm settings.
By @Sweet Light
Thank you for sharing your thoughts. It is definitely clear how this may create some difficulties in a workflow that you describe... we'll look into the EXIF/DNG tags to see if there is perhaps a better way to note the camera characteristics in the image. This is however a bit of a philosophical question: how do you record focal length for a photo which is a combination of some optical properties of the lens AND digital zoom obtained by cropping the sensor? Typically focal lenght entails not just the FOV of the lens, but also such characteristics as minimum focus, depth-of-field, etc. When digitally cropping, only the FOV is changed while everything else remains the same. So we opted to record optical and digital factors separately. I read many articles that came out when iPhone 17 was announced (example LINK) criticizing this introduction of confusion between claiming a focal length and claiming "optical quality", which kind of goes at the heart of what we are talking about. Happy to hear any suggestions you may have on ways to handle this.
Copy link to clipboard
Copied
Thanks for your response. I think philisophical should give way to practicality. I want to know what equivalent focal length setting a photo was taken with. Indigo and the Apple app both have a setting for selecting 48 mm and 200mm (crops of the 24mm and 100mm lens). If I select 48 mm or 200 mm, then I want the metadata to show what my selection was. Why hide this information from the user? Apple does it and Indigo should also. It is not a philisophical consideration, but a simple practical request that will help photographers understand what the capability of their iPhone cameras and apps. Then they can use this knowledge to make better photographs.
Copy link to clipboard
Copied
@BorisTheBlade Apple just released iOS 26.1, will we see Indigo update with working selfie soon? Also can you please provide to us what is the next update focus for Indigo?
Copy link to clipboard
Copied
Took some shot of the moon today and I got very strange results. First two has artifacts, and I noticed if you align center, the final image shifts left, so I have to align to the right to shift left to center. This is annoying really, and I know probably it has to do with EIS and OIS. Also the white balance can be seen in viewfinder shifting between grey and proper moon thats a tad yellowish white. The blue Indigo image came out even when viewfinder is yellow moon. First two Indigo shots using Night Mode, Pro mode 1 frame. Blue shot Indigo is 20 frames same modes. Stock 20x included for reference.
Copy link to clipboard
Copied
I did another experiment by adjust iso and ss in night mode and pro mode, 6 frames and ETTR so viewfinder clips but final jpeg and raw still properly exposed. I want to ETTR so that way when I edit the raw in HDR, the SDR base isn't as dark in LRM. I have attached the sooc jpg and the raw post processed, 20x zoom again.
Copy link to clipboard
Copied
With images still being 12MP I wonder if an upgrade from 15 Pro to 17 Pro would make much of a difference when it comes to Indigo alone regardless of the improvements to iPhone camera.
Copy link to clipboard
Copied
I took the exact path last month upgrading from a 15 Pro Max to a 17 Pro Max
I am heavy into Indigo usage (only shooting raw) and I will say my opinion is that yes the 17 Pro upgrade is a huge benefit over the 15.
Have been shooting professionally for 40 years, but still feel that 12 megapixels for the majority of usage is more than adequate for many image applications.
Copy link to clipboard
Copied
the main lens now, the ultrawide and the tele yes since it has a much bigger sensor and newer tech in that sensor vs the old 12mp. You will get better imaging based on new processing algorithm adapting to that sensor, and I can tell you that apple's tele jpeg is now much much better rendering vs the typical heavy denoising oversharpening.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now