Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
35

P: Introducing the Project Indigo camera app

Adobe Employee ,
May 23, 2025 May 23, 2025

This post applies to the Project Indigo iOS camera app. 

 

Adobe Labs is excited to share an early look at Project Indigo, an iPhone camera app we've started to develop, to get feedback from the photography community. The app offers full manual controls, a more natural ("SLR-like") look, and high image quality in both JPEG and raw formats. It also introduces some new photographic experiences not available in other camera apps. For more information on the underlying technology, please refer to thiProject Indigo blog post.

 

Before you start with Project Indigo 

  • We recommend using Project Indigo on iPhone 15 Pro/Pro Max or newer devices.
    (Also supported are 12 Pro/Pro Max, 13 Pro/Pro Max, and all 14-series devices.)
  • You should have at least 1GB of storage space left for the app, the downloadable AI Models inside the app, and for captured photos. 

 

Recipes for success when using Project Indigo 

To get the maximum out of your images captured with the app, follow these guidelines: 

  • When reviewing the results, focus on Project Indigo's more natural look (in both SDR and HDR). If you haven’t done this before, try viewing the images on your laptop or desktop device, preferably on an HDR screen. 
  • Capture with both JPEG and raw DNGs with file saving enabled. Project Indigo produces computational photography DNG files, which have the same natural look as JPEG images, but much more latitude for editing after capture. 
  • Take control of the camera with the built-in Pro Controls, including controls that are exclusive to a computational camera: Frames to Merge and Merge Method. These may be intimidating for beginners, but with Project Indigo, you can try them for free, and nothing will break—you can always reset the settings to ‘Auto’ and let the camera take back control. 
  • Go to the Indigo Labs page and play with the latest innovations our team can offer. These are only available on mobile via Indigo! 
  • Be patient! Project Indigo is doing a lot of heavy lifting under the hood, and it will reward you with great photos. In return, it may ask you for a bit of time to set up captures when needed, and to wait a few seconds for the image processing to finish. 

 

Sending feedback 

Please try the app and share feedback in this community forum thread. If you report a problem you encountered, it would help to include details like which device you are running Project Indigo on, what kind of scene you were trying to capture, what you were trying to achieve with the camera, and as much information as possible about what you like or do not like about the resulting photo quality. Our team will continually monitor this thread to track issues and improve future experiences.  

 

To improve the performance and results of Project Indigo, it is important that examples of images that do not meet your expectations are forwarded to the team via your report.  A large variety of file formats are allowed as attachments in these forum posts. The best option is to attach your image's raw file directly to your feedback post. Note that there is a 50 MB limit on an attachment's file size. If your raw file is too large to attach, the best option is to share the file via a file-sharing service (Dropbox or similar) and then share the link in your feedback post. Thank you for continuing to provide feedback on the Project Indigo camera! 

 

Boris Ajdin: Product Manager, NextCam 
 
Posted by: 

 

Rikk Flohr: Adobe Photography Org
TOPICS
iOS: iPhone
140.7K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 882 Replies 882
Adobe Employee ,
2 hours ago 2 hours ago
quote

Yes! Much better on the resource usage for my 16 Pro, from my perspective. Macro toggle is nice to have - makes it feel like you have six lenses at your disposal by the time you account for that, native cameras, and the two super res focal lengths. DNG-only mode seems like a worthwhile addition but I shoot mostly JPEGs unless I know I'll be using the Remove Reflections feature, so not quite applicable to me.  

 

Just being greedy with my question above - love to see new things the team is coming up with! 🙂


By @Moonboots22

No problem - just keeping the mood light in the forum. We've also added capture inside 'Project Indigo' album, so you can find all your Indigo shots easier in Apple Photos, mirroring the front camera (yes, this one is really small, but still), and we've greatly improved the quality of the Remove Reflections ML model. But the bulk of the effort went towards performance and stability. To answer your original question more directly, the primary goal is iPhone 17 support - as soon as that is ready we will ship a new version. Whether any other features make it in or not will depend on the timing of that release.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
3 hours ago 3 hours ago
quote

we need to recalibrate many parameters of the processing pipeline due to small but impactful changes in ultrawide and wide cameras


By @BorisTheBlade

Would you say that there have been opaque changes to those cameras not obvious from the launch keynote / spec sheets, or did Apple incorporate the processing changes they talked about at such a low level that 3rd party apps cannot get around it?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
2 hours ago 2 hours ago
quote
quote

we need to recalibrate many parameters of the processing pipeline due to small but impactful changes in ultrawide and wide cameras


By @BorisTheBlade

Would you say that there have been opaque changes to those cameras not obvious from the launch keynote / spec sheets, or did Apple incorporate the processing changes they talked about at such a low level that 3rd party apps cannot get around it?


By @nnhuy

Indigo is unlike any other 3rd party camera app out there, since we do all of our own processing from very low level (including things like auto-exposure algorithm and Electronic Image Stailization (EIS)). Those heavily depend on things that go beyond APIs, like sensor and lens characteristics, for example. So, it is nothing that we cannot handle, but it takes some time, and we would have needed early prototype devices quite early to be able to be ready on iPhone 17 launch day. That didn't happen, which is why we are now in the situation we are in where 17-series devices don't work...

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
2 hours ago 2 hours ago

Hi @BorisTheBlade, just a question on how the team at Indigo goes about setting image quality?  As you are probably aware of that most OEM stock cameras are based on machine learning/AI models and that tends to be a hit and miss, except with Apple being the most consistent of all when it comes to color sciences, hdr and white balance across all lenses.  The HDR is the most LTM for all of the OEM with the obvious halo glowing edges, and faux skies color of skies being cyan vs that true lavender blu/dark blue in the cities area due to light pollutions etc..  Speaking in regards to the JPEG of course.  Lately, Oppo and Vivo has set new image boundaries with the way their tunings are due to their own custom imaging chips making those high frequency details very natural vs the clumps details or worms like seen.

 

Also with iOS it's very limited at proper third party apps so I'm guessing comparing to them is kind of pointless since most of them aren't properly done and mainly uses proRAW pipelines.   Do you guys base the jpeg tuning against a database of images done from like dslrs, different OEMs, or how are the baseline established?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
an hour ago an hour ago
quote

Hi @BorisTheBlade, just a question on how the team at Indigo goes about setting image quality?  As you are probably aware of that most OEM stock cameras are based on machine learning/AI models and that tends to be a hit and miss, except with Apple being the most consistent of all when it comes to color sciences, hdr and white balance across all lenses.  The HDR is the most LTM for all of the OEM with the obvious halo glowing edges, and faux skies color of skies being cyan vs that true lavender blu/dark blue in the cities area due to light pollutions etc..  Speaking in regards to the JPEG of course.  Lately, Oppo and Vivo has set new image boundaries with the way their tunings are due to their own custom imaging chips making those high frequency details very natural vs the clumps details or worms like seen.

 

Also with iOS it's very limited at proper third party apps so I'm guessing comparing to them is kind of pointless since most of them aren't properly done and mainly uses proRAW pipelines.   Do you guys base the jpeg tuning against a database of images done from like dslrs, different OEMs, or how are the baseline established?


By @nhan_8084

We are not using any 3rd party information whatsoever. Since we control most of the end-to-end capture and processing pipeline, we capture our own raws with iPhones, using our AE, then we edit them the way we like, and that is in the ML model training set. That is for the "look" (i.e., tone and color rendition of the image). Other things like noise and sharpness are a bit more involved and are a mix of traditional and AI-based technologies. As always, the problem is to find a balance since if you optimize for one use case, you'll make another worse. Which is why IQ changes usually take a long time to happen, and are often done in very small increments after a lot of testing has been done with a huge variety of scenes. Even for companies such as Apple, which has probably more than 1000 people working on the camera stack (HW, SW, firmware, algorithms) it can take years to introduce meaningful changes to image quality tuning.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
54m ago 54m ago

Thank you for the insights!  Why is this approach different than what Marc did with Levoy when Google Camera was created and based on thousands of image database training ML?  I am not thorough with math logs when dealing with image processing, but rather value based adjustments, my images fars pretty well across vast majority of scenes ofc will never be an infinite set since the possiblities are truely endless.  But most scenes from motions to artificial lighting looks pretty good and much more usable than stock provides.  Yes, apple has their own team of imaging of over 1000 employees since their purchase of LinX and others!  But oddly that still makes them subpar when comparing to Oppo and Vivo somehow based on mass reviews seen across multiple blind tests on youtube.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
31m ago 31m ago
quote

Thank you for the insights!  Why is this approach different than what Marc did with Levoy when Google Camera was created and based on thousands of image database training ML?  I am not thorough with math logs when dealing with image processing, but rather value based adjustments, my images fars pretty well across vast majority of scenes ofc will never be an infinite set since the possiblities are truely endless.  But most scenes from motions to artificial lighting looks pretty good and much more usable than stock provides.  Yes, apple has their own team of imaging of over 1000 employees since their purchase of LinX and others!  But oddly that still makes them subpar when comparing to Oppo and Vivo somehow based on mass reviews seen across multiple blind tests on youtube.


By @nhan_8084

I cannot speak to how Apple decides on their flavor of image processing. Usually there are taste-makers who lead this effort and can lead it in various directions. Often there is user testing, which Apple certainly does a ton of, and which asks a lot of causal people what they prefer, not just photo affecionados on various forums. There are HW and SW limitations, industrial design choices impacting things, etc, etc. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
a minute ago a minute ago
LATEST

I did not even know Apple asks users for their inputs on the camera side!  Even if they did, the typical users tends to like bright, saturated images like how Marques mentioned in this videos, and they don't care to look at the technical aspect of SNR.  

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines