Copy link to clipboard
Copied
This post applies to the Project Indigo iOS camera app.
Adobe Labs is excited to share an early look at Project Indigo, an iPhone camera app we've started to develop, to get feedback from the photography community. The app offers full manual controls, a more natural ("SLR-like") look, and high image quality in both JPEG and raw formats. It also introduces some new photographic experiences not available in other camera apps. For more information on the underlying technology, please refer to this Project Indigo blog post.
Before you start with Project Indigo
Recipes for success when using Project Indigo
To get the maximum out of your images captured with the app, follow these guidelines:
Sending feedback
Please try the app and share feedback in this community forum thread. If you report a problem you encountered, it would help to include details like which device you are running Project Indigo on, what kind of scene you were trying to capture, what you were trying to achieve with the camera, and as much information as possible about what you like or do not like about the resulting photo quality. Our team will continually monitor this thread to track issues and improve future experiences.
To improve the performance and results of Project Indigo, it is important that examples of images that do not meet your expectations are forwarded to the team via your report. A large variety of file formats are allowed as attachments in these forum posts. The best option is to attach your image's raw file directly to your feedback post. Note that there is a 50 MB limit on an attachment's file size. If your raw file is too large to attach, the best option is to share the file via a file-sharing service (Dropbox or similar) and then share the link in your feedback post. Thank you for continuing to provide feedback on the Project Indigo camera!
Boris Ajdin: Product Manager, NextCam
Posted by:
Copy link to clipboard
Copied
Yes! Much better on the resource usage for my 16 Pro, from my perspective. Macro toggle is nice to have - makes it feel like you have six lenses at your disposal by the time you account for that, native cameras, and the two super res focal lengths. DNG-only mode seems like a worthwhile addition but I shoot mostly JPEGs unless I know I'll be using the Remove Reflections feature, so not quite applicable to me.
Just being greedy with my question above - love to see new things the team is coming up with! 🙂
By @Moonboots22
No problem - just keeping the mood light in the forum. We've also added capture inside 'Project Indigo' album, so you can find all your Indigo shots easier in Apple Photos, mirroring the front camera (yes, this one is really small, but still), and we've greatly improved the quality of the Remove Reflections ML model. But the bulk of the effort went towards performance and stability. To answer your original question more directly, the primary goal is iPhone 17 support - as soon as that is ready we will ship a new version. Whether any other features make it in or not will depend on the timing of that release.
Copy link to clipboard
Copied
we need to recalibrate many parameters of the processing pipeline due to small but impactful changes in ultrawide and wide cameras
By @BorisTheBlade
Would you say that there have been opaque changes to those cameras not obvious from the launch keynote / spec sheets, or did Apple incorporate the processing changes they talked about at such a low level that 3rd party apps cannot get around it?
Copy link to clipboard
Copied
we need to recalibrate many parameters of the processing pipeline due to small but impactful changes in ultrawide and wide cameras
By @BorisTheBladeWould you say that there have been opaque changes to those cameras not obvious from the launch keynote / spec sheets, or did Apple incorporate the processing changes they talked about at such a low level that 3rd party apps cannot get around it?
By @nnhuy
Indigo is unlike any other 3rd party camera app out there, since we do all of our own processing from very low level (including things like auto-exposure algorithm and Electronic Image Stailization (EIS)). Those heavily depend on things that go beyond APIs, like sensor and lens characteristics, for example. So, it is nothing that we cannot handle, but it takes some time, and we would have needed early prototype devices quite early to be able to be ready on iPhone 17 launch day. That didn't happen, which is why we are now in the situation we are in where 17-series devices don't work...
Copy link to clipboard
Copied
Hi @BorisTheBlade, just a question on how the team at Indigo goes about setting image quality? As you are probably aware of that most OEM stock cameras are based on machine learning/AI models and that tends to be a hit and miss, except with Apple being the most consistent of all when it comes to color sciences, hdr and white balance across all lenses. The HDR is the most LTM for all of the OEM with the obvious halo glowing edges, and faux skies color of skies being cyan vs that true lavender blu/dark blue in the cities area due to light pollutions etc.. Speaking in regards to the JPEG of course. Lately, Oppo and Vivo has set new image boundaries with the way their tunings are due to their own custom imaging chips making those high frequency details very natural vs the clumps details or worms like seen.
Also with iOS it's very limited at proper third party apps so I'm guessing comparing to them is kind of pointless since most of them aren't properly done and mainly uses proRAW pipelines. Do you guys base the jpeg tuning against a database of images done from like dslrs, different OEMs, or how are the baseline established?
Copy link to clipboard
Copied
Hi @BorisTheBlade, just a question on how the team at Indigo goes about setting image quality? As you are probably aware of that most OEM stock cameras are based on machine learning/AI models and that tends to be a hit and miss, except with Apple being the most consistent of all when it comes to color sciences, hdr and white balance across all lenses. The HDR is the most LTM for all of the OEM with the obvious halo glowing edges, and faux skies color of skies being cyan vs that true lavender blu/dark blue in the cities area due to light pollutions etc.. Speaking in regards to the JPEG of course. Lately, Oppo and Vivo has set new image boundaries with the way their tunings are due to their own custom imaging chips making those high frequency details very natural vs the clumps details or worms like seen.
Also with iOS it's very limited at proper third party apps so I'm guessing comparing to them is kind of pointless since most of them aren't properly done and mainly uses proRAW pipelines. Do you guys base the jpeg tuning against a database of images done from like dslrs, different OEMs, or how are the baseline established?
By @nhan_8084
We are not using any 3rd party information whatsoever. Since we control most of the end-to-end capture and processing pipeline, we capture our own raws with iPhones, using our AE, then we edit them the way we like, and that is in the ML model training set. That is for the "look" (i.e., tone and color rendition of the image). Other things like noise and sharpness are a bit more involved and are a mix of traditional and AI-based technologies. As always, the problem is to find a balance since if you optimize for one use case, you'll make another worse. Which is why IQ changes usually take a long time to happen, and are often done in very small increments after a lot of testing has been done with a huge variety of scenes. Even for companies such as Apple, which has probably more than 1000 people working on the camera stack (HW, SW, firmware, algorithms) it can take years to introduce meaningful changes to image quality tuning.
Copy link to clipboard
Copied
Thank you for the insights! Why is this approach different than what Marc did with Levoy when Google Camera was created and based on thousands of image database training ML? I am not thorough with math logs when dealing with image processing, but rather value based adjustments, my images fars pretty well across vast majority of scenes ofc will never be an infinite set since the possiblities are truely endless. But most scenes from motions to artificial lighting looks pretty good and much more usable than stock provides. Yes, apple has their own team of imaging of over 1000 employees since their purchase of LinX and others! But oddly that still makes them subpar when comparing to Oppo and Vivo somehow based on mass reviews seen across multiple blind tests on youtube.
Copy link to clipboard
Copied
Thank you for the insights! Why is this approach different than what Marc did with Levoy when Google Camera was created and based on thousands of image database training ML? I am not thorough with math logs when dealing with image processing, but rather value based adjustments, my images fars pretty well across vast majority of scenes ofc will never be an infinite set since the possiblities are truely endless. But most scenes from motions to artificial lighting looks pretty good and much more usable than stock provides. Yes, apple has their own team of imaging of over 1000 employees since their purchase of LinX and others! But oddly that still makes them subpar when comparing to Oppo and Vivo somehow based on mass reviews seen across multiple blind tests on youtube.
By @nhan_8084
I cannot speak to how Apple decides on their flavor of image processing. Usually there are taste-makers who lead this effort and can lead it in various directions. Often there is user testing, which Apple certainly does a ton of, and which asks a lot of causal people what they prefer, not just photo affecionados on various forums. There are HW and SW limitations, industrial design choices impacting things, etc, etc.
Copy link to clipboard
Copied
I did not even know Apple asks users for their inputs on the camera side! Even if they did, the typical users tends to like bright, saturated images like how Marques mentioned in this videos, and they don't care to look at the technical aspect of SNR.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now