I would like to see some built in functionality to use the depth data that is captured by the iPhone camera (and any android camers for that matter) to adjust the bokeh like in the native Apple Photos app. This would be a great addition to the already present depth map support in masking, although I think this should be in a more accessible UI location than masking.
This would help my current workflow since I often snap a photo in iOS. When I go to edit in Lightroom I decide the bokeh is not exactly how I wanted.
To adjust I currently have to:
1. Delete the image from Lightroom
2. Go to iOS photos and adjust the bokeh
3. Reimport the photo into Lightroom
It would be great to handle this all in Lightroom