Copy link to clipboard
Copied
I have always liked non-destructive editing and smart objects can play a big part in this. One issue I have is file sizes when working with smart object layers - they are always much bigger than then they should be. My conclusion is the Photoshop .psd file stores an unnecessary rendered copy of each smart object layer.
To demonstrate the issue try this:
1. Open a file as a smart object and save.
Look at the file size, you would expect a small overhead to the size of the original file but it is about twice the size.
2. Make it a linked smart object (right click on the layer "Relink to file ..." and select the original file)
Our psd file now contains just a link to the external original file, no actual real data that cant be derived from the original file - but it is still quite big!
3. Duplicate the linked smart object layer, save.
Twice no real data should be virtually nothing - it has grown again.
4. Now drastically shrink the image size, e.g. to 1% width and height (Image -> Image size … ) and save.
Look at the file size - it is minute!
Try some other image sizes. You will find what ever size you set the file size thus produced roughly tracks the image area. Hence my conclusions, the .psd file is unnecessarily storing a rendered version of each layer.
Even if it is slower to open up a file, I would very much like an option to create any rendered views needed on file opening and not storing them.
(Have I missed such an option?)
Regards,
Mark.
Copy link to clipboard
Copied
Even if it is slower to open up a file, I would very much like an option to create any rendered views needed on file opening and not storing them.
I disagree, that every instance of the Smart Object is actually rendered looks like a necessity for some things,
• like opening in older versions (or other editors) or
• addressing Layers in layout applications like Indesign or Illustrator or
• opening an image with missing linked Smart Objects,
so it seems beneficial to me.
To me file size consideration seems negligible compared to the potential opening-time-differences.
Edit: And the calculating effort could become very high if the SO-instances are not only transformed but also have multiple Filters applied to them which would all have to be calculated anew on opening if the results were not already stored in the file.
Copy link to clipboard
Copied
I think this feature will quit useful as optimization the document size, it can have more benifits at transfering on network or versioning. Imagine just a slighly change like position or scale, or want to compairing with multiple background, you have to save as a huge size of document. But I don't think ‘.PSD’ can implement this, because it compatibility. Maybe the future ".PSP" can. It will make a document more like an project. BTW, like AfterEffect can add multi effects that just like filters, but it can cached or using Hardware acceleration to quick previewing it. On the use of image files, I think shouldn't be waste on packing it all if you are not sending to anyone. Is not about how much disk space that you have, but about use it in a right way and not just simply duplicated.
Copy link to clipboard
Copied
As an aside: One thing that can make a huge difference in certain cases is whether the SO is flattened or layered.
Copy link to clipboard
Copied
Anyway, you may want to post a Feature Request over at
Photoshop Family Customer Community
As mentioned earlier I see no benefit in the approach you envision (maybe one could call it »procedural referencing«?) for Photoshop at current, but others may share your opinion and some Adobe employee/s might chime in.
Copy link to clipboard
Copied
Photoshop does indeed store a rendered layer for each duplicated smart object. If the duplicated smart object contains 52 layers - the internals are not duplicated, the file size only increases by the rendered size.
Why do you believe that this is an issue?
Dave
Copy link to clipboard
Copied
And if Photoshop had to place, transform and Filter all the SO-instances’ pixel content anew when opening a file that could take a while …
Copy link to clipboard
Copied
Thanks for your multiple thoughts c.pfaffenbichler.
Compatibility with other applications is something I was not considering, I only use Photoshop and Lightroom (thinking about it I don't know if Lightroom needs these rendered layers).
BTW I don't normally use linked smart objects, only used that in experimenting and it demonstrates the issue well so mentioned it above.
In the real world an example is I had an 85M pixel stitched image - 0.5GB smart object .psd to start with (16 bit RGB). I was expecting using SO duplicates would help me out, but after using 3 smart object duplicates, I ended up over the 2GB so could not even save as .psd, had to use .psb!
Thinking of changes my camera soon and may end up with ~50M pixels per image, I can easily see if I want to stick with non-destructive editing I will be looking at typically 1GB plus an edit. That will soon add up to lots of disk space seems.
For me when I use Photoshop (as opposed to quick edits in lightroom), I open a file and edit for quite a while (often hours!) so file opening time is a low priority.
I was thinking of a feature request, but thought it better to sound out on this forum first, I will await a while for other input first.
Copy link to clipboard
Copied
mjr1342 wrote
..........., I ended up over the 2GB so could not even save as .psd, had to use .psb!
That brings up something I definitely support. In these days of higher Mpx images we definitely need psb support in Lightroom.
Dave
Copy link to clipboard
Copied
Lightroom library system should catalog PSB files in its databases. Lightroon will never support PSB, PSD or Layered Tif files. Lightroom does not have layer support. I do not install Lightroom I'm not organized I do not need its library system. Adobe Bridge we do. Lightroom for me would only be a waste of disk space. Photoshop can edit images and layer is where Photoshop power is. Lightroom IMO is a Library system the has an image development feature that supports Camera RAW files and uses use the same Adobe Raw conversion engine as Adibe Camer Raw Plug-in does.. In fact ACR opens Lightrooms developed RAW file in Photoshop. Photoshop use ACR to open Camera RAW files. Lightrooms develop module is not a Plug-in it is only available in Lightroom. To work on RAW files data in Photoshop only ACR interfaces are available Lightroom's Image develipment is not available. The Idea was posted 8 years on Adobe feedback site. Adobe support IMO is not what it should be too many issues are never addressed and bugs remain in Photoshop release after release.
Copy link to clipboard
Copied
Hi JJ
Lightroom support for PSB should be the same as TIFF and PSD in that the images should appear in the catalogue and the develop module should work on the resulting file (not the internal layers). I agree 8 years and still no support for Adobe's own file format is shocking.
Dave
Copy link to clipboard
Copied
Yes most will agree my opening line was Lightroom library system should catalog PSB files in its databases. Adoeb Support is not what it should be IMO...
Copy link to clipboard
Copied
BTW I did find a work around for including my 2.5G .psb file under lightroom, just converted the layers to a smart object and made it a linked filed!
(with maximize compatibility off you should be able to catalog a .psb indirectly in lightroom like this upto 340 M pixel @ 16bit .psd or 680M pixel @ 8 bit RGB .psd, (yes I know I am wasting disk space again doing this!))
Copy link to clipboard
Copied
mjr1342 wrote
I have always liked non-destructive editing and smart objects can play a big part in this. One issue I have is file sizes when working with smart object layers - they are always much bigger than then they should be. My conclusion is the Photoshop .psd file stores an unnecessary rendered copy of each smart object layer.
You conclusion is simply wrong. If Photoshop did not have pixels for an image layer. The layer would be empty. Additionally wrong because smart object layer can share object. So many smarty object layers may be sharing the same smart object pixels. Each smart object layer has an associated transform for it object or shared object. These transforms transform the pixels rendered for the smart objects for how the layer uses the image. The transform Scales positions and distort the object for that layer. So a Picture package PSD my have one smart objects shared by 10 smart object layers. So the Pixels rendered for the smart object are transformed by the ten layer transform to sizes and positions the image to the ten image locations. Each layer has it own transform and transformed image pixels the PSD has only one copy of the pixels rendered for the object they are required. Without Pixels the layers will have not have any pixel content they will be empty.
Copy link to clipboard
Copied
You conclusion is simply wrong.
I think the approach the OP wishes to see implemented is possible for image editing in principle, tough probably not in Photoshop.
Something like Illustrator where a placed image is actually a reference to a separate file and its position, scale etc.
Of course this poses challenges for painting, clone stamping etc. and efficiency.
Copy link to clipboard
Copied
I don't even like linked Smart Object but I'm sure they have their place is some developed project work-flows. At least the Smart object pixels rendered and stored in the PSD can recover the image should the master Master object file become corrupt, deleted or lost so that it can not be re-linked or updated. I would not like to see any additional external file introduced. I feel they need to be embedded in the PSD. IMO Place is even messed up.
Copy link to clipboard
Copied
" I would not like to see any additional external file introduced."
Well guess what...to SAVE HUNDREDS OF MBS PER FILE, or even over a GB per file...IT WOULD BE WORTH it to many...I swear...
Copy link to clipboard
Copied
"I would not like to see any additional external file introduced. I feel they need to be embedded in the PSD. IMO Place is even messed up."
The Adobe fanboy spirit at its best. THAT'S exactly the principle of data linking and why people invented it. If you do not want it don't use file linking and stop blocking people who rely on such features! At least programs can have an editor / setting to choose how the host application should handle the thing. Also you should always use a version control system to keep your files safe. If file corruption is a permanently present factor in your thinking you definitely should overthink your working pipeline!
".. Master object file become corrupt, .." If that happens it will be your PSB file who dies first because it takes much more space so has a higher chance that a hardware error occurs there and not on one of the smaller linked files. Logic thinking, where are you?
Copy link to clipboard
Copied
You forgot that you are in an Adobe Forum. Here you are a freak if you use 2 screens, have a mouse with more than two buttons and ask for a hierarchy/node/link editor, etc.
The major problem of this issue (yes, issue) is, that Adobe never really understood the concept of data flow. When Photoshop got the feature "smart objects" this thing ("instancing" "referencing") was a common pratice for decades in all 3D and composting programs, even in open source and free ones. And they had editors so you always had a view over your project which element is embedded, which is linked and where, and where are the instances of it and which manipulation-nodes where sitting on this nodes to produce a final result. Data flow, as it is called. A root object (a file on disk or a layer with pixel information, it makes no difference) -> some nodes changing the appearance, like transform, blend, multiply/divide/add/sub/.. etc -> final image rendered on the canvas. Change one thing in the chain, everything after this "dirty" node will be updated.
Nearly no Adobe user will understand this concept, don't waste your time. A wonder how Substance Designer will fit into the Adobe portfolio, because it is data flow / node network driven by its definition 😉
So I now have serveral textures with 8k by 8k and 32bit float resolution and wonder why the source textures are 60mb and the same thing inside a PSB file takes 550mb of my disk space. Thanks Adobe and your fanboys!
Copy link to clipboard
Copied
No...YOUR conclusion is "simply wrong"....WHER EDO YOU THINK IT GETS THOSE PIXELS FROM IN THE FIRST PLACE?!?!?! IT gets that information FROM THE RAW FILE!! My god...I can't believe anybody would say something as stupid as what you just said..
Copy link to clipboard
Copied
I just want to add here beckause I have the exat same "issue". I have a file with just two raw layer as smart objects and have like 15 copies of them inside fodlers and masks.Now, why in hell shall you save all these pixels? I am only working with photoshop and nothin else so backward compatability and such isn't an issue, if we leave this out there is no reason to store 15 times the SAME data!
This is what it is, the exact same data. PS could open the project and simply restore all of the layer-data (which it probably does anyway) by loading the smart object once and transforming it using the layer transform settings. This is exactly what every 3D programm does btw. by using instancing. You can make 100 instances of a mesh, move, scew and rotate it and the filesize won't change.
Or why do you think I can just edit the smart object once while the file is open and it updates all the other objects referencing to the same?
Again I understand possible reasons, but I would like to have an option to disable this because it adds unnacessary filesize. Simply saving the file takes much longer but it shouldnt, that is an issue, loadup time can't be an argument here, once you make a change everything has to be updated anyway and then it is using the smart objects content, not what is stored on disc.
Copy link to clipboard
Copied
@c.pfaffenbichlerGenerating the pixels from the source PSD might take a long time upon opening a composite PSD file, but this is not necessary.
The cached data / render of the smart object could be stored in the smart object itself. Then if you have 15 copies of a PSD they could all look at the same cached render. Added advantage: no more dirty nodes, because the composite PSD does not need to store a copy.
And if you want to keep the PSD clean (because it does not need to know it is used as a smart object in other PSDs), then you can store the cache in a separate file .smartobjectcache.
Copy link to clipboard
Copied
The cached data / render of the smart object could be stored in the smart object itself.
If the data for each Smart Object instance (and each one might be at different scale, rotation, warp, … and have different filters applied) were stored in the SO itself the file size issue would remain, wouldn’t it?
Quite frankly I don’t seem to fully understand the point you are trying to make.
Copy link to clipboard
Copied
Oh right, I missed the part where every smart object instance might be different, that complicates things.
I was thinking of my own situation, in >95% of cases where I reuse smart objects they have the same scale, rotation, filters, everything. So my 15 composite PSD files each literally have the same data cached.
Makes me wonder how other people use smart objects. Might still make sense to improve this special case (same everything) if it is the most common use case.