On my new website I've got a few images with a lot of fine detail that jpg compression has a hard time compressing, resulting in files that are too big for the website.
Website image size is 2040 x1360, though if I wanted the images to read optimally on a 4 or 5K monitor I'd be looking at 2500 x 1700 pixels. Recommended maximum file size is 550kb.
I have two particular files in mind, attached. One is looking down at a running track with an athlete, red and black pebbled surface, lots of detail that the algorythm is obviosuly having trouble compressing. The other is a strongly backlit landscape looking through tree leaves with a lot of midrange and highlight detail in the leaves that has the same issue.
I'm using Photoshop 2021, sRGB, jpg compression level 2. Even at approx 2040 x1360 compression level 2 I'm getting file sizes of not much shy of 1MB and even dropping to level 1 isn't getting me much better compression. These are pretty critical images on my site, flow and sequencing is an important part of my work, so hoping there's some solutions I can look at.
I first thought it was metadata, but it actually looks like you hit the limit of jpeg compression. As such, it's an excellent demonstration of how it works.
In your case I would read "recommended size" as just that. If it goes a little over, so be it. Don't lose any sleep.
Since you can't do any meaningful tests on images that already have been jpeg-compressed (the compression artifacts are just treated as even more "detail" in the new compression) - I tried with some of my own detail-rich images. This one I got down to 579 kB at quality level 20, using Save For Web. But your example is probably even trickier, with more local contrast.
I would recommend Save For Web/Export over Save As, since you can directly preview the final result.