I don't see any problem to run in a 4k screen but bad hardware or system graphic card drivers
Yeah, the short answer is "it's probably fine".
The long answer is that it's completely dependent on how the game was written. At full-screen with a resolution of 1024x768 means that you're cranking out 786,432 pixels at some refresh rate, like 30 or 60 frames per second. At 60 fps, we're calculating 47,185,920 pixels per second. At full-screen with a resolution of 4096x2304 is 9,437,184 pixels per frame, that's 566,231,040 pixels per second, at 60fps. So, you're basically talking about 12x more pixels that get calculated, but also, computers are pretty fast.
To facilitate high-quality graphics at fast framerates on really nice displays, most computers include specialized graphics hardware to do that processing. The trick is, the software developer writing the game actually has to take advantage of the available hardware graphics acceleration. That's not guaranteed, but represents a massive difference in the workload required from your CPU. If the game uses hardware, you're golden. If it doesn't... maybe you're good?
The other question is just like, how much other "stuff" is the game doing between frames? There's a lot of code running in the game that isn't about drawing pixels. We don't know what that code is doing, if it's well written, if the number of pixels on the screen has any relationship to what's happening behind the scenes, if that processing is super demanding on the system already without the extra load from the much larger set of pixels that need to be rendered.
So yeah, "it's probably fine"? If you're really sweating it, go down to Best Buy and fire your favorite game up before you buy the machine. That's probably the closest thing you're going to get to a guarantee.