What's the technical difference between Total RMS Amplitude and Average RMS Amplitude?
Can't tell you absolutely, because Adobe won't release the formulae used to calculate these. But to a first approximation, the Average value is an attempt to get at something approximating to an Leq value out of the measurement. Those are based on the amount of time that a signal might stay at a particular level (or small range of levels). So if you subtract the Minimum RMS value of a selection from the Maximum value and divide the answer by 2, you'll get within spitting distance of what they refer to as the 'Average' value. But you won't get there exactly, because this is weighted by the time factor in each case. And also, all these numbers will vary slightly if you alter the sampling window width.
All I can say about this in general is that these numbers don't really have any practical value at all. For most people these days, that LUFS value at the bottom means rather more. That's also based on a type of Leq measurement, but if you look it up, you'll find that the calculation for this is actually specified, unlike Adobe's...
I need this to know which is the real RMS value of the entire file lenght that will be received by audio amplifiers delivering the output to loudspeakers, which is loudness agnostic.
You will only get an accurate RMS value for a steady tone or noise, and that would be a Total RMS value. Anything else like music or speech will not give you a meaningful value at all. And since the level that's fed to your amplifier and loudspeaker can be adjusted outside Audition, the exercise is pointless, I'm afraid - dB values are relative, not absolute.