Question
WriteMultiByte and "unicode"
I think I may be misunderstanding the use, but I'm using
WriteMultiByte as follows for a network packet:
var ba:ByteArray = new ByteArray();
ba.writeMultiByte("Bob", "unicode");
ba.writeMultiByte("Jones", "unicode");
// view the buffer for debugging
for (var i:int = 0; i < ba.length; i++)
{
trace(ba );
}
I expected the buffer length to be 16 bytes (BobJones = 8 letters, 2 bytes per), but instead it's 8 bytes (1 byte per letter, UTF-8). I also expected to see zeros in between each letter when I trace the buffer.
What am I doing wrong?
var ba:ByteArray = new ByteArray();
ba.writeMultiByte("Bob", "unicode");
ba.writeMultiByte("Jones", "unicode");
// view the buffer for debugging
for (var i:int = 0; i < ba.length; i++)
{
trace(ba );
}
I expected the buffer length to be 16 bytes (BobJones = 8 letters, 2 bytes per), but instead it's 8 bytes (1 byte per letter, UTF-8). I also expected to see zeros in between each letter when I trace the buffer.
What am I doing wrong?
