I think I may be misunderstanding the use, but I'm using
WriteMultiByte as follows for a network packet:
var ba:ByteArray = new ByteArray();
ba.writeMultiByte("Bob", "unicode");
ba.writeMultiByte("Jones", "unicode");
// view the buffer for debugging
for (var i:int = 0; i < ba.length; i++)
{
trace(ba
);
}
I expected the buffer length to be 16 bytes (BobJones = 8
letters, 2 bytes per), but instead it's 8 bytes (1 byte per letter,
UTF-8). I also expected to see zeros in between each letter when I
trace the buffer.
var src:String = String.fromCharCode(0XCCEC);
trace(src.charCodeAt(0).toString(16));//ccec
var tmpbytearr:ByteArray = new ByteArray();
tmpbytearr.writeMultiByte(src, "unicode");
trace(tmpbytearr.length);//1
trace(tmpbytearr.readUnsignedByte().toString(16));//3f
quote:
Originally posted by:
Alkahest ba.writeMultiByte("Bob", "unicode");
yes, indeed. And the compiler will not complain if you do:
ba.writeMultiByte("Bob", "anything whatsoever");
it just happily puts out UTF-8 regardless and ignores the
nonsense argument. I don't know the web address to report bugs. I
guess the authors just let that one slip by, or didn't have time to
implement it.
Hrm.. so you are saying that it ignores the second argument,
or are you saing that "unicode" not the correct value I should be
passing? I've tried just about every combination from the docs...
Alkahest, I tried a nonsense argument thinking that if no
error is thrown, then the argument is likely being ignored. That
was instead of trying them all. I previously only tried the alias
to 'unicode' (which was 'utf-16'), and that hadn't worked either.
I don't think we can see the source anywhere. That would
determine for sure that the argument is ignored, at least for now
until that gets implemented in an update.