Quote:
Originally Posted by qrange
utf8 uses more bytes for a single letter (what a waste).
|
It's not a waste.
It uses only one byte for ASCII characters, then adds more bytes as needed only for certain characters.
If not for encodings like UTF-8 that have variable-length characters, every character would have to be big enough to store the highest posible value. Now
that would be a waste.