top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters in Java?

+1 vote
629 views
How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters in Java?
posted Mar 24, 2014 by Yogeshwar Thakur

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

1 Answer

0 votes

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

answer Mar 25, 2014 by Sanjay Kumar
Similar Questions
+8 votes

For example, 00000001 11001100 becomes 11001100 00000001;

unsigned short swap_bytes(unsigned short x) {

}

+1 vote

I am looking for an example of a UNICODE to ASCII conversion example that will remove diacritics from characters (and leave the characters, i.e., Klüft to Kluft) as well as handle the conversion of other characters, like große to grosse.

There used to be a program called any2ascii.py (http://www.haypocalc.com/perso/prog/python/any2ascii.py ) that worked well, but the link is now broken and I can't seem to locate it.

I have seen the page Unicode strings to ASCII ...nicely, http://www.peterbe.com/plog/unicode-to-ascii, but am looking for a working example.

...