I know how to compare a char to see if it is an integer or not like this:
((c>57) || (c<48)) ? true: false;
Is this one actually compares with ASCII values?
If the coming character is a unicode character, how to do if the unicode character is a integer or not?
JerryWednesday, September 27, 2006 3:05 PM
The char C# data type is a Unicode character. Comparing a Unicode character with any integer value that represents anything other than Unicode is not a good idea. For example value values for the char data type are 0x0000 to 0xFFFF
Maybe you can explain what you're trying to accomplish?Thursday, September 28, 2006 4:08 PM
I have a old c code to do some math algorithms like HMM or Winkler. In the code, it compares to see if the char is within certain range (e.g. exclude latin characters in compare).
I am trying to write a c# version. So if it is a ASCII character, I will compare in the old way, if it is a Unicode char, I will compare with unicode characters?
ThanksMonday, October 02, 2006 7:50 AM
Daxu, what you want to do is quite complicated. In C it may appear to be quite easy because there's no direct encoding/decoding between ASCII and Unicode, a char is equivalent to byte.
In .NET, that's not the case. A char in .NET is Unicode, not ASCII; meaning it's 16-bits, not 8. You need to be processing byte data. With the byte data you need t be able to tell if you are dealing with ASCII data or Unicode data. If you're dealing with ASCII data you need to decode it into Unicode if you want to deal with chars.
As Mattias points out, the ASCII and Unicode character sets overlap quite a bit; but, it's not safe to assume one is an entire subset of the other. As such, there's no reliable way to tell if byte data is ASCII or Unicode from the data alone. Either you make assumptions about the data or you have some extra data to identify the encoding of the data.
If you provide some more detail, someone might be able to provide more guidance.Monday, October 02, 2006 1:38 PM