I've found out this weird python2 behavior related to unicode and variable:
This is the expected result I need, but I want to dynamically control the first part ("u\u2730")
Good, so the first part is casted as unicode. Now declaring a string variable and casting it to unicode:
>>> print myvar
It seems that now I can use the variable in my original code, right?
The results, as you can see, is not the original one. It seems that python is treating 'myvar' as string instead of unicode. Do I miss something?
Anyway, my final goal is to loop Unicode from \u0000 to \uFFFF, cast them as string and cast the string as HEX. Is there an easy way?