Hello everyone! I'm using llvm for instrumenting code, and I need to know if there's a way to difference between signed and unsigned Values of integer type during an optimization pass. I know in llvm 1 it was possible, but i'd like to work with llvm 2. Maybe using the debugging information? Is it available during an optimization pass? How do I access it? Thanks in advance Alberto