Timothee Cour
2015-Feb-19 05:52 UTC
[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers
Since llvm doesn't distinguish signed vs unsigned integers anymore, what is the recommended way to represent a language that distinguishes them? Is that to introduce new types, eg: %SignedI32 = type { i32 } %UnsignedI32 = type { i32 } ? -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150218/bac286ab/attachment.html>
Tim Northover
2015-Feb-19 17:30 UTC
[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers
On 18 February 2015 at 21:52, Timothee Cour <timothee.cour at gmail.com> wrote:> Since llvm doesn't distinguish signed vs unsigned integers anymore, what is > the recommended way to represent a language that distinguishes them? Is that > to introduce new types, eg:It probably depends on whether types are dynamic or static. If static, then the front-end should be keeping track of them anyway so you should be able to stick with iN and emit the correct operations. You *could* also do some kind type aliases like that but I wouldn't bother: without the struct they're just stripped by LLVM, with the struct you'd do more harm to readability and performance with the extra extractvalue/insertvalue instructions than you'd gain. And you'd probably want the optimisers to strip them as much as possible anyway. If dynamic, then the actual representation has to include some way of determining at runtime whether a signed or unsigned operation is needed. Cheers. Tim.