Timothee Cour
2015-Feb-19 06:04 UTC
[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers
Since llvm doesn't distinguish signed vs unsigned integers anymore, what is the recommended way to represent a language that distinguishes them? Is that to introduce new types, eg: %SignedI32 = type { i32 } %UnsignedI32 = type { i32 } ? -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150218/3d2bfae3/attachment.html>
David Blaikie
2015-Feb-19 06:16 UTC
[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers
On Wed, Feb 18, 2015 at 10:04 PM, Timothee Cour < timothee.cour2+llvm at gmail.com> wrote:> Since llvm doesn't distinguish signed vs unsigned integers anymore, what > is the recommended way to represent a language that distinguishes them? >Distinguishes them how? The IR doesn't have to distinguish everything your source language does. If, for example, your language supports overloading based on type (and unsigned and signed types are distinct types), much like Clang does for C++, you would mangle those source-level types into the function name to create distinct functions, regardless of the matching/non-matching nature of the actual argument types. Where else do you need to distinguish them?> Is that to introduce new types, eg: > %SignedI32 = type { i32 } > %UnsignedI32 = type { i32 } > ? > > _______________________________________________ > LLVM Developers mailing list > LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu > http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev > >-------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150218/95c6eaee/attachment.html>
Bruce Hoult
2015-Feb-19 06:27 UTC
[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers
That might be useful if your *CPU* distinguished them (which no current CPUs do -- there are signed and unsigned *operations*, not signed and unsigned values). It is irrelevant to your programming language. You do your type checking in your compiler, and emit LLVM code with i32 values the same as you'd emit machine code with 32 bit values that are neither signed nor unsigned. On Thu, Feb 19, 2015 at 7:04 PM, Timothee Cour < timothee.cour2+llvm at gmail.com> wrote:> Since llvm doesn't distinguish signed vs unsigned integers anymore, what > is the recommended way to represent a language that distinguishes them? Is > that to introduce new types, eg: > %SignedI32 = type { i32 } > %UnsignedI32 = type { i32 } > ? > > _______________________________________________ > LLVM Developers mailing list > LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu > http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev > >-------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150219/9456ea9a/attachment.html>