Displaying 6 results from an estimated 6 matches for "fextended".
Did you mean:
extended
2017 Jan 23
2
Changes to TableGen in v4.0?
I am trying to upgrade to the LLVM v4.0 branch, but I am seeing failures in
my TableGen descriptions for conversion from FP32 to FP16 (scalar and
vector).
The patterns I have are along the lines of:
[(set (f16 RF16:$dst), (fround (f32 RF32:$src)))]
or:
[(set (v2f16 VF16:$dst), (fround (v2f32 VF32:$src)))]
and these now produce the errors:
error: In CONV_f32_f16: Type inference
2015 Feb 12
3
[LLVMdev] half to float intrinsic promotion
Hi Guys,
I am trying to promote half to float for my intrinsic math operations,
following class and pattern are defined.
"
class S_HF__HF< string asmstr> : Intrinsic
<[llvm_float_ty ], [llvm_float_ty ],
[IntrNoMem],
!strconcat(asmstr, "_f16")>;
def :Pat<( f16 (int_my_math_f16 f16:$src)), (F2Hsr (FEXTsr f16:$src) )>;
“
where FEXTsr is
2012 Mar 31
2
[LLVMdev] Mangling of UTF-8 characters in symbol names
...>
> I think it's just so that we have a way to actually write out the
> symbol into the assembly file. What does gcc do?
>
> -Eli
>
>
It emits the high bits literally. The consequence is that UTF-8-encoded
identifiers come out in UTF-8:
scshunt at natural-flavours:~$ gcc -fextended-identifiers -std=c99 -x c -c -o
test.o -
int i\u03bb;
scshunt at natural-flavours:~$ nm test.o
00000004 C iλ
scshunt at natural-flavours:~$
As you can see, the nm output includes the literal lambda.
Sean
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://li...
2012 Mar 30
2
[LLVMdev] Mangling of UTF-8 characters in symbol names
Why is it that high (>127) bytes in symbol names get mangled by LLVM into
_XX_, where XX is the hex representation of the character? Is this required
by ELF or some similar standard? This behavior is inconsistent with GCC.
Sean
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
2012 Mar 30
0
[LLVMdev] Mangling of UTF-8 characters in symbol names
On Fri, Mar 30, 2012 at 12:12 PM, Sean Hunt <scshunt at csclub.uwaterloo.ca> wrote:
> Why is it that high (>127) bytes in symbol names get mangled by LLVM into
> _XX_, where XX is the hex representation of the character? Is this required
> by ELF or some similar standard? This behavior is inconsistent with GCC.
I think it's just so that we have a way to actually write out
2012 Mar 31
0
[LLVMdev] Mangling of UTF-8 characters in symbol names
...that we have a way to actually write out the
>> symbol into the assembly file. What does gcc do?
>>
>> -Eli
>>
>
> It emits the high bits literally. The consequence is that UTF-8-encoded
> identifiers come out in UTF-8:
>
> scshunt at natural-flavours:~$ gcc -fextended-identifiers -std=c99 -x c -c -o
> test.o -
> int i\u03bb;
> scshunt at natural-flavours:~$ nm test.o
> 00000004 C iλ
> scshunt at natural-flavours:~$
>
> As you can see, the nm output includes the literal lambda.
Okay... then we should probably support that as well. Might nee...