Displaying 6 results from an estimated 6 matches for "fextend".
Did you mean:
extend
2017 Jan 23
2
Changes to TableGen in v4.0?
...[(set (v4f16 VF16:$dst), (fround (v4f32 VF32:$src)))]
and 'CONV_v4f32_v4f16'. What adjustments do I need to make to the TD
descriptions to make these work again? I know that FP16 is not hugely
common on the mainstream platforms, but it is vital to ours.
I did notice that 'fextend' was replaced by 'fpextend', is there some new
ISD node type I should use for 'fround'?
Thanks,
MartinO
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20170123/7335c...
2015 Feb 12
3
[LLVMdev] half to float intrinsic promotion
...re defined.
"
class S_HF__HF< string asmstr> : Intrinsic
<[llvm_float_ty ], [llvm_float_ty ],
[IntrNoMem],
!strconcat(asmstr, "_f16")>;
def :Pat<( f16 (int_my_math_f16 f16:$src)), (F2Hsr (FEXTsr f16:$src) )>;
“
where FEXTsr is implementing the fextend type profile, F2Hsr is implementing as the float to half conversion .
“int_my_math_f16” is implementing the “S_HF__HF” profile above.
I am just trying to
(1) convert the $src from f16 to f32 using FEXTsr.
(2) use the F2Hsr to convert the f32 back to f16.
for testing.
however, I always got the...
2012 Mar 31
2
[LLVMdev] Mangling of UTF-8 characters in symbol names
...>
> I think it's just so that we have a way to actually write out the
> symbol into the assembly file. What does gcc do?
>
> -Eli
>
>
It emits the high bits literally. The consequence is that UTF-8-encoded
identifiers come out in UTF-8:
scshunt at natural-flavours:~$ gcc -fextended-identifiers -std=c99 -x c -c -o
test.o -
int i\u03bb;
scshunt at natural-flavours:~$ nm test.o
00000004 C iλ
scshunt at natural-flavours:~$
As you can see, the nm output includes the literal lambda.
Sean
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://...
2012 Mar 30
2
[LLVMdev] Mangling of UTF-8 characters in symbol names
Why is it that high (>127) bytes in symbol names get mangled by LLVM into
_XX_, where XX is the hex representation of the character? Is this required
by ELF or some similar standard? This behavior is inconsistent with GCC.
Sean
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
2012 Mar 30
0
[LLVMdev] Mangling of UTF-8 characters in symbol names
On Fri, Mar 30, 2012 at 12:12 PM, Sean Hunt <scshunt at csclub.uwaterloo.ca> wrote:
> Why is it that high (>127) bytes in symbol names get mangled by LLVM into
> _XX_, where XX is the hex representation of the character? Is this required
> by ELF or some similar standard? This behavior is inconsistent with GCC.
I think it's just so that we have a way to actually write out
2012 Mar 31
0
[LLVMdev] Mangling of UTF-8 characters in symbol names
...that we have a way to actually write out the
>> symbol into the assembly file. What does gcc do?
>>
>> -Eli
>>
>
> It emits the high bits literally. The consequence is that UTF-8-encoded
> identifiers come out in UTF-8:
>
> scshunt at natural-flavours:~$ gcc -fextended-identifiers -std=c99 -x c -c -o
> test.o -
> int i\u03bb;
> scshunt at natural-flavours:~$ nm test.o
> 00000004 C iλ
> scshunt at natural-flavours:~$
>
> As you can see, the nm output includes the literal lambda.
Okay... then we should probably support that as well. Might n...