Displaying 4 results from an estimated 4 matches for "gradiations".
2018 Apr 04
0
[RFC] Adding function attributes to represent codegen optimization level
...e way special semantically distinct properties
(optnone being "good for debugging" (or good for debugging compilers -
what's the baseline behavior before optimizations are applied), optsize
being "make this fit into something it wouldn't otherwise fit into") but
that the gradiations of -ON didn't fit into this kind of model & wouldn't
ever be implemented as function attributes.
CC'd Chandler & Eric who I think had opinions/were involved in those
previous discussions.
>
> Assuming the argument is reasonable (it make sense to me), I was hoping
> t...
2018 Apr 04
2
[RFC] Adding function attributes to represent codegen optimization level
...e way special semantically distinct properties (optnone being "good for debugging" (or good for debugging compilers - what's the baseline behavior before optimizations are applied), optsize being "make this fit into something it wouldn't otherwise fit into") but that the gradiations of -ON didn't fit into this kind of model & wouldn't ever be implemented as function attributes.
CC'd Chandler & Eric who I think had opinions/were involved in those previous discussions.
Assuming the argument is reasonable (it make sense to me), I was hoping
to solicit fee...
2018 Apr 03
5
[RFC] Adding function attributes to represent codegen optimization level
All,
A recent commit, D43040/r324557, changed the behavior of the gold plugin
when compiling with LTO. The change now causes the codegen optimization
level to default to CodeGenOpt::Default (i.e., -O2) rather than use the
LTO optimization level. The argument was made that the LTO optimization
level should control the amount of cross-module optimizations done by
LTO, but it should not
2018 Apr 04
0
[RFC] Adding function attributes to represent codegen optimization level
...ly distinct
> properties (optnone being "good for debugging" (or good for debugging
> compilers - what's the baseline behavior before optimizations are
> applied), optsize being "make this fit into something it wouldn't
> otherwise fit into") but that the gradiations of -ON didn't fit into
> this kind of model & wouldn't ever be implemented as function attributes.
>
> CC'd Chandler & Eric who I think had opinions/were involved in those
> previous discussions.
>
>
> Assuming the argument is reasonable (it make sense...