On Tue, 17 Apr 2012 08:38:18 -0400 Rafael EspĂndola <rafael.espindola at gmail.com> wrote:> > It also has the problem that there's nothing preventing a > > pass like GVN from merging a !fpmath-decorated operation with an > > undecorated operation, and nothing requiring it to strip > > the !fpmath tag when it does so. (This is a general problem > > with LLVM metadata semantics.) > > Can it? My impression from the last discussion on alias analysis is > that a pass can drop metadata it doesn't know, but it cannot ignore > it, so GVN would have to drop the metadata in this case.I agree; I think that GVN should keep metadata that is the same and drop metadata that differs. GVN might want to understand how to merge certain kinds of metadata (line numbers in debug info?), but that is not really necessary. In reality, metadata might want to have merging rules and a defined API so that metadata merging can be used by a number of passes (vectorization will need this too). For example, merging fp-accuracy data should choose the more-stringent bound, but not really be removed all together. -Hal> > > Dan > > > > Cheers, > Rafael > _______________________________________________ > LLVM Developers mailing list > LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu > http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev-- Hal Finkel Postdoctoral Appointee Leadership Computing Facility Argonne National Laboratory
2012/4/17 Hal Finkel <hfinkel at anl.gov>:> In reality, metadata might want to have merging rules and a defined API > so that metadata merging can be used by a number of passes > (vectorization will need this too). For example, merging fp-accuracy > data should choose the more-stringent bound, but not really be removed > all together.+1! The original intent of metadata was to represent spurious concepts, and the design was that is was always safe to ignore more to avoid the complex semantics than specific intent. Debug data should never be discarded if the user specified -g. It can be merged, changed, but never discarded. Otherwise, what's the point? Vectorisation the same, if you do some transformation based on one information and need a subsequent pass to finish the same transformation, and you destroy the metadata, you get incorrect results. With FP precision, you might be able to use different instructions or ignore specific traps on more relaxed models, and the back-end can only know that if you kept the metadata until the end. I appreciate the cost of strict metadata semantics to the IR, but I think we're at a point that either we use a decent metadata engine or none at all. -- cheers, --renato http://systemcall.org/
On Tue, Apr 17, 2012 at 9:16 AM, Renato Golin <rengolin at systemcall.org> wrote:> With FP precision, you might be able to use different instructions or > ignore specific traps on more relaxed models, and the back-end can > only know that if you kept the metadata until the end. > > I appreciate the cost of strict metadata semantics to the IR, but I > think we're at a point that either we use a decent metadata engine or > none at all.The point isn't whether it's a good idea to discard FP precision data (or vectorization data, or debug data), but rather whether IR transformations are allowed to treat a meta-data annotated instruction as having the same semantics as an unannotated instruction. If transformation passes which aren't metadata aware aren't allowed to reason about an annotated instruction, there's no point to metadata in the first place: we can just introduce new intrinsics/instructions, or change the definition of an existing instruction/intrinsic. -Eli