Quentin Colombet via llvm-dev
2018-Nov-16 17:25 UTC
[llvm-dev] [RFC] Tablegen-erated GlobalISel Combine Rules
Hi Daniel, I finally read the proposal. I have a couple of comments that re-enforce what I was thinking initially: it is too early IMHO to develop a full tablegen backend for this. I believe we still miss ways to represent things that matters (see my comments below) and I am afraid we still don't know all that needs to be represented.> It can reach across basic blocks when it's safe to do so. We may need to introduce a means to limit the inter-basic-block reach as I can foresee situations where it may pull expensive operations into a loop if it's not careful about thisI think we would need a syntax for that and in particular, I don't think a generic heuristic based on frequency can solely solve the problem (i.e., what you're hinting in your RFC). Another thing that we miss has to do with the number of users. E.g., should a pattern only be applied if there is only one user? Is it valid if there are more that one users, if we cannot move something into the destination block can we still apply the pattern is we move the result in the source block, etc. The number of users may not be that relevant, but this gives use is a notion of profitability (it is just how SDISel does this). This actually ties back to the inter-basic-block case: is this pattern profitable if it is between basic blocks with different frequencies. In other words, I believe we should start to think in terms of how profitable is a pattern. E.g., the source pattern as some cost and the destination patterns as some other cost. Applying the pattern should give us a profitability score, which would be higher when it produces dead code (i.e., what the oneUse check tries to identify in SDISel). Following that logic, we may get away with all the disabling specific patterns kind of thing. I.e., if something is not profitable it does not get applied. All in all, the syntax that your suggesting is reasonable (though IMHO MIR is going to limit our ability to share more logic with IR, but maybe that's fine to give up on that goal?), but I feel it actually distracts us from thinking about the problems we are solving and thus, I am hesitant to stamp a work that would write a new tablegen backend. That said, again, I am not the one doing the work! Cheers, -Quentin [snip]
Daniel Sanders via llvm-dev
2018-Nov-26 17:59 UTC
[llvm-dev] [RFC] Tablegen-erated GlobalISel Combine Rules
Hi Quentin, Sorry for the slow reply.> On Nov 16, 2018, at 09:25, Quentin Colombet <quentin.colombet at gmail.com> wrote: > > Hi Daniel, > > I finally read the proposal. > > I have a couple of comments that re-enforce what I was thinking > initially: it is too early IMHO to develop a full tablegen backend for > this. I believe we still miss ways to represent things that matters > (see my comments below) and I am afraid we still don't know all that > needs to be represented. > >> It can reach across basic blocks when it's safe to do so. We may need to introduce a means to limit the inter-basic-block reach as I can foresee situations where it may pull expensive operations into a loop if it's not careful about this > > I think we would need a syntax for that and in particular, I don't > think a generic heuristic based on frequency can solely solve the > problem (i.e., what you're hinting in your RFC).That's something I'm intending to look into properly as part of the algorithm work. The syntax can support whatever mechanism we need for this through GIMatchPredicate subclasses and there's room for syntactic sugar by defining new operator classes.> Another thing that we miss has to do with the number of users. E.g., > should a pattern only be applied if there is only one user? Is it > valid if there are more that one users, if we cannot move something > into the destination block can we still apply the pattern is we move > the result in the source block, etc.The existing hasOneUse() checks can be done via by defining a GIMatchPredicate like so: def hasOneUse : GIMatchPredicate<(ins reg:$A), bool, [{ return MRI.hasOneUse(${A}); }]>; and then using it in the match section. Improving on hasOneUse() to support the cases where some/all uses are combinable will require algorithm changes and I'm intending to look into that as part of that work.> The number of users may not be that relevant, but this gives use is a > notion of profitability (it is just how SDISel does this). This > actually ties back to the inter-basic-block case: is this pattern > profitable if it is between basic blocks with different frequencies. > In other words, I believe we should start to think in terms of how > profitable is a pattern. E.g., the source pattern as some cost and the > destination patterns as some other cost. Applying the pattern should > give us a profitability score, which would be higher when it produces > dead code (i.e., what the oneUse check tries to identify in SDISel). > > Following that logic, we may get away with all the disabling specific > patterns kind of thing. I.e., if something is not profitable it does > not get applied.I agree. We'll need to decide on precise means to measure profitability but the underlying mechanism to support this is the metadata. I'd expect a portion of the cost measurement to be gathered implicitly (more nodes => higher cost, some contribution from the scheduler model, etc.) and a portion to be gathered explicitly via GIMatchPredicate. That value(s) would then be delivered to the apply step via metadata.> All in all, the syntax that your suggesting is reasonable (though IMHO > MIR is going to limit our ability to share more logic with IR, but > maybe that's fine to give up on that goal?), but I feel it actually > distracts us from thinking about the problems we are solving and thus, > I am hesitant to stamp a work that would write a new tablegen backend.I think on that last point we're drawing opposite conclusions from the same thing :-). I think you see the syntax as defining the Combine algorithm (is that right?) but I see the syntax as abstracting it away from the rules. One of my key goals with this syntax is to allow the reorganization of the match and apply steps so that I can try out a different Combine algorithm without blocking the development of Combines with the current algorithm.> That said, again, I am not the one doing the work! > > Cheers, > -Quentin > [snip]
Quentin Colombet via llvm-dev
2018-Nov-26 18:11 UTC
[llvm-dev] [RFC] Tablegen-erated GlobalISel Combine Rules
Hi Daniel, Thanks for the reply! Le lun. 26 nov. 2018 à 10:00, Daniel Sanders <daniel_l_sanders at apple.com> a écrit :> > Hi Quentin, > > Sorry for the slow reply. > > > On Nov 16, 2018, at 09:25, Quentin Colombet <quentin.colombet at gmail.com> wrote: > > > > Hi Daniel, > > > > I finally read the proposal. > > > > I have a couple of comments that re-enforce what I was thinking > > initially: it is too early IMHO to develop a full tablegen backend for > > this. I believe we still miss ways to represent things that matters > > (see my comments below) and I am afraid we still don't know all that > > needs to be represented. > > > >> It can reach across basic blocks when it's safe to do so. We may need to introduce a means to limit the inter-basic-block reach as I can foresee situations where it may pull expensive operations into a loop if it's not careful about this > > > > I think we would need a syntax for that and in particular, I don't > > think a generic heuristic based on frequency can solely solve the > > problem (i.e., what you're hinting in your RFC). > > That's something I'm intending to look into properly as part of the algorithm work. The syntax can support whatever mechanism we need for this through GIMatchPredicate subclasses and there's room for syntactic sugar by defining new operator classes.That's the kind of things that worries me. My cynical view makes me think that more syntactic sugar sounds like we develop a hack on top of the suggested syntax, thus I'm pushing for more forward thinking. I admit I may be wrong and that additional syntactic sugar would fit well, but I haven't seen evidence one way or another. Therefore, again I am hesitant to stamp a new tablegen backend based on the few elements we have.> > > Another thing that we miss has to do with the number of users. E.g., > > should a pattern only be applied if there is only one user? Is it > > valid if there are more that one users, if we cannot move something > > into the destination block can we still apply the pattern is we move > > the result in the source block, etc. > > The existing hasOneUse() checks can be done via by defining a GIMatchPredicate like so: > def hasOneUse : GIMatchPredicate<(ins reg:$A), bool, [{ > return MRI.hasOneUse(${A}); > }]>; > and then using it in the match section. Improving on hasOneUse() to support the cases where some/all uses are combinable will require algorithm changes and I'm intending to look into that as part of that work. > > > The number of users may not be that relevant, but this gives use is a > > notion of profitability (it is just how SDISel does this). This > > actually ties back to the inter-basic-block case: is this pattern > > profitable if it is between basic blocks with different frequencies. > > In other words, I believe we should start to think in terms of how > > profitable is a pattern. E.g., the source pattern as some cost and the > > destination patterns as some other cost. Applying the pattern should > > give us a profitability score, which would be higher when it produces > > dead code (i.e., what the oneUse check tries to identify in SDISel). > > > > Following that logic, we may get away with all the disabling specific > > patterns kind of thing. I.e., if something is not profitable it does > > not get applied. > > I agree. We'll need to decide on precise means to measure profitability but the underlying mechanism to support this is the metadata. I'd expect a portion of the cost measurement to be gathered implicitly (more nodes => higher cost, some contribution from the scheduler model, etc.) and a portion to be gathered explicitly via GIMatchPredicate. That value(s) would then be delivered to the apply step via metadata. > > > All in all, the syntax that your suggesting is reasonable (though IMHO > > MIR is going to limit our ability to share more logic with IR, but > > maybe that's fine to give up on that goal?), but I feel it actually > > distracts us from thinking about the problems we are solving and thus, > > I am hesitant to stamp a work that would write a new tablegen backend. > > I think on that last point we're drawing opposite conclusions from the same thing :-). I think you see the syntax as defining the Combine algorithm (is that right?)No, I really worry that we tie ourselves to a syntax that is not the right abstraction to define the work. Again, where I would like to go is a description that can be used for high level combines as well and MIR seems wrong for this.> but I see the syntax as abstracting it away from the rules. One of my key goals with this syntax is to allow the reorganization of the match and apply steps so that I can try out a different Combine algorithm without blocking the development of Combines with the current algorithm. > > > That said, again, I am not the one doing the work! > > > > Cheers, > > -Quentin > > [snip] >