I’m working on a backend <https://github.com/avr-llvm/llvm> for the 8-bit AVR microcontroller architecture. As you can imagine, there is no support for hardware division. Because of this, llvm automatically lowers all divisions into calls to the runtime library - be it libgcc or compiler-rt. The runtime function in libgcc to perform 16-bit unsigned division/remainder is called __udivmodhi4. LLVM by default assumes this function to be named udivhi3 (see lib/CodeGen/TargetLoweringBase.cpp <https://github.com/llvm-mirror/llvm/blob/master/lib/CodeGen/TargetLoweringBase.cpp#L70> . udivhi3 does not exist any longer, it has been superseded by __divmodhi4, which returns the quotient and a remainder. Due to this, it has a different signature. I can change the name of the function in TargetLoweringBase.cpp, so that it points to the right function, by LLVM will not know about the change in calling convention (a custom one is used in the AVR implementation of the function), or the differing result type. I cannot figure out how to tell LLVM about the differing signature of the function. What is the best way to fix this? I can think of several solutions: - Create a pseudo instruction which pattern matches on integer divisions. Write a pass to expand this into the appropriate runtime library calls. - Manually write an implementation of the original udivhi3 function, and add it to compiler-rt. This would render binaries compiled with AVR-LLVM incompatible with libgcc. - Somehow modify LLVM to use the right calling convention, arguments, and result for the function. What is the best way to go about solving this? Thanks, Dylan -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150327/04dad4e2/attachment.html>