Hi Mikaël!
16-bit byte was a major pain back in the day, and we never fixed all known
failures. In part, it's because C standard really wants 8-bit chars.
Btw, why is DCPU16 still a thing? :)
On Jan 11, 2017 12:06 PM, "Mikaël Fourrier via llvm-dev" <
llvm-dev at lists.llvm.org> wrote:
Hi.
I'm working on a backend for the [DCPU16](https://github.com/te
chcompliant/TC-Specs/blob/master/CPU/DCPU.md), a fictional CPU. The main
subtlety is that the bytes are 16 bits instead of 8. There is already a
[working backend](https://github.com/krasin/llvm-dcpu16), but it does a lot
of source modification to support 16 bit words. I try to update it to
latest llvm, but it obviously fails since the new code assumes 1 word == 8
bits. Any idea of a robust way to do such backend?
Have a good day,
Mikaël
_______________________________________________
LLVM Developers mailing list
llvm-dev at lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/llvm-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
<http://lists.llvm.org/pipermail/llvm-dev/attachments/20170111/0f31b38d/attachment.html>