On Wed, Jan 11, 2017 at 12:16 PM, Mikaël Fourrier <
mikael.fourrier at laposte.net> wrote:
> 16-bit byte was a major pain back in the day, and we never fixed all
>> known failures. In part, it's because C standard really wants 8-bit
chars.
>>
>
> So no real solution?
My memory is cloudy, since that fun was in 2012, but if I remember
correctly, the major mistake I made with that backend is the decision to
pack to 8-bit chars into one 16-bit memory word. It made impossible to have
pointers to odd chars in the string, and complicated everything. The port
might have been cleaner if we had one 8-bit char in one 16-bit word. In
this case, half of the memory for the strings is wasted, but some things
would have been easier.
Another issue was the pointer arithmetic, and there's no good answer to
that: the fixes were intrusive and non-upstreamable, and they would have
been the same, if this port is done again.
The real solution would be to modify DCPU16 to be friendlier to C
compilers. One way to achieve that is to make the registers 32-bit and
allow addressing memory at 8-bit boundaries. It's okay to keep the amount
of RAM available at low numbers, if it adds fun.
>
> Btw, why is DCPU16 still a thing? :)
>>
>
> https://github.com/techcompliant/. It's a separate team not related to
> Mojang which took the idea. They are on alpha now.
>
Oh, yes, I have heard
<https://github.com/llvm-dcpu16/llvm-dcpu16/pull/196>
of them a half a year ago. Are they rigid about using the pristine DCPU16?
If not, changes like I mentioned above, would make the problem to deliver a
decent LLVM backend much easier.
>
> Also because https://github.com/FrOSt-Foundation/cFrOSt ;)
>
Oh!
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
<http://lists.llvm.org/pipermail/llvm-dev/attachments/20170111/348749f5/attachment.html>