similar to: [LLVMdev] Preferring to use GCC instead of LLVM

Displaying 20 results from an estimated 40000 matches similar to: "[LLVMdev] Preferring to use GCC instead of LLVM"

2008 May 11
0
[LLVMdev] Preferring to use GCC instead of LLVM
On May 10, 2008, at 7:55 PM, kr512 wrote: >> You are seriously ignorant of what LLVM is all about. >> Please go inform yourself. > > Alright, I read some more on llvm.org and it confirmed what > I was saying: > http://www.llvm.org/docs/GettingStarted.html#tutorial > > See at the end where it says: > ----------- > 6.Compile the program to native assembly using
2008 May 11
0
[LLVMdev] Preferring to use GCC instead of LLVM
Is this thread suposed to be a bad joke? 2008/5/10 kr512 <kr512 at optusnet.com.au>: > > Chris Lattner wrote: >> If you'd prefer to use GCC, go for it. No one is forcing >> you to use LLVM. > > No, we would prefer to use LLVM, but a missing part in LLVM > makes it difficult. It would be wonderful if this missing > part could be supplied. > >> You
2008 May 11
9
[LLVMdev] Preferring to use GCC instead of LLVM
Not that I sympathize with the OP's manners but... Bill Wendling <isanbard at gmail.com> writes: > On May 10, 2008, at 7:55 PM, kr512 wrote: > >> See how gcc is invoked to generate the final executable >> file. This means LLVM is an incomplete backend, >> unfortunately. >> > That's only a convenience. GCC generates assembly code too and calls
2008 May 13
9
[LLVMdev] Preferring to use GCC instead of LLVM
Jon Harrop wrote: > Can you explain why you would like to generate DLLs on the > customer's computer rather than using LLVM as a JIT > compiler? Customers/clients unhappy with the inefficiency, extra CPU and RAM usage, and performance penalty of JIT. They require a faster, more efficient solution. The solution is to fully compile programs to native code at the time of
2008 May 11
0
[LLVMdev] Preferring to use GCC instead of LLVM
On May 11, 2008, at 9:36 AM, Óscar Fuentes wrote: > > Not that I sympathize with the OP's manners but... > > Bill Wendling <isanbard at gmail.com> writes: >> >> That's only a convenience. GCC generates assembly code too and calls >> the assembler and linker as part of it's execution. You are perfectly >> able to call the assembler & linker
2008 May 11
1
[LLVMdev] Preferring to use GCC instead of LLVM
On May 10, 2008, at 8:41 PM, Emílio Wuerges wrote: > Is this thread suposed to be a bad joke? I thought jokes were funny? ;-) -Chris
2008 May 10
4
[LLVMdev] Preferring to use GCC instead of LLVM
Oh another thing, consider this question that some people will be asking: Why not use GCC to do what LLVM does, and skip the hassle of using LLVM entirely? ESPECIALLY considering that LLVM cannot be used without GCC. Even if you are using LLVM as a back-end only, for compiling LLVM bytecode only, GCC is still required to convert the "llc" output assembly .S file into a
2008 May 13
5
[LLVMdev] Preferring to use GCC instead of LLVM
me22.ca wrote: > You said that if I have to install GCC, you might as well > just use it for everything. That statement very clearly > doesn't apply anymore, since it's binutils that's the > dependency. Or if you still stand by it, it means that > you consider GCC to also be "incomplete". How do I get the necessary binutils on Windoze? Install MinGW or
2008 May 13
7
[LLVMdev] LLVM as a DLL
Michael T. Richter wrote: > Apparently the APIs in the LLVM docs missed your > attention. They're sneaky that way because, you know, > they just form the bulk of available documentation. I began my original message saying that I was providing "constructive criticism". That means I want to HELP if I can. Your sarcastic attitude is unprofessional. > The
2008 May 13
3
[LLVMdev] Preferring to use GCC instead of LLVM
Owen Anderson wrote: > There's nothing particularly stopping you from having your > installation package include copies of gas and ld, I disagree. gas and ld are not available on Windoze, except via MinGW. Yes I can make or tell my customers to install MinGW, but if MinGW is installed, then I don't need LLVM. (More about this further ahead) > You're welcome to think
2008 May 13
4
[LLVMdev] Preferring to use GCC instead of LLVM
Jon Harrop wrote: > So LLVM has relatively poor support for Windows, no direct > support for DLL generation and the exact opposite of your > performance requirements. I see. This news is disappointing to me. > I appreciate that you have customer demands but those > demands are very unusual (and, frankly, absurd!) but you > must try to meet them regardless. Very unusual?
2008 May 13
1
[LLVMdev] Preferring to use GCC instead of LLVM
I wrote: > The Solution: Make LLVM usable as a DLL or SLL in Windoze, > capable of generating a finished ready-to-execute .EXE or > .DLL file, without requiring that MinGW or Cygwin be > installed first. Michael T. Richter replied: > You will be welcomed with open arms by the LLVM community > when you write this. I look forward to your announcement > with bated breath.
2007 Aug 16
2
[LLVMdev] Changing basic blocks
On Wed, 15 Aug 2007, [ISO-8859-1] Em�lio Wuerges wrote: > -- > int total = BB->size(); > std::vector<MachineInstr*> positionmap(total); > for (int i = 0; i< total; ++i) > positionmap.push_back(BB->remove(BB->begin())); > for(int i = 0; i< total; ++i) > BB->push_back(positionmap[i]); > -- This doesn't do what you think. This line:
2007 Aug 17
0
[LLVMdev] Changing basic blocks
Yup, You are right. That exploded. I missed some lines in between. there was a .reserve(total) in the actual code. But, there is some side effect I still could not find. 2007/8/16, Chris Lattner <sabre at nondot.org>: > > On Wed, 15 Aug 2007, [ISO-8859-1] Emílio Wuerges wrote: > > -- > > int total = BB->size(); > > std::vector<MachineInstr*>
2008 May 13
0
[LLVMdev] LLVM as a DLL
On Tue, 2008-05-13 at 16:30 +1000, kr512 wrote: > Michael T. Richter wrote: > > Apparently the APIs in the LLVM docs missed your > > attention. They're sneaky that way because, you know, > > they just form the bulk of available documentation. > I began my original message saying that I was providing > "constructive criticism". That means I want to
2007 Aug 10
2
[LLVMdev] Changing basic blocks
For adding the nop: TII->insertNoop(*BB, BB->end()); 2007/8/9, Chris Lattner <sabre at nondot.org>: > > On Thu, 9 Aug 2007, [ISO-8859-1] Emílio Wuerges wrote: > > I too believe it should not be complicated. > > But I was not being able to do it. > > Finally, after some thinking (and tinkering), this worked like a charm: > > > > MachineInstr* mi =
2008 May 13
0
[LLVMdev] Preferring to use GCC instead of LLVM
On Tuesday 13 May 2008 06:49:34 kr512 wrote: > Jon Harrop wrote: > > Can you explain why you would like to generate DLLs on the > > customer's computer rather than using LLVM as a JIT > > compiler? > > Customers/clients unhappy with the inefficiency, extra CPU > and RAM usage, and performance penalty of JIT. They require > a faster, more efficient solution.
2007 Aug 09
4
[LLVMdev] Changing basic blocks
Hi Tanya and everybody, Ty for your support. I too believe it should not be complicated. But I was not being able to do it. For instance, I tried to run this code below: BB->push_back(&(BB->front())); BB->pop_front(); But it did not work (kinda obvious why). Nor this: BB->push_back(BB->begin()); BB->pop_front(); But also did not work. It seams the same
2007 Aug 09
0
[LLVMdev] Changing basic blocks
On Thu, 9 Aug 2007, [ISO-8859-1] Em�lio Wuerges wrote: > I too believe it should not be complicated. > But I was not being able to do it. > Finally, after some thinking (and tinkering), this worked like a charm: > > MachineInstr* mi = BB->remove(BB->begin()); > BB->push_back(mi); > > But, is there a better way to do it? This is a good way to do a single
2008 May 13
0
[LLVMdev] Preferring to use GCC instead of LLVM
> This means that LLVM requires an assembler and linker. Call it > GCC or binutils, it is irrelevant. The OP point is that LLVM > is not a self-sufficient tool on this aspect. > > Of course, if this is a serious problem for the OP, the > correct way of dealing with it is to take constructive, polite > actions for correcting it :-) I know one compiler (Free Pascal) that