Talin
2009-Oct-21 03:18 UTC
[LLVMdev] A few more questions about DIFactory and source-level debugging.
Well, I am much happier now that I understand about dsymutil, and can actually step through my program in gdb. However, there are still some issues that are puzzling me. 1) First off, the debugger appears to stop at odd points. The IR for my main function looks correct to me: define i32 @"main(tart.core.Array[tart.core.String])->int"(%"tart.core.Array[tart.core.String]"* %args) { entry: call void @llvm.dbg.func.start({ }* bitcast (%llvm.dbg.subprogram.type* @llvm.dbg.subprogram to { }*)) call void @llvm.dbg.stoppoint(i32 6, i32 22, { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) %testModuleReflection = call { } @testModuleReflection() ; <{ }> [#uses=0] call void @llvm.dbg.stoppoint(i32 7, i32 19, { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) %testModuleMethods = call { } @testModuleMethods() ; <{ }> [#uses=0] call void @llvm.dbg.stoppoint(i32 8, i32 16, { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) %testFindMethod = call { } @testFindMethod() ; <{ }> [#uses=0] call void @llvm.dbg.stoppoint(i32 9, i32 10, { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) call void @llvm.dbg.region.end({ }* bitcast (%llvm.dbg.subprogram.type* @llvm.dbg.subprogram to { }*)) ret i32 0 } However, when I single step into the function, it stops at the the *second* stop point. In fact, it appears to do this fairly consistently with all functions - the very first statement is always skipped. Here's the original source (with line numbers) for reference: 1 import tart.reflect.Module; 2 import tart.reflect.Method; 3 4 @EntryPoint 5 def main(args:String[]) -> int { 6 testModuleReflection(); 7 testModuleMethods(); 8 testFindMethod(); 9 return 0; 10 } 2) Another weird thing is that I can't seem to declare function variables that are not lvalues. The DIFactory::InsertDeclare method seems to require that the Storage parameter be the result of an alloca. However, what about function arguments that are passed by value, such as ints? 3) The same issue holds for immutable local variables. My language supports the concept of a "assign once" variable (like 'final' in Java). for which I use SSA values directly rather than storage created via alloca(). Does this means that there is no way to debug such variables? 4) There seems to be something weird going on with DW_TAG_inheritance: When I print out the type in the debugger I see <> symbols: 2 = { <> = { <> = { __tib = 0x0 }, members of tart.reflect.Member: _name = 0x0, _fullName = 0x0, _kind = 0, .. etc ... I'd like to see an example of DW_TAG_inheritance in the doc. 5) I can't seem to get the structure offsets to work. Here's what my generated IR for a struct member looks like (reformatted somewhat): @llvm.dbg.derivedtype104 = internal constant %llvm.dbg.derivedtype.type { i32 458765, { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*), i8* getelementptr inbounds ([10 x i8]* @.str103, i32 0, i32 0), { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*), i32 27, i64 mul (i64 ptrtoint (%1** getelementptr (%1** null, i32 1) to i64), i64 8), i32 mul (i32 ptrtoint (%1** getelementptr (%38* null, i32 0, i32 1) to i32), i32 8), i64 mul (i64 ptrtoint (%1** getelementptr (%tart.reflect.Member* null, i64 0, i32 2) to i64), i64 8), i32 0, { }* bitcast (%llvm.dbg.derivedtype.type* @llvm.dbg.derivedtype102 to { }*) }, section "llvm.metadata" ; <%llvm.dbg.derivedtype.type*> [#uses=1] You can see the offset calculation on line 9. However, here's what dwarfdump reports for the entry: 0x00001dcd: member [10] name( "_fullName" ) type( {0x00001bbb} ( tart.core.String* ) ) decl file( "/Users/talin/Projects/tart/trunk/stdlib/tart/reflect/Module.tart" ) decl line( 27 ) data member location( +0 ) <-- huh? 6) It might be good to mention in the source-level debugging docs that line and column numbers are 1-based, not 0-based. I know that might seem obvious but it threw me off for a bit. 7) Another thing that confused me for a while is that the debugging APIs all want size, offset, and alignment values to be in bits, not bytes - but ConstantExpr::getSizeOf() et al return a size in bytes (which makse sense, since it behaves like C sizeof). More confusing, however, the comments for getSizeOf() doesn't say what units its result is in - I automatically assumed that getSizeOf and DIFactory were compatible. Probably the simplest thing would be to add a "getSizeOfInBits" method (and the same for align and offset) which could be used directly with DIFactory. Note: All of the above results were produced with the current 2.6 branch head. -- Talin -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20091020/38b0b39c/attachment.html>
Devang Patel
2009-Oct-21 18:52 UTC
[LLVMdev] A few more questions about DIFactory and source-level debugging.
Hi! On Tue, Oct 20, 2009 at 8:18 PM, Talin <viridia at gmail.com> wrote:> Well, I am much happier now that I understand about dsymutil, and can > actually step through my program in gdb. However, there are still some > issues that are puzzling me. > 1) First off, the debugger appears to stop at odd points. The IR for my main > function looks correct to me: > > define i32 > @"main(tart.core.Array[tart.core.String])->int"(%"tart.core.Array[tart.core.String]"* > %args) { > entry: > call void @llvm.dbg.func.start({ }* bitcast (%llvm.dbg.subprogram.type* > @llvm.dbg.subprogram to { }*)) > call void @llvm.dbg.stoppoint(i32 6, i32 22, { }* bitcast > (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) > %testModuleReflection = call { } @testModuleReflection() ; <{ }> [#uses=0] > call void @llvm.dbg.stoppoint(i32 7, i32 19, { }* bitcast > (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) > %testModuleMethods = call { } @testModuleMethods() ; <{ }> [#uses=0] > call void @llvm.dbg.stoppoint(i32 8, i32 16, { }* bitcast > (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) > %testFindMethod = call { } @testFindMethod() ; <{ }> [#uses=0] > call void @llvm.dbg.stoppoint(i32 9, i32 10, { }* bitcast > (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { }*)) > call void @llvm.dbg.region.end({ }* bitcast (%llvm.dbg.subprogram.type* > @llvm.dbg.subprogram to { }*)) > ret i32 0 > } > > However, when I single step into the function, it stops at the the *second* > stop point. In fact, it appears to do this fairly consistently with all > functions - the very first statement is always skipped.gdb stops at the second stop point because it thinks that the first stop point indicates beginning of arguments and second stop point points to beginning "{" in c based languages.> Here's the original source (with line numbers) for reference: > > 1 import tart.reflect.Module; > 2 import tart.reflect.Method; > 3 > 4 @EntryPoint > 5 def main(args:String[]) -> int { > 6 testModuleReflection(); > 7 testModuleMethods(); > 8 testFindMethod(); > 9 return 0; > 10 } > > 2) Another weird thing is that I can't seem to declare function variables > that are not lvalues. The DIFactory::InsertDeclare method seems to require > that the Storage parameter be the result of an alloca. However, what about > function arguments that are passed by value, such as ints? > 3) The same issue holds for immutable local variables. My language supports > the concept of a "assign once" variable (like 'final' in Java). for which I > use SSA values directly rather than storage created via alloca(). Does this > means that there is no way to debug such variables?This means, someone needs to implement ... http://nondot.org/~sabre/LLVMNotes/DebugInfoVariableInfo.txt Interested to volunteer ? I can help break this up in small tasks.> 4) There seems to be something weird going on with DW_TAG_inheritance: When > I print out the type in the debugger I see <> symbols: > > 2 = { > <> = { > <> = { > __tib = 0x0 > }, > members of tart.reflect.Member: > _name = 0x0, > _fullName = 0x0, > _kind = 0, > > .. etc ... > > I'd like to see an example of DW_TAG_inheritance in the doc.For the following c++ code class A { }; class B : public A { }; B b; llvm-gcc from trunk produces following IR. Which works (there is extra empty DT_AT_name they needs to be fixed). %struct.B = type <{ i8 }> @b = global %struct.B zeroinitializer ; <%struct.B*> [#uses=0] !llvm.dbg.gv = !{!0} !0 = metadata !{i32 458804, i32 0, metadata !1, metadata !"b", metadata !"b", metadata !"b", metadata !1, i32 4, metadata !2, i1 false, i1 true, %struct.B* @b}; [DW_TAG_variable ] !1 = metadata !{i32 458769, i32 0, i32 4, metadata !"a.cc", metadata !"/tmp/", metadata !"4.2.1 (Based on Apple Inc. build 5653) (LLVM build 00)", i1 true, i1 false, metadata !"", i32 0}; [DW_TAG_compile_unit ] !2 = metadata !{i32 458771, metadata !1, metadata !"B", metadata !1, i32 3, i64 8, i64 8, i64 0, i32 4, null, metadata !3, i32 0}; [DW_TAG_structure_type ] !3 = metadata !{metadata !4} !4 = metadata !{i32 458780, metadata !1, metadata !"", null, i32 0, i64 0, i64 0, i64 0, i32 0, metadata !5}; [DW_TAG_inheritance ] !5 = metadata !{i32 458771, metadata !1, metadata !"A", metadata !1, i32 2, i64 8, i64 8, i64 0, i32 4, null, metadata !6, i32 0}; [DW_TAG_structure_type ] !6 = metadata !{i32 0}> 5) I can't seem to get the structure offsets to work. Here's what my > generated IR for a struct member looks like (reformatted somewhat): > > @llvm.dbg.derivedtype104 = internal constant %llvm.dbg.derivedtype.type { > i32 458765, > { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { > }*), > i8* getelementptr inbounds ([10 x i8]* @.str103, i32 0, i32 0), > { }* bitcast (%llvm.dbg.compile_unit.type* @llvm.dbg.compile_unit to { > }*), > i32 27, > i64 mul (i64 ptrtoint (%1** getelementptr (%1** null, i32 1) to i64), i64 > 8), > i32 mul (i32 ptrtoint (%1** getelementptr (%38* null, i32 0, i32 1) to > i32), i32 8), > i64 mul (i64 ptrtoint (%1** getelementptr (%tart.reflect.Member* null, i64 > 0, i32 2) to i64), i64 8), > i32 0, > { }* bitcast (%llvm.dbg.derivedtype.type* @llvm.dbg.derivedtype102 to { > }*) }, > section "llvm.metadata" ; <%llvm.dbg.derivedtype.type*> [#uses=1] > > You can see the offset calculation on line 9. However, here's what dwarfdump > reports for the entry: > 0x00001dcd: member [10] > name( "_fullName" ) > type( {0x00001bbb} ( tart.core.String* ) ) > decl file( > "/Users/talin/Projects/tart/trunk/stdlib/tart/reflect/Module.tart" ) > decl line( 27 ) > data member location( +0 ) <-- huh?Looks like a bug. Try to dbg DwarfDebug.cpp code where this offset info is transferred into DWARF DIE. Or file a bugzilla using IR from svn trunk (which has changed significantly for debug info).> 6) It might be good to mention in the source-level debugging docs that line > and column numbers are 1-based, not 0-based. I know that might seem obvious > but it threw me off for a bit. > 7) Another thing that confused me for a while is that the debugging APIs all > want size, offset, and alignment values to be in bits, not bytes - but > ConstantExpr::getSizeOf() et al return a size in bytes (which makse sense, > since it behaves like C sizeof). More confusing, however, the comments for > getSizeOf() doesn't say what units its result is in - I automatically > assumed that getSizeOf and DIFactory were compatible. Probably the simplest > thing would be to add a "getSizeOfInBits" method (and the same for align and > offset) which could be used directly with DIFactory. > Note: All of the above results were produced with the current 2.6 branch > head. > -- Talin > > _______________________________________________ > LLVM Developers mailing list > LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu > http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev > >- Devang