Displaying 3 results from an estimated 3 matches for "cmp_func".
2018 Feb 09
0
retpoline mitigation and 6.0
... retl # Early boot, so it hasn't been turned into a proper retpoline yet
----------------
IN:
0xc136fefb: 8d 34 7e leal (%esi, %edi, 2), %esi
(gdb) list *0xc136fef6
0xc136fef6 is in sort (lib/sort.c:87).
82 if (c < n - size &&
83 cmp_func(base + c, base + c + size) < 0)
84 c += size;
85 if (cmp_func(base + r, base + c) >= 0)
86 break;
87 swap_func(base + r, base + c, size);
88 }
89 }
90
91 /* sort */
You're pushing the target (-0x20(%ebp)) onto the stack and then
*calling* __x86_indirect_thunk. So it lo...
2018 Feb 09
2
retpoline mitigation and 6.0
On Fri, 2018-02-09 at 01:18 +0000, David Woodhouse wrote:
>
> For now I'm just going to attempt to work around it like this in the
> kernel, so I can concentrate on the retpoline bits:
> http://david.woodhou.se/clang-percpu-hack.patch
32-bit doesn't boot. Built without CONFIG_RETPOLINE and with Clang 5.0
(and the above patch) it does. I'm rebuilding a Release build of
2018 Feb 09
2
retpoline mitigation and 6.0
...-------------
> IN:
> 0xc136fefb: 8d 34 7e leal (%esi, %edi, 2), %esi
>
>
> (gdb) list *0xc136fef6
> 0xc136fef6 is in sort (lib/sort.c:87).
> 82 if (c < n - size &&
> 83 cmp_func(base + c, base +
> c + size) < 0)
> 84 c += size;
> 85 if (cmp_func(base + r, base + c) >= 0)
> 86 break;
> 87 swap_func(base + r, base + c, size)...