x86/emul: Calculate not_64bit during instruction decode
authorAndrew Cooper <andrew.cooper3@citrix.com>
Fri, 13 Jan 2017 13:23:42 +0000 (13:23 +0000)
committerAndrew Cooper <andrew.cooper3@citrix.com>
Mon, 16 Jan 2017 17:37:26 +0000 (17:37 +0000)
commit9b1f6622b68145931d6ff93ff4f37e6666bbcae1
tree0b8939f87f3a5205274d2456f6084461dffdfa59
parent90051a4ce58531a7dbcf193ded8091d26d16ea13
x86/emul: Calculate not_64bit during instruction decode

... rather than repeating "generate_exception_if(mode_64bit(), EXC_UD);" in
the emulation switch statement.

Bloat-o-meter shows:

  add/remove: 0/0 grow/shrink: 1/2 up/down: 8/-495 (-487)
  function                                     old     new   delta
  per_cpu__state                                98     106      +8
  x86_decode                                  6782    6726     -56
  x86_emulate                                57160   56721    -439

The reason for x86_decode() getting smaller is that this change alters the
x86_decode_onebyte() switch statement from a chain of if()/else's to a jump
table.  The jump table adds 250 bytes of data which bloat-o-meter clearly
can't see.

Signed-off-by: Andrew Cooper <andrew.cooper3@citrix.com>
Reviewed-by: Jan Beulich <jbeulich@suse.com>
xen/arch/x86/x86_emulate/x86_emulate.c