mirror of
https://github.com/torvalds/linux.git
synced 2024-12-12 22:23:55 +00:00
5ccd35287e
show_ldttss() shifts desc.base2 by 24 bit, but base2 is 8 bits of a
bitfield in a u16.
Due to the really great idea of integer promotion in C99 base2 is promoted
to an int, because that's the standard defined behaviour when all values
which can be represented by base2 fit into an int.
Now if bit 7 is set in desc.base2 the result of the shift left by 24 makes
the resulting integer negative and the following conversion to unsigned
long legitmately sign extends first causing the upper bits 32 bits to be
set in the result.
Fix this by casting desc.base2 to unsigned long before the shift.
Detected by CoverityScan, CID#1475635 ("Unintended sign extension")
[ tglx: Reworded the changelog a bit as I actually had to lookup
the standard (again) to decode the original one. ]
Fixes:
|
||
---|---|---|
.. | ||
boot | ||
configs | ||
crypto | ||
entry | ||
events | ||
hyperv | ||
ia32 | ||
include | ||
kernel | ||
kvm | ||
lib | ||
math-emu | ||
mm | ||
net | ||
oprofile | ||
pci | ||
platform | ||
power | ||
purgatory | ||
ras | ||
realmode | ||
tools | ||
um | ||
video | ||
xen | ||
.gitignore | ||
Kbuild | ||
Kconfig | ||
Kconfig.cpu | ||
Kconfig.debug | ||
Makefile | ||
Makefile_32.cpu | ||
Makefile.um |