ndelay(): switch to C function to avoid 64-bit division

We should be able to do ndelay(some_u64), but that can cause a call to
__divdi3() to be emitted because the ndelay() macros does a divide.

Fix it by switching to static inline which will force the u64 arg to be
treated as an unsigned long.  udelay() takes an unsigned long arg.

[bunk@kernel.org: reported m68k build breakage]
Cc: Adrian Bunk <bunk@kernel.org>
Cc: Evgeniy Polyakov <johnpol@2ka.mipt.ru>
Cc: Martin Michlmayr <tbm@cyrius.com>
Cc: Herbert Xu <herbert@gondor.apana.org.au>
Cc: Ralf Baechle <ralf@linux-mips.org>
Cc: Thomas Gleixner <tglx@linutronix.de>
Cc: Ingo Molnar <mingo@elte.hu>
Cc: Adrian Bunk <bunk@kernel.org>
Signed-off-by: Andrew Morton <akpm@linux-foundation.org>
Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
This commit is contained in:
Andrew Morton 2008-03-04 14:28:45 -08:00 committed by Linus Torvalds
parent daa49ff50a
commit 5cba6d22e3

View File

@ -7,6 +7,8 @@
* Delay routines, using a pre-computed "loops_per_jiffy" value.
*/
#include <linux/kernel.h>
extern unsigned long loops_per_jiffy;
#include <asm/delay.h>
@ -32,7 +34,11 @@ extern unsigned long loops_per_jiffy;
#endif
#ifndef ndelay
#define ndelay(x) udelay(((x)+999)/1000)
static inline void ndelay(unsigned long x)
{
udelay(DIV_ROUND_UP(x, 1000));
}
#define ndelay(x) ndelay(x)
#endif
void calibrate_delay(void);