ARM: kernel: avoid "p" constraint in prefetch()

GCC appears to have a bug where the use of the "p" constraint in
inline assembly may result in the -mword-relocations command line
option to be ignored. If building for armv7-a or later, this may
result in the use of movw and movt instructions to refer to absolute
symbol addresses, which is not supported by the PIE linker (due to
the fact that ELF does not define dynamic relocation types for
relocating the immediate field in such instructions), breaking the
KASLR build.

We cannot switch to the "m" or "Q" constraint, since that could cause
GCC to infer incorrectly that the pointer is valid, and elide subsequent
NULL checks for the pointer under the assumption that the dereference by
prefetch() would have resulted in a crash otherwise. Commit 16f719de6280
("[ARM] 5196/1: fix inline asm constraints for preload") has more details.

So instead, switch to the "r" constraint, which does not trigger this
issue. Note that this is unlikely to result in worse code, given that
GCC does not appear to use offset addressing when using the "p" constraint,
and so it already allocates a register to hold the argument to pld.

Note that GCC's __builtin_prefetch is equally affected, so avoid that
as well, and use our own definitions unconditionally.

Signed-off-by: Ard Biesheuvel <ard.biesheuvel@linaro.org>
diff --git a/arch/arm/include/asm/processor.h b/arch/arm/include/asm/processor.h
index 3667b39..ede058c 100644
--- a/arch/arm/include/asm/processor.h
+++ b/arch/arm/include/asm/processor.h
@@ -102,30 +102,32 @@
 /*
  * Prefetching support - only ARMv5.
  */
-#if __LINUX_ARM_ARCH__ >= 5
-
 #define ARCH_HAS_PREFETCH
+#define ARCH_HAS_PREFETCHW
+
 static inline void prefetch(const void *ptr)
 {
+#if __LINUX_ARM_ARCH__ >= 5
 	__asm__ __volatile__(
 		"pld\t%a0"
-		:: "p" (ptr));
+		:: "r" (ptr));
+#endif
 }
 
-#if __LINUX_ARM_ARCH__ >= 7 && defined(CONFIG_SMP)
-#define ARCH_HAS_PREFETCHW
 static inline void prefetchw(const void *ptr)
 {
+#if __LINUX_ARM_ARCH__ >= 7 && defined(CONFIG_SMP)
 	__asm__ __volatile__(
 		".arch_extension	mp\n"
 		__ALT_SMP_ASM(
 			WASM(pldw)		"\t%a0",
 			WASM(pld)		"\t%a0"
 		)
-		:: "p" (ptr));
+		:: "r" (ptr));
+#else
+	prefetch(ptr);
+#endif
 }
-#endif
-#endif
 
 #define HAVE_ARCH_PICK_MMAP_LAYOUT