summaryrefslogtreecommitdiff
path: root/include/dm
diff options
context:
space:
mode:
authorTrent Piepho <tpiepho@impinj.com>2019-05-10 17:48:20 +0000
committerSimon Glass <sjg@chromium.org>2019-05-21 17:33:23 -0600
commitb061ef39c350c288542536b09dc01d9e984a12ac (patch)
treed8bab333c9261a53eb0669f8d2595d4de3028e4a /include/dm
parent347ea0b63eb5143bf0e48aba65a41f50999367f0 (diff)
core: ofnode: Have ofnode_read_u32_default return a u32
It was returning an int, which doesn't work if the u32 it is reading, or the default value, will overflow a signed int. While it could be made to work, when using a C standard/compiler where casting negative signed values to unsigned has a defined behavior, combined with careful casting, it seems obvious one is meant to use ofnode_read_s32_default() with signed values. Cc: Simon Glass <sjg@chromium.org> Signed-off-by: Trent Piepho <tpiepho@impinj.com>
Diffstat (limited to 'include/dm')
-rw-r--r--include/dm/ofnode.h2
1 files changed, 1 insertions, 1 deletions
diff --git a/include/dm/ofnode.h b/include/dm/ofnode.h
index 1be5ba4b45..4ab2ae1ba5 100644
--- a/include/dm/ofnode.h
+++ b/include/dm/ofnode.h
@@ -224,7 +224,7 @@ static inline int ofnode_read_s32(ofnode node, const char *propname,
* @def: default value to return if the property has no value
* @return property value, or @def if not found
*/
-int ofnode_read_u32_default(ofnode ref, const char *propname, u32 def);
+u32 ofnode_read_u32_default(ofnode ref, const char *propname, u32 def);
/**
* ofnode_read_s32_default() - Read a 32-bit integer from a property