In this solution to convert a dot notation string to a nested dictionary and set a value, I coudn't understand why the desired result was achieved. Line 4 in particular didn't look right. So I tried
obj, path, value = {}, 'test.test1', 10
*path, last = path.split(".")
for bit in path:
obj = obj.setdefault(bit, {})
obj[last] = value
print(obj)
Output
{'test1': 10}
This confirmed my suspicions as the expected result is {'test': {'test1': 10}}
.
But when the code (added print
statements to see what is happening) runs inside a function the dictionary changes to the desired result only outside the function call.
def setv(obj, path, value):
*path, last = path.split(".")
for bit in path:
obj = obj.setdefault(bit, {})
print(obj)
obj[last] = value
print(obj)
obj, path, value = {}, 'test.test1', 10
setv(obj, path, value)
print(obj)
Output
{} # inside the function call
{'test1': 10} # inside the function call
{'test': {'test1': 10}} # after the function call
What subtle difference am I missing here that explains this behavior?
CodePudding user response:
Inside setv
, at the time it is printed, obj
is a local variable, because there was an assignment obj = ...
.
The assignment shadows the global variable (or the argument, they are identical) obj
.
You can verify it by printing id(obj)
as well.
The fact that the global obj
is a dictionary that contains a reference to the local obj
makes this more confusing but doesn't actually matter. The same can be observed with simpler objects:
>>> def f(a):
... print(a)
... a = 7
... print(a)
...
>>> a = 10
>>> f(a)
10
7
>>> print(a)
10