Json does not support Mapping and MutableMapping

Have you read my example in this reply?

In the example, I added custom key to store type information. It is used when deserializing.

    if isinsintance(obj, MyMapping):
        d = {"__type__": "MyMapping"}
        d.update(obj._data)
        return d

If default is not called, this example is broken.

Excuse me, but adding a custom key to the map to signal how to deserialize the JSON seems not to me a good practice. This way if you want to deserialize it to a dict, you have to know that the JSON has that strange key, and remove it manually. If I see something in the code, I banter the coder :smiley: If the code will broke because the company was so bold to update its Python version on its machines, well, I think they simply will remain to the previous Python version, or will fix the code.

Seriously, deserialization can’t know the type you want to deserialize. json gives you set of standard conversions, and this is really handy. But if you want to deserialize to a custom type, you have to tell to the deserializer the custom type, as any decent JSON library in any language do.

If you want to magically deserialize your data in a custom object, you wrong the data type. You have to use XML + XSD, or a typed JSON, or something similar.

So, IMHO the correct way to deserialize the JSON to MyMap is simply:

def custom_deserialize(d):
    return MyMap(d)

or, simpler:

d = json.loads(json_data)
mm = MyMap(d)

This is because, unluckily, json does not permit you to customize the conversion table. If this will be possible, not only you can directly convert JSON objects to MyMap, but also arrays to tuples or numpy.ndarrays, and so on. Buy I suppose this is another problem.

Is it good practice or not is not a problem here.
Removing the ability of customizing serialization of custom mapping types is backward incompatible change. It is the point.

json.loads provides object_hook callback for it, as I demonstrated in the example code.

object_hook example in the official document also use the “bad practice”.

# https://docs.python.org/3/library/json.html
>>> import json
>>> def as_complex(dct):
...     if '__complex__' in dct:
...         return complex(dct['real'], dct['imag'])
...     return dct
...
>>> json.loads('{"__complex__": true, "real": 1, "imag": 2}',
...     object_hook=as_complex)
(1+2j)

There are many customization in the world. The proposed change will break some of them.
Backward compatibility is a strong reason to keep status quo.
If you want to break backward compatibility, you need to show more strong and clear benefit.

1 Like

By the way, if you are thinking about frozendict, I’m +1 to support it by default when it is added in stdlib or built-in.
There is no backward compatibility issue for the new type.

I don’t think a developer should care about making incompatible changes of bad code, or no change will be ever made. I mean, there are people that uses “private” attributes of classes. If the language change and that “private” attribute changes or goes away, well, it’s your fault.

An alternative could be simply extending custom serialization also to builtin types. Indeed, I don’t know why I can’t be free to customize also default serializations. This will break nothing, since in past no builtin serialization was customized, simply because it was impossible.

Nope, but thank you :slight_smile:

I realize you think that, and you are perfectly entitled to in your own libraries. But the Python devs are more conservative than you are.

This is an area where we’d be okay breaking existing code: if it uses an undocumented interface. But even then, we’d need to have a good reason to break such code.

2 Likes

@ericvsmith

Yes, Python 3 is the perfect example… :smiley:

My last proposal will break no old code. To recap:

  1. let the default parameter of json.dump to manage also objects natively supported by json
  2. add native support to collections.abc classes and subclasses, and to any object that seems to be a supported-like object, even with a slower, generic algorithm

“If it walks like a duck, it’s probably a duck” :slight_smile:

1 Like

To the best of my estimation https://bugs.python.org/issue34858 is the open bugs.python.org issue for the problem being discussed here.

1 Like

See also https://bugs.python.org/issue30343 about extensibility of JSONEncoder.

1 Like

I, too, believe json.dump should respect the ABCs. Part of the point of e.g. Mapping is precisely to allow people to not have to use dict. A variety of other APIs in the stdlib respect them. Some special case the builtin types and use their internal APIs to gain performance, and then fall back to the more generic ABC apis. This is very reasonable behavior.

For reference: