How to serialize SqlAlchemy result to JSON?

Issue

Django has some good automatic serialization of ORM models returned from DB to JSON format.

How to serialize SQLAlchemy query result to JSON format?

I tried jsonpickle.encode but it encodes query object itself.
I tried json.dumps(items) but it returns

TypeError: <Product('3', 'some name', 'some desc')> is not JSON serializable

Is it really so hard to serialize SQLAlchemy ORM objects to JSON /XML? Isn’t there any default serializer for it? It’s very common task to serialize ORM query results nowadays.

What I need is just to return JSON or XML data representation of SQLAlchemy query result.

SQLAlchemy objects query result in JSON/XML format is needed to be used in javascript datagird (JQGrid http://www.trirand.com/blog/)

Solution

A flat implementation

You could use something like this:

from sqlalchemy.ext.declarative import DeclarativeMeta

class AlchemyEncoder(json.JSONEncoder):

    def default(self, obj):
        if isinstance(obj.__class__, DeclarativeMeta):
            # an SQLAlchemy class
            fields = {}
            for field in [x for x in dir(obj) if not x.startswith('_') and x != 'metadata']:
                data = obj.__getattribute__(field)
                try:
                    json.dumps(data) # this will fail on non-encodable values, like other classes
                    fields[field] = data
                except TypeError:
                    fields[field] = None
            # a json-encodable dict
            return fields

        return json.JSONEncoder.default(self, obj)

and then convert to JSON using:

c = YourAlchemyClass()
print json.dumps(c, cls=AlchemyEncoder)

It will ignore fields that are not encodable (set them to ‘None’).

It doesn’t auto-expand relations (since this could lead to self-references, and loop forever).

A recursive, non-circular implementation

If, however, you’d rather loop forever, you could use:

from sqlalchemy.ext.declarative import DeclarativeMeta

def new_alchemy_encoder():
    _visited_objs = []

    class AlchemyEncoder(json.JSONEncoder):
        def default(self, obj):
            if isinstance(obj.__class__, DeclarativeMeta):
                # don't re-visit self
                if obj in _visited_objs:
                    return None
                _visited_objs.append(obj)

                # an SQLAlchemy class
                fields = {}
                for field in [x for x in dir(obj) if not x.startswith('_') and x != 'metadata']:
                    fields[field] = obj.__getattribute__(field)
                # a json-encodable dict
                return fields

            return json.JSONEncoder.default(self, obj)

    return AlchemyEncoder

And then encode objects using:

print json.dumps(e, cls=new_alchemy_encoder(), check_circular=False)

This would encode all children, and all their children, and all their children… Potentially encode your entire database, basically. When it reaches something its encoded before, it will encode it as ‘None’.

A recursive, possibly-circular, selective implementation

Another alternative, probably better, is to be able to specify the fields you want to expand:

def new_alchemy_encoder(revisit_self = False, fields_to_expand = []):
    _visited_objs = []

    class AlchemyEncoder(json.JSONEncoder):
        def default(self, obj):
            if isinstance(obj.__class__, DeclarativeMeta):
                # don't re-visit self
                if revisit_self:
                    if obj in _visited_objs:
                        return None
                    _visited_objs.append(obj)

                # go through each field in this SQLalchemy class
                fields = {}
                for field in [x for x in dir(obj) if not x.startswith('_') and x != 'metadata']:
                    val = obj.__getattribute__(field)

                    # is this field another SQLalchemy object, or a list of SQLalchemy objects?
                    if isinstance(val.__class__, DeclarativeMeta) or (isinstance(val, list) and len(val) > 0 and isinstance(val[0].__class__, DeclarativeMeta)):
                        # unless we're expanding this field, stop here
                        if field not in fields_to_expand:
                            # not expanding this field: set it to None and continue
                            fields[field] = None
                            continue

                    fields[field] = val
                # a json-encodable dict
                return fields

            return json.JSONEncoder.default(self, obj)

    return AlchemyEncoder

You can now call it with:

print json.dumps(e, cls=new_alchemy_encoder(False, ['parents']), check_circular=False)

To only expand SQLAlchemy fields called ‘parents’, for example.

Answered By – Sasha B

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published