Friday, 23 August 2013

Python dict deserialization works in python2.7, fails in 3.3

Python dict deserialization works in python2.7, fails in 3.3

I'm using a sqlite3 table to store python dicts (utf8 content) and
serialization is done with JSON. It works fine in python2.7 but fails in
3.3.
Schema:
CREATE TABLE mytable
(id INTEGER, book TEXT NOT NULL, d JSON NOT NULL, priority INTEGER NOT
NULL DEFAULT(3),
PRIMARY KEY (id, book))
When inserting values, the dict is serialized with json.dumps(d). The
faulty part is retrieving the previously saved values.
import sys
import sqlite3
import json
filename = 'mydb.db'
sqlite3.register_converter('JSON', json.loads)
conn = sqlite3.connect(filename,
detect_types=sqlite3.PARSE_DECLTYPES|sqlite3.PARSE_COLNAMES)
c = conn.cursor()
c.execute('''SELECT book, id, d, priority FROM mytable''')
print(c.fetchall())
The above script works fine when executed with python2.7. However, using
3.3 a TypeError occures:
Traceback (most recent call last):
File "tests/py3error_debug.py", line 15, in <module>
c.execute('''SELECT book, id, d, priority FROM mytable''')
File
"/usr/local/Cellar/python3/3.3.2/Frameworks/Python.framework/Versions/3.3/lib/python3.3/json/__init__.py",
line 319, in loads
return _default_decoder.decode(s)
File
"/usr/local/Cellar/python3/3.3.2/Frameworks/Python.framework/Versions/3.3/lib/python3.3/json/decoder.py",
line 352, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
TypeError: can't use a string pattern on a bytes-like object
I can't spot an essential difference between the 2.7 and 3.3 JSON modules
(especially regarding json.loads) and I'm running out of ideas.

No comments:

Post a Comment