You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For the v0.12 release, the variable `enable_python_native_blobs` can be
28
-
safely enabled for improved blob support of python datatypes if the following
29
-
are true:
27
+
DataJoint 0.12 adds full support for all native python data types in blobs: tuples, lists, sets, dicts, strings, bytes, `None`, and all their recursive combinations.
28
+
The new blobs are a superset of the old functionality and are fully backward compatible.
29
+
In previous versions, only MATLAB-style numerical arrays were fully supported.
30
+
Some Python datatypes such as dicts were coerced into numpy recarrays and then fetched as such.
30
31
31
-
* This is a new DataJoint installation / pipeline(s)
32
-
* You have not used DataJoint prior to v0.12 with your pipeline(s)
33
-
* You do not share blob data between Python and Matlab
32
+
However, since some Python types were coerced into MATLAB types, old blobs and new blobs may now be fetched as different types of objects even if they were inserted the same way.
33
+
For example, new `dict` objects will be returned as `dict` while the same types of objects inserted with `datajoint 0.11` will be recarrays.
34
34
35
-
Otherwise, please read the following carefully:
35
+
Since this is a big change, we chose to disable full blob support by default as a temporary precaution, which will be removed in version 0.13.
36
+
37
+
You may enable it by setting the `enable_python_native_blobs` flag in `dj.config`.
38
+
39
+
```python
40
+
import datajoint as dj
41
+
dj.config["enable_python_native_blobs"] =True
42
+
```
43
+
44
+
You can safely enable this setting if both of the following are true:
45
+
46
+
* The only kinds of blobs your pipeline have inserted previously were numerical arrays.
47
+
* You do not need to share blob data between Python and MATLAB.
48
+
49
+
Otherwise, read the following explanation.
36
50
37
51
DataJoint v0.12 expands DataJoint's blob serialization mechanism with
38
52
improved support for complex native python datatypes, such as dictionaries
39
53
and lists of strings.
40
54
41
55
Prior to DataJoint v0.12, certain python native datatypes such as
42
56
dictionaries were 'squashed' into numpy structured arrays when saved into
43
-
blob attributes. This facilitated easier data sharing between Matlab
57
+
blob attributes. This facilitated easier data sharing between MATLAB
44
58
and Python for certain record types. However, this created a discrepancy
45
59
between insert and fetch datatypes which could cause problems in other
46
60
portions of users pipelines.
47
61
48
-
For v0.12, it was decided to remove the type squashing behavior, instead
49
-
creating a separate storage encoding which improves support for storing
50
-
native python datatypes in blobs without squashing them into numpy
51
-
structured arrays. However, this change creates a compatibility problem
52
-
for pipelines which previously relied on the type squashing behavior
53
-
since records saved via the old squashing format will continue to fetch
62
+
DataJoint v0.12, removes the squashing behavior, instead encoding native python datatypes in blobs directly.
63
+
However, this change creates a compatibility problem for pipelines
64
+
which previously relied on the type squashing behavior since records
65
+
saved via the old squashing format will continue to fetch
54
66
as structured arrays, whereas new record inserted in DataJoint 0.12 with
55
67
`enable_python_native_blobs` would result in records returned as the
56
-
appropriate native python type (dict, etc). Read support for python
57
-
native blobs also not yet implemented in DataJoint for Matlab.
68
+
appropriate native python type (dict, etc).
69
+
Furthermore, DataJoint for MATLAB does not yet support unpacking native Python datatypes.
58
70
59
-
To prevent data from being stored in mixed format within a table across
60
-
upgrades from previous versions of DataJoint, the
61
-
`enable_python_native_blobs` flag was added as a temporary guard measure
62
-
for the 0.12 release. This flag will trigger an exception if any of the
63
-
ambiguous cases are encountered during inserts in order to allow testing
64
-
and migration of pre-0.12 pipelines to 0.11 in a safe manner.
71
+
With `dj.config["enable_python_native_blobs"]` set to `False` (default),
72
+
any attempt to insert any datatype other than a numpy array will result in an exception.
73
+
This is meant to get users to read this message in order to allow proper testing
74
+
and migration of pre-0.12 pipelines to 0.12 in a safe manner.
65
75
66
76
The exact process to update a specific pipeline will vary depending on
67
77
the situation, but generally the following strategies may apply:
68
78
69
79
* Altering code to directly store numpy structured arrays or plain
70
80
multidimensional arrays. This strategy is likely best one for those
71
-
tables requiring compatibility with Matlab.
72
-
* Adjust code to deal with both structured array and native fetched data.
81
+
tables requiring compatibility with MATLAB.
82
+
* Adjust code to deal with both structured array and native fetched data
83
+
for those tables that are populated with `dict`s in blobs in pre-0.12 version.
73
84
In this case, insert logic is not adjusted, but downstream consumers
74
85
are adjusted to handle records saved under the old and new schemes.
75
-
* Manually convert data using fetch/insert into a fresh schema.
76
-
In this approach, DataJoint's create_virtual_module functionality would
77
-
be used in conjunction with a a fetch/convert/insert loop to update
78
-
the data to the new native_blob functionality.
86
+
* Migrate data into a fresh schema, fetching the old data, converting blobs to
87
+
a uniform data type and re-inserting.
79
88
* Drop/Recompute imported/computed tables to ensure they are in the new
0 commit comments