-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move to pydantic2 #312
base: master
Are you sure you want to change the base?
Move to pydantic2 #312
Conversation
@djbios this is the idea, I need to add the tests for pydantic v2 but as you see the implementation is simpler and we can basically copy the tests for v1 and adapt them for v2. |
@@ -19,7 +19,7 @@ def read(filename): | |||
|
|||
extras_require = { | |||
"marshmallow": ["marshmallow>=2.15.0"], | |||
"pydantic:python_version >= '3.6'": ["pydantic>=1.6.1"], | |||
"pydantic:python_version >= '3.6'": ["pydantic>=2.0.0"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh interesting. If I understand correctly, this suggests that Pydantic v2 includes a v1
package so that users can upgrade to v2 without needing to migrate their usage to the v2 BaseModel
directly. Is that right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They can just use 'from pydantic.v1 import BaseModel' and done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@leiserfg correct me if I misunderstood:
If my project (depending on uplink) uses pydantic v2 - the upgrade will happen seamlessly.
But if I use pydantic v1 - to upgrade the uplink I need to upgrade pydantic in my project to v2, upgrade uplink, and replace all imports of my pydantic models to pydantic.v1.BaseModel
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct, it's some work to do, but way less than having to port all your models to use pydantic 2 all at once.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, this highlights the core problem—this isn’t true backward compatibility, it’s more of a workaround. Requiring Pydantic v2 and then suggesting users import v1.BaseModel feels inconsistent. We’re essentially telling users, “We require v2, but you can sneak in your v1 models if you adjust your imports.”
From a library design perspective, it’s messy. Either we fully require v2 and force proper migration to v2 models, or we stay agnostic about which version is installed and handle both scenarios gracefully (that's how I initially understood your suggestion: we don't require v2, we detect the installed version and handle it). The current approach leaves users in a limbo where they think they’re upgrading, but they’re actually dragging legacy dependencies behind them. Feels like a recipe for confusion and potential bugs down the line imho
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I hear @djbios' point.
Seems like v1.BaseModel
is there to make the v2 upgrade easier: https://docs.pydantic.dev/latest/migration/#using-pydantic-v1-features-in-a-v1v2-environment
Thinking out loud. Let's say we do not support v1.BaseModel
. Then:
- I'm assuming some portion of uplink users are using pydantic v1, since our last release supported it. (Can we query this?)
- If those users want to upgrade uplink, they MUST upgrade to pydantic v2. Is it easy for those users to upgrade to v2 (assuming we did not support
v1.BaseModel
)?
If supporting v1.BaseModel
makes the pydantic v2 upgrade easier for a notable portion of users, I'm inclined to say we support it. But, I also understand @djbios' concern.
Some possible options:
uplink
provides a converter forv1.BaseModel
and v2BaseModel
. (The current state of this PR)uplink
provides a converter only for v2BaseModel
. And, we provide av1.BaseModel
converter in a separate library/repo (e.g.,uplink-pydantic-v1
) that the user can install if they need it.uplink
provides a converter only for v2BaseModel
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm assuming some portion of uplink users are using pydantic v1, since our last release supported it. (Can we query this?)
All uplink users use pydantic v1, because v2 is not supported currently :)
If those users want to upgrade uplink, they MUST upgrade to pydantic v2. Is it easy for those users to upgrade to v2 (assuming we did not support v1.BaseModel)?
v2 was pretty breaking tbh, but I believe most of the projects have been upgraded already (the ones who might want to use uplink, but they don't because it supports v1 only)
About options: I guess option 1 is already implemented, and I guess it's good enough. What I suggested is actually kinda "option 4" - we seamlessly support both v1 and v2, we detect installed pydantic version and use converter accordingly. But again - option 1 is good enough
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good. Let's go with option #1 for now and see the feedback: e.g., if users have trouble upgrading.
All uplink users use pydantic v1, because v2 is not supported currently :)
I was thinking about the proportion of uplink users who are using pydantic vs not -- as pydantic is an optional import for uplink. As you noted, every one of those will need to upgrade to v2.
p.s. A potential option #5 is to bring back pydantic 1.X support via a separate library.
I noticed that pydantic 2 requires python 3.8, and honestly think it's time to get rid of older versions, as per https://devguide.python.org/versions/ targeting |
uplink/converters/pydantic_v2.py
Outdated
@@ -1,5 +1,5 @@ | |||
""" | |||
This module defines a converter that uses :py:mod:`pydantic` models | |||
This module defines a converter that uses :py:mod:`pydantic.v1` models |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
v2 right?
uplink/converters/pydantic_v2.py
Outdated
@@ -48,7 +35,7 @@ def convert(self, response): | |||
return self._model.parse_obj(data) | |||
|
|||
|
|||
class PydanticConverter(Factory): | |||
class PydanticV2Converter(Factory): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any objections to just naming this PydanticConverter
?
uplink/converters/__init__.py
Outdated
from uplink.converters.typing_ import TypingConverter | ||
# fmt: on | ||
|
||
__all__ = [ | ||
"StandardConverter", | ||
"MarshmallowConverter", | ||
"PydanticConverter", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing this might be a breaking change, since it is being exported by the uplink.converters
package.
uplink/converters/pydantic_v1.py
Outdated
return self._model.parse_obj(data) | ||
|
||
|
||
class PydanticV1Converter(Factory): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be too complicated to handle both v1 an v2 models in a single factory? I think keeping them separate also makes sense, just thinking out loud.
For example, a "unified" PydanticConverter
instance could create the appropriate request and response body converters based on whether the model is a v1
or v2
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✔️
I also removed some duplicated tests in v1/v2 |
uplink/converters/pydantic_.py
Outdated
|
||
@classmethod | ||
def register_if_necessary(cls, register_func): | ||
if cls.pydantic is not None: | ||
if (cls.pydantic and cls.pydantic_v1) is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: A little bit of a confusing line - we register if both not None, right? Maybe refactor more explicit/readable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basically for the tests.
Edit, now that I removed that test for one of the two, I will simplify it.
|
||
|
||
def _encode_pydantic_v2(model): | ||
return model.model_dump(mode="json") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why in v1 we do primitives checks (if json atoms, if json containers), and here we don't?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In v1 there is no way take a model instance and generate from it an object that can be used to feed json.dump
because model.dict()
can return stuff like datetime
. That's why when we implemented the pydantic integration in uplink we had to do it that way (I did it, so I remember 😄 ). In pydantic 2 there is not need to do such a hack, as dump_model
in json
mode does that for us.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For context:
4f8538a
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah alright, I get it now :) _encode_pydantic_v1
input type is any jsonable, because we recursively re-use it, in _encode_pydantic_v2
input type is only pydantic model instance, single call. Maybe we add typehints just to avoid confusion?
Guys! Is there any other change needed here? Let's keep this moving so we can finally migrate to pydantic2. |
Thanks for the ping! I'll be able to take a look again tomorrow. |
My main concern here: #312 (comment) |
We can keep it fully backward compatible if we keep both |
I added a reply in #312 (comment) before seeing this thread. TL;DR: thoughts on going with one of the following?
^ All assume we only support pydantic v2. The question is whether we support If either of you don't have a strong preference, I suggest we go with option 3 and re-evaluate 1 and 2 after our next release, especially if we get feedback that supporting |
Well, I'm one of the users wanting to move to pydantic v2 from v1, and I really feel like option 1 makes it smoother. If you think it's a big deal, then 2 will also work, but it will imply more work and such a library will need to be created before (or very close) to having a release, as that will indeed stop people that is still using pydantic v1 from migrating. |
I’m happy with option 1, since we know at least one user would benefit from it and it smooths the migration. @djbios – I hear your points in this discussion. One thing that makes option 1 appealing is that it aligns with Pydantic’s v1→v2 migration guide (https://docs.pydantic.dev/latest/migration/), so it helps users take the leap to v2 without a massive overhaul. Wdyt? Thanks to both of you for the thorough discussion and thoughtful input on this! Let's commit to an approach soon so we can move forward decisively. |
Fixes # .
Changes proposed in this pull request:
Attention: @prkumar