Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: numpy precision #226

Open
wants to merge 12 commits into
base: main
Choose a base branch
from
Open

Conversation

jhug12
Copy link

@jhug12 jhug12 commented Jun 7, 2024

Fixes #225

Summary

Changes precision from float32 to float64 for conversion from polars DataFrame to numpy ndarray

Copy link

vercel bot commented Jun 7, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
functime-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 7, 2024 10:43am

@jhug12 jhug12 changed the title Fix/numpy precision fix: numpy precision Jun 7, 2024
@baggiponte
Copy link
Collaborator

baggiponte commented Jun 9, 2024

Ciao @jhug12, sorry for being late and thanks for the great PR.

I merged a few things in the codebase: would you mind rebasing your PR?

Also: I see you committed other fixes and improvements (thanks once more 🤗) but I am not sure whether they relate to this PR. I can help you with putting some changes in other pull requests, as they will be easier to review.

(I was actually thinking whether we should just map the datatypes that come in and when we perform the conversion just respect those, and raise a warning if they're not F64)

@baggiponte baggiponte added this to the 0.9.X milestone Jun 9, 2024
@baggiponte baggiponte added bug Something isn't working forecasting Forecasters and adapters labels Jun 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working forecasting Forecasters and adapters
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Reduced conversion precision prevents model fitting
2 participants