Skip to content

Update supervised_pca.py #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

Update supervised_pca.py #1

wants to merge 4 commits into from

Conversation

picost
Copy link
Owner

@picost picost commented Aug 3, 2016

Changed made in the code

In both classes

Added docstrings and comments. Changed a few names for readability.

In SupervisedPCA

  • removing unused alpha argument
  • X and Y tables shapes changed according to sklearn conventional shape (n_samples, n_features)
  • maximum number of component enforced to be less than minimum axis size of X (ie rank(X) assuming X has maximum rank)
  • gamma parameter changed according to the docstring
  • Removed X_scale variable that was unused (scaling was not inplace. should better be done by the user using a Pipeline)

In Kernel SPCA:

  • correction of call to _get_kernel method in _fit method (kernels in wrong parenthesis)
  • correction of parameters passed to pairwise_kernels in _get_kernel method (issue with kwargs metric and kernel).
  • correction of the call to eigsh (use of kwargs)
  • TO BE CHECKED: correction of transform method.

Next steps

  • finish debugging and modify X and Y shapes in KernelSPCA
  • Separate both classes in distinct modules.
  • create a buch of reference tests

picost added 4 commits August 3, 2016 14:23
# In both classes
    Added docstrings and comments. Changed a few names for readability.

#In SupervisedPCA
   - removing unused alpha argument
   - X and Y tables shapes changed according to sklearn conventional shape (n_samples, n_features)
   - maximum number of component enforced to be less than minimum axis size of X (ie rank(X) assuming X has maximum rank)
   - gamma parameter changed according to the docstring
   - Removed X_scale variable that was unused (scaling was not inplace. should better be done by the user using a Pipeline)

# In Kernel SPCA: 
   - correction of call to _get_kernel method in _fit method. Correction of parameters passed to pairwise_kernels in _get_kernel method (issue with kwargs metric and kernel).
   - correction of the call to eigsh (use of kwargs)
   - TO BE CHECKED: correction of transform method.

Next steps: finish debugging and modify X and Y shapes in KernelSPCA
Separate both classes in distinct modules.
Added a parameter to enable the user to chose whether or not the negative or dropped eigenvalues should be dropped. When performing trials, it can be usefull to keep all eigenvalues such that the span of the initial and transformed vectors is the same.

Also added a little more documentation about alphas_ and lambdas_
Create a separate module to define KernelSupervisedPCA.
Removes KernelSupervisedPCA from this module as it is implemented in a separated module now. (+ added the doi of the paper in the header)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant