-
Notifications
You must be signed in to change notification settings - Fork 269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added one hot encoding to transformers #243
base: master
Are you sure you want to change the base?
Conversation
The name of the source that will be transformed. | ||
|
||
""" | ||
def __init__(self, data_stream, num_classes, source_name='targets', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You don't need an additional source_name
kwarg, as which_sources
will already filter out the sources to be transformed.
In fact, I think having a targets
default value might not be worth it as well.
@markusnagel sorry about the delay, and thanks a lot for your contribution! |
Thanks for your feedback. Indeed the source_name was completely unnecessary. I implemented your suggestions, what do you think of it? |
@vdumoulin , I squashed the corrections into one commit. Any idea what's going on with travis? |
No idea. I restarted the build yesterday and had the same error, I'm trying again today. |
…ests for input validation.
Conflicts: tests/transformers/test_transformers.py
We added a OneHotEncoding class that transforms a given source (in int) to one-hot-encoding (also known as 1-of-K encoding) as well as a corresponding test.
One-hot-encoding is useful if (1) a problem has many classes and thus by storing the class number space can be saved and (2) if a problem can be treated as different types of problems (e.g. ordinal regression, classification).