Skip to content

Conversation

@AgatheZ
Copy link
Contributor

@AgatheZ AgatheZ commented Oct 17, 2024

Linked Issue(s)

#76

Summary of changes

I have implemented the code for the NLP template, as a standard multi class classifier leveraging a Bert based transformer.

This includes:
training script
datamodule
dataset
network
wrapper
configs

I created fake data (in tests/data/)

Key features I introduced:
KFOLD cross validation possibility (can be enabled/disabled in the config file)
Save the best model (can be enabled/disabled in the config file)
f1 score and other metrics are logged
A data manifest is logged as a mlflow artifact, logging the classes distributions in the training and validation test sets)

FYI: I had to change the Dockerfile for the default folder to be project_NLP, might incur some clashes in the future.

I have tested it on the DGX, it runs properly and nicely

@AgatheZ AgatheZ linked an issue Oct 17, 2024 that may be closed by this pull request
@AgatheZ
Copy link
Contributor Author

AgatheZ commented Oct 17, 2024

@heyhaleema Don't hesitate if you have any questions re. how to leverage this piece of code in a more elegant manner (e.g. Factory depending on the type of project)

@heyhaleema
Copy link

@heyhaleema Don't hesitate if you have any questions re. how to leverage this piece of code in a more elegant manner (e.g. Factory depending on the type of project)

@AgatheZ Firstly - this is amazing! 🔥 One (initial) question: is NLPDataset like a preprocessing step before NLPDataModule?

Also, I've just realised that since we run mlops run scripts/train.py -c config/config.cfg, some of the changes (related to #86) I've made to train.py (main.py in my local branch) might be better placed in the csc-mlops package e.g., mlops run local imaging, mlops run dgx nlp, etc. Having said that, not yet sure about this approach either...

@mikewoodward94 If you have some time for a quick chat about this next week, that'd be great 🙏🏼 My thought is that I can then have the diagram for #86 be the proposed updated workflow, and potentially have this used to update this diagram in the csc-mlops repo or have it added as another, separate asset.

@AgatheZ
Copy link
Contributor Author

AgatheZ commented Oct 23, 2024

@heyhaleema NLPDataset is about wrapping the data and labels in a PyTorch Dataset (more info here). This step is also performed for image-based projects (but I think they do it in the Datamodule class instead of having a dedicated Datset class)

About your second question, we could run mlops run scripts/train.py -c config/config_image.cfg or mlops run scripts/train.py -c config/config_NLP.cfg and only modify the config files. This would not require any update from the csc-mlops package.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

PTD - NLP Classifier Template

3 participants