Initial codebase (#1)
* Add project code * Logger improvements * Improvements to web demo code * added create_wlasl_landmarks_dataset.py and xtract_mediapipe_landmarks.py * Fix rotation augmentation * fixed error in docstring, and removed unnecessary replace -1 -> 0 * Readme updates * Share base notebooks * Add notebooks and unify for different datasets * requirements update * fixes * Make evaluate more deterministic * Allow training with clearml * refactor preprocessing and apply linter * Minor fixes * Minor notebook tweaks * Readme updates * Fix PR comments * Remove unneeded code * Add banner to Readme --------- Co-authored-by: Gabriel Lema <gabriel.lema@xmartlabs.com>
This commit is contained in:
24
train.sh
Executable file
24
train.sh
Executable file
@@ -0,0 +1,24 @@
|
||||
#!/bin/sh
|
||||
python -m train \
|
||||
--save_checkpoints_every -1 \
|
||||
--experiment_name "augment_rotate_75_x8" \
|
||||
--epochs 10 \
|
||||
--optimizer "SGD" \
|
||||
--lr 0.001 \
|
||||
--batch_size 32 \
|
||||
--dataset_name "wlasl" \
|
||||
--training_set_path "WLASL100_train.csv" \
|
||||
--validation_set_path "WLASL100_test.csv" \
|
||||
--vector_length 32 \
|
||||
--epoch_iters -1 \
|
||||
--scheduler_factor 0 \
|
||||
--hard_triplet_mining "in_batch" \
|
||||
--filter_easy_triplets \
|
||||
--triplet_loss_margin 1 \
|
||||
--dropout 0.2 \
|
||||
--start_mining_hard=200 \
|
||||
--hard_mining_pre_batch_multipler=16 \
|
||||
--hard_mining_pre_batch_mining_count=5 \
|
||||
--augmentations_prob=0.75 \
|
||||
--hard_mining_scheduler_triplets_threshold=0 \
|
||||
# --normalize_embeddings \
|
||||
Reference in New Issue
Block a user