List of commits:
Subject Hash Author Date (UTC)
eval_adamw1_bigtail13i_t1_shb_on_sha 6a4d823b75e3a90b452e528ef0e71b803666b161 Thai Thien 2020-08-04 17:34:33
add evaluation only d86462407743bd2ac65c778aeb7e172faea806e1 Thai Thien 2020-08-04 17:26:45
s d2c5d384ed61127c4fd3351692381b80a30c74aa Thai Thien 2020-08-04 17:02:33
WIP fce5736b0f6104fa3549884fbaa1f6508086af06 Thai Thien 2020-08-03 17:14:11
debug sha keepful e14c3307499356c7a7ed0d5f33818bd798276d94 Thai Thien 2020-08-03 15:44:02
adamw2_bigtail13i_t3_sha f769948ddd511865f62cddc80957663508e25976 Thai Thien 2020-08-02 04:25:08
g 56048b30fb4a19ae52c641b07754756f43ddcefb Thai Thien 2020-08-02 04:00:46
a d4e0e0d63613536a18f7ae0f39e4798f12abe7e8 Thai Thien 2020-08-02 03:59:14
adamw1_bigtail13i_t4_sha bc07d19cfcc88c4271d1c6f1984c1c96ff5a9413 Thai Thien 2020-08-01 16:19:17
adamw1_bigtail13i_t3_sha 17c169d6ef648e8c4f2c5383fb9226cfce48d626 Thai Thien 2020-08-01 16:11:33
adamw1_bigtail13i_t2_sha 8bc23f1401eacdc2e28f30a825503f0dd96bacc9 Thai Thien 2020-08-01 15:59:29
adamw1_bigtail13i_t2_shb aadc7244daabdad64a548342865a22beda073d94 Thai Thien 2020-08-01 15:47:08
gpu4 95c356f61c49d9f1927da7f86135a9bb60d21a59 Thai Thien 2020-08-01 14:46:30
adamw1_bigtail13i_t1_sha 34434bd013a4b05134c5ce42aa8f9956d424467d Thai Thien 2020-08-01 14:44:56
a 140443bb072f8086b6cef253bcadfb0d925ba606 Thai Thien 2020-08-01 12:09:09
adamw 57f61e52a024e91f57347c85a8c14813df16d15d Thai Thien 2020-08-01 12:05:36
adam1_bigtail13i_t1_shb 713dd1d5a0f13a2c881b00687fabb1174383da5b Thai Thien 2020-08-01 10:22:17
g1_BigTail14i_t8_sha eb361a5e64d38062fa6e5aa61ae21adc5a536aef Thai Thien 2020-07-31 15:45:50
g1_BigTail14i_t7_sha 32ff746fa0c272af36eadcd713cda94ef8f1b8e2 Thai Thien 2020-07-31 14:57:15
g1_BigTail14i_t6_sha 0656f2ac088bd2790b560931839bfa1d4bed98c4 Thai Thien 2020-07-30 19:32:13
Commit 6a4d823b75e3a90b452e528ef0e71b803666b161 - eval_adamw1_bigtail13i_t1_shb_on_sha
Author: Thai Thien
Author date (UTC): 2020-08-04 17:34
Committer name: Thai Thien
Committer date (UTC): 2020-08-04 17:34
Parent(s): d86462407743bd2ac65c778aeb7e172faea806e1
Signing key:
Tree: c7711fe4ccb97faee3b5602dada7c1dff0dff632
File Lines added Lines deleted
debug/verify_sha.py 34 0
train_script/learnstuff/l1/eval_adamw1_bigtail13i_t1_shb_on_sha.sh 5 3
File debug/verify_sha.py added (mode: 100644) (index 0000000..f91f783)
1 import torch
2 from models.meow_experiment.ccnn_tail import BigTail11i, BigTail10i, BigTail12i, BigTail13i, BigTail14i, BigTail15i
3 from hard_code_variable import HardCodeVariable
4 from data_util import ShanghaiTechDataPath
5 from visualize_util import save_img, save_density_map
6 import os
7 import numpy as np
8 from data_flow import get_train_val_list, get_dataloader, create_training_image_list
9 import cv2
10
11 def visualize_evaluation_shanghaitech_keepfull(path=None):
12 HARD_CODE = HardCodeVariable()
13 if path==None:
14 shanghaitech_data = ShanghaiTechDataPath(root= HARD_CODE.SHANGHAITECH_PATH)
15 shanghaitech_data_part_a_train = shanghaitech_data.get_a().get_train().get()
16 path = shanghaitech_data_part_a_train
17 saved_folder = "visualize/verify_dataloader_shanghaitech"
18 os.makedirs(saved_folder, exist_ok=True)
19 train_list, val_list = get_train_val_list(path, test_size=0.2)
20 test_list = None
21 train_loader, val_loader, test_loader = get_dataloader(train_list, val_list, test_list, dataset_name="shanghaitech_keepfull", visualize_mode=True,
22 debug=True)
23
24 # do with train loader
25 train_loader_iter = iter(train_loader)
26 for i in range(len(train_loader)):
27 img, label, count = next(train_loader_iter)
28 save_img(img, os.path.join(saved_folder, "train_img_" + str(i) +".png"))
29 save_path = os.path.join(saved_folder, "train_label_" + str(i) +".png")
30 save_density_map(label.numpy()[0][0], save_path)
31
32
33 visualize_evaluation_shanghaitech_keepfull()
34
File train_script/learnstuff/l1/eval_adamw1_bigtail13i_t1_shb_on_sha.sh copied from file train_script/learnstuff/l1/adamw1_bigtail13i_t1_shb.sh (similarity 64%) (mode: 100644) (index cd3f8e4..b7d9f51)
1 task="adamw1_bigtail13i_t1_shb"
1 task="eval_adamw1_bigtail13i_t1_shb_on_sha"
2 2
3 3 CUDA_VISIBLE_DEVICES=3 OMP_NUM_THREADS=2 PYTHONWARNINGS="ignore" HTTPS_PROXY="http://10.60.28.99:86" nohup python experiment_main.py \ CUDA_VISIBLE_DEVICES=3 OMP_NUM_THREADS=2 PYTHONWARNINGS="ignore" HTTPS_PROXY="http://10.60.28.99:86" nohup python experiment_main.py \
4 4 --task_id $task \ --task_id $task \
5 5 --note "adamW with extrem high lr and decay, msel1mean" \ --note "adamW with extrem high lr and decay, msel1mean" \
6 6 --model "BigTail13i" \ --model "BigTail13i" \
7 --input /data/rnd/thient/thient_data/shanghaitech_with_people_density_map/ShanghaiTech_3/part_B \
7 --input /data/rnd/thient/thient_data/shanghaitech_with_people_density_map/ShanghaiTech_3/part_A \
8 8 --lr 1e-3 \ --lr 1e-3 \
9 9 --decay 0.1 \ --decay 0.1 \
10 10 --loss_fn "MSEL1Mean" \ --loss_fn "MSEL1Mean" \
11 11 --batch_size 5 \ --batch_size 5 \
12 --datasetname shanghaitech_non_overlap \
12 --datasetname shanghaitech_keepfull \
13 13 --optim adamw \ --optim adamw \
14 --eval_only \
15 --load_model "saved_model_best/adamw1_bigtail13i_t1_shb/adamw1_bigtail13i_t1_shb_checkpoint_valid_mae=-7.574910521507263.pth" \
14 16 --cache \ --cache \
15 17 --epochs 1201 > logs/$task.log & --epochs 1201 > logs/$task.log &
16 18
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using git:
git clone git://git.rocketgit.com/user/hahattpro/crowd_counting_framework

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main