List of commits:
Subject Hash Author Date (UTC)
jhu float32 c0b004d6a733f83e469bb3d48f7e20a9c7113957 Thai Thien 2020-09-10 16:14:09
fix float 8867f5d5e160c38a1a7d67b2fac728cf60e6649e Thai Thien 2020-09-10 16:03:15
try catch empty file e7d392cd6ec807f3ae242ef5090f5725ced9f897 Thai Thien 2020-09-10 15:40:14
case no label at all 51b423b2ee9271e3639a332ecab092944636d8f2 Thai Thien 2020-09-09 13:49:29
remove experiment 06478164cb8e00ed5512a0d4db0dbc6edc1b5ad1 Thai Thien 2020-09-08 15:10:33
val train test 5a43d57ba16667cb96f9514dcf4b81167ff8bd5a Thai Thien 2020-09-08 15:01:51
parallel, try catch excemption 9ac6a36880a113c1820b1e6a702ed5c08ebcb03f Thai Thien 2020-09-08 14:55:08
now show the number of point in log 54a1a16adf32c866ed65808d16da5f997d27b54f Thai Thien 2020-09-07 17:19:18
gt to ground-truth 3996d1c19d68f3669b5f54e5b621635f0b87a9fc Thai Thien 2020-09-07 17:06:15
change data path 5e6605625ba130310fbeaade259e4a5b98987dad Thai Thien 2020-09-07 17:04:43
test train val e817f1d6f4743286735811fff60f64080a8e76ed Thai Thien 2020-09-07 17:02:57
add profiler fac9aa609d6de1e8cd9cb8a8a97f00063aaae310 Thai Thien 2020-09-07 16:59:36
generate dataset again 7475799e0ff4bdc51ce86ca98751535c841d49c9 Thai Thien 2020-09-07 16:52:00
try to fix cannot convert float infinity to integer b29a994c2f3e73e78ec2ca7d1ad808049f1008e9 Thai Thien 2020-09-07 16:49:44
jhu adamw1_bigtail13i_t1_jhu 77bb789fa2ffadfc07ec6f99e1bcf44bde24bfc3 Thai Thien 2020-09-07 16:38:13
train_dev_test cbfc478aaf7db3accb41a763b0818910999d25a5 Thai Thien 2020-09-06 17:29:37
seem working c44af597b7adf98ecaf3a71e565de8688f7ff6ba Thai Thien 2020-09-06 15:55:28
done single sample cee46d309e9bb91ac4185b0e1b74deddefbc8553 Thai Thien 2020-09-06 15:24:39
a ec2103c6617ad84375e5a7a0f3cde3b32ab012a0 Thai Thien 2020-08-23 15:39:51
pred_count d9168903486d7a88c0763a8e93b18d1ed89d2d40 Thai Thien 2020-08-23 15:37:02
Commit c0b004d6a733f83e469bb3d48f7e20a9c7113957 - jhu float32
Author: Thai Thien
Author date (UTC): 2020-09-10 16:14
Committer name: Thai Thien
Committer date (UTC): 2020-09-10 16:14
Parent(s): 8867f5d5e160c38a1a7d67b2fac728cf60e6649e
Signing key:
Tree: d9b7a4ac186453f55c561d7659f57e3a5b047f50
File Lines added Lines deleted
data_flow.py 2 2
train_script/learnstuff/l3/adamw1_bigtail13i_t2_jhu.sh 4 4
File data_flow.py changed (mode: 100644) (index c9960d9..b250689)
... ... def load_data_jhucrowd_256(img_path, train=True, debug=False):
1025 1025 gt_path = img_path.replace('.jpg', '.h5').replace('images', 'ground-truth-h5') gt_path = img_path.replace('.jpg', '.h5').replace('images', 'ground-truth-h5')
1026 1026 img_origin = Image.open(img_path).convert('RGB') img_origin = Image.open(img_path).convert('RGB')
1027 1027 gt_file = h5py.File(gt_path, 'r') gt_file = h5py.File(gt_path, 'r')
1028 target = np.asarray(gt_file['density'])
1028 target = np.asarray(gt_file['density']).astype('float32')
1029 1029 target_factor = 8 target_factor = 8
1030 1030 crop_sq_size = 256 crop_sq_size = 256
1031 1031 if train: if train:
 
... ... def load_data_jhucrowd_256(img_path, train=True, debug=False):
1055 1055 interpolation=cv2.INTER_CUBIC) * target_factor * target_factor interpolation=cv2.INTER_CUBIC) * target_factor * target_factor
1056 1056 # target1 = target1.unsqueeze(0) # make dim (batch size, channel size, x, y) to make model output # target1 = target1.unsqueeze(0) # make dim (batch size, channel size, x, y) to make model output
1057 1057 target1 = np.expand_dims(target1, axis=0) # make dim (batch size, channel size, x, y) to make model output target1 = np.expand_dims(target1, axis=0) # make dim (batch size, channel size, x, y) to make model output
1058 return img, target1.astype('float32')
1058 return img, target1
1059 1059
1060 1060
1061 1061 def data_augmentation(img, target): def data_augmentation(img, target):
File train_script/learnstuff/l3/adamw1_bigtail13i_t2_jhu.sh copied from file train_script/learnstuff/l3/adamw1_bigtail13i_t1_jhu.sh (similarity 71%) (mode: 100644) (index b6dc13c..b5a6e93)
1 task="adamw1_bigtail13i_t1_jhu"
1 task="adamw1_bigtail13i_t2_jhu"
2 2
3 CUDA_VISIBLE_DEVICES=3 OMP_NUM_THREADS=2 PYTHONWARNINGS="ignore" HTTPS_PROXY="http://10.60.28.99:86" nohup python experiment_main.py \
3 CUDA_VISIBLE_DEVICES=3 OMP_NUM_THREADS=6 PYTHONWARNINGS="ignore" HTTPS_PROXY="http://10.60.28.99:86" nohup python experiment_main.py \
4 4 --task_id $task \ --task_id $task \
5 5 --note "adamW with extrem high lr and decay, msel1mean on jhu" \ --note "adamW with extrem high lr and decay, msel1mean on jhu" \
6 6 --model "BigTail13i" \ --model "BigTail13i" \
 
... ... CUDA_VISIBLE_DEVICES=3 OMP_NUM_THREADS=2 PYTHONWARNINGS="ignore" HTTPS_PROXY="ht
8 8 --lr 1e-3 \ --lr 1e-3 \
9 9 --decay 0.1 \ --decay 0.1 \
10 10 --loss_fn "MSEL1Mean" \ --loss_fn "MSEL1Mean" \
11 --batch_size 40 \
11 --batch_size 70 \
12 12 --datasetname jhucrowd_256 \ --datasetname jhucrowd_256 \
13 13 --optim adamw \ --optim adamw \
14 --epochs 1201 > logs/$task.log &
14 --epochs 401 > logs/$task.log &
15 15
16 16 echo logs/$task.log # for convenience echo logs/$task.log # for convenience
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using git:
git clone git://git.rocketgit.com/user/hahattpro/crowd_counting_framework

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main