List of commits:
Subject Hash Author Date (UTC)
WIP 42c7c8e1d772fbbda61a4bdf9e329f74e1efb600 tthien 2019-10-03 17:52:47
add readme 580cf43d1edddd67b1f6a2c57fdd5cee3dba925c Thai Thien 2019-10-02 17:44:49
update script, debug ddb68b95389be1c1d398118677dd227a8bb2b70b Thai Thien 2019-10-02 15:52:31
add d (output density map) to loss function) a0c71bf4bf2ab7393d60b06a84db8dfbbfb1a6c2 tthien 2019-09-30 16:32:39
fix the args, add save interval for model, so we don't save them all 9fdf9daa2ac4bd12b7b62521d81e520db0debd01 tthien 2019-09-30 16:30:00
meow 1ad19a22a310992e27a26471feeb37375124d075 tthien 2019-09-29 18:25:43
fix pacnn perspective map 453ece3ccb818889ba895bfc4285f7905d33cba5 Thai Thien 2019-09-25 17:20:33
apex not work so well da8c0dd57297f972201f31d57e66897177922f48 Thai Thien 2019-09-24 17:25:59
fix data loader pacnn so it will scale up with correct number of people 11d55b50d764511f2491291f0208fee0905dec49 Thai Thien 2019-09-24 15:40:56
add comet ml a9d4b89ce594f5e241168ccafdcdf0f150ea0ebb Thai Thien 2019-09-23 17:07:58
fix pacnn avg schema c2140a96886195782e5689c24aeeb4fe7a2db7ad Thai Thien 2019-09-22 17:35:01
debug number not divisible by 8 a568fd7f294a8bd31b3db78437b4b6b51b5b41b9 Thai Thien 2019-09-22 04:36:06
pacnn 967074890d14ab0eefc277801860270a468e8f9f Thai Thien 2019-09-22 03:54:48
wip: pacnn 2192d7c7b449fecf3868877d9cfbc09bb6f7ae98 Thai Thien 2019-09-22 03:44:56
wip: pacnn 37620e5a9bc0f9516ea964ec58d9bdaa1c40ff36 Thai Thien 2019-09-22 03:14:42
fix training flow 2b87b1b26c7296b64493fdc49fedb421b249dfa3 Thai Thien 2019-09-17 18:00:35
dataset script bc5c052f5f956510ab95ef9a45434fd486c57fae Thai Thien 2019-09-16 17:21:13
evaluator ffc5bf8290ae0c469a9a18a2d061cfd1bfeee822 Thai Thien 2019-09-14 04:56:35
some more test for data loader 25173578cde7d4e9fe6c6140d1ee01caa4fcfc32 Thai Thien 2019-09-14 02:51:58
some visualize to debug data loader e4f52007616acf307bddbde79c0fb4f8c649c785 Thai Thien 2019-09-13 17:35:45
Commit 42c7c8e1d772fbbda61a4bdf9e329f74e1efb600 - WIP
Author: tthien
Author date (UTC): 2019-10-03 17:52
Committer name: tthien
Committer date (UTC): 2019-10-03 17:52
Parent(s): 580cf43d1edddd67b1f6a2c57fdd5cee3dba925c
Signing key:
Tree: 77016789a8c1eb73a20d1b23751c9a0a3ab0850c
File Lines added Lines deleted
dataset_script/download_kaggle_crowd_dataset.sh 4 2
dataset_script/unzip_file.sh 7 3
main_pacnn.py 5 2
playground/__init__.py 0 0
playground/play_load_perspective_map.py 12 0
train_script/train_pacnn_shanghaitechA.sh 11 3
File dataset_script/download_kaggle_crowd_dataset.sh changed (mode: 100644) (index 739aca7..1d9a5fd)
1 kaggle datasets download ucf-cc-50-with-people-density-map
1 #kaggle datasets download ucf-cc-50-with-people-density-map
2 #
3 #kaggle datasets download shanghaitech-with-people-density-map
2 4
3 kaggle datasets download shanghaitech-with-people-density-map
4 5
6 kaggle datasets download perspective-shanghaitech
File dataset_script/unzip_file.sh changed (mode: 100644) (index 35daadd..113efa4)
1 mkdir -p ../data/
1 mkdir -p data/
2 2
3 mv shanghaitech-with-people-density-map.zip ../data/ ; cd ../data/ ;unzip shanghaitech-with-people-density-map.zip
3 # mv shanghaitech-with-people-density-map.zip data/ ; unzip data/shanghaitech-with-people-density-map.zip
4 4
5 # mv ucf-cc-50-with-people-density-map.zip ../data/ ; cd ../data/ ;unzip ucf-cc-50-with-people-density-map.zip
5 # mv ucf-cc-50-with-people-density-map.zip ../data/ ; cd ../data/ ;unzip ucf-cc-50-with-people-density-map.zip
6
7 mv perspective-shanghaitech.zip data/
8 cd data/
9 unzip perspective-shanghaitech.zip
File main_pacnn.py changed (mode: 100644) (index 24d3d1c..fc19585)
... ... if __name__ == "__main__":
139 139 pass pass
140 140 loss_d = criterion_mse(d, d1_label) + criterion_ssim(d, d1_label) loss_d = criterion_mse(d, d1_label) + criterion_ssim(d, d1_label)
141 141 loss += loss_d loss += loss_d
142 loss.backward()
142
143 143 # with amp.scale_loss(loss, optimizer) as scaled_loss: # with amp.scale_loss(loss, optimizer) as scaled_loss:
144 144 # scaled_loss.backward() # scaled_loss.backward()
145 optimizer.step()
145
146 146 optimizer.zero_grad() optimizer.zero_grad()
147 loss.backward()
148 optimizer.step()
149
147 150 loss_sum += loss.item() loss_sum += loss.item()
148 151 sample += 1 sample += 1
149 152 counting += 1 counting += 1
File playground/__init__.py added (mode: 100644) (index 0000000..e69de29)
File playground/play_load_perspective_map.py added (mode: 100644) (index 0000000..d03dfbc)
1 import scipy.io
2 import h5py
3 import numpy as np
4
5 if __name__ == "__main__":
6 mat_path = "../data/perspective-ShanghaiTech/A/train_pmap/IMG_1.mat"
7 # mat = scipy.io.loadmat(mat_path)
8 # print(mat)
9 with h5py.File(mat_path, 'r') as f:
10 target = np.asarray(f['pmap'])
11 print(target.shape)
12 print(target)
File train_script/train_pacnn_shanghaitechA.sh changed (mode: 100644) (index b8ab1df..72ae080)
14 14
15 15 # trained 30 # trained 30
16 16
17 #python main_pacnn.py \
18 #--input data/ShanghaiTech/part_A \
19 #--load_model saved_model/train_state1_attemp3_30_checkpoint.pth.tar \
20 #--epochs 151 \
21 #--lr 1e-7 \
22 #--task_id train_state1_attemp4
23
24
17 25 python main_pacnn.py \ python main_pacnn.py \
18 26 --input data/ShanghaiTech/part_A \ --input data/ShanghaiTech/part_A \
19 --load_model saved_model/train_state1_attemp3_30_checkpoint.pth.tar \
27 --load_model saved_model/train_state1_attemp4_35_checkpoint.pth.tar \
20 28 --epochs 151 \ --epochs 151 \
21 --lr 1e-7 \
22 --task_id train_state1_attemp4
29 --lr 1e-8 \
30 --task_id train_state1_attemp5
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using git:
git clone git://git.rocketgit.com/user/hahattpro/crowd_counting_framework

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main