List of commits:
Subject Hash Author Date (UTC)
wip : perspective map eac63f2671dc5b064753acc4f40bf0f9f216ad2a Thai Thien 2019-10-04 16:26:56
shell script f2106e700b6f6174d4dd276f25ec6f3d9ff239bb thient 2019-10-04 07:42:51
WIP 42c7c8e1d772fbbda61a4bdf9e329f74e1efb600 tthien 2019-10-03 17:52:47
add readme 580cf43d1edddd67b1f6a2c57fdd5cee3dba925c Thai Thien 2019-10-02 17:44:49
update script, debug ddb68b95389be1c1d398118677dd227a8bb2b70b Thai Thien 2019-10-02 15:52:31
add d (output density map) to loss function) a0c71bf4bf2ab7393d60b06a84db8dfbbfb1a6c2 tthien 2019-09-30 16:32:39
fix the args, add save interval for model, so we don't save them all 9fdf9daa2ac4bd12b7b62521d81e520db0debd01 tthien 2019-09-30 16:30:00
meow 1ad19a22a310992e27a26471feeb37375124d075 tthien 2019-09-29 18:25:43
fix pacnn perspective map 453ece3ccb818889ba895bfc4285f7905d33cba5 Thai Thien 2019-09-25 17:20:33
apex not work so well da8c0dd57297f972201f31d57e66897177922f48 Thai Thien 2019-09-24 17:25:59
fix data loader pacnn so it will scale up with correct number of people 11d55b50d764511f2491291f0208fee0905dec49 Thai Thien 2019-09-24 15:40:56
add comet ml a9d4b89ce594f5e241168ccafdcdf0f150ea0ebb Thai Thien 2019-09-23 17:07:58
fix pacnn avg schema c2140a96886195782e5689c24aeeb4fe7a2db7ad Thai Thien 2019-09-22 17:35:01
debug number not divisible by 8 a568fd7f294a8bd31b3db78437b4b6b51b5b41b9 Thai Thien 2019-09-22 04:36:06
pacnn 967074890d14ab0eefc277801860270a468e8f9f Thai Thien 2019-09-22 03:54:48
wip: pacnn 2192d7c7b449fecf3868877d9cfbc09bb6f7ae98 Thai Thien 2019-09-22 03:44:56
wip: pacnn 37620e5a9bc0f9516ea964ec58d9bdaa1c40ff36 Thai Thien 2019-09-22 03:14:42
fix training flow 2b87b1b26c7296b64493fdc49fedb421b249dfa3 Thai Thien 2019-09-17 18:00:35
dataset script bc5c052f5f956510ab95ef9a45434fd486c57fae Thai Thien 2019-09-16 17:21:13
evaluator ffc5bf8290ae0c469a9a18a2d061cfd1bfeee822 Thai Thien 2019-09-14 04:56:35
Commit eac63f2671dc5b064753acc4f40bf0f9f216ad2a - wip : perspective map
Author: Thai Thien
Author date (UTC): 2019-10-04 16:26
Committer name: Thai Thien
Committer date (UTC): 2019-10-04 16:26
Parent(s): f2106e700b6f6174d4dd276f25ec6f3d9ff239bb
Signing key:
Tree: 708816d327e404c4564f072795a65513febbb358
File Lines added Lines deleted
.gitignore 2 0
data_flow.py 48 0
dataset_script/unzip_file.sh 8 1
File .gitignore changed (mode: 100644) (index 35eb2d2..169fe2b)
... ... __pycache__/
6 6 data/ data/
7 7
8 8 visualize/ visualize/
9 *.zip
10 saved_model/
File data_flow.py changed (mode: 100644) (index 2d67c33..19347e3)
... ... def load_data_shanghaitech_pacnn(img_path, train=True):
105 105 return img, (target1, target2, target3) return img, (target1, target2, target3)
106 106
107 107
108 def load_data_shanghaitech_pacnn_with_perspective(img_path, train=True):
109 """
110 # TODO: TEST this
111 :param img_path: should contain sub folder images (contain IMG_num.jpg), ground-truth-h5
112 :param perspective_path: should contain IMG_num.mat
113 :param train:
114 :return:
115 """
116 gt_path = img_path.replace('.jpg', '.h5').replace('images', 'ground-truth-h5')
117 p_path = img_path.replace(".jpg", ".mat").replace("images", "p_map")
118 img = Image.open(img_path).convert('RGB')
119 gt_file = h5py.File(gt_path, 'r')
120 target = np.asarray(gt_file['density'])
121 perspective = np.array(h5py.File(p_path, "r"))
122
123 if train:
124 crop_size = (int(img.size[0] / 2), int(img.size[1] / 2))
125 if random.randint(0, 9) <= -1:
126
127 dx = int(random.randint(0, 1) * img.size[0] * 1. / 2)
128 dy = int(random.randint(0, 1) * img.size[1] * 1. / 2)
129 else:
130 dx = int(random.random() * img.size[0] * 1. / 2)
131 dy = int(random.random() * img.size[1] * 1. / 2)
132
133 img = img.crop((dx, dy, crop_size[0] + dx, crop_size[1] + dy))
134 target = target[dy:crop_size[1] + dy, dx:crop_size[0] + dx]
135 perspective = target[dy:crop_size[1] + dy, dx:crop_size[0] + dx]
136 if random.random() > 0.8:
137 target = np.fliplr(target)
138 img = img.transpose(Image.FLIP_LEFT_RIGHT)
139 perspective = np.fliplr(perspective)
140
141 target1 = cv2.resize(target, (int(target.shape[1] / 8), int(target.shape[0] / 8)),
142 interpolation=cv2.INTER_CUBIC) * 64
143 target2 = cv2.resize(target, (int(target.shape[1] / 16), int(target.shape[0] / 16)),
144 interpolation=cv2.INTER_CUBIC) * 256
145 target3 = cv2.resize(target, (int(target.shape[1] / 32), int(target.shape[0] / 32)),
146 interpolation=cv2.INTER_CUBIC) * 1024
147
148 perspective_s = cv2.resize(perspective, (int(perspective.shape[1] / 16), int(perspective.shape[0] / 16)),
149 interpolation=cv2.INTER_CUBIC) * 256
150
151 perspective_m = cv2.resize(perspective, (int(perspective.shape[1] / 8), int(perspective.shape[0] / 8)),
152 interpolation=cv2.INTER_CUBIC) * 64
153
154 return img, (target1, target2, target3, perspective_s, perspective_m)
155
108 156 def load_data_ucf_cc50_pacnn(img_path, train=True): def load_data_ucf_cc50_pacnn(img_path, train=True):
109 157 """ """
110 158 dataloader for UCF-CC-50 dataset dataloader for UCF-CC-50 dataset
File dataset_script/unzip_file.sh changed (mode: 100644) (index c40824f..b389f77)
... ... mv perspective-shanghaitech.zip data/
8 8 mv shanghaitech-with-people-density-map.zip data/ mv shanghaitech-with-people-density-map.zip data/
9 9 cd data/ cd data/
10 10 unzip perspective-shanghaitech.zip unzip perspective-shanghaitech.zip
11 unzip shanghaitech-with-people-density-map.zip data/
11 unzip shanghaitech-with-people-density-map.zip data/
12 cd ..
13
14 # put perspective
15 mv data/perspective-ShanghaiTech/A/train_pmap/ data/ShanghaiTech/part_A/train_data/pmap/
16 mv data/perspective-ShanghaiTech/A/test_pmap/ data/ShanghaiTech/part_A/test_data/pmap/
17 mv data/perspective-ShanghaiTech/B/train_pmap/ data/ShanghaiTech/part_B/train_data/pmap/
18 mv data/perspective-ShanghaiTech/B/test_pmap/ data/ShanghaiTech/part_B/test_data/pmap/
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using git:
git clone git://git.rocketgit.com/user/hahattpro/crowd_counting_framework

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main