List of commits:
Subject Hash Author Date (UTC)
data_script run seem ok 67420c08fc1c10a66404d3698994865726a106cd Thai Thien 2020-02-01 03:33:18
add perspective 642d6fff8c9f31e510fda85a7fb631fb855d8a6d Thai Thien 2019-10-06 16:54:44
fix padding with p 86c2fa07822d956a34b3b37e14da485a4249f01b Thai Thien 2019-10-06 02:52:58
pacnn perspective loss fb673e38a5f24ae9004fe2b7b93c88991e0c2304 Thai Thien 2019-10-06 01:38:28
data_flow shanghaitech_pacnn_with_perspective seem working 91d350a06f358e03223966297d124daee94123d0 Thai Thien 2019-10-06 01:31:11
multiscale loss and final loss only mode c65dd0e74ad28503821e5c8651a3b47b4a0c7c64 Thai Thien 2019-10-05 15:58:19
wip : perspective map eac63f2671dc5b064753acc4f40bf0f9f216ad2a Thai Thien 2019-10-04 16:26:56
shell script f2106e700b6f6174d4dd276f25ec6f3d9ff239bb thient 2019-10-04 07:42:51
WIP 42c7c8e1d772fbbda61a4bdf9e329f74e1efb600 tthien 2019-10-03 17:52:47
add readme 580cf43d1edddd67b1f6a2c57fdd5cee3dba925c Thai Thien 2019-10-02 17:44:49
update script, debug ddb68b95389be1c1d398118677dd227a8bb2b70b Thai Thien 2019-10-02 15:52:31
add d (output density map) to loss function) a0c71bf4bf2ab7393d60b06a84db8dfbbfb1a6c2 tthien 2019-09-30 16:32:39
fix the args, add save interval for model, so we don't save them all 9fdf9daa2ac4bd12b7b62521d81e520db0debd01 tthien 2019-09-30 16:30:00
meow 1ad19a22a310992e27a26471feeb37375124d075 tthien 2019-09-29 18:25:43
fix pacnn perspective map 453ece3ccb818889ba895bfc4285f7905d33cba5 Thai Thien 2019-09-25 17:20:33
apex not work so well da8c0dd57297f972201f31d57e66897177922f48 Thai Thien 2019-09-24 17:25:59
fix data loader pacnn so it will scale up with correct number of people 11d55b50d764511f2491291f0208fee0905dec49 Thai Thien 2019-09-24 15:40:56
add comet ml a9d4b89ce594f5e241168ccafdcdf0f150ea0ebb Thai Thien 2019-09-23 17:07:58
fix pacnn avg schema c2140a96886195782e5689c24aeeb4fe7a2db7ad Thai Thien 2019-09-22 17:35:01
debug number not divisible by 8 a568fd7f294a8bd31b3db78437b4b6b51b5b41b9 Thai Thien 2019-09-22 04:36:06
Commit 67420c08fc1c10a66404d3698994865726a106cd - data_script run seem ok
Author: Thai Thien
Author date (UTC): 2020-02-01 03:33
Committer name: Thai Thien
Committer date (UTC): 2020-02-01 03:33
Parent(s): 642d6fff8c9f31e510fda85a7fb631fb855d8a6d
Signing key:
Tree: 7638cbed4da5b04760aee3d450df28f6f30797c7
File Lines added Lines deleted
README.md 8 0
dataset_script/README.md 20 0
dataset_script/download_kaggle_crowd_dataset.sh 0 3
dataset_script/unzip_file.sh 12 11
hard_code_variable.py 5 1
File README.md changed (mode: 100644) (index bfae287..b2f2bf7)
... ... conda install -c comet_ml -c conda-forge comet_ml
8 8 conda install ignite -c pytorch conda install ignite -c pytorch
9 9 conda install h5py conda install h5py
10 10 conda install scikit-learn conda install scikit-learn
11 ```
12
13 ### make data folder
14 Let make `/data` folder at root
15 ```
16 cd /
17 sudo mkdir data
18 sudo chown -R tt /data/
11 19 ``` ```
File dataset_script/README.md changed (mode: 100644) (index 82f573e..c7aafda)
... ... conda install -c conda-forge kaggle
9 9
10 10 Set Kaggle api to ~/.kaggle/kaggle.json as [instructed](https://github.com/Kaggle/kaggle-api) Set Kaggle api to ~/.kaggle/kaggle.json as [instructed](https://github.com/Kaggle/kaggle-api)
11 11
12 ### Set up dataset
13
14 Download dataset from kaggle
15
16 ```shell script
17 sh download_kaggle_crowd_dataset.sh
18 ```
19
20 Unzip and set name for dataset
21
22 ```shell script
23 sh unzip_file.sh
24 ```
25
26 Default, dataset will be save to `/data`. However, you have to manually create `/data` folder.
27
28 ```shell script
29 sudo mkdir data
30 sudo chown -R tt /data/
31 ```
File dataset_script/download_kaggle_crowd_dataset.sh changed (mode: 100644) (index e9df98e..ce48f8d)
1 1 #kaggle datasets download ucf-cc-50-with-people-density-map #kaggle datasets download ucf-cc-50-with-people-density-map
2 #
3 2 kaggle datasets download shanghaitech-with-people-density-map kaggle datasets download shanghaitech-with-people-density-map
4
5
6 3 kaggle datasets download perspective-shanghaitech kaggle datasets download perspective-shanghaitech
File dataset_script/unzip_file.sh changed (mode: 100644) (index b389f77..64fe5c7)
1 mkdir -p data/
1 PWD="$(pwd)"
2 DATAPATH=/data
3 # if DATAPATH is not exist, please manually make it
2 4
3 5 # mv shanghaitech-with-people-density-map.zip data/ ; unzip data/shanghaitech-with-people-density-map.zip # mv shanghaitech-with-people-density-map.zip data/ ; unzip data/shanghaitech-with-people-density-map.zip
4 6
5 7 # mv ucf-cc-50-with-people-density-map.zip ../data/ ; cd ../data/ ;unzip ucf-cc-50-with-people-density-map.zip # mv ucf-cc-50-with-people-density-map.zip ../data/ ; cd ../data/ ;unzip ucf-cc-50-with-people-density-map.zip
6 8
7 mv perspective-shanghaitech.zip data/
8 mv shanghaitech-with-people-density-map.zip data/
9 cd data/
10 unzip perspective-shanghaitech.zip
11 unzip shanghaitech-with-people-density-map.zip data/
12 cd ..
9 # mv perspective-shanghaitech.zip $DATAPATH
10 # mv shanghaitech-with-people-density-map.zip $DATAPATH
11
12 unzip perspective-shanghaitech.zip $DATAPATH
13 unzip shanghaitech-with-people-density-map.zip $DATAPATH
13 14
14 15 # put perspective # put perspective
15 mv data/perspective-ShanghaiTech/A/train_pmap/ data/ShanghaiTech/part_A/train_data/pmap/
16 mv data/perspective-ShanghaiTech/A/test_pmap/ data/ShanghaiTech/part_A/test_data/pmap/
17 mv data/perspective-ShanghaiTech/B/train_pmap/ data/ShanghaiTech/part_B/train_data/pmap/
18 mv data/perspective-ShanghaiTech/B/test_pmap/ data/ShanghaiTech/part_B/test_data/pmap/
16 mv $DATAPATH/perspective-ShanghaiTech/A/train_pmap/ $DATAPATH/ShanghaiTech/part_A/train_data/pmap/
17 mv $DATAPATH/perspective-ShanghaiTech/A/test_pmap/ $DATAPATH/ShanghaiTech/part_A/test_data/pmap/
18 mv $DATAPATH/perspective-ShanghaiTech/B/train_pmap/ $DATAPATH/ShanghaiTech/part_B/train_data/pmap/
19 mv $DATAPATH/perspective-ShanghaiTech/B/test_pmap/ $DATAPATH/ShanghaiTech/part_B/test_data/pmap/
File hard_code_variable.py changed (mode: 100644) (index 3f5b17a..b844ba1)
1 1 class HardCodeVariable(): class HardCodeVariable():
2 """
3 Where did you put your data ?
4 set it here, so you don't have to set in every script
5 """
2 6 def __init__(self): def __init__(self):
3 7 self.UCF_CC_50_PATH = "/data/cv_data/UCFCrowdCountingDataset_CVPR13_with_people_density_map/UCF_CC_50" self.UCF_CC_50_PATH = "/data/cv_data/UCFCrowdCountingDataset_CVPR13_with_people_density_map/UCF_CC_50"
4 self.SHANGHAITECH_PATH = "data/ShanghaiTech/part_A/train_data"
8 self.SHANGHAITECH_PATH = "/data/ShanghaiTech/part_A/train_data"
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/hahattpro/crowd_counting_framework

Clone this repository using git:
git clone git://git.rocketgit.com/user/hahattpro/crowd_counting_framework

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main