Subject | Hash | Author | Date (UTC) |
---|---|---|---|
update documentation | feaaa4095d89e99f71f0684eec302d01e421708d | Thai Thien | 2020-04-30 15:35:38 |
change gpu | c7c6aa29ccee5736c2d4cb290c2ff71c6d897d57 | Thai Thien | 2020-04-30 12:31:11 |
reduce batch size | 2b2820095211e117c6d02236ef85adb89b70098f | Thai Thien | 2020-04-30 12:28:32 |
forget .to(device) | a641d5cdf82eb89c9034253785e80dd920c1f5d6 | Thai Thien | 2020-04-30 12:25:22 |
reduce batch size of shb keepfull | 044f6785fedcf442def0fdb24109413a79274289 | Thai Thien | 2020-04-30 12:20:21 |
ccnn v7 with ssim loss | 40079e5354eef95e0e89cf23a7f2025ee362e232 | Thai Thien | 2020-04-30 12:16:19 |
local | 1d4e08754bd493e76cf09076de7c76adf1a30a0a | Thai Thien | 2020-04-30 12:10:15 |
mse-ssim loss function | 052de7c879bff7690a7cfc1905c8376bf8605c45 | Thai Thien | 2020-04-30 11:41:24 |
delete misinformation note | ca07533d585def04eec086685f2e72eacc330ddc | Thai Thien | 2020-04-29 16:59:47 |
typo | 3505077301af5349665f12121862db2512ad450a | Thai Thien | 2020-04-29 16:59:03 |
change, adapt, survive | 6755c8375302af2146cb63adc967631d53a7b1c8 | Thai Thien | 2020-04-29 16:48:10 |
gpu | 861ec0d41cea2eec359c6ddfe207e1ed6b583369 | Thai Thien | 2020-04-28 18:08:11 |
ccnn_v7_t4 | a84f63e64fe8b31fe22c94d383de5ed4e1a27fe4 | Thai Thien | 2020-04-28 18:07:33 |
2500 | d37516f53d21f3dfc05143ba5ffda80fc5a07825 | Thai Thien | 2020-04-28 17:56:57 |
change max epoch | 246ae30e6dd2b4a15b7dc70a3dc05592ac1c48f2 | Thai Thien | 2020-04-28 17:55:23 |
h1 t3 h1 t4 | fbcd13dd240e06a982a1ce48f27cd1d542a26b63 | Thai Thien | 2020-04-28 17:53:06 |
h1 | 96204cb5262500020371637741131d24e3fa91d0 | Thai Thien | 2020-04-27 17:35:51 |
typo | adfb213c2564bc90b8b69811469534b004808644 | Thai Thien | 2020-04-27 17:17:58 |
batch size 8 for shb | c90fa9a5d725a1ef0d29ed23f947ee05b9aa7894 | Thai Thien | 2020-04-27 17:10:34 |
change proxy | 30cd53782eb17b416c471502f1e6c6e9975a644b | Thai Thien | 2020-04-27 17:06:28 |
File | Lines added | Lines deleted |
---|---|---|
README.md | 16 | 1 |
dataset_script/README.md | 9 | 1 |
dataset_script/download_kaggle_crowd_dataset.sh | 2 | 2 |
File README.md changed (mode: 100644) (index c3455df..bfe4add) | |||
1 | # WORK IN PROGRESS | ||
2 | I am trying to publish a paper for graduating. I really need to publish. | ||
3 | |||
4 | <b> Please do not copy my work and my ideal to use as your own. </b> I really need to publish (a) paper(s), I hope you understand. | ||
5 | <br><br> | ||
6 | |||
7 | However, you can use my framework as starter code for your own work, or use my re-implementation of other works. | ||
8 | |||
9 | |||
1 | 10 | ### Environment | ### Environment |
2 | 11 | ``` | ``` |
3 | 12 | conda create -n env python=3.7 anaconda | conda create -n env python=3.7 anaconda |
... | ... | sudo chown -R tt /data/ | |
30 | 39 | When you use comet.ml | When you use comet.ml |
31 | 40 | ```shell script | ```shell script |
32 | 41 | conda install -c comet_ml comet_ml | conda install -c comet_ml comet_ml |
33 | ``` | ||
42 | ``` | ||
43 | |||
44 | "" | ||
45 | TODO: | ||
46 | https://github.com/kornia/kornia/blob/master/kornia/losses/ssim.py | ||
47 | |||
48 | add ssim from here |
File dataset_script/README.md changed (mode: 100644) (index c7aafda..0900f05) | |||
1 | 1 | # All thing in this folder handle data | # All thing in this folder handle data |
2 | 2 | ||
3 | The dataset, which is publicly available on the internet, belong to their original author. I only re-upload and process the dataset to use for my own project. I make it publicly available so I might save some of your time. | ||
4 | |||
5 | Shanghaitech dataset [Single-Image Crowd Counting via Multi-Column Convolutional Neural Network](https://pdfs.semanticscholar.org/7ca4/bcfb186958bafb1bb9512c40a9c54721c9fc.pdf) | ||
6 | |||
7 | UCF-CC-50 dataset [Multi-Source Multi-Scale Counting in Extremely Dense Crowd Images](http://openaccess.thecvf.com/content_cvpr_2013/papers/Idrees_Multi-source_Multi-scale_Counting_2013_CVPR_paper.pdf) | ||
8 | |||
3 | 9 | ### Kaggle dataset | ### Kaggle dataset |
4 | 10 | ||
5 | First, install https://anaconda.org/conda-forge/kaggle | ||
11 | You will need a Kaggle account, and a working installation of Conda on your machine. | ||
12 | |||
13 | First, install [Kaggle](https://anaconda.org/conda-forge/kaggle) package by conda | ||
6 | 14 | ```bash | ```bash |
7 | 15 | conda install -c conda-forge kaggle | conda install -c conda-forge kaggle |
8 | 16 | ``` | ``` |
File dataset_script/download_kaggle_crowd_dataset.sh changed (mode: 100644) (index ce48f8d..46c84d5) | |||
1 | 1 | #kaggle datasets download ucf-cc-50-with-people-density-map | #kaggle datasets download ucf-cc-50-with-people-density-map |
2 | kaggle datasets download shanghaitech-with-people-density-map | ||
3 | kaggle datasets download perspective-shanghaitech | ||
2 | kaggle datasets download tthien/shanghaitech-with-people-density-map | ||
3 | kaggle datasets download tthien/perspective-shanghaitech |