Add pypi link to README

plus some *free* empty lines :)
pull/31/head
Viktor Szépe 2020-07-25 16:19:14 +02:00 zatwierdzone przez GitHub
rodzic edd7df5dc6
commit cbb5c1b6c9
Nie znaleziono w bazie danych klucza dla tego podpisu
ID klucza GPG: 4AEE18F83AFDEB23
1 zmienionych plików z 6 dodań i 2 usunięć

Wyświetl plik

@ -10,6 +10,7 @@ If you would like to use Fawkes to protect your identity, please check out our b
Copyright
---------
This code is intended only for personal privacy protection or academic research.
We are currently exploring the filing of a provisional patent on the Fawkes algorithm.
@ -39,11 +40,13 @@ when --mode is `custom`:
`fawkes -d ./imgs --mode mid`
### Tips
- The perturbation generation takes ~60 seconds per image on a CPU machine, and it would be much faster on a GPU machine. Use `batch-size=1` on CPU and `batch-size>1` on GPUs.
- Turn on separate target if the images in the directory belong to different person, otherwise, turn it off.
- Run on GPU. The current fawkes package and binary does not support GPU. To use GPU, you need to clone this, install the required packages in `setup.py`, and replace tensorflow with tensorflow-gpu. Then you can run fawkes by `python3 fawkes/protection.py [args]`.
### How do I know my images are secure?
### How do I know my images are secure?
We are actively working on this. Python script that can test the protection effectiveness will be ready shortly.
Quick Installation
@ -57,9 +60,10 @@ pip install fawkes
If you don't have root privilege, please try to install on user namespace: `pip install --user fawkes`.
[pypi_fawkes]: https://pypi.org/project/fawkes/
### Citation
```
@inproceedings{shan2020fawkes,
title={Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models},