秀人美女网爬虫 【Windows】【22.12.03】

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
F:\Pycharm_Projects\meitulu-spider\venv\Scripts\python.exe F:\Pycharm_Projects\meitulu-spider\xrmnw.py
****************************************************************************************************
秀人美女网爬虫
Verson: 22.12.03
Blog: http://www.h4ck.org.cn
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search>
Arguments:
-a <download all site images>
-q <query the image with keywords>
-h <display help text, just this>
Option Arguments:
-p <image download path>
-r <random index category list>
-c <single category url>
-e <early stop, work in site crawl mode only>
-s <site url eg: http://www.xiurenji.vip (no last backslash "/")>
****************************************************************************************************
F:\Pycharm_Projects\meitulu-spider\venv\Scripts\python.exe F:\Pycharm_Projects\meitulu-spider\xrmnw.py **************************************************************************************************** 秀人美女网爬虫 Verson: 22.12.03 Blog: http://www.h4ck.org.cn **************************************************************************************************** USAGE: spider -h <help> -a <all> -q <search> Arguments: -a <download all site images> -q <query the image with keywords> -h <display help text, just this> Option Arguments: -p <image download path> -r <random index category list> -c <single category url> -e <early stop, work in site crawl mode only> -s <site url eg: http://www.xiurenji.vip (no last backslash "/")> ****************************************************************************************************
F:\Pycharm_Projects\meitulu-spider\venv\Scripts\python.exe F:\Pycharm_Projects\meitulu-spider\xrmnw.py 
****************************************************************************************************
秀人美女网爬虫
Verson: 22.12.03
Blog: http://www.h4ck.org.cn
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search>
Arguments:
     -a <download all site images>
     -q <query the image with keywords>
     -h <display help text, just this>
Option Arguments:
     -p <image download path>
     -r <random index category list>
     -c <single category url>
     -e <early stop, work in site crawl mode only>
     -s <site url eg: http://www.xiurenji.vip (no last backslash "/")>
****************************************************************************************************

Continue Reading