wget

Cronjob with password protected site (.htaccess)

有些话、适合烂在心里 提交于 2020-01-12 18:47:17
问题 I want to create a cronjob that every X time goes to open a webpage. This webpage is password protected by .htaccess (user=admin, passwor=pass). The instruction I give is the following: wget --user=admin --password='pass' http://www.mywebsite.com/test.php But cron gives me the following error: --2012-05-02 10:14:01-- http://www.mywebsite.com/test.php Resolving www.mywebsite.com... IP Connecting to www.mywebsite.com|IP|:80... connected. HTTP request sent, awaiting response... 401 Authorization

Cronjob with password protected site (.htaccess)

时光毁灭记忆、已成空白 提交于 2020-01-12 18:45:41
问题 I want to create a cronjob that every X time goes to open a webpage. This webpage is password protected by .htaccess (user=admin, passwor=pass). The instruction I give is the following: wget --user=admin --password='pass' http://www.mywebsite.com/test.php But cron gives me the following error: --2012-05-02 10:14:01-- http://www.mywebsite.com/test.php Resolving www.mywebsite.com... IP Connecting to www.mywebsite.com|IP|:80... connected. HTTP request sent, awaiting response... 401 Authorization

How do I request a file but not save it with Wget? [closed]

风流意气都作罢 提交于 2020-01-11 14:51:45
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 7 years ago . I'm using Wget to make http requests to a fresh web server. I am doing this to warm the MySQL cache. I do not want to save the files after they are served. wget -nv -do-not-save-file $url Can I do something like -do-not-save-file with wget? 回答1: Use q flag for quiet mode, and tell wget to output to stdout with O

wget安装nginx

痞子三分冷 提交于 2020-01-11 14:34:24
#下载: wget http: //nginx.org/download/nginx-1.8.0.tar.gz #解压: tar -zxvf nginx-1 .8 .0 .tar .gz #安装依赖插件 yum install -y gcc gcc-c++ ncurses-devel perl pcre pcre-devel zlib gzip zlib-devel #进入目录编译安装 cd nginx-1 .8 .0 ./configure make && make install 来源: https://www.cnblogs.com/waibizi/p/12179625.html

wget 仿站

那年仲夏 提交于 2020-01-08 11:53:46
wget下载整个网站 $ wget -r -p -np -k www .baidu.com wget 参数说明 -r, --recursive 指定递归下载。 -p, --page-requisites 下载所有用于显示 HTML 页面的图片之类的元素。 -np, --no-parent 不追溯至父目录 -k, --convert-links 让下载得到的 HTML 或 CSS 中的链接指向本地文件。 常见参数见: https://www.cnblogs.com/ftl1012/p/9265699.html 来源: https://www.cnblogs.com/xlizi/p/12165248.html

Is there any javascript (and client-side) wget implementation?

感情迁移 提交于 2020-01-06 09:03:49
问题 In order to provide a service for webmasters, I need to download the public part of their site. I'm currently doing it using wget on my server, but it introduce a lot of load, and I'd like to move that part to the client side. Does an implementation of wget exists in Javascript? If it exists, I could zip the files and send them to my server for processing, that would allow me to concentrate on the core business for my app. I know some compression library exists in Js (such as zip.js), but I

Script for a changing URL

懵懂的女人 提交于 2020-01-06 04:21:14
问题 I am having a bit of trouble in coding a process or a script that would do the following: I need to get data from the URL of: nomads.ncep.noaa.gov/dods/gfs_hd/gfs_hd20140430/gfs_hd_00z But the file URL's (the days and model runs change), so it has to assume this base structure for variables. Y - Year M - Month D - Day C - Model Forecast/Initialization Hour F- Model Frame Hour Like so: nomads.ncep.noaa.gov/dods/gfs_hd/gfs_hdYYYYMMDD/gfs_hd_CCz This script would run, and then import that date

wget: where does it look for certificates?

坚强是说给别人听的谎言 提交于 2020-01-03 16:56:56
问题 I have a HTTPS-site that needs an intermediate-certificate to verify the servers SSL-certificate. If I put the intermediate-cert into /etc/ssl/certs (and make the hash-link) then openssl s_client -connect IP:PORT will work. Otherwise I get a verification error. Where does wget look for certificates? I only can make it work if I explicitly set --ca-directory in wget. So it seems openssl looks into /etc/ssl/certs and wget does not. Thanks! EDIT If I run wget with -d then I see without --ca

One-liner to split very large directory into smaller directories on Unix

假如想象 提交于 2020-01-03 11:49:00
问题 How do you to split a very large directory, containing potentially millions of files, into smaller directories of some custom defined maximum number of files, such as 100 per directory, on UNIX? Bonus points if you know of a way to have wget download files into these subdirectories automatically. So if there are 1 million .html pages at the top-level path at www.example.com , such as /1.html /2.html ... /1000000.html and we only want 100 files per directory, it will download them to folders

One-liner to split very large directory into smaller directories on Unix

ぐ巨炮叔叔 提交于 2020-01-03 11:48:21
问题 How do you to split a very large directory, containing potentially millions of files, into smaller directories of some custom defined maximum number of files, such as 100 per directory, on UNIX? Bonus points if you know of a way to have wget download files into these subdirectories automatically. So if there are 1 million .html pages at the top-level path at www.example.com , such as /1.html /2.html ... /1000000.html and we only want 100 files per directory, it will download them to folders