similar to: When download.file fails...

Displaying 20 results from an estimated 100000 matches similar to: "When download.file fails..."

2008 Oct 01
1
changing 'https' to 'http' when using download.file(), any side effects or just use RCurl?
Dear R-Help, >From reading the help file, it is my understanding the the download.file() function does not support HTTPS connections. So therefore, understandably, the follow produces an error: ### R Code > url <- "https://stat.ethz.ch/pipermail/r-help/2008-October/thread.html" > destfile <- "//PFO-SBS001/Redirected/tonyb/Desktop/R_web_test/tmp.txt" >
2008 Mar 31
1
download.file error
Dear all, I am looking for a way to work out if a file on the internet exists before attempting to download it using the function download.file(). For example, using a url that does not exist url <- "http://finance.yahoo.com/ftse.csv" destfile <- tempfile() download.file(url = url, destfile = destfile) # gives the following response ... trying URL
2010 Sep 16
2
FTP Download
Hi, I have problems downloading complete folders via ftp with R. Single files work fine. I tried Rcurl, but it does not work. This is my code: url = "ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V6/Daily/2009/" filenames = getURL(url, ftp.use.epsv = FALSE, ftplistonly = TRUE, crlf = TRUE) filenames = paste(url, strsplit(filenames, "\r*\n")[[1]], sep =
2009 Jul 22
0
download.file() help! setting the proxy for user /passw0rd
I would like to download climate data files from PCMDI website using R. I tried this line below and I was not able to get the file mainly due to user name and password requirements. I am looking for help for setting up the user and password within R (or somewhere). I have read the FAQ but unfortunately I am a newbie on R and couldnt figure out how to do it. Many thanks in advance
2011 Nov 08
2
download.file
I am downloading say 100 files from ucsc website and storing it into dest folder. download.file function create a file in destination folder even if the file is not present which is something I dont want. So I wrote if condition to remove the file if the download function has non zero value. Now it exits when there is an error or file not present. How can I use "try" and "if"
2009 Feb 26
2
ftp fetch using RCurl?
Hi everyone, I have to fetch about 300 to 500 zipped archives from a remote ftp server. Each of the archive is about 1Mb. I know I can get it done by using download.file() in R, but I am curious that is there a faster way to do this using RCurl. For example, are there some parameters that I can set so that the connection does not need to be rebuilt....etc. A even simpler question is, how can I
2012 Aug 28
1
R Download 'Permission Denied'
Hello, I am receiving a 'Permission Denied' error when trying to use the R Download funtion. I am wondering if this is a: - Error in the code - Permissions issue at source - Permissions issue at destination. If you could shed any light on this it would be very much appreciated, below is a sample of the code. Code
2005 Jul 06
0
download.file() yields incomplete files with method="internal" (PR#7991)
Summary: When I use method="wget" with download.file(), I consistently get a download of the entire file. When I use method="internal", I infrequently get the entire file, but usually get only part of the file. This behavior occurs with .cdf (a weather file format - basically binary) from a UCAR site. I am not sure this is a bug, since it could be some internet
2012 Apr 12
1
using wildcards in download.file?
Hi, Do you know whether it is possible to use wildcards in download.file()? For example: url = "ftp://abc.com/*.*" # to download all the files in the ftp folder download.file(url,destfile=...) # does not work, any solutions? Thanks! JIng
2010 Nov 17
1
Problem downloading and opening netcdf file
I am trying to download and open an on-line netcdf file. I'm using Windows XP and R 2.11.1 Here's my script library(ncdf) link <- "http://ibis.grdl.noaa.gov/SAT/SeaLevelRise/slr/slr_sla_gbl_free_all_66.nc" dest <- "C:/temp/slr_sla_gbl_free_all_66.nc" download.file(url=link,destfile=dest) nc1 <- open.ncdf(dest) The file appears in my C:/temp
2008 Nov 14
1
Problems when I try to download.file pdfs
Hello, I have been trying to download a pdf file but I only receive a blank sheets. I used the option internet2 in windows: Rgui.exe --internet2 but I recieved the same result. I use the next command: ulr2 <- "http://cran.r-project.org/doc/manuals/R-intro.pdf" download.file(url = ulr2, destfile = "D:\\users2\\r-intro.pdf",cacheOK = FALSE) Any ideas how I can resolve
2011 Aug 26
1
issue with available.packages() and download.file()
Dear R-Users, I think I have encountered a potential bug (or at least unwanted behavior), but I'm not sure so I wanted to post here first. Lately I've been encountering an error when running a package I put together. I have my package set up to check for updates when it loads but this error occurs and stops the package from loading: Error : .onLoad failed in loadNamespace() for
2007 Jul 02
1
download.file - it works on my Mac but not on Windows.
Hi: I am working with someone remotely to allow them access to our data. The follow command using "download.file" works perfectly on my Mac: > > download.file(url="http://oceanwatch.pfeg.noaa.gov:8081/thredds/ > wcs/satellite/AG/ssta/14day? > request=GetCoverage&version=1.0.0&service=WCS&format=NetCDF3&coverage= >
2007 Sep 14
1
segfault in download.file
Hello, I was trying to use get.hist.quote in tseries, and got a segfault: -8<----------------------------------- > library(tseries) Loading required package: quadprog Loading required package: zoo 'tseries' version: 0.10-6 'tseries' is a package for time series analysis and computational finance. See 'library(help="tseries")' for
2016 Feb 18
1
default destfile in download.file()
A nice default value for the `destfile` argument in download.file() would be `basename(url)` i.e. the name of the downloaded file. This would correspond to default behavior in many other web/ftp clients and makes code slightly more concise: download.file("https://svn.r-project.org/R/trunk/doc/CRAN_mirrors.csv") mirrors <- read.csv("CRAN_mirrors.csv") [[alternative
2007 Oct 16
1
try / tryCatch for download.file( ) within a for loop when URL does not exist
I am trying to download a bunch of files from a server, for which I am using download.file( ) within a for loop. The script is working fine except until download.file hits a URL which has no file, at which point it exits. I want to change this behavior to simple log the failure and maintain state within the for loop and iterate to next. I read about try / tryCatch but am having trouble
2011 Sep 16
1
download files using ftp: avoid error
I am planning to download a large number of files from some website. I am using the following script. files2down = c('aaa', 'bbb', ................) for (i in 1: len) { print(paste('downloading file', i, ' of total ', len)); url = paste(urlPrefix, files2down[i], sep='') destfile = paste (dest, 'inDir', files2down[i], sep='/' )
2009 Feb 07
1
Yahoo data downloading problem
Hi, I got some problems while was trying to download data from Yahoo using yahoo.get.hist.quote() function. My script is as follows : app <- yahoo.get.hist.quote("aapl", start="02/07/09", end="02/07/06", quote="close") However I got following error : trying URL
2018 May 03
0
download.file does not process gz files correctly (truncates them?)
Dear all, I've been diving a bit deeper into this per request of Tomas Kalibra, and found the following : - the lock on the file is only after trying to read it using oligo, so that's not a R problem in itself. The problem is independent of extrenal packages. - using Windows' fc utility and cygwin's cmp utility I found out that every so often the download.file() function inserts
2010 Sep 21
1
diagnosing download.file() problems
I'm accessing around 95 tar files on an FTP server ranging in size between 10 and 40MB a piece. while certainly can click on them and download them outside of R, I'd like to have my script do it. Retrieving the ftp directory with RCurl works fine (about 90% of the time) but downloading the files by looping through all the files is a random process. I may get 1-8 files download and then