similar to: RCurl - HTTP request of header ONLY

Displaying 20 results from an estimated 900 matches similar to: "RCurl - HTTP request of header ONLY"

2011 Aug 30
1
Why does loading saved/cached objects add significantly to RAM consumption?
Dear list, I make use of cached objects extensively for time consuming computations and yesterday I happened to notice some very strange behavior in that respect: When I execute a given computation whose result I'd like to cache (tried both saving it as '.Rdata' and via package 'R.cache' which uses a own filetype '.Rcache'), my R session consumes about 200 MB of
2012 Jun 07
1
How to set cookies in RCurl
Hi, I am trying to access a website and read its content. The website is a restricted access website that I access through a proxy server (which therefore requires me to enable cookies). I have problems in allowing Rcurl to receive and send cookies. The following lines give me: library(RCurl) library(XML) url <- "http://www.theurl.com" content <- readHTMLTable(url) content
2013 Apr 24
0
string size limits in RCurl
Hi All, I am running into what appears to be character size limit in a JSON string when trying retrieve data from either `curlPerform()` or `getURL()`. Here is non-reproducible code [1], but it should shed some light on the problem. # Note that .base.url is the basic url for the API, q is a query, user # is specified, etc. session = getCurlHandle() curl.opts <- list(userpwd
2012 May 28
1
Rcurl, postForm()
Dear colleagues, Could I get some assistance using postForm() to scrape the business names and addresses at this website: http://www.brantford.ca/business/LocalBusinessCommunity/Pages/BusinessDirectorySearch.aspx I've read through (http://www.omegahat.org/RCurl/RCurlJSS.pdf) and scoured the web for tutorials, but I can't crack it. I'm aware that this is probably a pretty basic
2010 Nov 23
0
[R] Catching a RCurl error?
> -----Urspr?ngliche Nachricht----- > Von: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org] > Im Auftrag von Tal Galili > Gesendet: Dienstag, 23. November 2010 14:18 > An: r-help at r-project.org > Betreff: [R] Catching a RCurl error? > > Hi all, > > I'm running a complex script which accesses the internet, and sometimes > it >
2010 Nov 23
0
Catching a RCurl error?
Hi all, I'm running a complex script which accesses the internet, and sometimes it stops with the error: Error in curlPerform(url = url, headerfunction = header$update, curl = curl, > : Failure when receiving data from the peer Is there a way to make the script "wait" longer, or not crash when this error happens? (I'm wondering if this should be done in the level of
2010 Nov 14
1
RCurl and cookies in POST requests
Hello. I know that it's usually possible to write cookies to a cookie file by removing the curl handle and doing a gc() call. I can do this with getURL(), but I just can't obtain the same results with postForm(). If I use: curlHandle <- getCurlHandle(cookiefile=FILE, cookiejar=FILE) and then do: getURL(http://example.com/script.cgi, curl=curlHandle) rm(curlHandle) gc() it's
2008 Aug 27
1
RCurl: using netrc with curlPerform
Hello, I am having trouble getting the curlPerform function to authenticate using the .netrc file. From the documentation I've read it certainly seems as though this function should be able to authenticate via the .netrc file. The example I am using here comes from the "R as a Web Client- the RCurl package" paper and demonstrates using the .netrc file to access the
2011 Jun 06
1
RCurl and kerberos
Dear list, I would like to call a Kerberos-authenticated web-service from within R. Curl can do it: $ curl --negotiate -u : "http://my.web.service/" so I would expect that RCurl also has the capability, but I have not been able to find the correct options to set. listCurlOptions() does not return anything with negotiate, and searching the source of RCurl, the only thing I found was
2010 Apr 16
0
RCurl slow when sending data over 1kb
I am using RCurl's curlPerform command to send an XML string to an HTTP server running on the localhost. The command is something like this: reader <- basicTextGatherer() curlPerform(url="http://127.0.0.1/", httpheader=c('Content-Type' = "text/xml; charset=utf-8"), postfields=toString.XMLNode(xmlRoot(xdoc)), writefunction=reader$update,
2008 May 07
1
[BioC] RCurl loading problem with 64 bit linux distribution
Martin, Well, thanks for jumping in! We need all the help we can get ;) I changed the execute bit as you suggested and recompiled, no luck, still the same error message. Below is the output you wanted me to look at, its a bit beyond me so I include both a brief grep summary and then the whole enchilada. I do note that my output is different from yours, but I'm not sure how to interpret. I
2008 May 07
1
[BioC] RCurl loading problem with 64 bit linux distribution
Martin, Well, thanks for jumping in! We need all the help we can get ;) I changed the execute bit as you suggested and recompiled, no luck, still the same error message. Below is the output you wanted me to look at, its a bit beyond me so I include both a brief grep summary and then the whole enchilada. I do note that my output is different from yours, but I'm not sure how to interpret. I
2009 Sep 17
1
RCurl and Google Scholar's EndNote references
Hi! I've performed a Google Scholar Search using a query, let's say "Frank Harrell", and parsed the links to the EndNote references from the resulting HTML code. Now I'd like to download all the references automatically. For this, I have tried to use RCurl, but I can't seem to get it working: I always get error code "403 Forbidden" from the web server.
2009 Jun 02
1
Problem downloading webpages using batchfiles and RCurl from command line in Vista Basic - couldn't connect to host
Dear all, I am having a problem downloading webpages through R when i run it in the DOS window under Windows Vista Basic. I have downloaded the batchfiles from http://code.google.com/p/batchfiles/ and have successfully set the PATH. I open up 'Command Prompt' in Vista and type (after the C:\...> stuff): ### START ### C:\Users\Karen>Rscript -e "library(RCurl);
2005 Jan 10
0
SSOAP and Rcurl with proxy server
Hi I'm trying to use a bioconductor package (KEGGSOAP) which relies on Rcurl and SSOAP. As an example, a function exists: > list.organisms function () { orgs <- matrix(unlist(.SOAP(KEGGserver, "list_organisms", "", action = KEGGaction, xmlns = KEGGxmlns), use.names = FALSE), ncol = 2, byrow = TRUE) temp <- orgs[, 2] names(temp)
2010 Jul 03
1
XML and RCurl: problem with encoding (htmlTreeParse)
Hi All, First method:- >library(XML) >theurl <- "http://home.sina.com" >download.file(theurl, "tmp.html") >txt <- readLines("tmp.html") >txt <- htmlTreeParse(txt, error=function(...){}, useInternalNodes = TRUE) >g <- xpathSApply(txt, "//p", function(x) xmlValue(x)) >head(grep(" ", g, value=T)) [1] " |
2010 Aug 04
2
Finding the right url for RCurl
Hi all, I am using RCurl to try and download data from a website, but I'm having trouble finding out what URL to use. Here is the site: http://www.invescopowershares.com/products/holdings.aspx?ticker=PGX See how in the upper right, above the displayed sheet, there's a link to download the data as a .csv file? When I hit "copy url" and paste into getURL in R, it doesn't
2009 Jan 19
3
download/retain text file structure with RCurl/getURL()
Dear list, I'm trying to download a text file directly from the internet using the RCurl package and the command getURL. Duncan Lang graciously helped me solve the first step in this problem using the following command: ################# txtfile <- getURL('ftp://ftp.wcc.nrcs.usda.gov/data/snow/snow_course/table/history/idaho/13e19.txt', ftp.use.epsv = FALSE) #################
2009 Jan 26
2
RCurl unable to download a particular web page -- what is so special about this web page?
Dear R-help, There seems to be a web page I am unable to download using RCurl. I don't understand why it won't download: > library(RCurl) > my.url <- "http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2" > getURL(my.url) [1] "" Other web pages are ok to download but this is the first time I have been unable to download a
2013 Aug 25
2
RCurl cookiejar
R-helpers, When I use cURL in the Terminal: curl --cookie-jar cookie.txt --url "http://corpusdelespanol.org/x.asp" --user-agent "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:16.0) Gecko/20100101 Firefox/23.0" --location --include a cookie file "cookie.txt" is saved to my working directory. However, when I try what I think is the equivalent command R with RCurl: