similar to: Script auto-detecting its own path

Displaying 20 results from an estimated 2000 matches similar to: "Script auto-detecting its own path"

2009 Mar 11
3
Matrix Construction; Subdiagonal
I'm trying to enter a vector into the subdiagonal of a matrix but cannot find a command in R which corresponds to the MatLab version of diag(vec, k), where vec = the vector of interest, and k = the diagonal (k=0 for the diagonal; k=-1 for the subdiagonal; k=1 for superdiagonal, etc.) Is there an equivalent command in R? I'm looking for something like this: vec = seq(1, 5, 1)
2003 Jul 31
5
Wu-ftpd FTP server contains remotely exploitable off-by-one bug
Hello, I see in BugTraq that there's yet another problem with Wu-ftpd, but I see no mention of it in the freebsd-security mailing list archives...I have searched the indexes from all of June and July. Wu is pretty widely used, so I'm surprised that nobody seems to have mentioned this problem in this forum. The notice on BugTraq mentioned only Linux, not FreeBSD, but that's no
2005 Jul 19
1
Minor "bug" in source()
For R v2.1.1 patched and R v2.2.0 devel: Calling source(file, chdir=TRUE) with is.character(file) != TRUE, that is, with 'file' as a connection, will generate an error. Example: > file <- textConnection("cat('Hello world\n')") > source(file, chdir=TRUE) Error in source(file, chdir = TRUE) : Object "ofile" not found Of course, it does not make
2007 Dec 19
1
Function reference
Hi. I'm looking for an R equivalent to something like function pointers in C/C++. I have a search procedure that evaluates the fitness of each point it reaches as it moves along, and decides where to move next based on its fitness evaluation. I want to be able to pass different fitness functions to this procedure. I am trying to find a good way to do this. I was thinking of passing in
2005 Jul 01
4
Lines for plot (Sweave)
Dear List: I am generating a series of plots iteratively using Sweave. In short, a dataframe is subsetted row by row and variable graphics are created conditional on the data in each row. In this particular case, this code ends up generating 17,000 individual plots. In some cases, all student data (this is working with student achievement data) are available and my code below works very well in
2010 Mar 10
1
Strange result in survey package: svyvar
Hi R users, I'm using the survey package to calculate summary statistics for a large health survey (the Demographic and Health Survey for Honduras, 2006), and when I try to calculate the variances for several variables, I get negative numbers. I thought it may be my data, so I ran the example on the help page: data(api) ## one-stage cluster sample dclus1<-svydesign(id=~dnum, weights=~pw,
2014 Dec 03
2
[PATCH] test_compression.sh
* Use `mktemp` instead of playing with date(1). * Use -f instead of removing the file every time. * "echo ERROR; exit 1" is what die() is for. * Some cosmetic renamings ('k' to 'comp' for compression etc). * Remove the MacOSX comment. It's not MacOSX specific, and it's not a problem anyway. The number behaves just right. * Remove the $((${size}+10)). It's
2013 Jan 23
1
Evaluating the significance of the random effects in GLMM
Hi all! I am working with GLMM using the binomial family I use the following codes I dropped no significant terms, refitting the model and comparing the changes with likelihood: G.1<-lmer(data$Ymat~stu+spi+stu*sp1+(1|ber),data=data,family="binomial") G.1b<-lmer(data$Ymat~stu+spi+(1|ber),data=data,family="binomial") anova (G.1,G.2) But, when I want to evaluate the
2000 Jul 06
1
R 1.1.0 dev.print()
Hi, I just upgraded to 1.1.0 from 1.0.1 this morning on my OSF/1 machine. I now have problems with the following code: %E /tmp 43% R --vanilla Version 1.1.0 (June 15, 2000) ... > test2 <- function () { plot(runif(30)) ofile <- "/tmp/newfile.ps" dev.print(file = ofile) } + + + + + > > test2() Error in device(...) : Object "ofile" not found However, if
2010 Jun 11
4
setting the current working directory to the location of the source file
AFAIK a script run through source() does not have any legit way to learn about it's own location. I need this to make sure that the script will find its datafiles after I move the whole directory. (The datafiles are in the same directory.) Here is a hack I invented to work around it: print(getwd()) source_pathname = get("ofile",envir = parent.frame()) source_dirname =
2016 May 23
7
[PATCH 1/5] mllib: make external_command echo the command executed
Add an optional parameter to disable this behaviour, so the Curl module in v2v won't print user-sensible data (like passwords). --- builder/checksums.ml | 1 - builder/downloader.ml | 1 - builder/sigchecker.ml | 1 - mllib/common_utils.ml | 4 +++- mllib/common_utils.mli | 7 +++++-- v2v/curl.ml | 2 +- 6 files changed, 9 insertions(+), 7 deletions(-) diff --git
2009 Jun 16
3
The most straightfoward way to write a function that sums over the rows of a matrix
Hello! I am trying to write a function with vector and data.frame parameters that uses the sum() function and values from the rows of the data.frame. I need to pass this function as a parameter to optim(). My starting point is: observs <- data.frame(y, x1, x2, x3) Fn <- function(par, observs) { sum( (y - (par[1] * (x1 + 1) * x2^(-par[2]) * x3^par[3])^2 ) }
2011 Jun 15
12
Create or Update Model with JSON?
Having issue updating a model with JSON. I simply *"model.attributes=json"*, which does update the attributes but not the id, or uuid in this case. I want to save or create the record (if it''s id exists). Seems like this would be documented, but I''m not finding anything. I''m using a MOM to pass JSON between apps. Thanks. -- You received this message because
2019 Apr 16
2
Measure network bandwidth per process
Hi, Is there a way to measure network bandwidth per process in CentOS Linux release 7.6.1810 (Core) using any utility? I was reading about nethogs but it does not have the option to run it in daemon mode so that we can take a look at historical data to figure out the process which was consuming high network bandwidth instead it is a good tool for Live monitoring. Please suggest. Thanks in
2009 Jun 16
2
Trouble with optim on a specific problem
Hello! I am getting the following errors when running optim() [I tried optim() with 3 different methods as you can see]: Error in optim(c(0.66, 0.999, 0.064), pe, NULL, method = "L-BFGS-B") : objective function in optim evaluates to length 6 not 1 > out <- optim( c(0.66, 0.999, 0.064), pe, NULL, method = "Nelder-Mead") Error in optim(c(0.66, 0.999, 0.064),
2010 Nov 29
1
map() and pdf clipping
Hello, Below is a function (test.map) that permits drawing the same map using three different devices. The "pdf" device doesn't clip polygons to the plot region as I see it does by both the native device (in my case "Quartz") and the "png" device. test.map("pdf") # produces "test-map.pdf" with no clipping test.map("png") #
2007 Apr 23
1
Bug in R 2.4.1 ?
Hello everybody, I'm using hdf5 files to store results from intermediate calculations. These are usually part of a list, called "res". As I want the hdf-files to contain all the members of res in its top "directory", I used to do attach(res) do.call("hdf5save", args=c(fileout=file.path(dir, ofile), as.list(names(res)))) detach(res) which did what I
2014 Dec 03
7
[PATCH] Improve LPC order guess
Hi, This patch improves compression a very tiny bit on average, but up to 0.1 percentage point for classical music. I haven't found any tracks that show worsening compression with this patch. -------------- next part -------------- A non-text attachment was scrubbed... Name: 0001-Improve-LPC-order-guess.patch Type: text/x-patch Size: 0 bytes Desc: not available Url :
2007 Dec 02
3
documenting yoru progress
Hello all: I have a function that writes a fairly elaborate report based on some survey data. For documentation and bookkeeping purposes, I'd like to write out in the report the function call that produced the report, or at least enough information to help me recreate the steps that led to that report. I've been generating all the reports with scripts, in order to be able to recreate
2005 May 30
4
Very simple traffic shaping script for H.323
Hello - What I want to do seems very simple - I want to make sure any H.323 traffic gets processed before anything else entering or leaving this network. The network has a videoconferencing device on the LAN at 192.168.16.4. A Linux firewall NATs an external IP Address to this internal address and I have appropriate SNAT and DNAT rules that work. The NAT and connection tracking rules all work