Displaying 20 results from an estimated 100 matches similar to: "Currency return calculations"
2011 Jan 07
0
Odp: Currency return calculations
My mistake sir. I was literally engrossed in my stupid logic, and while doing so, overlooked the simple and very effective solution you had offered. Sorry once again sir and will certainly try to be very careful in future.
Thanks again and have a great weekend sir.
Regards
Amelia
--- On Fri, 7/1/11, Petr PIKAL <petr.pikal@precheza.cz> wrote:
From: Petr PIKAL
2011 Jan 06
1
Calcuting returns
Dear R forum helpers,I have following datatrans <- data.frame(currency_transacted = c("EURO", "USD", "USD", "GBP", "USD", "AUD"), position_amt = c(10000, 25000, 20000, 15000, 22000, 30000))date <- c("12/31/2010", "12/30/2010", "12/29/2010", "12/28/2010", "12/27/2010",
2011 Jan 07
1
Calculating Returns : (Extremely sorry for earlier incomplete mail)
Dear R forum helpers,
I am extremely sorry for the receipt of my incomplete mail yesterday. There was connectivity problem at my end and so I chose to send the mail through my cell, only to realize today about the way mail has been transmitted. I am again sending my complete mail through regular channel and sincerely apologize for the inconvenience caused.
## Here is my actual mail
Dear R
2009 Feb 08
0
Initial values of the parameters of a garch-Model
Dear all,
I'm using R 2.8.1 under Windows Vista on a dual core 2,4 GhZ with 4 GB
of RAM.
I'm trying to reproduce a result out of "Analysis of Financial Time
Series" by Ruey Tsay.
In R I'm using the fGarch library.
After fitting a ar(3)-garch(1,1)-model
> model<-garchFit(~arma(3,0)+garch(1,1), analyse)
I'm saving the results via
> result<-model
2006 Jun 18
13
Currency calculation
I''m thinking of experimenting with some currency conversion. However,
I''d like the conversions to be in synch with the current rates.
Anyone know (and this maybe out in left field) if there is some online
(perhaps xml) or other data stream I can connect with in my code to
output values based on user selection ?
TIA
Stuart
2007 Sep 18
0
[LLVMdev] 2.1 Pre-Release Available (testers needed)
On Fri, Sep 14, 2007 at 11:42:18PM -0700, Tanya Lattner wrote:
> The 2.1 pre-release (version 1) is available for testing:
> http://llvm.org/prereleases/2.1/version1/
>
> [...]
>
> 2) Download llvm-2.1, llvm-test-2.1, and the llvm-gcc4.0 source.
> Compile everything. Run "make check" and the full llvm-test suite
> (make TEST=nightly report).
>
> Send
1998 Nov 16
5
Solaris make for 0.63 failing
I shouldn't try to do this on Monday morning. Can anyone suggest why the make
for R 0.63 is failing for me under Solaris (SunOS 5.6).
Paul Gilbert
...
creating src/scripts/html2dos
creating tests/Makefile
creating tests/Examples/Makefile
creating src/include/Platform.h
R is now configured for sparc-sun-solaris2.6
Source directory: .
Installation directory: /usr/local
C
2008 Feb 03
0
[LLVMdev] 2.2 Prerelease available for testing
Target: FreeBSD 6.2-STABLE on i386
autoconf says:
configure:2122: checking build system type
configure:2140: result: i386-unknown-freebsd6.2
[...]
configure:2721: gcc -v >&5
Using built-in specs.
Configured with: FreeBSD/i386 system compiler
Thread model: posix
gcc version 3.4.6 [FreeBSD] 20060305
[...]
objdir != srcdir, for both llvm and gcc.
Release build.
llvm-gcc 4.2 from source.
2008 Apr 30
2
fft: characteristic function to distribution
The characteristic function is the inverse Fourier transform of the
distribution function. The characteristic function of a normaly
distributed random variable is exp(-t^2/2).
x=seq(-2,2,length=100)
fft(pnorm(x),inverse=T)/length(x)
exp(-x^2/2)
Why aren't the inverse fft and the mentioned function the same?
Thanks for help,
Thomas
2008 Jan 24
6
[LLVMdev] 2.2 Prerelease available for testing
LLVMers,
The 2.2 prerelease is now available for testing:
http://llvm.org/prereleases/2.2/
If anyone can help test this release, I ask that you do the following:
1) Build llvm and llvm-gcc (or use a binary). You may build release
(default) or debug. You may pick llvm-gcc-4.0, llvm-gcc-4.2, or both.
2) Run 'make check'.
3) In llvm-test, run 'make TEST=nightly report'.
4) When
2008 Mar 06
2
How to hold a value(Mean sq) with a string
Hi all:
Can someone advice me on how to hold the residuals
Mean sq value on a string
so it can be used in other calculations.
I was trying something like this:
Msquare<-dfr$Mean sq but fails..Thanks
dfr <- read.table(textConnection("percentQ
Efficiency
1.565 0.0125
1.94 0.0213
0.876 0.003736
1.027 0.006
1.536 0.0148
1.536 0.0162
2.607 0.02
1.456 0.0157
2.16 0.0103
2011 Mar 16
2
Removing Bad Data
I created a couple of timeSeries objects - when I was merging them , I got an
error.
Looking at the data , I see that one of the time series has
06/30/2007 0.0028 0.0183 0.0122 0.0042 0.0095 -
07/31/2007 -0.0111 0.0255 0.0096 -0.0069 -0.0024 0.0043
08/31/2007 -0.0108 -0.0237 -0.0062 -0.0138 -0.0173 -0.0065
09/30/2007
2008 Mar 25
3
Output of order() incorrectly ordered?
Hello,
I have a data frame consisting of four columns and would like to sort
based on the first column and then write the sorted data frame to a
file.
> df <- read.table("file.txt", sep="\t")
where file.txt is simply a tab-delimited file containing 4 columns of
data (first 2 numeric, second 2 character). I then do,
> df_ordered <- df[order(df$V1), ]
OR,
2012 Aug 07
3
NADA Package: Referencing Data Frame Columns
The sample data sets that come with the NADA package are limited to one or
two variables and a censored measurement indicator column. I try to mimic
examples using my data but keep missing the target.
My water chemistry data is available in two formats: long (as seen in a
database table) and wide (as seen in a spreadsheet). The two structures are:
str(chem)
'data.frame': 65349 obs. of
2012 Jan 19
8
sumarizar
*Hola!!! resulta que tengo unos datos de divisas ordenados por fechas
(días) los que he convertido a formato tipo YYYY-MM-DD donde DD siempre es
01:*
*
*
*
EUR.resto$date<-as.Date(EUR.resto$date)
EUR.resto$mo <- substr(EUR.resto$date,6,7)
EUR.resto$yr <- substr(EUR.resto$date, 1,4)
2004 Apr 27
1
'R CMD build' fails when there are spaces in the path (PR#6830)
Full_Name: Byron Ellis
Version: R 1.9.0 (and 2.0.0)
OS: Linux (Redhat Fedora Core)
Submission from: (NULL) (140.247.241.197)
It appears that `R CMD build` cannot handle spaces in the path when building
packages for distribution. For instance:
[ellis@net-78815 ~/Bayesian Networks]$ R CMD build bnsl
* checking for file 'bnsl/DESCRIPTION' ... OK
* preparing 'bnsl':
* cleaning src
*
2010 Mar 29
1
Question about 'logit' and 'mlogit' in Zelig
I'm running a multinomial logit in R using the Zelig packages. According to str(trade962a), my dependent variable is a factor with three levels. When I run the multinomial logit I get an error message. However, when I run 'model=logit' it works fine. any ideas on whats wrong?
## MULTINOMIAL LOGIT
anes96two <- zelig(trade962a ~ age962 + education962 + personal962 + economy962 +
2007 May 23
1
I made some progress on my previous "systemfit" question but still not quite there
Surprisingly, I played around with some test code and below actually
creates equations that look correct.
tempmat<-matrix(10,nrow=6,ncol=6)
restrictmat<-diag(6)
colnames(tempmat)<-c("AUD.l1","CHF.l1","CAD.l1","GBP.l1","EUR.l1","JPY.l
1")
2010 May 11
1
kernel density to smooth plots
Hi r-sers,
I have a data of relative frequencies for the interval of 0-20, 20-40,...380-400. I would like the two data on the same graph using the same x-axis label. My question is how to get a smooth curve using kernel density code if it possible for this data.
> cbind(rel_obs,rel_gen)
rel_obs rel_gen
[1,] 0.000000000 0.0000
[2,] 0.092534175 0.0712
[3,] 0.105152471 0.1092
2008 Mar 24
1
Great difference for piecewise linear function between R and SAS
Dear Rusers,
I am now using R and SAS to fit the piecewise linear functions, and what
surprised me is that they have a great differrent result. See below.
#R code--Knots for distance are 16.13 and 24, respectively, and Knots for y
are -0.4357 and -0.3202
m.glm<-glm(mark~x+poly(elevation,2)+bs(distance,degree=1,knots=c(16.13,24))
+bs(y,degree=1,knots=c(-0.4357,-0.3202