search for: uncoupledlhs

Displaying 3 results from an estimated 3 matches for "uncoupledlhs".

2017 May 27
2
Latin Hypercube Sampling when parameters are defined according to specific probability distributions
...tributions with different rates ? >Here is the code used to perform a LHS when the parameter ?dispersal distance? is defined by one default value in the model: >library(pse) >factors <- c("distance") >q <- c("qexp") >q.arg <- list( list(rate=1/30) ) >uncoupledLHS <- LHS(model=NULL, factors, 50, q, q.arg) >head(uncoupledLHS) >Thanks a lot for your time. >Have a nice day >Nell Nell, I would like to suggest a slightly different method for generating the sample using the lhs library, then I will try using the pse library. Generally when you h...
2017 Jun 01
1
Latin Hypercube Sampling when parameters are defined according to specific probability distributions
...o one model simulation, I should have a value generated by the LHS for all distance classes at the first line of the data frame. library(pse) q <- list("qexp", "qunif", "qunif") q.arg <- list(list(rate=exponential_rate), list(min=0, max=1), list(min=0, max=1)) uncoupledLHS <- LHS(model=model_function, input_parameters, N, q, q.arg) hist(uncoupledLHS$data$dispersal_distance, breaks=10) tabLHS <- get.data(uncoupledLHS) Sorry, it?s the first time that I perform a sensitivity analysis using the LHS. Thank you very much for your time. Have a nice day Nell...
2017 Jun 01
0
Latin Hypercube Sampling when parameters are defined according to specific probability distributions
...a value generated by the LHS for all distance classes at the first line of the data frame. > > > > library(pse) > q <- list("qexp", "qunif", "qunif") > q.arg <- list(list(rate=exponential_rate), list(min=0, max=1), > list(min=0, max=1)) > uncoupledLHS <- LHS(model=model_function, input_parameters, N, q, q.arg) > hist(uncoupledLHS$data$dispersal_distance, breaks=10) > > tabLHS <- get.data(uncoupledLHS) > > > > Sorry, it?s the first time that I perform a sensitivity analysis using the LHS. > > > Thank you very m...