similar to: Maximum Likelihood estimation of KB distribution

Displaying 20 results from an estimated 100 matches similar to: "Maximum Likelihood estimation of KB distribution"

2012 Jul 11
0
declaring negative log likelihood of a distribution
Hi everyone! I already posted http://r.789695.n4.nabble.com/Declaring-a-density-function-with-for-loop-td4635699.html a question on finding density values of a new Binomial like distribution which has the following pmf: http://r.789695.n4.nabble.com/file/n4636134/kb.png Thank fully http://r.789695.n4.nabble.com/template/NamlServlet.jtp?macro=user_nodes&user=124474 Berend Hasselman and
2009 May 12
2
Kumaraswamy distribution
Dear R users, Does anyone know how to write function for Kumaraswamy distribution in R? Since I cannot write dkumar, pkumar, etc. in R. Please help. Thanks a lot, Debbie _________________________________________________________________ [[elided Hotmail spam]] [[alternative HTML version deleted]]
2003 Dec 11
0
Re: [R] chisq.test freezing on certain inputs (PR#5701)
On Thu, 11 Dec 2003, Jeffrey Chang wrote: > Hello everybody, > > I'm running R 1.8.1 on both Linux and OS X compiled with gcc 3.2.2 and > 3.3, respectively. The following call seems to freeze the interpreter > on both systems: > > chisq.test(matrix(c(233, 580104, 3776, 5786104), 2, 2), > simulate.p.value=TRUE) > > By freeze, I mean, the function call never
2014 Sep 07
0
format(object.size(...), units): KB, MB, and GB instead of Kb, Mb, and Gb?
I cannot remember if this has already been discussed or not, and I'm a bit worried I'm throwing off an endless debate. If it's already settled, no need to discuss it further. TOPIC #1: Shouldn't R use KB, MB and GB when reporting on sizes kilobytes, megabytes and gigabytes? More specifically, format() for object_size objects (returned by object.size()) uses Kb, Mb and Gb, which
2007 Jan 10
2
problems with optim, "for"-loops and machine precision
Dear R experts, I have been encountering problems with the "optim" routine using "for" loops. I am determining the optimal parameters of several nested models by minimizing the negative Log-Likelihood (NLL) of a dataset. The aim is to find the model which best describes the data. To this end, I am simulating artificial data sets based on the model with the least number
2006 Aug 09
1
scaling constant in optim("L-BFGS-B")
Hi all, I am trying to find estimates for 7 parameters of a model which should fit real data. I have a function for the negative log likelihood (NLL) of the data. With optim(method="L-BFGS-B",lower=0) I am now minimizing the NLL to find the best fitting parameters. My problem is that the algorithm does not converge for certain data sets. I have read that one should scale the fn
2006 May 29
2
Convert bytes to kb or mb in words
Using File.size(myfile) I can get the size of the uploaded file in _bytes_. Does anybody know of a function to convert bytes into a more human readable format? If I had a wish list for a "file_size_in_words()" function, it would do this: 10752 bytes becomes "10.5 Kilobytes". 2213814 bytes becomes "2.1 Megabytes". 238 bytes becomes "Less Than 1 Kilobyte".
2006 Jan 19
0
OS X Finder - Zero KB Available, Write Problems, Uggh
Dear list- On all of our Mac OS X (Samba) clients (10.3, 10.4) I have been experiencing the following problem: users would not be able to write to a published Samba shares UNLESS tha Global parameter "Max Disk Size = xxxxxx" is set in the Samba server's config file. I first discovered this issue while setting up a Linux server (using the 2.4 kernel and version 3.0 of Samba) last
2008 Sep 18
0
domU cpuinfo shows only 16 KB ater upgrading to Xen-3.3.0!
Hi folks! After upgrading my Xen-3.2.0 to new Xen-3.3.0/Linux-2.6.18.8-xen-3.3.0 my domU /proc/cpuinfo shows only: administrativo@vsrvXX:~$ cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 15 model : 6 model name : Intel(R) Pentium(R) D CPU 3.40GHz stepping : 4 cpu MHz : 3391.500 *cache size : 16 KB* physical id : 0 siblings
2004 Oct 04
1
Error: cannot allocate vector of size 1125522 Kb, Reached total allocation of 510Mb
R-help I'm trying to compute the 'dist' function of a data set consisting of 16975 observations and 5 variables(2 quantitative and 3 categorical). If I call the function on a subset of the data frame everything works fine but when I reach above 3000 observations R either crash or gives the following error message. Error: cannot allocate vector of size 1125522 Kb In addition: Warning
2008 May 17
0
autocorrelation in nlme: Error: cannot allocate vector of size 220979 Kb
Dear R community, Below you may find the details of my model (lm11). I receive the error message "Error: cannot allocate vector of size 220979 Kb" after applying the autocorrelation function update(lm11, corr=corAR1()). lm11<-lme(Soil.temp ~ Veg*M+Veg*year, data=a, random = list(Site=pdDiag(~Veg), Plot=pdDiag(~Veg))
2006 Nov 13
0
No mouse and KB activity in FC5 guest OS in FC6 xen host setup
Here is the configuration of unmodified guest FC5 in FC6 Xen host. I have a iMac running FC6 Xen in x86 mode and FC5 as unmodified guest I get a graphics for unmodified guest which allows me login. But then no mouse and keyboard is active after logging me into FC5 unmodified guest. What is wrong ? I also tried adding nographic=1 options and usb=1 and sdl=1. I always created it again. But
2017 May 12
0
Message body is too big: 247741 bytes with a limit of 40 KB
On 11.05.17 17:35, Jesse Molina wrote: >A 40KB limitation on messages is probably inappropriate in the year 2017. sending mail >40KB is inapropriate in this kind of mailing list. If you have attachment, share it via web. text is welcome pastebin or paste.debian.org (since this list runs on lists.debian.org). >On 5/9/2017 3:09 PM, nut-upsuser-owner at lists.alioth.debian.org wrote:
2008 Jul 09
2
CentOS Patch for http://www.kb.cert.org/vuls/id/800113
Will there be a BIND patch available for this vulnerability, for CentOS 3.9? http://www.kb.cert.org/vuls/id/800113
2006 Dec 05
0
Quota's KB or MB
Hi, Did dovecot ever implement quota's in MB's or has it always used quota's in KB's? I have RC15 using quota's in KB and beta8 somehow using quota's in MB's. Am I going mad? Regards Richard
1997 Jul 15
0
http://www.microsoft.com/kb/articles/q165/4/03.htm
extract from the above article: With this update installed, connecting to older SMB servers using the Client for Microsoft Networks is no longer possible, because these older servers do not support encryption of passwords sent over the network. The following SMB servers are known not to support password encryption over the network: Samba LAN Manager for UNIX
1997 Jul 15
0
http://www.microsoft.com/kb/articles/Q166/7/30.htm
this article is technically correct, however it fails to point out that samba can be upgraded to support encrypted passwords. see: 15th July 97: Samba and Encrypted Passwords. http://samba.canberra.edu.au/pub/samba/SambaEncryption.html luke extract: After upgrading your Windowss NT 4.0 computer to Service Pack 3 (SP3), you are unable to connect to SMB servers (such as Samba or
2004 Sep 30
1
nlme: cannot allocate vector of size 126064 Kb
I have around 4000 observations of a time series. I am trying to fit a regression with ARMA error structure using gls from the package nlme. I have encountered the error: cannot allocate vector of size 126064 Kb I know this has come up many times before and I will check out the suggestions in the mail archive. I was wondering though if there is an alternative package that will fit such a
2006 Oct 03
1
help: Error: cannot allocate vector of size 12079 Kb
Dear All, I'm running the latest R on WinXP by using Rgui.exe --max-mem-size=30000Mb . After read in a huge data file to a data matrix, I tried to get a subset of the data matrix but failed with: Error: cannot allocate vector of size 12079 Kb Any tips to get out of it? If helps, > memory.limit() [1] 3145728000 > memory.size() [1] 842735624 Thanks! Best, Cao
2008 May 16
1
autocorrelation error: cannot allocate vector of size 220979 Kb
Dear R community, I used a linear mixed model (named lm11) to model daily soil temperature depending upon vegetation cover and air temperature. I have almost 17,000 observations for six years. I can not account for autocorrelation in my model, since I receive the error message after applying the function: update(lm11, corr=corAR1()) Error: cannot allocate vector of size 220979 Kb Do