search for: lakhs

Displaying 4 results from an estimated 4 matches for "lakhs".

Did you mean: lakes
2003 Aug 14
1
Re: Samba vs. Windows : significant difference in timestamphandling ?
...e clear that reiserfs is by far the best on all counts. I've read postings that xfs excells with very large files, e.g. movies, but I couldn't see any difference - reiserfs was just as fast). It's incredibly fast in directory manipulations, especially for very large directories with lakhs of files. Very parsimonious too. It doesn't compress data, not yet, but it doesn't waste space like other fs's. A full reiserfs volume could probably not be restored in an equally large ext3 volume. It works since about a year impeccably. > As I am about to upgrade our nt4 domain...
2010 May 14
0
Garbage collection
...rts and deployed in server, report have some hierarchy levels..... Initially we have less volume of data in database table and report is running with fine performance... after few months,reports get very slower, we thought that this could be because of data increased in database table ( nearly 11 lakhs of records in table)..... Then i created index and report has shown better performance than old.... after 1 year, again reports are showing poor performance..... So we have decided to monitor memory related issues and used ''top'' and ''free'' commands to monitor m...
2003 Oct 01
2
Hep for creating a package
Hi, I created a package for OpenSSH 3.7.1p2 in UNIX (OS: Sun Solaris), the newly created package includes binaries,man pages, libraries and Configuration files. I am planning to install this package in all of my Sun server's (Approximately 200 Server's). We have SSH/OpenSSH older version are already installed and running in all of the sun boxes. I want to install the
2011 Jun 11
1
Memory(RAM) issues
Dear All, I have been working with R(desktop version) in VISTA. I have the latest version R 2.13.0. I have been working with few data-sets of size 12Lakh * 15 and my code is quite computing intensive ( applying MCMC gibbs sampler on a posterior of 144 variables) that often runs into a memory issue such as memory can not allocate the vector ,full size(shows to have reached something like 1.5 GB)