Displaying 20 results from an estimated 31 matches for "512x512".
2015 Sep 07
2
[PATCH mesa 2/3] nv30: Fix color resolving for nv3x cards
May I ask why you're doing 512x512 instead of 1024x1024? These are
already scaled up coordinates, so 1024x1024 should work no? Or is it
because of the seams on the edges? Do those not also appear with
512x512 or does it sample outside of the box?
Separately, why not use this approach on nv40 as well? I can't imagine
the blitter...
2003 Nov 24
2
Questions on Random Forest
Hi, everyone,
I am a newbie on R. Now I want to do image pixel classification by random
forest. But I has not a clear understanding on random forest. Here is some
question:
As for an image, for example its size is 512x512 and has only one variable
-- gray level. The histogram of the image looks like mixture Gaussian Model,
say Gauss distribution (u1,sigma1), (u2,sigma2),(u3,sigma3). And a image
classified by K-means or EM algorithm, so the class label image is also
512x512 and has 0, 1, 2 value.
I read the bina...
2018 Jul 15
3
x265 and NHW comparison images posted to https://nhwcodec.blogspot.com/?
Hello,
Ok, I will post them in a folder on Google Drive and will give the link on
my blog page, -because among other things, on my blog page, I can not
display 2 bitmap 512x512 images side by side-... Hope it is ok!
Cheers,
Raphael
2018-07-15 1:41 GMT+02:00 J.B. Nicholson <jbn at forestfield.org>:
> Raphael Canut wrote:
>
>> I downloaded 20 images from the Internet with rather good quality (mainly
>> faces), I compressed them at high compression...
2018 Jul 14
2
NHW Project - some results
...oject
currently.
I downloaded 20 images from the Internet with rather good quality (mainly
faces), I compressed them at high compression -l7 setting with the NHW
Project, and on 19 images out of 20, I visually prefer the results of NHW
compared to x265 (HEVC)!!! -I can make available these 20x3=60 512x512
24bit bitmap images for those who want, just let me know.-
I did not select these 20 images, it was the 20 first rather good quality
images that Google Images gave me (I entered the name of an actress...).
So I am quite satisfied with these results, and I think now that the NHW
Project is (visual...
2012 Jul 25
4
NHW Image codec - improvement of precision
...ying to improve precision of my codec, with keeping my
low-complexity (fast) approach.
I do not totally use the reference (and impressive) block prediction with
different modes + residual coding scheme, but I just apply residual coding
on the first order wavelet "image" (for example for a 512x512 image, I just
code the errors on the 256x256 wavelet "high-resolution" part), hence the
lack of precision.I try to compensate it with a little more neatness of the
wavelet 5/3 filterbank, but that's right that precision seems visually more
important.
Any opinion on this approach or o...
2009 Aug 26
3
changing equal values on matrix by same random number
Dear all,
I have about 30,000 matrix (512x512), with values from 1 to N.
Each value on a matrix represent a habitat patch on my
matrix (i.e. my landscape). Non-habitat are stored as ZERO.
No I need to change each 1-to-N values for the same random
number.
Just supose my matrix is:
mymat<-matrix(c(1,1,1,0,0,0,0,0,0,0,0,
0,0,0,0,2,2,2,0,0,0,0...
2010 Oct 15
0
tessellation from biological data in spatstat
Hi,
I'm new to this mailing list so apologies if this is too basic. I have
confocal images 512x512 from which I have extracted x,y positions of the
coordiates of labelled cells exported from ImageJ as a.csv file. I also
have images that define an underlying pattern in the tissue defined as
areas of different pixel values 0 or 255 (also 512x512) I've exported
these images as .txt files. I...
2015 Sep 07
5
[PATCH mesa 1/3] nv30: Fix max width / height checks in nv30 sifm code
The sifm object has a limit of 1024x1024 for its input size and 2048x2048
for its output. The code checking this was trying to be clever resulting
in it seeing a surface of e.g 1024x256 being outside of the input size
limit.
This commit fixes this.
Signed-off-by: Hans de Goede <hdegoede at redhat.com>
---
src/gallium/drivers/nouveau/nv30/nv30_transfer.c | 4 ++--
1 file changed, 2
2006 Nov 03
3
identify extremes positions of values into matrix
Um texto embutido e sem conjunto de caracteres especificado associado...
Nome: n?o dispon?vel
Url: https://stat.ethz.ch/pipermail/r-help/attachments/20061103/39e6883b/attachment.pl
2017 Nov 27
2
NHW Project - speed comparison with x265
...y slowly working on the NHW Project.I recently made a speed
comparison with x265 and wanted to share it with you.
On my processor Intel Core i5-6400, in average the NHW encoder (totally
unoptimized) is x10 times faster to encode than x265 (png decoding time
removed), in average 30ms vs 300ms for a 512x512 24bit color image, and in
average the NHW decoder (totally unoptimized) is x3 times faster to decode
than x265 (output to .ppm file), in average 10ms vs 30ms.
As a reminder, x265 is ultra optimized, so with good C optimization, SIMD
optimization and multithreading, the NHW codec will be x40 times...
2016 Oct 01
0
NHW codec
...little about the NHW codec and its state after
nearly 5 years on the Xiph forums.
I also know that if I would like to discuss I'd better go to the great
Doom9 and Encode.ru forums but for now I find that my codec is too
experimental for these professional forums:
- the codec is only for fixed 512x512 size, just a test tool.
- there is only mid compression
- for high compression, all the quantization scheme must be redesigned
- all the compression schemes (wavelet DC image, wavelet coeffs, residual
coding if I keep it...) must be improved
- change the chroma subsampling
- when all that is done,...
2018 May 02
0
NHW Project - new -l7 quality setting
...NHW Project a
-l7 high compression quality setting.
This setting is still experimental and can be improved, but actually (I am
certainly not objective) I find it competitive with x265 (HEVC), mainly
because it has a good neatness.
Update at: http://nhwcodec.blogspot.com/
You can also find sample 512x512 test bitmap images on the demo page.
Else I know that we can improve this setting, but I start to be short of
idea (still think to Chroma from Luma and efficient edge-preserving
denoising, but they are impressive research topics on their own)...
Any help, ideas are very welcome!
-To finish, agai...
2018 May 15
0
NHW Project - A new release?
...the Doom9, encode.ru and embedded processors forums.I
will still continue to post my "daily" updates on the Xiph Theora channel
if I am not boring you...
What I fear with the other forums, is for example that people tell me that
my project is too amateur because it is restricted to fixed 512x512 image
size and so is not interesting... What do you think of it? I plan to adapt
the NHW codec to any size of image, but first I would like to make all the
quality settings (lower and higher), and when all the algorithm is
validated, that I have a good demo version, then I will try to find a
sponso...
2018 Aug 04
0
NHW Project - Image comparison
...those who didn't read my previous post, here are salient
features and speed timings between the NHW Project and x265 codecs:
For encoding, the NHW Project (with all the expensive processings turned
on: pre-processing, feedback correction, residual coding, dithering,...)
takes 30ms to encode a 512x512 24bit bitmap image, and the x265 codec takes
300ms to encode that same ppm image.
For decoding, the NHW Project (with all the expensive processing turned on:
post-processing, UV comp sharpening,...) takes 10ms to decode the image,
and the x265 codec takes 30ms to decode that same ppm image.
As a...
2018 Aug 09
0
NHW Project - needed improvements
...is heavily mandatory and necessary to
interest a company, to be of interest to Xiph.org for example, to possibly
become a Xiph project,...
But it requires a lot of time and motivation that I don't have for now.My
idea is from the beginning, to create a good demo version of the NHW
Project for 512x512 size images, and if a company is interested maybe
they'll want to sponsorize me to adapt the codec to any image size and have
professional use.Don't have asked to Xiph, but maybe a company can also
sponsorized me via Xiph.org (non-profit organization)?
Yes, there is a big drawback for the...
1997 Oct 23
0
R-beta: why restart()
...off the ground.
I accept that if you know what will cause the fatal fail you can program in a
test for it --- sometimes. What about this one: likely fatal error due to
solve(mat,x) having nonsingular matrix. Yes, yes, check there aren't any zeros
on the diagonal of the SVD ... but mine are 512x512 matrices.
It isn't efficient programming to completely re-write solve (or indeed to
re-write any built-in) with it's built-in fatal return changed to an error
return, and it isn't efficient to run solve if you're going to do an explicit
SVD in the first place. It *is* efficient...
2007 Oct 01
1
Speeding up image plots
Dear R-users,
I wonder whether it is possible to speed up plots of 2-dimensional
images (such as image() or levelplot()) in order to export lower quality
versions of the resulting plots.
I intend to export a large number of images in order to construct
animated gifs. For this purpose, I used the png()-function in
conncection with image() and levelplot(). But the underlying arrays are
so big
2013 May 29
1
NHW Image codec - 2 lower quality settings
Hello,
I have finally added 2 lower quality settings for the NHW codec: -l1 (-5Ko)
and -l2 (-10Ko).I use a quantization of 0.935 and 0.88 (kind of
quantization), and I decrease residual coding on the first order wavelet
image.
I have updated the demo page: http://nhwcodec.blogspot.com/ .
These 2 lower quality settings are still experimental.If you could find
time, would be interested in any
2013 Nov 14
0
NHW Image codec - advice
...and select now the -h2
setting.Is that important to not respect the original indicated quality
setting (because this -rare- case the result won't be good)? The overhead
in the compressed file won't be too much, as generally a quite blurred
image has a high compression ratio, typically for a 512x512 image, it will
pass from 28Ko at normal setting to 31/32Ko at -h2 setting, just 3/4Ko more.
Could that switch in the quality setting be acceptable?
Many thanks,
Raphael
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.xiph.org/pipermail/theora/attachmen...
2008 Jun 11
1
Page breaks when encoding ogg/theora
...rs.
When using ogg_stream_pageout(), I'm making sure that all the
remaining pages are flushed at the end; it still doesn't work.
This is a hex dump of the page after the comment pages. This page
should contain 3 frames; no eos set. The frames are extremely simple
(Y, U, V all set to 0xc8, 512x512).
- header (30 bytes)
0x0012F33C 4f 67 67 53 00 00 42 00 00 00 00 00 00 00 29 00 OggS..B.......).
0x0012F34C 00 00 02 00 00 00 a2 31 5b d2 03 13 09 09 ......?1[?....
- body (37 bytes)
0x0196BBD8 30 00 0b 89 69 f7 ff ee 17 27 d3 ff b8 5c 9f 4f 0...i???.'???\?O
0x0196BBE8 ff 83 00 7...