search for: worker_connected

Displaying 9 results from an estimated 9 matches for "worker_connected".

2011 Sep 16
3
Rainbows! or unicorn?
I''m putting together a small web frontend for a client to upload files into an existing application. It''s trivial - there will never be more than a (small) handful of concurrent connections, but I need a streaming rack.input for upload progress on files up to 500MB or so. I was planning on using Rainbows! with ThreadSpawn and worker_connections=1, then noticed that unicorn is
2010 Feb 14
6
Nginx Sock And Rails Envinroment Error
Hi There, Im running an amazon instance with nginx proxying to a unicorn sock. For some reason, even though i specify the production environment, when being visited by nginx, the site shows errors in development form. Interestingly, when running on a port rather than a sock, if i visit that port, the errors are rendered as normal with a 500 page, the same port, throught nginx, shows errors
2011 Jul 08
2
Puppetmaster setup with separate CA server configuration help
Hi All, I am setting up puppetmaster with nginx and passenger and separating the Puppetmaster primary CA server. I have 3 host loadbalancer01 - Nginx doing LB on IP address and also running puppetmaster with passenger under 127.0.0.1 (port 8140). primaryca - Puppetmaster Primary CA pclient - Puppet Client The did the following steps: On Primary CA server: ---------------------------- cd
2006 Dec 07
17
compress and max upload size?
I am using mongrel_cluster with mod_proxy_balancer and would like to enable compression (assuming it improves throughtput) and limit file size upload. I configured mod_deflate and LimitRequestSize in Apache, but in my trials looks like the proxied calls bypass those directives (the conf goes below). Is there a way to get this? -- fxn # Adapt this .example locally, as usual. # # To be
2017 Nov 07
2
Problem with getting restapi up&running
Hi, i am currently struggling around with gluster restapi (not heketi), somehow i am a bit stuck. During startup of glusterrestd service it drops some python errors, heres a error log output with increased loglevel. Maybe someone can give me a hint how to fix this -- snip -- [2017-11-07 10:29:04 +0000] [30982] [DEBUG] Current configuration: proxy_protocol: False worker_connections: 1000
2011 Nov 15
3
Seg fault in dovecot/auth 2.0.15
...io = 0x829e2b0 tv = {tv_sec = 2147483, tv_usec = 0} msecs = 1 ret = 1 i = 0 j = 0 call = 192 #6 0xb785efa0 in io_loop_run (ioloop=0x82a6398) at ioloop.c:405 No locals. #7 0xb784abaa in master_service_run (service=0x82a62e8, callback=0x805c470 <worker_connected>) at master-service.c:481 No locals. #8 0x0805c828 in main (argc=2, argv=0x82a61c0) at main.c:298 c = <value optimized out> (gdb) --mhg
2013 May 30
0
HTTP 500 error page
Hello all, I''d like to have nginx+passenger show the custom (Default) error pages when a HTTP 500 error occurs in my app. I am running a rails 2.3 app on nginx 1.2 with passenger 3.0.7 Here is my nginx config file: #user nobody; worker_processes 16; error_log /opt/nginx/logs/error.log info; pid /opt/nginx/logs/nginx.pid; worker_rlimit_nofile 32768; events {
2012 Dec 06
2
pasenger does not start puppet master under nginx
On the server [root@bangvmpllDA02 logs]# ruby -v ruby 1.8.7 (2011-06-30 patchlevel 352) [x86_64-linux] [root@bangvmpllDA02 logs]# puppet --version 3.0.1 and [root@bangvmpllDA02 logs]# service nginx configtest nginx: the configuration file /apps/nginx/nginx.conf syntax is ok nginx: configuration file /apps/nginx/nginx.conf test is successful [root@bangvmpllDA02 logs]# service nginx status
2014 Feb 07
1
Dovecot 2.2.10 crash / infinite loop bug
...05 #12 0x0246f9c2 in io_loop_call_io (io=0x7d1ae8c0) at ioloop.c:388 #13 0x02470e97 in io_loop_handler_run (ioloop=0x7d1ae3c0) at ioloop-kqueue.c:151 #14 0x0246f928 in io_loop_run (ioloop=0x7d1ae3c0) at ioloop.c:412 #15 0x0241609d in master_service_run (service=0x7e433d00, callback=0x164e66a0 <worker_connected>) at master-service.c:566 #16 0x164e6cc8 in main (argc=Cannot access memory at address 0x0 ) at main.c:393 Current language: auto; currently asm Sincerelly, Jani Hast