similar to: Problem while trying to set up an ipsec vpn

Displaying 20 results from an estimated 100 matches similar to: "Problem while trying to set up an ipsec vpn"

2005 Feb 08
15
Few questions
Hi, I have a few problems with my shorewall configuration. First of all, the option maclist seems no to be recognized. I have this: ghostwheel /etc/shorewall # cat interfaces | grep -v ''^#'' - eth1 detect dhcp,tcpflags,routefilter loc eth0 detect tcpflags,maclist When I look at shorewall-init.log, I found out:
2006 May 29
4
IpSec support with kernel 2.6.16.18
Hi all, I''m currently using ipsec with Shorewall 3.0.7 on a patched 2.6.10 kernel. Having heard that ipsec support was in the standard kernel starting from 2.6.16, I tried to upgrade to the last kernel. My problem is that shorewall won''t start anymore. I get this output in /var/log/shorewall-init.log: Starting Shorewall... Initializing... Shorewall has detected the
2004 Dec 30
19
OpenVPN tun Interface
I have a zone "rw" defined as tun0 in interfaces. From that zone, pings to zone "loc" succeed but pings to remote networks (On IPsec VPNs) are rejected in the all2all chain. From my point of view, these pings should be in the rw2cctc chain. (rw to cctc is ACCEPTed in policy.) I must have a hole in my config, where would it be? Thanks, A.
2008 Jan 22
10
IPSEC VPN to VPN firewalling problem
Dear Shorewall Users :-) I''ve been playing with shorewall for some time now - I found it really interesting and easy tool to organise all the rules and so on (beforethat I''ve been using simple iptables rules in shell script ;-) Generally it''s quite easy to be used, but anyway found one problem which I cannot handle myself - or in other words - cannot find appropriate
2007 Sep 03
3
Shorewall + IPSec: help debugging why gw1<->gw2 SA works, but loc<->gw2 traffic doesn't trigger SA
Dear list, I''m running Shorewall on a dedicated Fedora 7 box. Shorewall is working well as an office DSL router (dynamic IP) with loc and dmz zones. I am now trying to configure IPSec to connect a VPS, "casp", with a static IP to both the firewall and to the loc network behind it. The host to host SA works fine. However, pings from "loc" to "casp" can be
2009 Nov 14
2
[LLVMdev] Very slow performance of lli on x86
> > for -O3 results refer attachment. > time clang (-O0) llvm-gcc(-O0) gcc(-O0) > real 0m10.247s 0m11.324s 0m10.963s > user 0m2.644s 0m2.478s 0m2.263s
2005 Apr 27
5
26sec kame ipsec tunnel : packets leave unencrypted...
Hi everyone, First of all, this is my first post in this ML, so I''m not sure that this is the right place for my question (please don''t shoot me down ;)). For the record, I''ve been reading and using LARTC for almost 3 years now, and it''s a great help for anyone who wants to learn linux networking. My problem: I want to setup a tunnel for the following
2009 Nov 14
0
[LLVMdev] Very slow performance of lli on x86
He is probably using the interpreter on a debug build. Evan On Nov 14, 2009, at 1:40 PM, Eric Christopher <echristo at apple.com> wrote: >> >> for -O3 results refer attachment. >> time clang (- >> O0) llvm-gcc(-O0) >> gcc(-O0) >> real >> 0m10.247s
2014 Jan 08
1
Some Speex AGC Questions
I'm attempting to use speex preprocess for automatic gain control in an application I'm working on and could use some help. I'm using Opus as my codec. In order to keep the number of packets down, I'm using 60msec frames. I'm sampling at 48KHz as is recommended for Opus. So, the frame length is 2880 samples and the sampling rate is 48000. The source of the data is a
2009 Nov 15
0
[LLVMdev] Very slow performance of lli on x86
Sorry i really forgot to mention one thing. I downloaded the X86 binaries of llvm+clang and llvm-gcc from llvm download site. i hope that is not a debug build. Prasanth J On Sun, Nov 15, 2009 at 1:22 PM, Prasanth J <j.prasanth.j at gmail.com> wrote: > Hi all, > > LLVM is built without debug enabled. Also i am not forcing lli to use > interpreter mode. so i dont think the
2009 Nov 15
5
[LLVMdev] Very slow performance of lli on x86
Hi all, LLVM is built without debug enabled. Also i am not forcing lli to use interpreter mode. so i dont think the reason is not because of debug build or interpreter mode. *step 1: * compiled the 3 files (generic_replica.c ,xacc.c and dacc.c) with clang-cc to llvm bytecode files using -emit-llvm-bc and (-O0/-O3) options *step 2:* bytecode obtained from step 1 (generic_replica.bc, xacc.bc and
2011 Jan 18
1
Open virt-viewer with virsh start
Is it possible to have virt-viewer come up when a domain is started with virsh start, the same way that it does with virt-install? Thanks. --- Scott Lerman
2009 Oct 24
1
[LLVMdev] [PATCH] remove usage of RaiseAllocations pass from llvm-gcc
After LLVM rev 84987, the RaiseAllocations pass no longer exists. llvm-gcc needs to be patched: Index: gcc/llvm-linker-hack.cpp =================================================================== --- gcc/llvm-linker-hack.cpp (revision 84984) +++ gcc/llvm-linker-hack.cpp (working copy) @@ -80,7 +80,6 @@ llvm::createJumpThreadingPass(); llvm::createFunctionInliningPass();
2010 Sep 07
1
Is an R sub-session somehow possible?
I wrote the interface between R and TeXmacs. Recently, I added tab completion. However, there is one slight problem. In order to enable easy interaction with R, I (I.e. my program) interact with the command-line interface. This means that the user can invoke demo(), and then R will interact with the user and ask to press enter. It also means that the user can enter a<-c(3,4 and then R will
2010 Sep 07
1
what is the best way for an external interface to interact with graphics, libraries
Another message about the R to TeXmacs interface. 1. Graphics The TeXmacs interface allows the user to directly insert graphics into the session. Since I am not very familiar with programming for R, I implemented the interaction with graphics in a very primitive way. It was two modes of working: with X11, and without (for example when working remotely through ssh without forwarding X11). In
2012 Apr 25
4
delayedAssign changing values
I'm not sure if this is a known peculiarity or a bug, but I stumbled across what I think is very odd behavior from delayedAssign. In the below example x switches values the first two times it is evaluated. > delayedAssign("x", {x <- 2; x+3}) > x==x [1] FALSE > delayedAssign("x", {x <- 2; x+3}) > x [1] 5 > x [1] 2 The ?delayedAssign documentation says
2012 Nov 12
3
arrange data
Dear r-users,   I have daily rainfall data from 1971 to 2000.  I would like to extract november and december data only.  I would also like to do column bind for november and december, therefore I would like to delete 31 December from december data so that the length of november and december are the same.  Hope somebody can help me.  I tried this below:   > kuantan.dt.1 <-
2014 Jan 04
0
Some Speex AGC Questions
I'm attempting to use speex preprocess for automatic gain control in an application I'm working on and could use some help. I'm using Opus as my codec. In order to keep the number of packets down, I'm using 60msec frames. I'm sampling at 48KHz as is recommended for Opus. So, the frame length is 2880 samples and the sampling rate is 48000. The source of the data is a
2012 Apr 26
2
How to plot graph with different scale (y axis) on same graph?
Hi, I have my data in below format. position var1 var2 2 .1 10 3 .29 89 12 .56 100 425 .34 1234 6546 .12 21 .... ..... ..... .... ..... ......
2013 Jul 19
3
mails delivered to the wrong user when using lmtp_proxy and reject_unverified_recipient
Hi, looks like we detected a serious bug in dovecot's lmtp proxying where e-mails are delivered to the wrong user. The setup is: *) Dovecot is configured with "lmtp_proxy=yes" # Support proxying to other LMTP/SMTP servers by performing passdb lookups. lmtp_proxy = yes *) Postfix uses "dynamic recipient verification", so Postfix starts sending a (verify) mail by LMTP to