Archive for September, 2014


A bit behind as always… but struggling with this one for a while, so I felt it worthy of sharing the solution.

I got everything working on the LAN then setup port forwarding on my firewall for;

TCP Port 80
TCP Port 443
TCP Port 1494
TCP Port 1604
TCP Port 2598

Whilst I was able to access the storefront when I tried to launch my application i received an error;

Connection to the server "192.168.0.243" (the server's internal / private IP address) was interrupted. Please check your network connection and try again.

 

Screen Shot 2014-09-24 at 22.48.46

All the documentation and discussions online suggest that remote access is only possible using Netscaler, and although I may deploy Netscaler at a later date, for the moment I want to skip that step!

I found the default.ica file in C:\inetpub\wwwroot\Citrix\Store\App_Data contained an Address= entry (my installation uses mostly default values, and an excerpt looked something like this);

[ApplicationServers]
Application=

[Application]
Address=
TransportDriver=TCP/IP
DoNotUseDefaultCSL=On
BrowserProtocol=HTTPonTCP
LocHttpBrowserAddress=!
WinStationDriver=ICA 3.0
ProxyTimeout=30000
AutologonAllowed=ON

Simply adding my external IP address in the Address= section got me working!

Oddly, it’s quite hard to find Ubiquiti hardware in the UK, but i’ve previously sourced equipment from an eBay distributor (http://stores.ebay.co.uk/ubntshop/) and they were able to provide the best price for the ERL.

I fired the first EdgeRouter up and starting getting to know the webUI. It didn’t take long, but seemed very basic. Even for my relatively simple requirements i’d need to get to know the CLI. The official Ubiquiti Edgemax support forum (https://community.ubnt.com/t5/EdgeMAX/bd-p/EdgeMAX) was a great place to start.

An important thing to note is that the Edgemax operating system is based on Vyatta, so you if you struggle to find an Edgemax specific solution to a problem you may be able to find a Vyatta solution which will work on your Ubiquiti hardware.

IP Addresses

One of the first decisions was an IP addressing scheme. I decided to use 192.168.x.y;

  • where x represents the site (in increments of 10 to allow for future splitting) and
  • y will be the same at every site for the router & file server

The DHCP range will be from .1 to .199

Firmware Upgrade

I should have done this a little earlier, but when configuring the system parameters I was reminded to check for a firmware upgrade and found the shipped unit was running a pretty outdated v1.1 (the current at the time of posting is v1.5). So I went ahead and upgraded.

Default Configuration (WAN + 2LAN)

The new firmware has a great wizard to get you started. I chose the WAN + 2LAN setup and was immediately up and running with the router providing internet connectivity to the LAN. However, at this point in time double NAT is occurring as the internet connection is provided by a BT HomeHub3 (which doesn’t support bridge mode).

ADSL Modem

To avoid the double NAT scenario it was necessary to purchase an ADSL modem. There don’t appear to be many to chose from, I opted for the Draytek Vigor 120. Absolutely no configuration of the modem was required, I simply plugged it in and set the ERL WAN connection to use PPoE with login credentials;

username: bthomehub@btbroadband.com
password: BT

…and voila!

VPN

Initially during testing I placed both Edgerouters side by side, set static IP addresses (8.10.0.1 and 8.20.0.1) on connected them with an ethernet cable. Unfortunately, I was unable to get an IPSec tunnel established using the WebUI, but after looking at some sample configs on the forum I was able to get it working using the CLI.

I had to then modify 3 elements to get it working on-site;

  • the peer to use the dynamic hostname
  • the local-ip to use 0.0.0.0
  • the interface to use pppoe0
vpn {
    ipsec {
        auto-firewall-nat-exclude enable
        disable-uniqreqids
        esp-group FOO0 {
            compression disable
            lifetime 3600
            mode tunnel
            pfs enable
            proposal 1 {
                encryption aes128
                hash sha1
            }
        }
        ike-group FOO0 {
            dead-peer-detection {
                action restart
                interval 15
                timeout 30
            }
            lifetime 28800
            proposal 1 {
                dh-group 2
                encryption aes128
                hash sha1
            }
        }
        ipsec-interfaces {
            interface pppoe0
        }
        nat-traversal enable
        site-to-site {
            peer dynamic-hostname.com {
                authentication {
                    mode pre-shared-secret
                    pre-shared-secret secret
                }
                connection-type initiate
                default-esp-group FOO0
                ike-group FOO0
                local-ip 0.0.0.0
                tunnel 1 {
                    allow-nat-networks disable
                    allow-public-networks disable
                    esp-group FOO0
                    local {
                        subnet 192.168.20.0/24
                    }
                    remote {
                        subnet 192.168.10.0/24
                    }
                }
            }
        }
    }
}

This was working well but if either internet connection dropped or router got rebooted the VPN wouldn’t automatically come back up. Supposedly dead-peer-detection should take care of this, but it doesn’t appear to be working. I decided to create a simple workaround using a cron script;

#!/bin/bash
run=/opt/vyatta/bin/vyatta-op-cmd-wrapper
$run show vpn ipsec sa | grep "up"
if [ $? == 1 ]
then
 $run restart vpn
fi

The following command creates a cron job to run the script every 5 minutes;

set system task-scheduler task vpn_monitor executable path /config/scripts/vpn_monitor.sh
set system task-scheduler task vpn_monitor interval 5m

By placing the script in /config/scripts you ensure it remains after a firmware upgrade and is included in configuration backups.

Static-Host-Mapping

We want to block a few websites (namely facebook) and rather than overcomplicating things with url-filtering / squidguard, we’ve simply set a few static host mappings;

set static-host-mapping host-name facebook.com inet 127.0.0.1

We also set a static host map for file server at the other site (as the DNS server on the local router doesn’t have any knowledge of hostnames/ip addresses serviced by the other site). Maybe at a later date I will try and find out if I can forward DNS requests to the other site before going out to the internet?

Backup

Every time I make a configuration change I download a config backup.

On one occasion the backup failed to download and the WebUI became unresponsive (rebooting the router fixed things, but the backup still wouldn’t download). I later discovered this was due to size of the /config folder after installing squidguard and downloading the category database. As I wasn’t going to be using this initially I simply removed it.

I was recently tasked with overhauling the “network” for a local, small, not for profit. The company currently have 2 sites, with roughly a dozen desktops at each and half dozen laptops which roam between the two.

The primary requirements were to provide;

  • networked file storage (preferably redundant)
  • centralised user management (single sign-on and access control)
  • site blocking/web filtering

If both sites had “reasonable” internet connections, I would have suggested a single server at the “central” location with a site-to-site VPN. Unfortunately the connections are~ 3MBit down, 0.3Mbit up (ADSL). This introduces a need for additional hardware (servers at every site) and a way of synchronising/replicating between the sites!

As always, everything should be designed with scalability in mind, but sticking to a tight budget.

The File Servers

My first purchase were the file servers. Many years back I used to “roll my own” with something like a HP MicroServer and Windows Home Server (or possibly FreeNAS/OpenFiler) but some years back I made the transition to a dedicated Synology appliance.

Whilst you lose some of the flexibility (being able to install any software on x86/x64 hardware like the MicroServer) you gain a huge amount of reliability and support by going with a dedicated appliance (not to mention the huge feature set and ability to run many additional applications on the Synology product line).

One of the only requirements for the file server was redundancy (so at least 2 bays to support raid 1). Wanting to stick with Synology I used their product comparison tool (https://www.synology.com/en-uk/products/compare) to make a shortlist and then after looking at the prices settled for the DiskStation DS214.

Although storage requirements were pretty small, I got a great deal on WD Green 3TB disks so bought 5 (2 for each site and 1 spare).

The Routers

Had this have been a “home solution” i’d probably have opted for something like the Asus RT-AC66U flashed with one of the open source firmwares such as OpenWRT or DD-WRT. But, needing a “business solution” I needed something, most importantly reliable (the potential sacrifice being ease of use).

On top of reliability, the primary feature requirement for the routers is site-to-site VPN. After some research I decided to give the Ubiquiti EdgeRouter Lite 3 a try. Frustratingly the ADSL connection coming in at both sites is provided by a BT HomeHub3. The HH3 doesn’t support bridge mode, and to avoid double NAT / further complications I decided to purchase 2 ADSL modems (there aren’t many to chose from… I went for the Draytek Vigor 120).

Documentation

I previously posted about some SharePoint issues i’ve been tackling, this is the medium i’ve chosen for documenting and sharing the how-to guides, configuration details and process documents. I’m yet to tackle, but may also use it for new user requests, password resets, support requests etc.

To be continued…

Similarly, I have already posted about getting OpenLDAP replication working, this was one tiny part of the project. I will be following up this post with a number specifically tackling the implementation and configuration of the new solution.

Watch this space.

Hopefully the shortest post yet :)

I was previously using GeoScaling (http://www.geoscaling.com/) to provide DNS for my domain names but they’ve had a few 24hr+ outages in the last year and this has caused havoc with (mainly with my e-mail as sending servers have been unable to resolve/deliver mail to my domain).

I anticipated it was time to cough up some cash for a professional, paid for service with guaranteed uptime, but stumbled across another free option- CloudFare (http://www.cloudflare.com). The switch was pretty seamless and (to the best of my knowledge) they’ve had no downtime since I migrated. They have a much larger infrastructure (presumably due to the paid for services they also offer) and even the free services supports a CDN style caching if you wish to save your webserver’s bandwidth.

%d bloggers like this: