Archive for November, 2014


I provisioned a new Windows Server 2012 R2 VM to be used as a Domain Controller and another to be used for VMWare Update Manager and Veeam (Backup and Replication).

Assign a static IP address, and install all windows updates (this takes considerable time and numerous reboots).

Domain Controller

Follow the “wizard”. The main thing to note (as previously mentioned)- follow best practice when choosing a domain name; I’ve always gone with something.local or something.home in the past, but suffered as a result. I did a little research and found some articles suggesting best practice is to use a subdomain of an internet facing domain you own http://www.mdmarra.com/2012/11/why-you-shouldnt-use-local-in-your.html. So, say you own microsoft.com, your internal domain name may be ad.microsoft.com. You configure the NETBIOS name to be whatever you like, this will be used when you logon using NETBIOS\User rather than user@ad.microsoft.com.

Now you can join the other Windows Server to the domain and configure the identity source in vCenter. This took me a little longer than anticipated; You must login as administrator@vsphere.local (not root).

Update Manager

  • Install update manager (follow the “wizard”)
  • Login to vCenter (using vSphere)
  • Ensure all virtual machines off of host
  • Scan
  • Attach (patch and upgrade baselines)
  • Remediate (check both baselines and check all patches)
  • Repeat for each host

Veeam

  • Install Veeam
  • Connect to vCenter
  • Setup Backup Repository
  • Configure Backups (I stick roughly to the default… Weekly full backup with daily incrementals, retaining 14 restore points). *I added the entire datacenter to the job, so as I add new VMs they will automatically be included in the backup job. I can then create a new datacenter to store development machines and/or anything I don’t want included in the nightly backups*
Advertisements
  • Deploy vCenter Virtual Appliance
  • Configure static IP address, hostname etc
  • Check for & install updates (this took quite a while and the web interface appeared to hang, be patient)
  • Reboot
  • Launch & complete the setup wizard
  • Login to the web interface
  • Create a datacenter & cluster and add your hosts
  • Create vSphere Distributed Switch
  • *This is the step i’ve often missed which then causes loss of network connectivity. You then have to connect to the console (IPMI) and reset the host networking and start over!* : Edit the Distributed Port Group settings, under “Teaming and failover” move the uplink port(s) you intend to use to “Active uplinks”
  • Assign physical NICs to vDS
  • Migrate VMKernel  network adapters to vDS
  • *Assign SSL certificate (i’ve yet to do this, and imagine some detail will be required)

I have some additional hardware to setup. So will probably try and follow my own guide sometime in the coming weeks. I may add some screenshots and if it seems like any detail is missing.

Following on from: https://tickett.wordpress.com/2014/11/24/building-hosting-environment-part-1-hardware/

  • Configure IPMI (either use a static IP or setup a static DHCP lease)
  • Tweak the bios (ensure options are optimised for performance rather than to minimise noise etc)
  • Add DNS* entries for your IPMI and ESX Management Interfaces
  • Install ESXi (I did everything without the need to even plug a monitor/keyboard in, IPMI is a life saver)
  • Configure your management interfaces (use the IP addresses you previously configured in DNS, and the domain name you previously selected)

Now you can login with the vSphere client and configure a few more items;

  • NTP (on the Configuration tab under Software, Time Configuration)
  • Add your datastore (i’m using NFS, so I had to add a VMKernel interface first)

Until we have our vCenter server up and running we will stick to a single NIC.

*If you don’t yet have a device which provides DNS (router), you can add entries to your hosts file for now.

*Choosing a domain name; I’ve always gone with something.local or something.home in the past, but suffered as a result. I did a little research and found some articles suggesting best practice is to use a subdomain of an internet facing domain you own http://www.mdmarra.com/2012/11/why-you-shouldnt-use-local-in-your.html. So, say you own microsoft.com, your internal domain name may be ad.microsoft.com. You configure the NETBIOS name to be whatever you like, this will be used when you logon using NETBIOS\User rather than user@ad.microsoft.com.

Server/Network Monitoring

The problem is that we are spoilt for choice! My requirements are pretty basic, yet making a decision and wading through all the information/trialling etc was proving quite a challenge!

The feature set I’m looking for is pretty basic;

  • connectivity (probably through ping)
  • network throughput/bandwidth usage
  • website availability
  • sql server availability
  • disk space
  • cpu utilisation
  • disk queue length

The number of monitors/sensors/servers initially will be quite low, but i’m always nervous about purchasing something which is going to end up costing me substantial amounts if/when it needs expanding (always prefer unlimited licenses!).

And perhaps most importantly, I was really hoping to find something which sites on top of a Microsoft SQL Server backend database. This will make ad-hoc queries much simpler and allow me to pull data into existing SQL Server Reporting Services (SSRS) reports.

  • The top players in the industry appear to be Whatsup Gold & IPMonitor, both of them hold a pretty hefty price tag.
  • We then have a few big players with no price tag… Spiceworks & Nagios. I’ve used Spiceworks before, and find it too bloated (it does everything OK, but nothing very well… not to mention the adverts. Nagios sounds quite complex to get up and running, although a potential contender.
  • ServersAlive looked like an option with a very reasonable price tag. Although it doesn’t directly support a SQL Server backend, it does support the logging of “state changes” to SQL Server. Unfortunately the software looks incredibly dated- i’m not too sure it’s being actively maintained/developed.

And none of these use SQL Server as a backend database. Maybe I should just build something myself… my requirements are pretty basic… Luckily before making a purchase or starting to roll my own, i found this page on wikipedia;

http://en.wikipedia.org/wiki/Comparison_of_network_monitoring_systems

I took a quick look at the website for each tool listed as supporting MS SQL Server backend db and quickly found myself down to NetXMS (http://www.netxms.org/).

I installed the server along with a linux and windows agent (fairly effortless) and it’s all looking rather promising. I’ve started by setting up a few basic monitors and will hopefully find time over the coming weeks to add;

  • Additional monitors
  • Alerts
  • Poke around in the database (pull some data into SSRS)

I hope to be installing some equipment in a local datacenter to offer some hosting services. First item, the hardware;

  • Ubiquiti Edgerouter Lite
  • Dell 8024 (24x 10GbE Switch)
  • Synology RS3614RPXS NAS (6x WD RED 3TB + 2x Samsung EVO 840 1TB + Intel X540-T2 10GbE NIC)
  • 2x Supermicro AS-2022TG-HIBQRF (each w/ four nodes w/ 64GB RAM & 2x Opteron 6176 + Intel X540-T2 10GbE NIC)

Initially I went for a combination of the Netgear Prosafe XS708E (8x 10GbE Switch) paired with a Dell (24x 1GbE Switch) but quickly found myself running out of 10GbE ports and concerned about the lack of redundant power supplies.

Likewise, I had chosen the RS3614XS but felt the additional cost of the RP (model with redundant power supplies) was justified.

And finally the servers themselves, initially Supermicro AS1042G-LTF (single node with four sockets and single power supply) but then switching to the AS-2022TG-HIBQRF (four node, each with two sockets and shared redundant power supplies).

I’ve tried to avoid single points of failure at a component level (redundant power supplies etc) but without overkill couldn’t avoid it at device level (redundant switches, NAS etc).

Supplier wise… I got the switch from http://www.etb-tech.com/ and the NAS from http://elow.co.uk/ (both of which admittedly i had my doubts about when first placing the orders, as the prices seemed a little cheap, but the service was incredible, both dispatched same day using next day couriers). The rest from eBuyer and local suppliers.

Each device is connected to the switch using 2x10GbE LAG/LACP ports (I may go more into the configuration of this later).

VirtualBox High CPU Usage on OSX

I noticed the fan was constantly making a lot of racket in my MacBook Air lately (running Yosemite 10.10). VirtualBox appears to be the culprit, even when my Windows 7 guest is idling (showing as 100%+ in Activity Monitor).

After reducing the number of assigned processors to 2 (was previously 4) everything seems to be back to normal. I also set the max usage to 75% (although this had no impact when set to 4 processors, so i’m not sure if it actually helps).

Looking forward to the increased battery life more than anything.

For our helpdesk/support/ticketing system we have a screen mounted on the wall with key stats (still a work in progress);

tel_dash

90% of the data is pulling from a SQL database- simple stuff! However, we wanted to pull in the number of unanswered e-mails currently sitting in our support mailbox (circled in red).

Retreiving Data From the Inbox

The first challenge was to find a way to pull the number of e-mails. A small .NET app seemed like the way to go (we could then push the data into SQL and pull it into SSRS).

The first attempt used Microsoft.Office.Interop.Outlook;

var app = new Microsoft.Office.Interop.Outlook.Application();
var ns = app.GetNamespace("MAPI");
ns.Logon(null, null, false, false);
Outlook.Recipient recipient;
recipient = ns.CreateRecipient("Tickett Enterprises Support");
var inbox = ns.GetDefaultFolder(Microsoft.Office.Interop.Outlook.OlDefaultFolders.olFolderInbox);
var subfolder = inbox.Folders[1];
var shared = ns.GetSharedDefaultFolder(recipient ,Microsoft.Office.Interop.Outlook.OlDefaultFolders.olFolderInbox);
return shared.Items.Count.ToString();

This was straightforward but assumes the application is being run on a machine with outlook installed and configured with the Inbox setup etc. As our e-mail is provided by Exchange (Office 365) the recommended approach appeared to be utilising the Exchange Web Services Managed API (EWS). Writing this code took considerably longer. I got stuck troubleshooting “Autodiscover service couldn’t be located” error… I decided to point the API directly at the EWS URL but that failed with 401 Unauthorized Exception.

I’m not 100% sure of the cause, but it seems that my windows credentials were either not being passed or were not being recognised/interpreted correctly. Microsoft’s sample code uses

service.UseDefaultCredentials = true;

…but changing this to false fixed the 401. My end code was;

            
try
{
  ExchangeService service = new ExchangeService();
  service.Credentials = new WebCredentials("user@domain", "password", "");
  service.UseDefaultCredentials = false;
  service.Url = new Uri("https://pod51047.outlook.com/ews/exchange.asmx");
  Mailbox mb = new Mailbox("support@domain");
  return Folder.Bind(service, new FolderId(WellKnownFolderName.Inbox, mb)).TotalCount.ToString();
}
  catch (Exception ex)
{
  return ex.Message;
}

A colleague then suggested calling this code directly from SSRS instead of pushing/pulling to/from SQL. So… the .NET project was compiled as a class library and the rest should be easy?

Loading/Calling the DLL in SSRS

I expected this to be straightforward, but let’s face it- it never is! Fortunately, google saved the day.

In my development environment (local machine) I had to;

  • Reference my custom dll (in BIDS / Visual Studio, on the report menu, under report properties, references)
  • Copy my custom dll to C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies (This is for Visual Studio 2010, your path may differ slightly)
  • Copy any additional dlls your dll references to C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies (for me, this was Microsoft.Exchange.WebServices.dll)
  • Modify C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\RSPreviewPolicy.config (under each Codegroup node, set the PermissionSetName to FullTrust)

This had me up and running! I was hoping the same process would be true for deployment to the reporting service, think again! It turns out SSRS 2012 only supports .net 3.5 and earlier (my code was compiled as 4). Fortunately I was able to recompile my dll in .net 3.5 without any drama.

Then roughly the same process in my production environment (ssrs 2012);

  • Copy my custom dll and dependencies to c:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\bin (for SQL Server Reporting Services 2012)
  • Modify c:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\rssrvpolicy.config (under each Codegroup node, set the PermissionSetName to FullTrust)
%d bloggers like this: