Tag Archive: SQL

Further to a post over 3 years ago https://tickett.wordpress.com/2011/09/27/updateable-excel-frontend-for-sql-view/ – I have finally given up the search for a ready made solution and built something myself.

I work with a number of systems which have functionality gaps and/or need a simpler (quicker) interface for data entry. The solution is essentially a SQL view (or stored procedure) with a simple grid (excel like) front-end. If any data needs to be captured not currently handled by the system a custom table is created to hold the fields.

My previous Excel based solution works rather well, but is starting to show it’s age. I am now beta testing a web based solution i have built;

Screen Shot 2014-12-11 at 20.34.43

The application is extremely simple to configure- enter the SQL to retrieve & update the data;

Screen Shot 2014-12-11 at 20.43.43

And list the columns with some attributes (ReadOnly, Hidden, Title, Width etc);

Screen Shot 2014-12-11 at 20.42.10

And you’re all set!

Functionality currently includes;

  • Easy configuration
  • Simple/fast data entry (with validation)
  • Column resizing (non-persistent)
  • Sort on any column

Next steps;

  • Test (find and fix bugs)
  • Optimise code
  • Allow parameters in select SQL

I have used this technique over and over, and although many other options exist, this is simple and effective so worth mentioning.

There is 1 prerequisite: database mail must be configured on the sql server.

The first step is to build your query- let’s say:

select blah from some_table where exception_condition = 'yes'

Once we’re happy with our query we can build it into the sp_send_dbmail command:

EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'Test',
@recipients = 'blah@example.com',
@query = 'select blah from some_table where exception_codition = ''yes''',
@subject = 'EBMS Exception Report',
@attach_query_result_as_file = 1;

Notice- you will have to replace any single quotes in your query with 2 single quotes. You will likely notice a the output is a bit mis-aligned:

This is because sql server automatically extends the field to the length in the table definition. I normally truncate each field to no more than about 30 characters- and we’re in business. If you don’t want to keep running the sp_send_dbmail procedure, opening the e-mail, tweaking the query and re-running- you can run the sql query in management studio and select the "output as text" option as shown below:

After a bit of tweaking you should have it looking how you want:

Once you have tested this and are happy you can now created a sql job with a single t-sql step containing the above code. Attach a schedule and you’re good to go!

If you receive an error when trying to execute the sql job, you may need to double check the sql server agent account has adequate permission on the msdb database and the databases involved in your sql query.


It has been far too long since I blogged. Work has been keeping me busy 7 days a week but here’s a quick post about a recent challenge.

I have been working on several data migration projects and have been struggling in general with sluggish SQL query performance. Most of the research conducted has suggested the primary bottleneck is disk IO (rather than throughput or CPU speed) so I decided to try and find the fastest "reasonably priced" SSD to give me a boost. I found this list http://www.storagesearch.com/ssd-fastest.html and after research each of the items (for availability and cost) I chose the OCZ Revodrive3 X2 PCI-e SSD.

I ordered the SSD and installed it in one of my spare servers (SuperMicro X8SIL-F, Intel X3470, 16GB Kingston RAM). Despite not being listed on the OCZ compatibly list http://www.ocztechnology.com/displaypage.php?name=revodrive_moboguide the SSD was detected during post. Again, despite being listed as only compatible with Windows 7, the SSD was detected in Windows 2008 R2 and after installing the driver and initialising the disk in Server Manager appeared in "My Computer".

Unfortunately after installing SQL Server 2008 R2 and loading a few database up the machine mysteriously restarted and no longer was the drive listed in "My Computer" or device manager. After further reboots the SSD was no longer showing during post (various green and blue LEDs were still lit so it wasn’t completely dead). I tried the card in an old Dell 9150 I had kicking around- nothing. Then a spare HP Microserver N36L- again, nothing.

Fairly convinced now that the SSD had died I called eBuyer up and started the RMA ball rolling along with placing an order for a new card (this was Thursday night and fortunately they were able to guarantee Saturday delivery). I also ordered a motherboard from the compatibility list- ASRock Z68 Extreme4 Gen3, an Intel i7 Quad-Core 2600 3.4GHz Sandy Bridge processor, and 16GB DDR3 RAM along with a no-name case all from Amazon (again, with the guarantee of Saturday delivery).

I put everything together and fired it up:

One of the first things I noticed was the different number of chips present on each of the SSD boards (compare this to the one at the top of the post?). But regardless, the replacement was working and still going strong after 24hrs solid use.

Keeping an eye on various counters in Windows I’ve seen the card reach over 1000MB/sec- impressive! And no longer does the disk seem to be the bottleneck, it now appears to be the CPU- doh! Unfortunately it looks like some queries only utilise a single processor core so the CPU is actually only 12.5% utilised.

That’s all for now! It looks like I was just very unfortunate with the first card but the replacement is blindingly fast and a great price (especially when compared with the competition).


Updateable Excel Frontend for SQL View

I am currently working on a Web UI: https://tickett.wordpress.com/2014/12/11/excel-like-web-frontend-to-sql-viewtablestored-procedure/

We often have a need to view/manipulate/analyse data from systems in Excel (because, let’s face it- it’s quick & easy). However, it is sometimes the case that we need to add data (it may be commentary, budget figures or any manner of data that for one reason or another cannot be keyed into the underlying system(s)). I hoped Excel’s built in functionality could cater for this… Unfortunately, while it’s very easy to set a datasource to pull data from SQL it doesn’t appear to have any functionality for pushing data back into the underlying database.

So this is what I came up with…

Step 1: The data is retrieved using VBA (I have a master sheet which is pre-formatted). The “template sheet” is then copied and used as a starting point each time the data is retrieved:

Sub RetreiveSheet()
    On Error GoTo Error
    Dim cnt As ADODB.Connection
    Dim rst As ADODB.Recordset
    Dim stSQL As String

    Const stADO As String = "Provider=SQLOLEDB.1;User ID=User;Password=Password;Persist Security Info=False;" & _
    "Initial Catalog=EBMS1920;Data Source=sql-01"

    stSQL = stSQL & vbCrLf & "ORDER BY [Date] ASC"
    Set cnt = New ADODB.Connection

    With cnt
        .CursorLocation = adUseClient
        .Open stADO
        .CommandTimeout = 0
        Set rst = .Execute(stSQL)
    End With

    Application.EnableEvents = False

    For i = 0 To rst.Fields.Count - 1
        ActiveSheet.Cells(1, i + 1) = rst.Fields(i).Name
    Next i

    Application.ScreenUpdating = False
    For r = 1 To rst.RecordCount
     For i = 0 To rst.Fields.Count - 1
        ActiveSheet.Cells(r + 1, i + 1) = rst.Fields(i).Value
     Next i
    Next r

    Application.ScreenUpdating = True

    Set rst = Nothing
    Set cnt = Nothing

    ActiveSheet.Range("B1000").Value = "Last Retreived: " & DateTime.Now()

    Application.EnableEvents = True
    ActiveSheet.Protect UserInterfaceOnly:=True

    Exit Sub

    MsgBox ("An error occurred")
End Sub

Here is an example with some made up data:

The gray cells indicate cells that are pulling from the system and CANNOT be updated. The pink cells are pulling from my custom table and CAN be updated, and the yellow cells are summary fields (a pivottable). All cells are locked except those in pink and the sheet is protected.

The SQL query is pulling from FORECAST_SPREADSHEET which is a view joining tables from the underlying system with my custom table FORECAST_DATA.

Now the clever bit… allowing updates to write back to the database!

Step 2: We create a trigger on our view:


Step 3: We add a routine to the “worksheet_change” event:

Private Sub Worksheet_Change(ByVal Target As Range)

If Application.ScreenUpdating = False Then Exit Sub
If Target.Value <> "" And IsNumeric(Target.Value) = False Then
Target.Value = ""
MsgBox ("Value must be numeric!")
Exit Sub
End If

Dim updateSQL As String
Dim orgCode As String
Dim eventID As String
Dim columnName As String
Dim newValue As String

orgCode = Range("A" & Target.Row).Value
If orgCode = "" Then Exit Sub
eventID = Range("D" & Target.Row).Value
columnName = Cells("1", Target.Column).Value
newValue = IIf(Target.Value = "", "NULL", Target.Value)

updateSQL = "UPDATE FORECAST_SPREADSHEET SET " & columnName & " = " & newValue & " WHERE ORG_CODE = '" & orgCode & "' AND ID = " & eventID

Dim cnt As ADODB.Connection
Dim rst As ADODB.Recordset

Const stADO As String = "Provider=SQLOLEDB.1;User ID=User;Password=Password;Persist Security Info=False;" & _
"Initial Catalog=EBMS1920;Data Source=sql-01"

Set cnt = New ADODB.Connection

With cnt
    .CursorLocation = adUseClient
    .Open stADO
    .CommandTimeout = 0
    Set rst = .Execute(updateSQL)
End With


Set rst = cnt.Execute(selectSQL)
Application.EnableEvents = False
For i = 0 To rst.Fields.Count - 1
ActiveSheet.Cells(Target.Row, i + 1) = rst.Fields(i).Value
Next i
Application.EnableEvents = True

Set rst = Nothing
Set cnt = Nothing


End Sub

You will notice in the code used to retrieve the data earlier that I set Application.EnableEvents = False to prevent this event being fired when my code is executing (I only want it to fire when the user is keying in data).

We’re done! RetreiveSheet() can be executed to retrieve the data and keying into the pink cells will update the custom SQL table in real-time! I chose to retrieve the latest copy of an entire row everytime an update is committed to the database.

Sidenote: One trick I learnt whilst building this solution was how to avoid cell formula references being automatically updated (broken) when the sheet to which they refer is no longer valid (deleted). We do this using the INDIRECT() function. So =SUM(’26_201109′!covers) becomes =SUM(INDIRECT(“’26_201109′!covers”))


New Virtual Lab – Part 2

…continued from https://tickett.wordpress.com/2011/08/24/new-virtual-lab-part-1/

So- in came the first set of bits for new server #1 and I begun piecing it together…

Issue #1- The motherboard doesn’t sit quite right on the spacers/chassis screws (because of the element of the cpu cooler which sits on the underside of the metherboard)- not really a problem, I just added a few washers (I expect I may’ve been able to find some slightly larger spacers too, if i’d looked hard enough).

Issue #2- My USB pen drive didn’t fit in the internal slot with the chassis together. Not to worry- i simply attached a header to the spare pins and plugged the USB stick into one of those ports still inside the case.

Issue #3- When i powered up the machine it was pretty loud. I checked and believe this to be because the Akasa cooler (AK-CCE-7107BS) only has a 3 pin header so doesn’t support pulse-width-modulation (PWM) and effectively runs at full speed all of the time! Fortunately the other cooler (Gelid Slim Silence IPlus Low Profile Intel CPU Cooler) had the correct 4 pin connector and when hooked up, supported PWM and ran nice and quiet!

Issue #4- I intend to run the server “headless” so one of the great features of the X8-SiL-F motherboard is the on-board IPMI. Unfortunately when i tried to connect with the default username/password “ADMIN” / “ADMIN” access was denied. I downloaded a copy of the latest firmware from the Supermicro site and flashed using:

dpupdate.exe -f SMT_SX_250.bin -r n

The -r n parameter telling it to reset to factory settings. Voila- I could now login via the web-interface or windows IPMI tool using the default login credentials (“ADMIN” / “ADMIN”).

ESXi 4.1 installed like a charm but i’ve had a little trouble trying to deploy my first Virtual Machine (ESXi management network losing connectivity and/or the hypervisor crashing)- I think this might be because i’m using old knackered network cables! I will replace them and hopefully update tomorrow in Part 3.

The 2nd of these servers is on route and hopefully be delivered before the weekend.

My next question is what virtual machines should i configure?

I definitely need a vCenter server
I definitely need at least one SQL server (possibly 2 or 3 as i’d like to experiment with replication)
I definitely need at least one web server (IIS)
I definitely want to get trixbox back up and running
I am considering experimenting with pfSense or possibly untangle?
I also need a local dns server but think that might best sit on physical hardware or i’ll get problems with the hypervisor being unable to query DNS before the VM has started?
My fileserver currently runs WHS2011- So I would also like a WHS2011 VM to test the “Drive Extender” replacements on (however I realize I can’t really test performance here so might have to give that a miss).
Also, i think that OSX Server might run on ESXi- and i’d quite like to have a proper (non-hacked) time machine backup store configured so this might be the right route to go down…


SQL Problem

I have written quite a few feeds/interfaces over the last few years which i’m now trying to “tighten up”. There’s one in particular which i’ve been struggling with- this is essentially all it does:

– Download and rename a bunch of files from an FTP
– Manipulate these files a little
– Execute a 3rd party application to import the files

There are 2 problems:

– No applications can be run n the SQL server
– The 3rd party application cannot import files from a network location (so they must be copied to a local area imported then the output files copied back to the network location)

This particular feed  is a SQL Server Integration Services (SSIS). The package needs to be executed on demand and passed some variables.

Here’s how I achieved my goal (i don’t remember the details of all the problems i encountered trying to get this working, but there were a lot!):

– Created an .asp page which collects the variables and executes a stored procedure
– The stored procedure:
— creates a sql job with the SSIS package and relevant variables
— executes the sql job and waits for it to complete

 WAITFOR DELAY '000:00:02'
  SELECT * FROM OPENROWSET('sqloledb','server=servername;trusted_connection=yes','exec msdb.dbo.sp_help_job @job_name=N''job name''') WHERE current_execution_status = 1

— deletes the sql job

The SSIS package contains a script task which takes care of the “copy files to local area”, “import files” and “copy files back to network location (one of the tricky bits here was getting my code to wait for the remote process to complete:

Dim path As New System.Management.ManagementPath("\\" & strMachineName &"\root\cimv2:Win32_Process")
Dim scope As New System.Management.ManagementScope(path)
Dim opt As New System.Management.ObjectGetOptions()
Dim classInstance As New System.Management.ManagementClass(scope, path, opt)
Dim inParams As System.Management.ManagementBaseObject = classInstance.GetMethodParameters("Create")
inParams("CommandLine") = strCommand
Dim outParams As System.Management.ManagementBaseObject = classInstance.InvokeMethod("Create", inParams, Nothing)
 Do While Process.GetProcessById(CInt(outParams("processId")), strMachineName).ProcessName.Length > 0
Catch ex As Exception
End Try


%d bloggers like this: