Monday, 27 August 2012

Update TextBox using AppendText method in real time for WPF Applications

A few days ago I was working on an application that imports several different MS Dynamics CRM 2011 solutions to various environments and I had designed a simple form that contained a big text box which would be updating with progress and a couple of buttons. The problem that I had was that all the progress information would get updated at the end of the import process, which was completely useless as I wanted the progress information to be displayed in real time. 

After a lot of searching I hit upon the issue, which, without going into too much detail is related to threads. In essence, the GUI is waiting for the result of an operation and until that operation finishes it's blocked, which means that nothing is displayed. You probably have seen this when a window becomes non responsive while the application is processing a long(-ish) running process. The way around this problem is by using the Dispatcher.

The code below shows a simple example of how this is accomplished, where Import_Click is the event for a button named Import (xaml not displayed for simplicity).

The key is the WriteMessage method, which uses the dispatcher to invoke the AppendText method for the passed-in MessageBox using a lambda expression, the rest of the code is, hopefully, fairly self-explanatory.

 public partial class MainWindow : Window
     private readonly BackgroundWorker import = new BackgroundWorker();
     public MainWindow()
         import.DoWork += new DoWorkEventHandler(import_DoWork);
     private void import_DoWork(object sender, DoWorkEventArgs e)
     private void Import_Click(object sender, RoutedEventArgs e)
     public void Import()
             WriteMessage(MessagetTextBox,string.Format("Import started @ {0}.{1}", DateTime.Now,Environment.NewLine));
             WriteMessage(MessagetTextBox,string.Format("Import finished @ {0}.{1}", DateTime.Now,Environment.NewLine));
     private void WriteMessage(TextBox MessageBox, string Message)
         Dispatcher.Invoke((Action)(() => MessageBox.AppendText(Message)));

Sunday, 19 August 2012

Installing and using Pen Load balancing software in RHEL/CentOS 6.x

Last week I was asked to provide alternatives to NLB as we seem to be having problems getting delivery of a couple of switches for our test environment, or something like that. At any rate, NLB does not work too well with ESXi in our environment for various reasons, so I remembered about PEN, as I had used in a development environment ages ago.

You can compile from source, if you want to, but there is an already compiled rpm, which can be downloaded from here (This is the EPEL repository for CentOS 5).

Installing it's a simple case of using yum:
yum install -y pen
At this point you can start load balancing with pen like this:

 /usr/bin/pen -l pen8080.log 8080

This will distribute traffic arriving at this server on port 8080 to port 8080 on .82 and .83 with sticky sessions. If you want round robin stick an -r in, like this:

/usr/bin/pen -l pen8080.log -r 8080

The only downside of using pen like this is that if the box goes down for any reason so does Pen, which means that we need a start up script. I named the file /etc/init.d/penlb8080. The file name should match the servicename variable in the script:
# Pen Starting Script
# chkconfig: 345 93 92
#Source function library
. /etc/init.d/functions



start() {
echo -n $"Starting $servicename: "
[ $RETURNVALUE = 0 ] && touch $lockfile
stop() {
echo -n $"Stopping $servicename: "
kill -9 `cat $PIDFILE`
[ $RETURNVALUE = 0 ] && rm -f $lockfile
case "$1" in
status $pen
echo "Usage: $servicename {start|stop|restart|status}"
exit 1

exit $?
Make the script executable:
chmod +x /etc/init.d/penlb8080
Add to list of services controlled by chkconfig:
chkconfig --add penlb8080
Start this pen load balancer instance with:
service penlb8080 start
If you also wanted to run a second pen instance you could use the same script as above but with certain modifications. Say you wanted to run a second load balancer on port 80 as well, all you need to change are the following values, as well as the script name (penlb80):


Wednesday, 15 August 2012

Using Wix's harvest tool Heat to generate list of files from websites and directories

A while ago I was asked to write an installer for a project I'm working on and I immediately thought of Wix.

There was one problem though, thus far I had only used Wix for simple projects with not that many files but not this time, there were multiple websites and multiple projects and I wasn't going to do ~ 30+ odd files by hand, which is where Heat came in.

In order to generate a wxs file containing all files in a website this is the command to use:
Heat website MyWebsite -gg -cg  MyWebsite -o MyWebsite.wxs
where website tells heat that its harvesting a website
MyWebsite is the name of the website in IIS
-gg means that guids will be generated now
-cg MyWebsite will generate a component group called MyWebsite
-o MyWebsite.wxs will output to MyWebsite.wxs

Note that in a default installation Heat can be found in this directory:

C:\Program Files\Windows Installer XML v3.5\bin\

In order to run the above command the website needs to exist, which normally means that it has been deployed from Visual Studio.

Heat is not limited to harvesting websites though, it can also harvest directories, like this:
Heat dir "C:\solutions\" -gg -cg solutions -out solutions.wxs
Hope this helps in making the Wix experience easier.

Tuesday, 14 August 2012

The Thread was being aborted

A few weeks ago we started having an issue with a batch, to be precise it's a windows service that calls a web method that calls MS Dynamics CRM to generate a couple of files from data stored in the Dynamics CRM database. At any rate, all of a sudden it stopped working properly, it would fail at some point, reasonably early on in the process, probably it would go through about 1000 records.

Annoyingly, this coincided with a new release so naturally all suspicions fell on the build, this despite my protestation that just because two events occur one after the other does not mean that the first caused the second.

At any rate, after sneakily modifying the code in production to add as much trace logging as we could [Cowboys'r's], we found out to our surprise that it was failing while updating the records in Dynamics CRM. Unfortunately, the error was not very helpful as it seemed to vary between:

Unable to connect to the remote server


The thread was being aborted

After a lot of thinking and googling we thought we had come up with the answer: increase the number of connections.


It seems that when a client makes an authenticated call  a connection is opened and then closed, and since we were making loads of connections while processing our data, this seemed like the most likely culprit, as the connections available get used up.

In order to increase the number of connections available a registry hack and a reboot are needed:
  1.  Open registry editor. [from start | run | regedit]
  2. Navigate to this key [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters]
  3.  Set the TcpTimedWaitDelay key to 1388. [You might need to create this key. It's a DWORD value]
  4.  Set the MaxUserPort key to a high value, anything above 1388 and below 10000 ought to do it. [You might need to create this key. It's a DWORD value]
  5. Reboot.
We did this in one of our servers and in the morning I was extremely disappointed to see that the process had failed, but encouraged that it had processed about 2500 records. 

We ran the process manually a couple of times it was failing at roughly the same point, around 2500 records (by this point in time we had built a significant backlog of data) always with the same error:

The thread was being aborted

And then it hit me, it was timing out, so a simple fix was in order, i.e. increase the timeout on the web.config file of the web service.
<httpRuntime executionTimeout="300" />
I think that the default timeout is 90 seconds and that was clearly not enough to go through the 5000 records it can process daily (yep, it's limited to 5000 records by MS Dynamics CRM. Nope, I did not write the code)