Wednesday, 30 November 2011

Install Scripts

At work nobody seems to bat an eyelid when presented with a 50 or 60 page document to install an application (This is a bit unfair, we seem to be doing better than before where 100+ page documents were the norm). The last project I was involved in had a document with almost 70 pages documenting how to install  the application, admittedly, it does contain a good number of  pages to un-install the application,  say, for the sake of argument, or because I've just checked, that the un-install part is 8 pages, that still leaves with 60+ pages of documentation. Some of it is due to corporate templates, so remove another 5 pages and you are still left with a mammoth 55+ pages. Why?

Well, for starters Wix or something similar was not used, so there is a lot of steps documenting how to create and configure a website in IIS.
Secondly, some processes were not found to be reliable when automated, the solution? Add a myriad of manual steps instead. Heaven forbid that we should spend some time trying to understand the issues. 
Finally, the fact that it covers two types of configurations as the customer wasn't quite certain about what they wanted and wanted to have the flexibility to switch between the two.

This type of install is never going to be a simple double-click an msi and then click next until it installs, but it could be made so much simpler that it is by following a few simple rules:
  1. Use Wix. I'm partial to Wix even if it does have a very steep learning curve.
  2. Script everything.
  3. If you can't script it, go back to rule 2. If you've already been to rule 2 a few times, then go to rule 4.
  4. Write an application that automates the install.

Tuesday, 29 November 2011

The joys of SSL - A simple SSL Server

Is there anything that OpenSSL cannot do?

In this post, I showed how it can be used to test cipher levels. Today I'll show how it can be used as a simple SSL server to test, .. an SSL client, I guess. Since I already have a CA, I'm going to use that CA, which means that I generate a certificate request using the following command:
openssl req -new -newkey rsa:1024 -nodes -subj '/CN=RHEL6Blade/O=dev.com/C=UK/ST=Yorkshire/L=Here' -keyout key.pem -out myreq.pem
I submit the certificate request to said CA and come back to my linux box with two files:
  • certificate file. Ensure that this was exported as a Base64 certificate - redhat.cer
  • pkcs7 file i.e. certificate chain - redhat.p7b
I concatenate the key and the certificate with the following command:
cat key.pem redhat.cer > key-cert.pem
I then convert the pkcs7 chain into a pem file with the following command:
openssl pkcs7 -in redhat.p7b -noout -out test.pem
Finally, I can run the SSL server with the following command:
openssl s_server -cert key-cert.pem  -www -debug -CAfile test.pem
From a windows machine running IE 7, using the hostname only:


If I use the fully qualified domain name, I get this error as the url and the certificate name don't match:


If I omit the -CAfile command from the server, i.e:
openssl s_server -cert key-cert.pem  -www -debug
I get the same error even if I use the hostname. This is because the CA is not trusted (if the CA is trusted then there is no need to add the -CAfile modifier):

Monday, 28 November 2011

Merge Multiple PDF files using the iTextSharp library

I had a bit of an issue today. I'd been given an e-book in multiple PDF files and when I say multiple, I mean multiple; one per chapter of a twenty six chapter book, plus a couple of appendices and the index and table of contents, all in all over 30 files.

I did a bit of searching trying to see whether I could do it with OpenOffice and a link took me to the iText library, a free Java,C# library that can read, parse and do all manner of things with PDF  files.

I know that there are a myriad of applications that join, merge, ... etc, pdf files, not least Adobe Acrobat, which you can use for free for 30 days but sometimes it's fun to tinker a little bit.

I found this sample but it looks like it's from 2006 and the code does not work with the current version (5.1.2.0). Somebody, posted an example that looked more promising, but alas it did not work either, so I modified to work with the current library. I also modified so that it would join any number of files.

Here is the code (hopefully is self explanatory despite the lack of comments):

   1 public static void MergePdfFiles(string destinationfile, List<string> files)
   2 {
   3     Document document = null;
   4     
   5     try
   6     {
   7         List<PdfReader> readers = new List<PdfReader>();
   8         List<int> pages = new List<int>();
   9 
  10         foreach (string file in files)
  11         {
  12             readers.Add(new PdfReader(file));
  13         }
  14 
  15         document = new Document(readers[0].GetPageSizeWithRotation(1));
  16 
  17         PdfWriter writer = PdfWriter.GetInstance(document, new FileStream(destinationfile, FileMode.Create));
  18 
  19         document.Open();
  20 
  21         foreach (PdfReader reader in readers)
  22         {
  23             pages.Add(reader.NumberOfPages);
  24             WritePage(reader, document, writer);
  25         }
  26     }
  27     catch (Exception ex)
  28     {
  29         MessageBox.Show("An Error occurred");
  30     }
  31     finally
  32     {
  33         document.Close();
  34     }
  35 }
  36 
  37 private static void WritePage(PdfReader reader, iTextSharp.text.Document document, PdfWriter writer)
  38 {
  39     try
  40     {
  41         PdfContentByte cb = writer.DirectContent;
  42         PdfImportedPage page;
  43 
  44         int rotation = 0;
  45 
  46         for (int i = 1; i <= reader.NumberOfPages; i++)
  47         {
  48             document.SetPageSize(reader.GetPageSizeWithRotation(i));
  49             document.NewPage();
  50 
  51             page = writer.GetImportedPage(reader, i);
  52 
  53             rotation = reader.GetPageRotation(i);
  54 
  55             if (rotation == 90 || rotation == 270)
  56             {
  57                 cb.AddTemplate(page, 0, -1f, 1f, 0, 0, reader.GetPageSizeWithRotation(i).Height);
  58             }
  59             else
  60             {
  61                 cb.AddTemplate(page, 1f, 0, 0, 1f, 0, 0);
  62             }
  63         }
  64     }
  65     catch (Exception ex)
  66     {
  67         MessageBox.Show("An Error occurred");
  68     }
  69 }

The code will join the pdf files in the order that the files List stores the files, so it would possible to sort the files using List<T>.Sort Method if you need to before calling MergePdfFiles

Saturday, 26 November 2011

Disabling Low ciphers in IIS 6.0

I discussed yesterday that we had done some security hardening for our IIS 6.0 servers. In essence, we have disabled Low ciphers, i.e. those with a key length shorter than 128 bits as well as SSL 2.0. 
For some annoying reason you have to actually disable the ciphers by creating the Enabled Dword key and setting it to 0. If it's not there, the system will assume that it's ok to use it the cipher.

You can copy and paste the text in italics to a file, save it with .reg extension and then simply double click on it to apply to your server.
Windows Registry Editor Version 5.00 
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers\DES 56/56]
"Enabled"=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers\RC2 40/128]
"Enabled"=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers\RC4 40/128]
"Enabled"=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Ciphers\RC4 56/128]
"Enabled"=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server]
"Enabled"=dword:00000000

Friday, 25 November 2011

The joys of SSL - Cipher Levels

A few weeks ago we were tasked with ensuring that our IIS 6.0 server only allowed Medium and High ciphers, so in order to test what ciphers IIS accepted I used openssl from a linux box in our dev environment, like this.
openssl s_client -connect 10.168.20.109:443 -cipher NULL
CONNECTED(00000003)

140587511035720:error:140790E5:SSL routines:SSL23_WRITE:ssl handshake failure:s23_lib.c:184:

---

no peer certificate available

---

No client certificate CA names sent

---

SSL handshake has read 0 bytes and written 61 bytes

---

New, (NONE), Cipher is (NONE)

Secure Renegotiation IS NOT supported

Compression: NONE

Expansion: NONE

---

Cool, we don't accept Null ciphers

Let's try now with low ciphers


openssl s_client -connect 10.168.20.109:443 -cipher LOW
CONNECTED(00000003)
depth=0 C = US, ST = York, L = York, O = York, OU = York, CN = crmdevbox.dev.com
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 C = US, ST = York, L = York, O = York, OU = York, CN = crmdevbox.dev.com
verify error:num=27:certificate not trusted
verify return:1
depth=0 C = US, ST = York, L = York, O = York, OU = York, CN = crmdevbox.dev.com
verify error:num=21:unable to verify the first certificate
verify return:1
---
Certificate chain
 0 s:/C=US/ST=York/L=York/O=York/OU=York/CN=crmdevbox.dev.com
   i:/DC=com/DC=dev/CN=TESTAuthority
---
Server certificate
-----BEGIN CERTIFICATE-----
MIIFgDCCBGigAwIBAgIKIkXHKgAAAAAAJzANBgkqhkiG9w0BAQUFADBCMRMwEQYK
CZImiZPyLGQBGRYDY29tMRMwEQYKCZImiZPyLGQBGRYDZGV2MRYwFAYDVQQDEw1U
RVNUQXV0aG9yaXR5MB4XDTExMTAyNTA4NTUwOFoXDTEzMTAyNDA4NTUwOFowajEL
MAkGA1UEBhMCVVMxEjAQBgNVBAgTCVlvcmtzaGlyZTESMBAGA1UEBxMJU2hlZmZp
ZWxkMQswCQYDVQQKEwJIUDELMAkGA1UECxMCSFAxGTAXBgNVBAMTEGRoejMwNzAx
LmRldi5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBANNYoEklD7Rgnoq0
Ld1+YkKNVcMy9CLb40kmQ0E5f+7SXPDMqNS12yk81k21PQoioOVPSKG2+pAls/vw
QhbjMBeRXeYkrH6V5+nuBYogRVZil0xMeK5+lOJYwig+rHkd5bo8OU/ep+m0VaPj
k2k9+Ns3ZZ1FwlRf+s0KiT1iKVY1AgMBAAGjggLSMIICzjALBgNVHQ8EBAMCBaAw
RAYJKoZIhvcNAQkPBDcwNTAOBggqhkiG9w0DAgICAIAwDgYIKoZIhvcNAwQCAgCA
MAcGBSsOAwIHMAoGCCqGSIb3DQMHMBMGA1UdJQQMMAoGCCsGAQUFBwMBMB0GA1Ud
DgQWBBS6N5Bqj8MOTu9CmsXJXcE+sWRDlDAfBgNVHSMEGDAWgBQiuWOMgDSyXO0d
pdlFswWLTcSM+TCB9gYDVR0fBIHuMIHrMIHooIHloIHihoGtbGRhcDovLy9DTj1U
RVNUQXV0aG9yaXR5LENOPW1hcnMsQ049Q0RQLENOPVB1YmxpYyUyMEtleSUyMFNl
cnZpY2VzLENOPVNlcnZpY2VzLENOPUNvbmZpZ3VyYXRpb24sREM9ZGV2LERDPWNv
bT9jZXJ0aWZpY2F0ZVJldm9jYXRpb25MaXN0P2Jhc2U/b2JqZWN0Q2xhc3M9Y1JM
RGlzdHJpYnV0aW9uUG9pbnSGMGh0dHA6Ly9tYXJzLmRldi5jb20vQ2VydEVucm9s
bC9URVNUQXV0aG9yaXR5LmNybDCCAQYGCCsGAQUFBwEBBIH5MIH2MIGoBggrBgEF
BQcwAoaBm2xkYXA6Ly8vQ049VEVTVEF1dGhvcml0eSxDTj1BSUEsQ049UHVibGlj
JTIwS2V5JTIwU2VydmljZXMsQ049U2VydmljZXMsQ049Q29uZmlndXJhdGlvbixE
Qz1kZXYsREM9Y29tP2NBQ2VydGlmaWNhdGU/YmFzZT9vYmplY3RDbGFzcz1jZXJ0
aWZpY2F0aW9uQXV0aG9yaXR5MEkGCCsGAQUFBzAChj1odHRwOi8vbWFycy5kZXYu
Y29tL0NlcnRFbnJvbGwvbWFycy5kZXYuY29tX1RFU1RBdXRob3JpdHkuY3J0MCEG
CSsGAQQBgjcUAgQUHhIAVwBlAGIAUwBlAHIAdgBlAHIwDQYJKoZIhvcNAQEFBQAD
ggEBAKPrAW4nxI8AaNWpWHGnEHsUBy9C9jFEVLLXxP7jyawmfEgcBBpl+osxH1hG
FSFxYRa5ZVMhj6wMBMHlleftDZ7y5TunKFkOMqHhB027/SmYoqWw/XyWJxW8/EUq
9i3L5Aw1PV+/oF6Nm05VehXRlXF0ngHcwC4EYFCjt4R+/f7prMZHdvIZUq9VhtcV
VHdOl2NE7QizXdebrlyU5cIoS98XySgDuSjGeIOYY/z03jFghKv62qK5ituqQWYY
S17m516w5fSebBYhVfL5YZoCg5OXo346iZU320a6i5dZWHDDq89GkNp3i8aPxSZL
GyHRoPJzDBzo6WR+212a4j3zTEE=
-----END CERTIFICATE-----
subject=/C=US/ST=York/L=York/O=York/OU=York/CN=crmdevbox.dev.com
issuer=/DC=com/DC=dev/CN=TESTAuthority
---
No client certificate CA names sent
---
SSL handshake has read 1556 bytes and written 251 bytes
---
New, TLSv1/SSLv3, Cipher is DES-CBC-SHA
Server public key is 1024 bit
Secure Renegotiation IS NOT supported
Compression: NONE
Expansion: NONE
SSL-Session:
    Protocol  : TLSv1
    Cipher    : DES-CBC-SHA
    Session-ID: B51700000B4CA5D162B9D228487DCCC53694945999BBE6F50BF1332ED8DA3D28
    Session-ID-ctx:
    Master-Key: E477BEE6E2A4D51D5FCAFB9AC385F89ED3FFDE06F4EEC5860A5080D655D18EDA3E483D7FA0CCE0EEDD454E9C3A847367
    Key-Arg   : None
    Krb5 Principal: None
    PSK identity: None
    PSK identity hint: None
    Start Time: 1322229571
    Timeout   : 300 (sec)
    Verify return code: 21 (unable to verify the first certificate)
oops, we do. Incidentally, I think the error (unable to verify the first certificate) is due to the linux box not trusting TESTAuthority.

So we did a little bit of hardening, which I'll post about separately, and now when we try again:
openssl s_client -connect 10.168.20.109:443 -cipher LOW
CONNECTED(00000003)

140587511035720:error:140790E5:SSL routines:SSL23_WRITE:ssl handshake failure:s23_lib.c:184:

---

no peer certificate available

---

No client certificate CA names sent

---

SSL handshake has read 0 bytes and written 61 bytes

---

New, (NONE), Cipher is (NONE)

Secure Renegotiation IS NOT supported

Compression: NONE

Expansion: NONE

---

Result.

You can try with Medium or High ciphers too.

Wednesday, 23 November 2011

Post-Build actions in Visual Studio

I've been working on sorting out some issues with a plug-in in one of our applications. It all started easy enough with me thinking that a few tweaks here and there would be enough, but it has ended up being a bit of a mess and all the malarky needed to register the plug-in and copying it to the assembly directory to enable debugging, well it gets tiring quickly, so I thought I had better automate it.

To be honest, I've never had a look at build actions before, so I wasn't sure whether it was worth the effort. You know how it is, you can do something manually and it will take you 1 hour of tedious work or you can learn a better way of doing it or you can write and app/script that will take you a few hours to write. Normally I favour the latter approach, as you always learn something, even if it is that sometimes it is better to bite the bullet and do it the long way.

At any rate, I wanted to copy the plug-in library to the assembly folder only for debug builds and it turns out that this is easily done, like so:
if $(ConfigurationName) == Debug (
copy /Y "$(TargetDir)plugin.dll" "C:\inetpub\wwwroot\isv\cbs\bin\plugins.dll"
copy /Y "$(TargetDir)plugin.pdb" "C:\inetpub\wwwroot\isv\cbs\bin\plugins.pdb"
)
Make sure that the first bracket follows on the same line as Debug

This can actually be improved by doing the import at the same time as well. Starting from an already deployed plug-in (I know, I know, catch 22):
  1. Open the PluginRegistrationTool and open the organization you want to update plug-ins from.
  2. Hit Import/Export Button.
  3. Select Export Solution Xml
  4. Ensure that you only select your custom plug-ins.
  5. Save this to your build directory (You can save it anywhere, but it'll save you typing if you use the build directory, .e.g $(TargetDir) ).
  6. Copy the PluginRegistrationTool to your build directory (This is optional as well, but bear in mind that you will need to provide a path to the executable otherwise).
  7. Copy the connections.config file to your build directory.
In order to make my life easier, my user is a deployment and system administrator, so that I can leave the credentials empty and it will use my credentials. I've not managed to get it working with different credentials.
if $(ConfigurationName) == Debug (
copy /Y "$(TargetDir)plugins.dll" "C:\Program Files\Microsoft Dynamics CRM\Server\bin\assembly\plugins.dll"
copy /Y "$(TargetDir)plugins.pdb" "C:\Program Files\Microsoft Dynamics CRM\Server\bin\assembly\plugins.pdb"
PluginRegistration.exe /org:CDCC /op:import /f:ExportSolution.xml  /c:Connections.config /cl:DEBUG
)
Where CDCC is my org and DEBUG is the label on the connections.config file

Unfortunately, sometimes it seems that an iisreset is needed for the plug-ins to be picked up properly, so it might be advisable to add an iisreset command too.

Deletion Service is running

Following on from yesterdays post, I can confirm that making the changes to the timeout settings have had the desired effect and the deletion service is running again. It took a good 12 minutes to run and thus we will be monitoring the running time to adjust the timeout, just in case it bites us in the back, it can't be good to have a timeout of 24 hours :), I imagine, but I suspect that 30 seconds won't be enough either, we'll see.

Tuesday, 22 November 2011

MS Dynamics CRM 4.0 Deletion Service is not running

We finally implemented a new windows service that deletes calendars, I have talked about the joys of trying to delete calendars before. At any rate, the problem we now have is that the Deletion Service has not run to completion since the calendars were deleted, we get a time out error:
Exception while trying to execute AsyncOperationId: {GUID} AsyncOperationType: 14 - System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
where GUID matches the ID of one of the entries in the scalegrouporganizationmaintenancejobs table.

After running a few sql queries we found that we have over 1 million records marked for deletion, i.e. deletionstatecode=2, across various tables, I'm not sure how long this has been failing for, but it looks like it has been failing for a while, something to investigate for sure. After doing a bit of reading I found this post that recommends this KB.

The KB in essence suggests creating two registry DWORD keys:
  • OLEDBTimeout
  • ExtendedTimeout
OLEDBTimeout needs to be set to 86400
ExtendedTimeout to 1000000 and no larger than 2147483647

In our case we have multiple application servers, but it seems that the error regularly occurs in one of the servers only, so we have made the registry changes on that one server only and we are waiting for it to run tonight.

Let's see if this does the trick.

Find duplicate rows in a SQL Server database

From time to time I've found myself going through the database trying to find duplicates of this or the other and because this does not happen often enough I forget how to do it. So I was at it today again and ran into the same hole in my memory, so in order to never forget again, as if, I've decided to document it here, so here we go:
select callid,count(callid) as numberofcalls from calls
group by callid
having ( count(callid) > 1 )
This will display the callid and the number of times it appears on the calls table, provided that callid appears more than once.

If you are wondering why are there duplicate callids, so am I. Oh, the joys of IT support.

Sunday, 6 November 2011

Null Ciphers in .NET Framework.

We get some strange requests at work form time to time, the last one? Can you use null ciphers for SSL traffic in IIS?

If you are wondering why you might want to do this, I don't have a ready answer to be honest. In *nix land, this isn't a major problem, but IIS does not support it at all (at least for versions 5-7.5).
To be able to handle a Null cipher, Schannel needs to have certain values

dwMinimumCipherStrength
Minimum bulk encryption cipher strength, in bits, allowed for connections. If this member is zero, SCHANNEL uses the system default.
If this member is -1, only the SSL3/TLS MAC-only cipher suites (also known as NULL cipher) are enabled.

dwMaximumCipherStrength
Maximum bulk encryption cipher strength, in bits, allowed for connections. If this member is zero, SCHANNEL uses the system default.
If this member is -1, only the SSL3/TLS MAC-only cipher suites (also known as NULL cipher) are enabled. In this case, dwMinimumCipherStrength must be set to -1.

However - it is not these properties alone. You cannot enable NULL ciphers through the registry only. The SCHANNEL caller has to OPT IN by passing -1 to the appropriate fields in the SCHANNEL cred. IIS does not allow NULL ciphers as they do not pass in -1 to the SCHANNEL cred.
This leaves you needing some sort of proxy server that accepts Null Ciphers for the SSL handshake so that the proxy completes the handshake and then forwards the request to IIS.

If you wanted to create your own proxy, this is probably a very bad idea by the way, the .NET framework can help you. From version 4.0, it is possible to allow null ciphers on the SSLStream constructor by simply setting Encryption policy to AllowNoEncryption, like this:

   1 SslStream sslStream = new SslStream(myclient.GetStream(), false, null, null, EncryptionPolicy.AllowNoEncryption);

where myclient is a TcpClient object.

I still have to ask why, though, why?

Install SQL Server 2008 HA Cluster - The current SKU is invalid.

I was trying to install an Active/Active SQL 2008 Cluster on Friday and I hit a problem. Just before the install proper on the second node of the first resource group. I got the following EM:
The current SKU is invalid
I tried again and it worked fine, but then when I was trying to add the second node on the second resource group, it failed three times, so after a little bit of googling I found two answers:

  • Add the second node manually, for which I have to run a command like this: 
setup.exe /ACTION=AddNode /INSTANCENAME="MSSQLSERVER"
/SQLSVCACCOUNT="<Domain\Account>" /SQLSVCPASSWORD="<Password>"
/AGTSVCACCOUNT="<Domain/Account>", /AGTSVCPASSWORD="<Password>"
/ASSVCACCOUNT="<Account/Password>" /ASSVCPASSWORD=”<Password>” /INDICATEPROGRESS
  • Delete defaultsetup.ini. However, make sure that you keep the key, as you will need it.
While I relish a challenge as much as the next person, I did not have the time to go for option one, so went for the easy option and deleted defaultsetup.ini.

Hopefully I'll get some time this week to post about how to create an active/active SQL Server 2008 Cluster

HA windows service with Windows Cluster Service

We support a MS Dynamics CRM application that uses a few SSIS jobs for tidying up data, however they were awfully slow, so we decided to forgo them. In fairness, it is only SSIS jobs that call a web service that were too slow and I'm sure that given enough time, we could make them run faster, but we decided to go for a different route and that was to use a windows service running on the backend, but our backend is clustered, which raises the problem of how to make sure that this service runs in High Availability mode.
It turns out that it is surprisingly easy to do with windows cluster service. There is a resource type called generic service and this can be used to start the service on various nodes as needed.

Assuming that you have your windows service installed in all the nodes in your cluster, this is what you need to do to have a clustered windows service.
  1. Start the cluster console (start|run|cluadmin)
  2. Connect to cluster if needed (it normally connects to the cluster running on the box by default)
  3. Right Click on Groups. Select New| Group. Follow wizard, make sure that all Nodes are available, as shown in the screenshot below and click Finish.
  4. Right Click on the group you have just created. Select New|Resource.
  5. Make sure that you select Generic Service in Resource Type and follow the wizard.  
  6. When prompted for a service name, make sure that this is the service name and not the service's display name. If your service has starts parameters this is where you can add them.  
  7. You can now bring the service online, by bringing the whole group or just the service.
That is all, you should now have an HA windows service. Note that you don't actually have to create a new group, but I think it's just good practice to do so.
Also note that all the windows cluster service will do is to bring the service up after a failover, if your windows service runs a long running process and it stops halfway because the active node goes down, the service will be brought up on the other node, which may or may not cause issues.