Monday, 31 March 2014

Maximum Workflow depth in Ms Dynamics CRM 2011/2013 or how according to Microsoft infinite = 8

There is an inbuilt protection against infinite loops in workflows in Dynamics CRM:
This workflow job was canceled because the workflow that started it included an infinite loop. Correct the workflow logic and try again.
Unfortunately the error will be trigger even if there are no infinite loops. This is because the check to determine whether a workflow is in an infinite loop relies on the maximum execution depth, which by default is set to 8.

This means that if you have a chain of workflows/plugins that exceeds this number you will get the above error, which can be confusing as it might leave you trying to work out how are your workflows/plugins, getting into an infinite loop.

The solution is to increment the Maximum Workflow depth, which can be done from PowerShell, like this:
Add-pssnapin Microsoft.crm.powershell -ErrorAction SilentlyContinue 
$workflows = Get-CrmSetting -SettingType WorkflowSettings 
$workflows.MaxDepth = 12 
Set-CrmSetting $workflows
We ended up using 12 as we have long intricate (potential) chains of workflows, I think it never goes higher than 10 but better safe than sorry.

Tuesday, 25 March 2014

ID1038: The AudienceRestrictionCondition was not valid because the specified Audience is not present in AudienceUris.

After installing a new environment today, I got this error in one of our services:
ID1038: The AudienceRestrictionCondition was not valid because the specified Audience is not present in AudienceUris.
I thought: that's a new one.

This was a very simple one to solve, though, it turns out that the AudienceUri needs to match the text, casing included, of the uri in the federationmetadata file, otherwise you get this error, who knew?

Relevant extract from FederationMetadata.xml file:

entityID="https://testserver.test.local/UAT/Scheduling.Service/Scheduler.svc"

Relevant extract from the web.config file:

<microsoft.identityModel>
 <service name="SchedulingService.Scheduler">
  <audienceUris>
   <add value="https://testserver.test.local/uat/Scheduling.Service/Scheduler.svc />
 </audienceUris>
<issuerNameRegistry type="Microsoft.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35">
 <trustedIssuers>
 <add thumbprint="abcdef790108485dee20eeca19c8132e11abcdef"  name="http://adfs.test.local/adfs/services/trust" />
 </trustedIssuers>
 </issuerNameRegistry>
 <certificateValidation certificateValidationMode="None" />
 </service>
</microsoft.identityModel>

As you can see uat is not capitalized in the audienceUri which is enough to cause the issue.

Saturday, 22 March 2014

Some companies take security best practices seriously, others not so much

I was trying to register for an online account with Sainsbury's Energy last night so I navigated to https://www.sainsburysenergy.co.uk. 




So to sum up, I'm trying to navigate to www.sainsburysenergy.co.uk and the certificate is for www.sainsburysenergy.com.

Note that https is only mandatory for Log In and Register Now only but I'm too lazy to take screenshots again. :)

This is www.sainsburysenergy.com, as I said above, it will only go on https for Log In and Register Now


Somebody should set up a proper redirect or get another certificate.

Monday, 17 March 2014

Create Entity in Ms Dynamics CRM 2011/2013 using OData endpoint

This is really part of the brain dump series but haven't really had the time until now.

Two entities,  Author (dab_author) and Book (dab_book) with a 1:N relationship between them.

The challenge is to create an new book for a particular author, from the author form.


var createNewBook = function()
{
 url = Xrm.Page.context.getClientUrl() + "/XRMServices/2011/OrganizationData.svc/dab_bookSet";
 author = Xrm.Page.data.entity.getId();
 dab_book = {};
 dab_book.dab_name = "Odata entity creation test";
 dab_BooksId = {};
 dab_BooksId.Id = author;
 dab_BooksId.LogicalName = "dab_author";
 dab_book.dab_BooksId = dab_BooksId;
 book = window.JSON.stringify(dab_book);

 createBook(book, url).done(process)

}

var createBook = function(book, url)
{
 return $.ajax({
 type:"POST", 
 contentType:"application/json;charset=utf-8",
 datatype:"json",
 data:book,
 url:url,
 beforeSend:function(x){x.setRequestHeader("Accept","application/json")},
 });
}

var process = function(data){
 var entity = data.d; 
 alert("Created new Book. Id:" + entity.dab_bookId);
}

Do bear in mind that casing is a bit funny. The relationship name is dab_booksid, however it needs to be sort of title cased to dab_BooksId, in essence the prefix needs to be kept in lower case, the rest Title Cased.

Also, jquery needs to be loaded if you are using Dynamics CRM 2011

Monday, 10 March 2014

Using Memory Appenders in Log4net (All your logging are belong to us)

I have always been a keen advocate of Trace logging as useful tool for tracking down bugs in Test and Production environments, those where it is not possible to hook up a debugger to the web service or windows service.

There is an important problem, though with trace logging as it's normally done in Production, and this is that it's not verbose enough. It is considered good practice to dial down the logging level so that only errors get logged  but this can be a big problem as all that gets logged is the exception, which sometimes is simply not enough to diagnose the problem.

I do believe that this is a good practice for various reasons (less load on the server, no need to worry about log size, etc..), but it is also true that a production system logging at its highest verbosity can be useless due to the amount of messages that get logged.

If an error happens that cannot be diagnosed from the limited logging; logging is normally set to 11 and it is hoped that the error happens again and can be diagnosed.

In an ideal world we would want to log everything leading up to an error and there is a nice way of doing this with the log4net library, as it provides a way to log to memory.

The idea is to log everything to a memory listener and only dump it to the final log (file/Event Log/DB) in case of an exception, or any other condition that might require writing to file/DB, e.g. auditing.

Memory usage would need to be monitored in a real application to see how much of an impact this has, but seeing as memory is cheap and developers are expensive, I think there is good case for having extra memory to accommodate this pattern.

Below is a PoC that I did to see if it would actually would work, I will try to implement this pattern next time I'm doing a web service/windows service from scratch.

From a new Console project in Visual Studio

1.      Add reference to log4net (Can be downloaded from here)
2.      Edit AssemblyInfo.cs
         Add the following to the end of the file
         [assembly: log4net.Config.XmlConfigurator(ConfigFile = "ConsoleApplication2.exe.config", Watch = false)]

Code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using log4net;
using log4net.Repository.Hierarchy;
using log4net.Appender;

namespace ConsoleApplication2
{
  class Program
    {
        static void Main(string[] args)
        {
            ILog log = log4net.LogManager.GetLogger("PoC");
 
            Hierarchy hierarchy = LogManager.GetRepository() as Hierarchy;
            MemoryAppender mappender = hierarchy.Root.GetAppender("MemoryAppender") as MemoryAppender;
 
            ImportantDateCalculator(log, mappender, "first");
            System.Threading.Thread.Sleep(1000);
            ImportantDateCalculator(log, mappender, "second");
            Console.ReadKey();
        }
 
        private static void ImportantDateCalculator(ILog log, MemoryAppender mappender, string msg)
        {
            mappender.Clear();
 
            try
            {
                log.InfoFormat("{0}", msg);
                //loads of code here, calculating the meaning of life
                log.InfoFormat("{0}", new Random().Next(42, 42));
                //even more code here calculating the approximate date and time for the heat death of the universe.
                log.InfoFormat("{0}", DateTime.Now.ToString("s"));
 
                throw new Exception("oh noes");
            }
            catch (Exception ex)
            {
                StringBuilder message = new StringBuilder();
                mappender.GetEvents().ToList().ForEach(x => message.AppendLine(x.RenderedMessage));
                message.AppendLine(ex.ToString());
                log.ErrorFormat("{0}",message);                
            }
            
            mappender.Clear();
        }
    }
}
Config File
<?xml version="1.0"?>
<configuration>
  <configSections>
    <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>   
  </configSections>
  <log4net>
    <appender name="RollingFileAppender" type="log4net.Appender.RollingFileAppender">
      <file value="PoC.log"/>
      <threshold value="ERROR"/>
      <appendToFile value="true"/>
      <rollingStyle value="Size"/>
      <maxSizeRollBackups value="10"/>
      <maximumFileSize value="1000KB"/>
      <staticLogFileName value="true"/>
       <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date [%thread] %-5level %logger - %message%newline"/>
      </layout>
    </appender>
    <appender name="MemoryAppender" type="log4net.Appender.MemoryAppender">
    </appender>
    <root>
      <appender-ref ref="RollingFileAppender"/>
      <appender-ref ref="MemoryAppender"/>
    </root>
  </log4net>
<startup><supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0"/></startup></configuration>

Monday, 3 March 2014

Add users to Local Groups using PowerShell

Today I had to add a new hire to all the local Administrators group of all our dev and test servers.

So I wrote a script to do this:
$group =[ADSI]"WinNT://$(hostname)/Administrators,group"
$group.psbase.Invoke("Add",([ADSI]"WinNT://contoso/bobsmith").path)
This could be improved in so many ways that I hesitate it to even call it a script but there you go.

Given the churn rate, it probably makes sense to actually write a proper script, that takes values such as username and hostname and does it remotely, but I just hope that it won't be me running this next time :)

Saturday, 1 March 2014

Mob Programming

I've still not worked out whether this is a moderately elaborate hoax or a true methodology, judge for yourself here: http://mobprogramming.org/