Pages

Friday, 1 February 2013

What's new with CRM 2011 Update Rollup 12

During my regular Internet roaming for new CRM information, I came across the following article detailing some of the best new additions to CRM 2011. As I enjoyed reading it I decided to repost it here.
http://blogs.msdn.com/b/crm/archive/2013/01/29/10-things-i-love-about-dynamics-crm-2011-december-2012-service-update-or-polaris-update.aspx

Wednesday, 30 January 2013

CRM 2011 Update Rollup 12 is finally out!

Following many delays, Microsoft have finally released CRM 2011 Update Rollup 12.
You can catch up on the release exploits here.

If you'd rather download the update or read about what's new in it, go right ahead.
Just a friendly tip, I suggest waiting at least 3-4 weeks before deploying this update to any production/live environment.

The updated SDK is also available and is now at build 5.0.13.

Monday, 28 January 2013

Running SQL queries in PowerShell

Sometimes I come across such a requirement. It is quite useful for many many scripts.
Whenever I search for this information I come up with long code sinplets of various optimisation level.
On my last search for this I found an actual PowerShell function for executing SQL queries. I rejoiced but not for long. While this worked very well, it required the local installation of the SQL management tools. Quite a tall order for something that should work practically anywhere.
So I decided to write my own (simplified) SQL query execution function.

Here's the result.
Function Execute-SqlQuery {
    [cmdletbinding()]
    Param (
        #SQL server name.
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $Server,
        #SQL query to execute.
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $Query,
        #Database name on the SQL server.
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $Database
    )
    Process {
        #Only basic connection string support implemented. Always use Integrated Security.
        $ConnStr = "Server=$Server; Database=$Database; Integrated Security=SSPI;"

        #Create SQL connection object and pass connection string.
        $SqlConn = New-Object System.Data.SqlClient.SqlConnection
        $SqlConn.ConnectionString = $ConnStr
        $SqlConn.Open()

        #Create SQL command object and apply the connection to it.
        $SqlCmd = New-Object System.Data.SqlClient.SqlCommand
        $SqlCmd.CommandText = $Query
        $SqlCmd.Connection = $SqlConn

        #Crate SQL adapter object to return query data to the end user.
        $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
        $SqlAdapter.SelectCommand = $SqlCmd

        #Fill the SQL query results into a temporary DataSet object.
        $Results = New-Object System.Data.DataSet
        $SqlAdapter.Fill($Results) | Out-Null

        #Clean up
        $SqlCmd.Dispose()
        $SqlConn.Close()
        $SqlConn.Dispose()

        #Return
        Return $Results.Tables[0]
    }
}


I hope you all find this useful. It's a big time saver for sure.

Friday, 25 January 2013

Writing PowerShell functions properly

My roots are in IT infrastructure. I have never been a full on developer so my understanding of syntax or the proper objects and methods to use is not complete. However, I do try to do whatever I do in the cleanest manner. This includes my ever growing addiction with PowerShell.

I've been using PowerShell for a few years now. Started out slowly with the odd command here and there and eventually got a big push forward when I forced myself to sit down and write my first actual script file. Happiest day of my scripting life!
Ever since then I've been trying to find various best practises and good examples for whatever I try to do. The most common search I perform is for proper PowerShell function declaration. To me, more than anything, this is the holy grail pinpointing the main difference between PowerShell and most other languages.
PowerShell is a language aimed at IT personnel and not developers. This target audience knows little of best practises, proper syntax or code hardening. They are usually under budgeted and do not have the time for proper coding. In addition, they usually also lack the expertise required to properly estimate a coding project.
The above audience make up the main bulk of PowerShell article and blog writers. And herein lies the problem. The blind are leading the blind.

Whenever I try searching for a properly written PowerShell function I usually find articles explaining how simple it is to write these in PowerShell. Those articles usually contain functions similar to my first example below.

The following function (and all those to follow) are simples ones that take in 3 string values and combine them together with some additional text. I deliberately picked a simple example.

First, this base function example.
Function WriteText ($a, $b, $c) {
    Write-Host $a 'was first,' $b 'was second,' $c 'was third'
}

Some of you may be asking yourself what's wrong with this function. Well, quite a lot. I'll get to that in a moment. Let me start off by executing the above function:
WriteText 'ABC' 'DEF' 'GHI'

With the following output:
ABC was first, DEF was second, GHI was third

Great! That's what we wanted isn't it?
Yes, but only on a very limited basis.
What would happen if instead of providing 3 strings we only provided 2?
WriteText 'ABC' 'GHI'

With the following output:
ABC was first, GHI was second, was third

Now hang on there PowerShell window! That's not what I wanted you to do!
Clearly 'ABC' was supposed to be the first string and 'GHI' was the third. I don't even know what I expected you to do with the missing second one. That should have returned an error, right?
Wrong. All wrong.

Scripts (like other code) only do what you tell them to do. Nothing more and nothing less. So where did we go wrong with this function?
Oh so many places! Let's start listing some of them.
  1. Our function's name does not follow PowerShell's hyphened double worded function format. Examples of proper functions would be Get-Item, Set-ExecutionPolicy etc.
  2. Our variables are not typed hardened. This could cause all sorts of issues in longer code segments where we expect something to work which only works with specific types.
  3. Our general syntax uses the simplest possible PowerShell syntax. While this is fast to write, it is less readable if we later on go back and want to make alterations or explain it to others.

My next example is the very same function with some syntax and type hardening alterations. Still not a very robust function but at least it's readable. It now also has a meaningful function name which follows proper PowerShell best practises for naming. I even added my own 'La' prefix to the function so it would not clash with other similarly named functions in the future.
Function Get-LaStringConcat1 {
    Param (
        [String]$FirstString,
        [String]$SecondString,
        [String]$ThirdString
    )
    Process {
        Write-Host $FirstString 'was first,' $SecondString 'was second,' $ThirdString 'was third'
    }
}

I can now easily execute this function in the same way I executed the previous one but would like to show you additional changes to the way I like working. In order to avoid mishaps you should always strive to execute functions in a parametrised manner. This is demonstrated here (and will be the guideline for all the following examples).
Get-LaStringConcat1 -FirstString 'ABC' -SecondString 'DEF' -ThirdString 'GHI'

This returns the following output:
ABC was first, DEF was second, GHI was third

If I were to take out one of the strings, I would still get undesired results.
Example:
Get-LaStringConcat1 -FirstString 'ABC' -ThirdString 'GHI'

Output:
ABC was first,  was second, GHI was third


Now let's move on and fix the undesired results. The first and easiest way to fix this is by requiring all 3 parameters contain a value. Simple enough to do, we just need to make the following changes to our function.
Function Get-LaStringConcat2 {
    Param (
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $FirstString,
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $SecondString,
        [parameter(Mandatory=$true)]
        [validateNotNull()]
        [String]
        $ThirdString
    )
    Process {
        Write-Host $FirstString 'was first,' $SecondString 'was second,' $ThirdString 'was third'
    }
}

The above function implemented a very addition to all 3 parameters. The first option indicates to PowerShell that the parameter is mandatory so it cannot be omitted anymore. The second option tells PowerShell that even if the parameter is not omitted it must still contain a non-null value. It can be executed in the same way as before. I will skip the valid execution and go straight to one with missing parameters. Let's provide the first and second parameters but leave out the third.
Get-LaStringConcat2 -FirstString 'ABC' -SecondString 'CDE'

Output:
cmdlet Get-LaStringConcat2 at command pipeline position 1
Supply values for the following parameters:
ThirdString:

That's good isn't it? PowerShell noticed I did not supply the ThirdString parameter and is now demanding it of me. What if I provide no value and just hit Enter? Well, this happens:
Get-LaStringConcat2 : Cannot bind argument to parameter 'ThirdString' because it is an empty string.
At line:1 char:1
+ Get-LaStringConcat2 -FirstString 'ABC' -SecondString 'CDE'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidData: (:) [Get-LaStringConcat2], ParameterBindingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAllowed,Get-LaStringConcat2
Why is that? Because I also asked PowerShell to make sure any supplied values are not null.


Right, 'are we done?' you may be asking. Far from it (and my apologies for the long post).
While the above example does harden our function so it can't be executed with missing parameters, I would prefer it be more flexible and forgiving. For this reason I came up with the following example.
Function Get-LaStringConcat3 {
    Param (
        [String]$FirstString,
        [String]$SecondString,
        [String]$ThirdString
    )
    Process {
        [String]$OutStr = ''

        If ($FirstString)  {$OutStr += $FirstString + ' was first, '}
        If ($SecondString) {$OutStr += $SecondString + ' was second, '}
        If ($ThirdString) {$OutStr += $ThirdString + ' was third'}
        
        If ($OutStr -eq '') {$OutStr = 'No input strings were provided'}
        
        Write-Host $OutStr
    }
}

Wait a second, the function just got a lot longer. Why? I made the following changes from the previous one:
  1. Removed the [parameter(Mandatory=$true)] and [validateNotNull()] options. I do want to allow running the function with missing parameters as this is a desired feature.
  2. As parameters can now be empty, I need to check each one and build up my output string accordingly. That's what the whole code block in Process{} does. The function will now create a blank output string variable and start adding to it for each non-null variable.
  3. If no parameters at all were provided the function will output the 'No input strings were provided' message.
Executing the above function with a missing parameter:
Get-LaStringConcat3 -FirstString 'ABC' -ThirdString 'GHI'

ABC was first, GHI was third

Now that was good! Our function finally looks good even with missing parameters! How about no parameters at all?
Get-LaStringConcat3


No input strings were provided

Perfect! Now my function is ready to be shipped out so the entire company can use it!

Well, not quite yet. There's just one more adjustment I want to make to it. It's a very subtle one this time. Note how our function uses Write-Host for outputting the compiled string. The string isn't returned as the function's value but is just printed to screen as text. What if I wanted to store that text in a variable for later use? Not easily done with our current function. Here's what happens if I try.
$SomeString = Get-LaStringConcat3 -FirstString 'ABC' -ThirdString 'GHI'


ABC was first, GHI was third

Note how the variable value assigning output the string. But did it store it?
Write-Host $SomeString

Output:


No value. Our variable is empty because the function didn't return a value, it just printed to screen.

Now that's not good. I want to use this function in long scripts and must be able to store its output value.
So let's make a very small change to our function. In fact, it's only changing a single word. Can you spot it?
Function Get-LaStringConcat4 {
    Param (
        [String]$FirstString,
        [String]$SecondString,
        [String]$ThirdString
    )
    Process {
        [String]$OutStr = ''

        If ($FirstString)  {$OutStr += $FirstString + ' was first, '}
        If ($SecondString) {$OutStr += $SecondString + ' was second, '}
        If ($ThirdString) {$OutStr += $ThirdString + ' was third'}
        
        If ($OutStr -eq '') {$OutStr = 'No input strings were provided'}
        
        Return $OutStr
    }
}

Did you spot it? All I did was change the Write-Host command to a Return command. Return is a special command which can only be used inside a function and tells the function which data to return when executed. Let's have a look at the previous example again:
$SomeString = Get-LaStringConcat4 -FirstString 'ABC' -ThirdString 'GHI'




Now that's more like it. Storing a value in a variable shouldn't output any data unless I specifically ask for it. Now how about the actual variable? Does it have the string stored?
Write-Host $SomeString

Output:
ABC was first, GHI was third

Success!
Now the function can be shipped off and used in other scripts by anyone. I'd call this hardened enough for a more distributed use.

To be fair, we're not really done. There are still a lot of small bug fixed to perform with our function. Our 3 strings aren't put together smoothly enough, we haven't added any in-line help to our function, there is no proper documentation for what each line does (very important!) and many other small and annoying things to do. For the purpose of what I wanted to show you today I'll stop here. The rest you can do on your own. :)

***Update 30/01/2013
+Peter Kriegel suggested I also include the the [cmdletbinding()] option in this blog post. Seeing as it does not add too much complication, I decided to.
Instead of transposing other wonderful articles I decided to just link them here. I suggest a read through of these:

Wednesday, 16 January 2013

Picking apart CRM 2011's diagnostics page - Part 2

In the first part of this article I covered the basics of what goes on behind the scenes of CRM 2011's diagnostics page. If you missed it, catch up on it here: Picking apart CRM 2011's diagnostics page - Part 1.
I am now going to dive deeper into the JavaScript sections of the diagnostics tests. In total there are 8 JavaScript performance tests with 5 of those nested together under the JavaScript Dom stress test. I call these stress tests as they are basically memory and CPU intensive performance tests designed to give an indication of client computer (and browser) performance.
When running these tests they produce the following output (numbers may differ):
=== Array Manipultaion Benchmark ===
Time: 286 ms
Client Time: Wed, 16 Jan 2013 13:59:08 UTC
=== Morph Benchmark ===
Time: 334 ms
Client Time: Wed, 16 Jan 2013 13:59:08 UTC
=== Base 64 Benchmark ===
Time: 35 ms
Client Time: Wed, 16 Jan 2013 13:59:08 UTC
=== DOM Benchmark ===
Total Time: 561 ms
Breakdown:
  Append:  20ms
  Prepend: 25ms
  Index:   465ms
  Insert:  22ms
  Remove:  29ms
Client Time: Wed, 16 Jan 2013 13:59:09 UTC
So what does all that mean? What does each of these tests actually do?
Now it all gets interesting. :)

I spent some time taking apart the exact JavaScript functions executed for each of these tests. I have detailed them below including the source snippet.

Array Manipulation Benchmark
function arrayBenchmark() {
    for (var ret = [], tmp, n = 2e3, j = 0; j < n * 15; j++) {
        ret = [];
        ret.length = n
    }
    for (var j = 0; j < n * 10; j++)
        ret = new Array(n);
    ret = [];
    for (var j = 0; j < n; j++)
        ret.unshift(j);
    ret = [];
    for (var j = 0; j < n; j++)
        ret.splice(0, 0, j);
    for (var a = ret.slice(), j = 0; j < n; j++)
        tmp = a.shift();
    for (var a = ret.slice(), j = 0; j < n; j++)
        tmp = a.splice(0, 1);
    ret = [];
    for (var j = 0; j < n; j++)
        ret.push(j);
    for (var a = ret.slice(), j = 0; j < n; j++)
        tmp = a.pop()
}
This test builds up an array in memory and then performs various manipulations on it. The manipulations performed are (in order) unshift, splice, shift, splice, push, pop. Just simple memory manipulation games.

Morph Benchmark
function morphBenchmark() {
    var loops = 30, nx = 120, nz = 120;
    function morph(a, f) {
        for (var PI2nx = Math.PI * 8 / nx, sin = Math.sin, f30 = -(50 * sin(f * Math.PI * 2)), i = 0; i < nz; ++i)
            for (var j = 0; j < nx; ++j)
                a[3 * (i * nx + j) + 1] = sin((j - 1) * PI2nx) * -f30
    }
    for (var a = Array(), i = 0; i < nx * nz * 3; ++i) a[i] = 0;
    for (var i = 0; i < loops; ++i) morph(a, i / loops)
}
This test performs mathematical manipulations in a loop on an array variable. This is again a memory and CPU stress test.

Base 64 Benchmark
function base64Benchmark() {
    var toBase64Table = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/", base64Pad = "=";
    function toBase64(data) {
        for (var result = "", length = data.length, i = 0; i < length - 2; i += 3) {
            result += toBase64Table[data[i] >> 2];
            result += toBase64Table[((data[i] & 3) << 4) + (data[i + 1] >> 4)];
            result += toBase64Table[((data[i + 1] & 15) << 2) + (data[i + 2] >> 6)];
            result += toBase64Table[data[i + 2] & 63]
        }
        if (length % 3) {
            i = length - length % 3;
            result += toBase64Table[data[i] >> 2];
            if (length % 3 == 2) {
                result += toBase64Table[((data[i] & 3) << 4) + (data[i + 1] >> 4)];
                result += toBase64Table[(data[i + 1] & 15) << 2];
                result += base64Pad
            }
            else {
                result += toBase64Table[(data[i] & 3) << 4];
                result += base64Pad + base64Pad
            }
        }
        return result
    }
    var toBinaryTable = [-1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 62, -1, -1, -1, 63, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, -1, -1, -1, 0, -1, -1, -1, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, -1, -1, -1, -1, -1, -1, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, -1, -1, -1, -1, -1];
    function base64ToString(data) {
        for (var result = "", leftbits = 0, leftdata = 0, i = 0; i < data.length; i++) {
            var c = toBinaryTable[data.charCodeAt(i) & 127], padding = data[i] == base64Pad;
            if (c == -1)
                continue;
            leftdata = leftdata << 6 | c;
            leftbits += 6;
            if (leftbits >= 8) {
                leftbits -= 8;
                if (!padding) result += String.fromCharCode(leftdata >> leftbits & 255);
                leftdata &= (1 << leftbits) - 1
            }
        }
        if (leftbits)
            throw Components.Exception("Corrupted base64 string");
        return result
    }
    for (var str = [], i = 0; i < 819; i++)
        str.push(String.fromCharCode(25 * Math.random() + 97));
    str = str.join("");
    for (var base64, loops = 1, i = 0; i <= loops; i++)
        base64 = toBase64(str);
    for (var i = 0; i <= loops; i++)
        base64ToString(base64)
}
A long and intimidating function that in essence converts values to and from base64 encoding. Once again a fairly simple memory and CPU stress test.

DOM Benchmark
This test is actually a combination of 5 sub tests. I have broken up the main function below so to explain more clearly.

function domBenchmark() {
    var count = 1500, divs = new Array(count);
This first JavaScript snippet creates the base structure within which the other functions operate. Basically creating the variables used by each sub function. Note that the 5 following functions are nested within the domBenchmark() function and are not root functions.
Later in this function the HTML DOM structure is created. The DIV element all functions relate to is created in just a few lines.

DOM Benchmark - Append
function testAppend(div) {
        for (var i = 0; i < count; i += 1) {
            var add = document.createElement("div");
            div.appendChild(add)
        }
    }
A function that creates 1500 DIV elements all nested under the source DIV.

DOM Benchmark - Prepend
function testPrepend(div) {
        for (var i = 0; i < count; i += 1) {
            var add = document.createElement("div");
            div.insertBefore(add, div.firstChild)
        }
    }
A function that creates an additional 1500 DIV elements but instead of appending them as before it prepends them. They are added to the top of the parenting DIV elements instead of the bottom.

DOM Benchmark - Index
function testIndex(div) {
        for (var i = 0; i < count; i += 1)
            divs[i] = div.childNodes[count * 2 - i * 2 - 1]
    }
A function that indexes 1500 of the div elements created in the testAppend() and testPrepend() functions. It only indexes every second DIV.
This is considered indexing as it takes the DIV structure and stores it in an array which acts as an index table. Again, meant at stressing memory and CPU.

DOM Benchmark - Insert
function testInsert(div) {
        for (var i = 0; i < count; i += 1) {
            var add = document.createElement("div");
            div.insertBefore(add, divs[i])
        }
    }
A function that takes the index table array variable and inserts each of its DIV elements into the HTTP DOM. A very similar test to the testPrepend() test but with array data instead of cleanly created.. A simple array insert test to stress memory and CPU.

DOM Benchmark - Remove
function testRemove(div) {
        for (var i = 0; i < count; i += 1)
            div.removeChild(divs[i])
    }
The final test function which removes 1500 of the DIV elements created earlier.

DOM Benchmark - Run Tests
var div = document.createElement("div");
    div.style.display = "none";
    div.setAttribute("id", "domBenchmarkDiv");
    document.body.appendChild(div);
    var start, end;
    start = new Date;
    testAppend(div);
    end = new Date;
    var appendTime = end - start;
    start = new Date;
    testPrepend(div);
    end = new Date;
    var prependTime = end - start;
    start = new Date;
    testIndex(div);
    end = new Date;
    var indexTime = end - start;
    start = new Date;
    testInsert(div);
    end = new Date;
    var insertTime = end - start;
    start = new Date;
    testRemove(div);
    end = new Date;
    var removeTime = end - start;
    document.body.removeChild(div);
    var total = appendTime + prependTime + insertTime + indexTime + removeTime, results = "Breakdown:\r\n  Append:  " + appendTime + "ms\r\n  Prepend: " + prependTime + "ms\r\n  Index:   " + indexTime + "ms\r\n  Insert:  " + insertTime + "ms\r\n  Remove:  " + removeTime + "ms\r\n";
    return [total, results]
}
And finally, actually executing all the above functions. This is done in turn and by order right after creating the parent DIV elements as well as some time tracking variables used for duration reporting of the results.
The final steps clean up the parent DIV element and return the stress test results as text.


As you can clearly see, these stress tests have very little to do with actual CRM performance on either the client or server. They are memory and CPU client stress tests designed to measure how browsers behave.
It is important to point out that these functions are rather short and simple and as such are not affected by antivirus scans (the dredded McAfee for example which can kill CRM through its ScriptScan feature).

I hope that sums it all up well enough.
I am open to any additional questions.

Tuesday, 15 January 2013

Picking apart CRM 2011's diagnostics page - Part 1

A few days ago I was called in to examine client-side CRM 2011 performance issues. I love a good examination process that lets me dig in!
After going through the obvious culprits such as update rollups, installed antivirus software, client and server specifications etc. it eventually turned out to be a server I/O problem which was causing longer than expected load times. I used HTTP Watch and WireShark to point me in that direction and then had some discussions with the IT team.
All that, however, wasn't the interesting part. Not even close.

During this examination I used CRM 2011's diagnostics page (http://<server>/<org>/tools/diagnostics/diag.aspx or http://<server>/tools/diagnostics/diag.aspx). This was part of CRM Online that came to the on-premise version in Update Rollup 4. More about that in Rhett Clinton's article. The reason I started picking apart the diagnostics page was because I was getting fairly good results despite CRM behaving worse than expected. Seeing that, I decided to check what exactly the diagnostic pages does and measures. Below is a sample of this page from one of my test systems.

CRM 2011 Diagnostics Page
CRM 2011 Diagnostics Page
One thing I'd noticed about the diagnostics figures is that they vary greatly between different networks and client computers. What does seem fairly constant (putting aside any real issues) is the ratio between the 4 JavaScript sections. While the number may vary, the Dom Benchmark will always be the longest and the Base64 test will always be the shortest.

The first thing I did was to understand what each of the tests stands for. These are as follows:
  1. Latency Test
    Not exactly what I'd call a latency test but as close as you can get with HTTP. This calculates the average time taken over 20 downloads of a very small text file (only 12 bytes long).
    The file downloaded is /_static/Tools/Diagnostics/smallfile.txt
  2. Bandwidth Test
    Performs downloads of image files in increasing size. Slightly similar in concept to many Internet based speed test sites. These download speeds are then averaged out (weighted of course) to give one average download speed value.
  3. Browser Info
    Basic JavaScript pull of the local browser details such as browser name, version, cookie status, platform (OS) and the user-agent string.
  4. IP Address
    Reports the IP address of the client computer as known to the server. This is passed as a variable in the diag.aspx file. The IP address is server-side dynamic and represents the IP address which was used to contact the server with.
  5. JavaScript tests (there are 4)
    Runs various JavaScript times loops and returns their execution time. This is basically a memory/CPU stress test on the client machine. I will elaborate more on thest in Part 2 of this article.
  6. Organization Info
    Basic server info such as organization name, time on server and url.
These tests output quite a lot of useful data to the Results text area. When running these tests I highly recommend looking at that and not just the values in the results table.

In the second part of this article I will dive into the JavaScript tests phase. If you're like me, that's the most interesting part. :)

Monday, 7 January 2013

CRM 2011 Update Rollup 12 imminent

*** Updates below

There has just been a refresh on a lot of the CRM 2011 downloads on Microsoft's downloads page. If you click search and sort by 'newest to oldest' it'll bring up quite a few CRM 2011 downloads on the first 3 pages.
One of the downloads refreshed is the 5.0.13 SDK download. To me this points to an imminent Update Rollup 12 release which will, hopefully, also bring new Service Release functionality. This was previously meant to be released with the now non-existent Update Rollup 9 (as you may recall we jumped from Update Rollup 8 to 10).

I have my fingers crossed for this!


*** Update 14/01/2013
As of 07/01/2013 the Update Rollup 12 download link has been live with download links only for the client elements of the update. Going over the KB article linked from the download page it seems Microsoft was aiming at releasing the on-premise server update files on 10/01/2013. That was 4 days ago.
My personal estimate is that the client updates went live in order to support the server updates deployed on Microsoft's CRM Online hosting service. I am not sure why there's a delay in releasing the on-premise server updates and can only guess there have been some issues with the deployment to CRM Online. Hopefully we should be getting our much anticipated on-premise Update Rollup 12 very soon.

*** Update 15/01/2013
Going through the KB article this morning I noticed a slight change in wording. Microsoft are no longer promising the on-premise server components on 10 January as before but rather just in January 2013. The exact wording follows.
Update Rollup 12 for Microsoft Dynamics CRM 2011 will be available for on-premise customers in January 2013.

*** Update 21/01/2013
Yet more delays with the on premise server components of Update Rollup 12. It seems Microsoft found a critical bug with the Update Rollup 12 release just after publishing it and it was swiftly removed. The message about this went out on the 15th but has only found its way to my Google search feed today. The exact wording follows.
I'd like to give an update on the Microsoft Dynamics CRM 2011 Update Rollup (UR) 12 Release. We originally made UR12 available on Thursday, January 10th, to deliver multi-browser support for our On Premises customers. After discovering an issue that could potentially impact a customer's database, we withdrew the UR12 Server bits to ensure that no On Premises customers would be affected. Unfortunately, these bits were available on the Microsoft Download Center for a short period of time. If you downloaded the UR12 Server bits, please do not install them. We plan to repost the UR12 Server bits within the next week, and we will keep you informed as to when they are available. We have taken measures to improve our engineering processes and methodologies going forward, and we take your feedback very seriously. We apologize for any inconvenience this has caused. 

** Update 30/01/2013
Surprisingly, Microsoft kept the vague (and once altered) release date of January 2013. Update Rollup 12 for CRM 2011 has been officially and fully released today.
Finally!