{Dynamics CRM + Maintenance Page} How to pull down specific CRM Organization for maintenance in a multi-tenant CRM environment with IFD configured

I know the title of the blog is a bit confusing. So if you are reading this, I would suggest you continue reading this.

So first the requirement. Our client had two CRM Organizations in a single installation of CRM. And both the organizations are on IFD.

With time one organization because of it more transaction data and usage grew significantly bigger and business decided to move it on a separate server altogether and on a newer version of CRM. To illustrate this even more let us take names here.

Organization Name

Org1 is to be moved to a new server to a newer version of CRM. It was decided we would have 3 days of downtime for this. Users have been communicated accordingly that https://org1.contoso.com would be unavailable for 3 days. But during this time, users should be able to access https://org2.contoso.com

To bring a CRM website down, we can usually create a HTML with the appropriate markup and name it app_offline.htm and then place it in the root folder. Make sure you name it exactly like this, otherwise it would not work. However in our case this would not work since that would bring the whole website down and org2 won’t be accessible as well.

So we were left with no other choice but to configure rewrite rule.

Go to IIS –> Microsoft Dynamics Website –> URL Rewrite. If you do not see this icon, rewrite module is not installed. You need to install the rewrite module in that case.



Click on “Add Rule” and select “Blank Rule” from the menu.



You final rule would look like this.



And voila! You are done. Users would still be able to access org2.contoso.com but when they try to access org1.contoso.com, they would get the maintenance page.

I am redirecting here to a local page called maintenance.htm where the appropriate message is shown to the user with downtime notification.


Hope this helps!

{Dynamics CRM + Web API + Plugins} Can we make Web API calls from Plugins in Dynamics CRM 2016

After my last couple of blogs on Web API, I am being asked this question repeatedly. Can I use Web API from Plugins/ Custom workflows? And I keep on asking back, why do you need that? After all there is the almighty organization service which virtually allows you do anything and everything that you would need to do. Honestly I find very limited use of invoking Dynamics Web API from Plugins unless you want to get out of the transaction scope under which your plugin is running.

That said, I wondered why not give it a try. After all learning something new never hurts. So let’s see the feasibility for this in Dynamics CRM Online


CRM On-Premise (without IFD)

Honestly before I tried this code, I had doubts whether it would work at all. However it worked like a charm. In the below code, I am retrieving an account with a particular guid.

using (WebClient client = new WebClient())
client.UseDefaultCredentials = true;

                byte[] responseBytes = client.DownloadData(new Uri(“<your web api url>/accounts(3C2D2712-E43F-E411-9402-005056AB452C)”));
                string response = Encoding.UTF8.GetString(responseBytes);

                // parse the json response

This is a simple example but the same construct would work for any web api query. Just paste this code anywhere in your plugin and it should work. Make sure you specify the UseDefaultCredentials as true to make the web api call under the context of the user the plugin is running.

This code will work even if you are calling the the webapi from a console application. The only difference is in that case you would need to specify your credentials instead of using the default credentials.

client.Credentials = new NetworkCredential(<username>, <password>, <domain>)



CRM Online or IFD

Unfortunately I could not make it work for CRM online when i wrote this blog. To see it working with the latest version of Dynamics online, go for the below link


Now continuing with this post for 2016 version IFD

The above could would not work for CRM online since the WebClient credentials is of type Network credentials whereas you need to pass office 365 credentials to for CRM online.

For calling online WebAPI methods you would need to pass the OAuth token in the header and for that you would need user interaction. Executing the webclient code from above with the credentials would give you unauthorized access. I even tried to use the ADAL library in plugin and get the authentication token but since the plugin is in sandbox mode you would end up with an error like the one below.

Unhandled Exception: System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35]]: Unexpected exception from plug-in (Execute): XrmPlugins.TestEntityPostUpdate: System.TypeLoadException: Inheritance security rules violated while overriding member: ‘Microsoft.IdentityModel.Clients.ActiveDirectory.AdalException.GetObjectData(System.Runtime.Serialization.SerializationInfo, System.Runtime.Serialization.StreamingContext)’. Security accessibility of the overriding method must match the security accessibility of the method being overriden.Detail:
<OrganizationServiceFault xmlns:i=”
http://www.w3.org/2001/XMLSchema-instance” xmlns=”http://schemas.microsoft.com/xrm/2011/Contracts”>
<ErrorDetails xmlns:d2p1=”
http://schemas.datacontract.org/2004/07/System.Collections.Generic” />
<Message>Unexpected exception from plug-in (Execute): XrmPlugins.TestEntityPostUpdate: System.TypeLoadException: Inheritance security rules violated while overriding member: ‘Microsoft.IdentityModel.Clients.ActiveDirectory.AdalException.GetObjectData(System.Runtime.Serialization.SerializationInfo, System.Runtime.Serialization.StreamingContext)’. Security accessibility of the overriding method must match the security accessibility of the method being overriden.</Message>
<InnerFault i:nil=”true” />


A pretty ugly error but what this means is that to acquire the token, Active Directory Authentication Library would need to run in full trust mode but plugins registered in sandbox( which is the only option for online) runs only under partial trust.


As of now I am stuck with this. Would definitely like to know if anybody is able to get this working.


Hope this helps!

{Good to Know} Why you should never user USER_ROLES variable in your Dynamics CRM form scripts

Sometimes simple things can give you nightmares and the same happened with one of my colleague. So this is the scenario.

My colleague just needed to identify if a user is a particular security role and for that he was using the OOB script – Xrm.Page.context.getUserRoles(). The funny thing is that this method was returning him zero roles. After hours of cross checking, he finally turned up to me in frustration and we started debugging together.

For the first 15 mins, I was totally confused. Indeed Xrm.Page.context.getUserRoles() was returning an empty array. Finally I launched my favourite Chrome Dev Tools and started analyzing the method definition for the function. So I went ahead to the console of the Chrome developer tools and to check the method definition, instead of using Xrm.Page.context.getUserRoles(), I typed Xrm.Page.context.getUserRoles. Notice the difference here. I did not mention the parenthesis.



And now you know. This function gets the value from the variable USER_ROLES. And my developer friend was having an array declared at the top of the form load script with exactly the same name. So this variable was getting initialized to an empty array and no wonder Xrm.Page.context.getUserRoles() was returning empty array.

Believe me, you would run out of search phrases in google to identify issues like this and may waste hours or days identifying the issue. Thanks to the wonderful developer tools these days in modern browsers, we were able to pinpoint to this issue in no time.

There are some other internal variables specific to user which CRM uses. Just good to know as this may come in handy.



Hope this helps!

{In Depth} Power BI and Dynamics CRM on-premise–When and how it works?

Microsoft has been focusing recently in Cloud First Strategy for Microsoft Dynamics CRM. Every new cool feature you name, you see it in online first. However as with everything, when you focus heavily on one side, some aspects always gets neglected. And big on-premise customers are at the receiving end recently for this. Believe me as a consultant, I am faced every now and then with the perennial question from my on-premise customers – “When is that feature coming in on-premise?” And many a times I don’t have an answer as well. That the same product has some features which is not present their version of CRM is something very difficult to convince a customer.

Same is the Power BI integration with Dynamics CRM. Just talking of Power BI, it’s a awesome product. And it feels really great when something like this works with Dynamics CRM. And with dynamics CRM Online, this integration just works WOW.

However the same cannot be said about Dynamics CRM on-premise. In this topic, I will cover in depth the integration of Dynamics CRM on-premise with Power BI. Sit tight because we are going to go deep here.

First of all, we need to understand the Power BI connecting string for Dynamics CRM. Microsoft Dynamics CRM comes as one of the listed data sources for Power BI. So let us choose that.


Once we try to connect to that, we need to enter the OrganizationData.svc url. You would be asked to enter your credentials. And voila, you are done!



Wonderful! Just a few clicks and you are all set up with the data. You can start building wonderful reports. However we are more interested in CRM + Power BI connectivity. So let’s explore the connection string that is used by Power BI to connect to Dynamics CRM Online. The connection string looks something like below.


If you explore the connection string, the first thing to note here is the Data Source property. Microsoft has really come up with an inventive name – $EmbeddedMashup(<some random guid>)

I am more interested in the Extended Properties section. Looks like Microsoft has hidden the query string in that. Seems interesting right! Let’s explore it together.

From the string format it seems to me it is base64 encoded string. Oh, very simple it is. Let us quickly covert the base64 string to readable text. Below is the sample code to do that. I have written this code in the Console Application. And the ExtendedProperties value, I have stored in the App.Config file.

var str = ConfigurationManager.AppSettings["ConnectionString"];
var rawstring = System.Text.Encoding.UTF8.GetString(Convert.FromBase64String(str));


But alas! The rawstring gives the below output. Certainly not what we expected!

PK\0\0\b\0�|iHA�k��\0\0\0�\0\0\0\0\0Config/Package.xml �\0(�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0���\n�@�_E���jb��&Rt�u�%]C���:�H�BAݺ�\f����v�tn��Aw&!\f<�(#�R�*!�=�I�\ny.*��f��A’���S:ML>t}E��1z̶��U[��\f�0R�/U�����{���C\b��\a�c�͢����<�?1n�Ǝ�ʸ��b�~~�’PK\0\0\b\0�|iH�髤\0\0\0�\0\0\0\0\0[Content_Types].xml �\0(�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0m�K�0\fD�y���@\b5e܀\vD�����E�l,8W mw��g�y��zW�d\a�1��)�%\br��z�*���{8���(�uQA���tdu,| ��ƏVs>��6w�n�r��;&ǒ�PWgj�4���,��\aqZss��ĸ����?y�����$m�v!q^PK\0\0\b\0�|iH��r{�\0\0\0�\0\0\0\0\0Formulas/Section1.m �\0(�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0m��\n�@��@��8�\b��F��R��!ȺY��N�6�|ws���if�����F���8�#߂�ZL��v�\vb�,�Rlą8�Ġ�viH�;`H�Du\"[�_e��iv�2E��i}7���O�WH>[��<����uh�=��l�|�’��%�F�<����@9+Tc�;7ğgY�ʰ��#e����PK-\0\0\0\b\0�|iHA�k��\0\0\0�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0Config/Package.xmlPK-\0\0\0\b\0�|iH�髤\0\0\0�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0�\0\0\0[Content_Types].xmlPK-\0\0\0\b\0�|iH��r{�\0\0\0�\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0�\0\0Formulas/Section1.mPK\0\0\0\0\0\0�\0\0\0�\0\0\0\0


Don’t be upset. Even in this weird string we have some ray of hope. Just check for the highlighted names – Package.xml, Section1, [Content_types].xml. Does they look familiar. Well from a simple guess all these seems file names. What else other that an archived file like (.zip) can store multiple files within it. So let’s convert it into a zip file. Below is the code for the same.

var str = ConfigurationManager.AppSettings["ConnectionString"];

    System.IO.File.WriteAllBytes(@"C:\Debajit\Personal\My Projects\Power BI\connection.zip", Convert.FromBase64String(str));

Now when I open the connection.zip, below is the screenshot.


And our guess was correct! Not let’s quickly see the hidden connection string. It is actually stored in the Formulas folder in the file – Section1.m. When I open the file in Visual Studio, I can see the connection string below.

section Section1;

shared #"AccountSet (2)" = let
    Source = OData.Feed("
    AccountSet_table = Source{[Name="AccountSet",Signature="table"]}[Data]


As you can clearly see, Microsoft uses an internal method OData.Feed to connect to the Dynamics CRM Online feed. So surely we cannot do any trick to this connection string to make it work for on-premise. It’s an inbuilt stuff after all.

So how can we make it work for on-premise environments. Microsoft says you can use the OData Feed. However let’s explore where it works and where it might not work for you. In both the examples, I would make use of Power BI Desktop

Scenario 1: On-Premise (non-IFD)

No brainer here. Instead of Dynamics CRM Online Data source, select OData Feed here. Most of the cases you need to be in your network to connect to Dynamics CRM on-premise OData Url.


Then enter OData URL for your organization. It might ask you for credentials. So enter the same. Once you are connected you will need to select the tables for your report. Here I have selected RoleSet to create an absolutely waste PIE Chart. The PIE Chart shows the count of Role by business unit. I saved it by the name ‘crmdata’.


I save the file as crmdata.pbix. Then I go ahead to URL – https://app.powerbi.com and in Get Data –> Files-> Local File -> Upload the recently saved Power BI File. As you can see, now my Power BI workspace shows the report and the data set as well.


So now what. You have uploaded your on-premise report but how Power BI will refresh the report when the underlying data source changes in your on-premise CRM. For that you need to schedule refresh of the dataset – crmdata here. But how is that going to happen? After all how does Power BI, hosted in Cloud can connect to your data source? Well Microsoft have your back here. For this you would need to install the Power BI Personal Gateway.

You can download the Power Bi Personal Gateway from the following link – https://www.microsoft.com/en-us/download/details.aspx?id=47753

A detailed understanding of how power bi personal gateway works is explained in the following link – https://powerbi.microsoft.com/en-us/documentation/powerbi-personal-gateway/

Just to give you a brief understanding

  • You install Power BI Personal Gateway on your laptop/ desktop/ common server machine. Power BI installs as a service (not always though, please refer to the documentation) and keeps running as long the machine in which it is installed is up and running.
  • Power BI secretly connects with the Power BI personal gateway at the scheduled refresh times (I will come to that on how to configure the same)
  • The connection happens through Azure Service Bus Namespace. For this some outbound ports needs to be opened for connection to happen. Please refer to the above documentation for the same.

Once you have installed and configured the Power BI – Personal Gateway, open Power BI and go to Settings –> Data Sets-> Select the data set that you want to configure.



If you expand Gateway connections, you would find that Power BI will detect if the Gateway is online.


Now expand Data Source Credentials. You might see an error icon in this. Don’t worry, you are only interested for OData Connection. So set up the OData connection here. Once successful your screen should show like the below.


The final step is very easy. The refresh schedule. I think the below screenshot is good enough here. No need for words.


All set and done, as long as you Power BI – Personal Gateway is running, your report would keep refreshing.


Scenario 2 – On-Premise with IFD

Not so good news here. Specially if your IFD is configured with a proxy server. And that is usually the case in most standard implementations. In these case once you enter the OData URL, you would receive an error like this in the Power BI


This is because when you are configuring the IFD with ADFS Proxy server, you would have a home realm URL. Remember at the beginning of the blog, where we explored the connection string, we found Microsoft users OData.Feed API to connect. Unfortunately OData.Feed API will not validate the Home Realm Url in this case.

All you can do in this case is use stand alone excel files and use the Power Query add-on in Excel to refresh the data. Check for this wonderful article on the same topic. Please note that even after applying the trick, the excel files would refresh, but if you try to upload the same in Power BI, it won’t work because Power BI will not take the connection string embedded in the Workbook which has been uploaded.



Hope this helps next time somebody wants power bi integration with CRM on-premise!

{KnowHow} How to use Discovery Service Web API of Dynamics CRM 2016

Recently I posted in my blog on how to execute Web API queries from external ASP.NET web application to retrieve data. Details could be found here – https://debajmecrm.com/2016/02/29/knowhow-how-to-execute-web-api-calls-to-microsoft-dynamics-crm-from-an-external-asp-net-web-application/.

After this post, people has been asking me to how to leverage the Discovery service Web API with Dynamics CRM 2016. And the most common question being – “Why am I not getting any response when I query for instance data through the Web API?”

Well, let’s jump to an example here. The following is the code for a blog reader sent me. In a web-resource on a form, he simply invoked the webapi endpoint of the discovery service to get the instance details.

function getOrgs() {
    var req = new XMLHttpRequest();
    req.open("GET", “https://disco.crm.dynamics.com/api/discovery/v8.0/Instances”);

    req.setRequestHeader("Accept", "application/json");
    req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
    req.setRequestHeader("OData-MaxVersion", "4.0");
    req.setRequestHeader("OData-Version", "4.0");
    req.onreadystatechange = function () {
        if (this.readyState == 4 /* complete */) {
            req.onreadystatechange = null;
            if (this.status == 200) {
                var discovery = JSON.parse(this.response);
                alert("User Id : " + discovery.Url);
            else {
                var error = JSON.parse(this.response).error;

He informed and told that he is getting empty response and also not getting an error.  I took his code and when I inspected through fiddler, I found the following error at the connection level

“Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘https://xrmtr14.crm.dynamics.com’ is therefore not allowed access”

https://xrmtr14.crm.dynamics.com is my CRM online instance here .

So what is ‘Access-Control-Allow-Origin’ header?

Access-Control-Allow-Origin is a CORS (Cross-Origin Resource Sharing) header.

When Site A tries to fetch content from Site B, Site B can send an Access-Control-Allow-Origin response header to tell the browser that the content of this page is accessible to certain origins. (An origin is a domain, plus a scheme and port number.) By default, Site B’s pages are not accessible to any other origin; using the Access-Control-Allow-Origin header opens a door for cross-origin access by specific requesting origins.

So it’s clear that if you want to access the discovery service web api endpoint from within CRM, the discovery service endpoint would not allow you to do the same since Access-Control-Allow-Origin header is not present in the server response. Also if your really ask me, it makes sense as well. After all why do we need to query the instance in which we are already in.

The question now is why then how can we use the discovery service web Api. If you are well versed with the concept of OAuth, let me tell you the discovery service web api endpoints can also be accessed through OAuth 2.0.

So we will connect to discovery web API endpoint through asp.net web application. Please follow the link – https://debajmecrm.com/2016/02/29/knowhow-how-to-execute-web-api-calls-to-microsoft-dynamics-crm-from-an-external-asp-net-web-application/ as I am going to use the same example for this. The only difference is the code to access the instances.

So all the steps are the same except for the fact that I have added a new button called ‘Get Instance Details’ which when clicked would fetch me the instance details. Below is the code that I wrote in the button event Handler.


if (Session["AuthResult"] != null)
                var authResult = (AuthenticationResult)Session["AuthResult"];
                var webRequest = (HttpWebRequest)WebRequest.Create(new Uri("https://disco.crm.dynamics.com/api/discovery/v8.0/Instances”));

                webRequest.Method = "GET";
                webRequest.ContentLength = 0;
                webRequest.Headers.Add("Authorization", String.Format("Bearer {0}", authResult.AccessToken));
                webRequest.Headers.Add("OData-MaxVersion", "4.0");
                webRequest.Headers.Add("OData-Version", "4.0");
                webRequest.ContentType = "application/json; charset=utf-8";

                using (var response = webRequest.GetResponse() as System.Net.HttpWebResponse)
                    //Get reader from response stream
                    using (var reader = new System.IO.StreamReader(response.GetResponseStream()))
                        var instances = new List<Instance>();
                        string responseContent = reader.ReadToEnd();

                        dynamic dynamicObj = JsonConvert.DeserializeObject(responseContent);

                        foreach (var data in dynamicObj.value)
                            var instance = new Instance
                                Url = data.Url,
                                Id = data.Id,
                                FriendlyName = data.FriendlyName,
                                UniqueName = data.UniqueName


                        InstanceGrid.DataSource = instances;


All I have done in the above code block is change the URL to point to the discovery service web API.


And this is final screenshot



Hope this helps!

Configure your Dynamics CRM as identity provider for an external web application

How common in these days is to land up in a website where you see login via facebook or login via your gmail. Many times for our projects we need to develop a custom asp.net web portal which where users might need to authenticate with Microsoft Dynamics and fetch data. In this example, I will show you how to configure single sign-on between you external web application and Dynamics CRM online.

I will achieve this using the Access Control services (ACS) feature in Microsoft Azure.

  • The first step to use ACS is to create a Access Control namespace in Azure. Login to the azure portal and create a access control namespace.


  • Once the namespace is created successfully, click on ‘Manage’ at the bottom of the screen.


  • The management portal will open.  Click on Management Service –> Symmetric Key –>  Copy the key. Store it as you would need this later.





  • Now we need have to add the access control namespace as an application in the CRM Active directory. For this go to CRM AD and add a web application or web api with Sign On URI and App Id Uri both as .accesscontrol.windows.net/">https://<service bus namespace>.accesscontrol.windows.net/. In case you are not aware of how to add an application to Azure AD, I would strongly suggest to go through my below blog post where I have explained in depth on how to add an application and configure the same.


  • Once the application is added, open the application and click on View Endpoints at the bottom of the screen. Copy the federation metatadata url from the pop-up.


  • Now to go Access Control Namespace management portal one more time (Access Control Namespace –> Manage) and set up Dynamics CRM as Identity provider. Enter the below information in the identity provider screen.


As you can see, in the WS-Federation Metadata, I have entered the federation metadata URL that I copied from the earlier step. Click on Save and you are done with the changes in the Azure Management portal.

  • Create a ASP.NET web application project with just a single page. I have named my page as default.aspx. I will demo this using visual studio 2012 since I will use the identity and access tool extension which is there for VS 2012. My default.aspx page just contains a grid view.


  • Make sure in the project properties, the target framework is 4.5 (not 4.5.1 or 4.5.2)
  • In the default.aspx.cs, just enter the below code.

public partial class _default : System.Web.UI.Page
        protected void Page_Load(object sender, EventArgs e)
            if (System.Threading.Thread.CurrentPrincipal.Identity is ClaimsIdentity)
                var claimsIdentity = (ClaimsIdentity)System.Threading.Thread.CurrentPrincipal.Identity;
                var claimsData = new List<ClaimsData>();

                foreach (var claim in claimsIdentity.Claims)
                    var claimData = new ClaimsData
                        ClaimValue = claim.Value,
                        ClaimType = claim.Type,
                        ClaimValueType = claim.ValueType


                this.ClaimsGrid.DataSource = claimsData;

    class ClaimsData
        public string ClaimValue { get; set; }

        public string ClaimType { get; set; }

        public string ClaimValueType { get; set; }


  • Now right click the project and select Identity and access from the menu. If you do not see this option, download the Identity and access extension for Visual studio – Identity and Access Tool. You may need to restart the visual studio for the changes to reflect. If even after installation, you are not able to see this menu option, make sure you are running your project in .NET Framework 4.5 to be exact.
  • In the providers tab, select ‘Use the Azure Access Control Service’


  • Click the ‘Configure’ link


  • Enter the name of the access control namespace and the symmetric key that we copied at the very beginning of this article from the management portal.


  • Click on ‘OK” and you are all set. Your web.config would be updated accordingly.
  • Now when you run the application, you are redirected to the Microsoft login page and once you enter your correct CRM credentials, you will be redirected to your default page with the appropriate claims. The code in the page load of default.aspx.cs pasted above parses the claims and binds it to a gridview. The below is the final screenshot as it look on my screen.


As you can see from the above screenshot, we get multiple claims including the logon name which we can use to retrieve data in the context of the authenticated user.

So you have achieved single-sign on for your application with Dynamics CRM acting as the identity provider. Wasn’t that easy? Well windows azure ACS makes that easy for you.

Hope this helps!