Thursday, November 15, 2018

Azure Website–Allowing different file types for download

An azure website out of the box will not allow you to download static content files such as JSON, MP4, etc. If you need to enable this feature, you need to add a web.config file in the site/wwwroot folder with the following contents

<?xml version="1.0"?>
             <!--following needed only if extensions were defined at a different level -->
             <!-- <remove  fileExtension=".json" /> -->
             <!-- for a good list of other mimetypes see: -->
             <mimeMap fileExtension="json" mimeType="application/json" />
             <mimeMap fileExtension="mp4" mimeType="video/mp4" />
             <mimeMap fileExtension="ogg" mimeType="audio/ogg" />
             <mimeMap fileExtension="m4a" mimeType="audio/mp4" />
             <mimeMap fileExtension="flv" mimeType="video/x-flv" />
             <mimeMap fileExtension="woff" mimeType="application/font-woff" />
             <mimeMap fileExtension="woff2" mimeType="application/font-woff2" />
             <mimeMap fileExtension="ttf" mimeType="application/font-ttf" />
             <mimeMap fileExtension="csv" mimeType="text/plain" />

You can pick and choose the values you need for fileextensions.

Wednesday, November 14, 2018

Dynamics CRM Plugin Registration Tool Log File Location

The Plugin Registration Tool stores its log files at: “C:\Users\{user}\AppData\Roaming\Microsoft\Microsoft Dynamics365© Plug-in Registration Tool\”

You can get to it via this shortcut: %appdata%\Microsoft\Microsoft Dynamics365© Plug-in Registration Tool\

And while we are at it, the path to XRM Toolbox’s logs is: %appdata%\MscrmTools\XrmToolBox\Logs\

Thursday, November 01, 2018

Currencies and Strong vs Weak terminology

Note to self:

1. When a currency is labelled as being strong, it means that you get more of the other currency against the currency (eg: US$ is stronger than the Indian Rupee, that would mean you get more rupees per US$).

2. When a currency is labelled as being weak, it means that you get fewer of the other currency against the currency (eg: US$ is weaker than the Indian rupee, that would mean you get fewer rupees per US$).

So in this chart, the US$ has become stronger when compared to the INR (because you getting more INR per $ (INR 63.2/$) today than 52 weeks ago (INR 74.4/$)).


When a currency is stronger than its counterpart, different people benefit: The weaker sides exporters are happier as their exports are cheaper and the stronger sides exporters are less happy as their exports

US Exporter

US Importer

US Currency Holder

Indian Exporter

Indian Importer

Indian Currency Holder

US$ stronger/INR weaker

Exports cost more

Imports cost less

More INR per $

Exports cost less

Imports cost less

Fewer $ per INR

US$ weaker/INR stronger

Exports cost less

Imports cost more

Fewer INR per $

Exports cost more

Imports cost less

More $ per INR


Caterpillar, Boeing, US Software companies

Walmart, World Market

American tourist to India

Indian Software companies, Indian manufacturers

International companies that manufacture their goods elsewhere and import to India, Indian resellers

Indian tourist/student travelling to US

Thursday, October 18, 2018

Cannot convert the literal 'undefined' to the expected type 'Edm.Guid'

If you get this error when working in Javascript with DynamicsCrm (or any system that uses Odata), and you get this error, then you most likely need to setup your object like this:

var request = {

    ObjectId: {guid: "E8C656B7-6AD1-E811-A967-000D3A30D5DB"}


Otherwise, the request that will get sent will have undefined value and the response will have an error with the following message:

message=An error occurred while validating input parameters: Microsoft.OData.ODataException: Cannot convert the literal 'undefined' to the expected type 'Edm.Guid'. ---> System.FormatException: Guid should contain 32 digits with 4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx).

Tuesday, October 16, 2018

Dynamics CRM–Business Rules or Workflows?

When should you use a business rule (BR) and when should you use workflows?

First thing we should understand are that business rules are just another type of a process (just like Workflows and Business Process Flows). You can validate this by performing an advanced find and setting the category to Business Rule. Other options for category are:


Now when you create a business rule, there are 3 options (2 really), one that applies it to forms (all or specific) and the other that applies it to the entity. When you choose entity and create your BR, and activate it, you will find 2 entries in processes with the same name. One of these handles the javascript part (that gets run client side). The other is a synchronous workflow that runs on create and on update of the fields that are in conditionals of the business rule. You can look at the JavaScript that will run by looking at the ClientData field on Process. Also, the XAML field contains workflow.

Why is this important to know? Here is why:

1. If you have a business rule that needs to be run client side, then Business Rules are your best option. This is especially true if you want to show error messages or perform data validation.

2. But if you are going to have an entity level business rule and it doesn't really contain validation or error messages, then you have a choice: If you use a Workflow, you can decide to run it in the background (asynchronously).

Some other things to consider:

Business Rules have a nicer UI. But, every business rule that fires at the entity level creates a new synchronous workflow that fires on create and on update of a single field. If you use a workflow, you have more choices: you can create a single workflow that runs on Create, that does all the things that need to be done on create, this in my opinion is more efficient, then having multiple workflows that are all firing one after the other.

Also, look at the limitations of BR as defined in microsoft docs:

Inner workings of Business Rule javascript:

Finally, if you are curious about the inner workings of business rules, see the code at:

But basically here are the main things to know:

The javascript for the business rule is injected into the formscript for the form.
The function itself is setup to run on change of the fields that are in the conditionals.
The function is run once at form startup.

Wednesday, October 03, 2018

ODataException: An undeclared property 'aaa_xxxxxid' which only has property annotations in the payload but no property value was found in the payload

If you get the error: then know this: case matters and the field name is the schema name of the field!

An error occurred while validating input parameters: Microsoft.OData.ODataException: An undeclared property 'aaa_xxxxxid' which only has property annotations in the payload but no property value was found in the payload. In OData, only declared navigation properties and declared named streams can be represented as properties without values.

Please note that the field names are case sensitive and go by the “schema name”. In my case, the schema name was ‘aaa_XxxxxId’. Once that was fixed the code began working!

var entity = {};
entity["customerid_account@odata.bind"] = "/accounts(21876381-dc67-e811-a954-000d3a378ca2)";
entity["aaa_XxxxxId@odata.bind"] = "/aaa_xxxxx(c9edf6c0-d670-e811-a958-000d3a3780dd)";
entity.caseorigincode = 1;
entity["msdyn_incidenttype@odata.bind"] = "/msdyn_incidenttypes(6760f919-77b0-e811-a95c-000d3a378f36)";

Tuesday, October 02, 2018

Creating and Using Tokens with Azure Service Bus

The following notes are with regards to the “.net standard” Microsoft.Azure.ServiceBus library and details what each parameter is (because I found the documentation lacking in this regard).

Creating a token

Tokens are created using a TokenProvider. There are many options in instantiating a token provider and then one I am showing here utilizes the Shared Access Signature (SAS). But, if you are running your code in Azure, a better option would be to use the ManagedServiceIdentityProvider, as one doesnt need to share keys, etc.

To create a token using the “CreateSharedAccessSignatureTokenProvider(string keyName, string sharedAccessKey)”, you need a key name and its key. This is retrieved from the “Shared access policies” blade in Azure. Once you have these 2 values, you can create a tokenProvider.

TokenProvider tokenProvider = TokenProvider.CreateSharedAccessSignatureTokenProvider(keyName, sasKey);

The CreateSharedAccessSignatureTokenProvider has overloads, in which you can specify the TTL of the token and the scope (namespace vs entity).

To create the token, you need to use the “GetTokenAsync(string appliesTo, TimeSpan timeout)” on the tokenProvider. The “appliesto” defines the either the namespace or the entity and should be defined as follows:

For access to the entire namespace:
For access to an entity in the namespace called myFirstQueue:

Using the token

The GetTokenAsync method returns a token. The token contains a property called TokenValue. You use this TokenValue to work with the servicebus.

var tokenProvider = TokenProvider.CreateSharedAccessSignatureTokenProvider(token.TokenValue);
var client = new QueueClient("", "myFirstQueue", tokenProvider);
await client.SendAsync(message);

Controlling of access to the service bus:

There are 2 things that are possible:
1. restrict scope to the entity
2. specify if the token can manage,send or listen

The first one, I showed you in “creating a token”. The second one is not so obvious and is controlled by the Claims on the SaS policy that you are using (the keyName and sasKey for which you pulled from the SAS Policies blade in Azure).

When to use tokens:

Say you have a lot of different micro-services and they all need to get access the ServiceBus. Also, you dont want to give each one a key, as you now will have to figure out what to do if the key got compromised. Also, you dont want all services to be able to do anything and everything on the ServiceBus. In this case, you would need to stand up a token provider service. The microservices, would then authenticate with the token service and the token service would then determine what the microservice is authorized to do and return a token that has the appropriate claims and scope.

Best practices:

As you can see with the SharedAccessSignature option, you still need to provide your token service the sas-key. Instead, with the release of managed service identities support on ASB, one can do this without the sas-key. So, if your service is going to be hosted in Azure, then MSI is definitely the way to go.

Wednesday, September 26, 2018

LogicApp–HttpRequest trigger: retrieving query parameters

With a Http request based trigger, how do you retrieve the query parameter values (eg: &queryParamName=hello%20world)? I did not find an answer easily on the internet, so here it is:



The above example, will retrieve the query param value for a query param named “queryParamName” and return it as a response.

Another way to achieve the same thing, but in a simpler manner is to use the RelativePath:


When you add the relative path, you will have to get the updated HTTP Get URL (that gets generated for you) and in the URL you will find


To call the logic-app, all you have to do is replace the bolded section with a value (invoke/queryParamName/hello%20world/.

I prefer using the relative-path method, because you will find that the logicApp designer will make the variable (in this case queryParamName), easily available to other actions, which is not just nicer, but also protects you from typos.

Wednesday, September 12, 2018

Databricks–Using Databricks Secrets

Here are all the steps needed to setup a secret in databricks (not key-vault) in Databricks. This works with the standard version of DB:

Install Databricks CLI:
1. install databricks cli (needs python)
     pip install databricks-cli
2. Setup databricks token (needs token from user-settings in Databricks. Also needs the host url: eg Host:
     databricks configure --token
3. Create a scope
     databricks secrets create-scope --scope dbSecretsScope --initial-manage-principal "users"
4. Add a key-value to scope
     databricks secrets put --scope dbSecretsScope --key mydbPassword --string-value myPasswordValue
5. List keys in scope
     databricks secrets list --scope dbSecretsScope

Create a Python notebook and add the following code in a cell and run it

sqlserver ='
port = '1433'
database = 'myFirstDb'
user = 'iamspecial'
pswd = dbutils.secrets.get("dbSecretsScope","mydbPassword")
print "password is", pswd #will display redacted – very important!
table = 'myFirstTable'

## Load Data Frame ##
df1 = \
   .option('user', user) \
   .option('password', pswd) \
   .jdbc('jdbc:sqlserver://' + sqlserver + ':' + port + ';database=' + database, table)

df1 #to display the dataframe to make sure we connected

Tuesday, September 04, 2018

Dynamics Error: Only owner can revoke access to the owner.

Error: Only owner can revoke access to the owner. CallerId: xxxxxx, OwnerId: yyyyyyy

I was getting this error when attempting to revoke access from yyyyyy via code on an account entity. What I found out was that while yyyyyy was the owner of the entity, I could successfully revoke access from the entity.

Instead, I first had to assign the entity to the new owner (xxxxxx) and only then could I revoke access from yyyyyy.