Monday, August 16, 2021

Microsoft work account doesnt send push notifications for sign ins to Authenticator app

 Push notifications suddenly stopped working for me. Here is how I fixed it:

  1. If you go to My Sign-Ins (microsoft.com), you can see all the ways in which you can sign in to your microsoft account.
  2. In my case, "Microsoft Authenticator" was in the list, but on my phone within the authenticator app, my account was grayed out.
  3. I deleted the "Microsoft Authenticator" item from the list and then added back again. The website prompts you through a couple of dialogs and then finally provides you a QR code to use against the authenticator app. Once I did this, the account was no longer grayed out.
  4. The final step was to mark the Authenticator app as the default sign in method on the website.
That fixed my problem.

Tuesday, May 11, 2021

Add Git Commands to the Visual Studio Toolbar

 I wanted to be able to quickly access the Git "Manage Branches" from the toolbar.


Here is how to do it:

  1. Go to View >> Toolbars >> Customize
  2. Switch to the "Commands" tab
  3. Then radio check the "Toolbar" option and in the drop down select "Standard" (in my case, I wanted to add it to the standard toolbar. You can add it to a new toolbar or a different toolbar).
  4. Click on the "Add Command" button
  5. Select "View" under categories
  6. Select "GitRepositoryWindow" and also optionally the "GitWindow"


  7. You now have easy access to the commands:



Wednesday, February 24, 2021

Databricks - Connecting to an Azure Storage Account using a SAS key

This post is about setting up a connection from Databricks to Azure Storage Account using a SAS key.

This method is perfect when you need to provide temporary access with fine grained permssions to a storage account. In this post, I will show how to setup readonly access for a temporary period of time.

Please note: you should try and use pass through credentials, but, that functionality need premium tier databricks. This post is a workaround that utilizes the Shared Access Signature.


  1. Create a Shared Access Signature Key for your Storage Account
    Go to your Storage Account and under "Settings", select "Shared access signature"
    Allowed Services: "Blob"
    Allowed Resource Types: "Container" and "Object"
    Permissions: "Read" and "List"


  2. Click Generate SAS and connection string
  3. You need the "SAS Token"


  4. Python code for Databricks Notebook:

from pyspark.sql import SparkSession
container = "<<containerName>>"
storageAccountName = "<<storageAccountName>>"
sasKey = "<<sas token from step 3>>"; # you should use:#dbutils.secrets.get(scope="scopeName",key="keyName") 
spark.conf.set(f"fs.azure.sas.{container}.{storageAccountName}.blob.core.windows.net", sasKey)
spark.conf.set("spark.sql.execution.arrow.enabled", "true")
spark.conf.set("spark.sql.execution.arrow.fallback.enabled", "true")

inputfilepath = f"wasbs://{container}@{storageAccountName}.blob.core.windows.net/pathToFile/fileName.parquet"
dataframe = spark.read.option("mergeSchema", "true").parquet(inputfilepath)
dataframe.describe()
display(dataframe)


Comments:

The SAS key doesn't seem to allow you to use the abfs[s]: endpoint from databricks. Instead you need to use the wasb[s]: endpoint. Also, the SAS key doesn't seem to allow you to use the DFS URL (eg: {storageAccountName}.dfs.core.windows.net). So this isn't ideal, but, its a great way to connect using a temporary SAS key

Tuesday, December 22, 2020

"this dataset won't be refreshed" error in PowerBi service when using Web.Contents

If you get the "This dataset wont be refreshed" error in PowerBi online and the dataset refreshes fine in PowerBi Desktop, then one reason maybe in how you are using Web.Contents.

The first thing you should do is check your "Data Source Settings" in PBI-Desktop and check to see if you get the "Some data sources may not be listed because of hand-authored queries". (See: https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-data#refresh-and-dynamic-data-sources for more info)


For a majority of users, this is most likely happening on the usage of Web.Contents.

Here is some code that caused this error for me:

uri = "https://login.microsoftonline.com/" & tenantID & "/oauth2/token",
xxx = Web.Contents(uri, [Headers = authHeaders, Content = Text.ToBinary(authQueryString)])

The problem with the above code is that PBI service does not like to refresh URLs that are computed on the fly. So to fix the above code all you have to do is, pass the dynamic parts as a RelativePath, like so:

uri = "https://login.microsoftonline.com/",
relativePath = tenantID & "/oauth2/token",
xxx = Web.Contents(uri, [RelativePath =relativePath,  Headers = authHeaders, Content = Text.ToBinary(authQueryString)])


More info: https://docs.microsoft.com/en-us/powerquery-m/web-contents

Sunday, September 13, 2020

D365 Finance and Operations: Odata query authentication: "No P3P Policy defined" error

 My background is with D365 Sales and I have tons of code that works flawlessly using the "Service to service calls using client credentials" flow to get its authentication token. Recently, I have been working with F&O and I was trying to using the same basic code to perform an OData call against Finance & Operations. But, I was getting a "No P3P Policy defined" error. Its not a very helpful error.

After some trial and error, I was able to determine that the problem was the URL I was using for the acquisition of the token, via the "AcquireTokenAsync" method). This does not work: https://mydomain.sandbox.operations.dynamics.com/, whereas this one does: https://mydomain.sandbox.operations.dynamics.com. The only difference was the trailing slash. Once I removed it, the error went away and all my queries began working flawlessly.


Sample Code: https://gist.github.com/rajrao/455963f9bd5b9a16558b6085116b3c03

Error Details

StatusCode: 401, ReasonPhrase: 'Unauthorized', Version: 1.1, Content: System.Net.Http.HttpConnectionResponseContent, Headers:

P3P: CP="No P3P policy defined.  Read the Microsoft privacy statement at https://go.microsoft.com/fwlink/?LinkId=271135"

Other documents:

MsDocs: Service endpoints overview  

Saturday, July 18, 2020

Sharing borrowed kindle library books with kids



  1. Borrow the book (I use Libby, and so after borrowing, need to make sure I click on deliver to Kindle).
  2. Once you have claimed the book and added it to your account,  and under account settings, click on manage "Content and Devices" (this should be the link https://www.amazon.com/hz/mycd/myx?_encoding=UTF8&ref_=ya_aw_mycd#/home/content/booksAll/dateDsc/)
  3. Click on the "..." Button next to your book
  4. Click on "Manage Family Library"
  5. Finally add it to your kids free time library.

Thursday, April 09, 2020

PowerBi - Attaching a Common Data Model Folder to a DataFlow - Error: path doesnt contain model.json

If you get the "The path doesn't contain model.json. Please provide the full path to the CDM folder including its definition file (its Model.json file) and try again." error, there can be many reasons for it.

Here are some things to try:

  1. Make sure your url doesnt contain blob.core.windows.net.

    1. When I used the desktop app StorageExplorer and copied the path to the model.json file, then the path looked like: https://xxxxxx.blob.core.windows.net/cds-crmuat1/model.json
    2. The same file, the path through Azure portal's storage explorer was: https://xxxxxx.dfs.core.windows.net/cds-crmuat1/model.json.
      The 2nd path worked, but the first one didnt.
  2. Make sure your permissions are correct:
    1. In the screen shot above, what I had missed were:
      1. Adding myself to the owner role on the storage account directly. Its not enough to be in here via inheritance, you need to add yourself directly.
      2. Also I had to add myself to the "Storage Blob Data Owner" role.
    2. The ones in blue are what the documentation asks you to do: https://docs.microsoft.com/en-us/power-bi/service-dataflows-connect-azure-data-lake-storage-gen2#grant-permissions-to-power-bi-services
Some other gotchas:
  1. The storage account used for CDM needs to be the exact same storage account used by PowerBi. As of writing this blog, this is a limitation of PowerBi that I hope they will remove.
  2. The storage account needs to be in the same tenant and also hosted in the same region as PowerBi.
  3. If you encounter an error when trying to save a DataFlow, make sure that there is a container/filesystem named PowerBi in that storage account.
Useful resources:


Wednesday, March 25, 2020

Dynamics CRM - Alternate key fails with message "The statement has been terminated."

If you get the unhelpful message that the "Statement has been terminated.":


Look to see if you have a duplicate value in your key. Unfortunately, if you have more than 50k records, you cannot use fetchXml and either would need to load the data into a database or filter the list by date-range (but the latter option is not guaranteed to catch all the errors and so the db option is your best bet).

Here is a query to find dups based on accountNumber on account entity
 
 
   
   
   
     
   

   
 



It used to be one time that CRM used to throw an error message stating the key creation failed because of a duplicate key. In the latest version, its a rather unhelpful message.


 

Friday, March 20, 2020

git@github.com: Permission denied (publickey).

If you get the error "git@github.com: Permission denied (publickey).", try using HTTPS:
Under clone or download, "use https" instead of "use SSH".