Friday, February 23, 2024

SQL Server Integration Services Project template available for Visual Studio 2022

 Since end of 2022, there is also a SQL Server Integration Services  Project template available for Visual Studio 2022 which you can install from the Visual Studio Marketplace. You can install it from the direct download link here or you can search for it in the Visual Studio 2022 extension manager and install it from there.



Thursday, February 22, 2024

Classic Azure Application Insights deprecated on February 29th 2024 - 7 days to go

 If you missed it - classic Azure Application Insights will be deprecated on February 29th 2024. If you missed the different notification e-mails, you can quite easily see the warning if you navigate to an Azure Application Insights resource in Azure Portal.


Migration is actually quite easy - you just click on the link provided and this will open up the menu depicted below which allows you to associate your Azure Application Insights resource to a Log Analytics Workspace. The good news is that there are no pricing changes when moving to the workspace-based model. 




As indicated in the migration window, this is  a one way operation so plan for it in advance - the points below might impact on how you will do the migration:

  • You can link different Application Insight resources to a single Log Analytics workspace or you can make the split - in most case you want to consolidate it.
  • Instrumentation keys do not change during the migration so you don't need to worry about this
  • The export feature is not available on the Application Insights workspace-based resources - you need to look at diagnostic settings for exporting telemetry
  • There might be some schema changes - important to consider when doing KQL queries - check out query data across Log Analytics workspaces, applications and resources in Azure Monitor
  • Existing log data will not immediately move to the Log Analytics workspace - only new logs generated after the migration will be stored in the new log location.


Tuesday, December 26, 2023

Running SSIS packages in Azure Data Factory - scaling and monitoring

Lifting and shifting SSIS packages to Azure Data Factory (ADF) can provide several benefits. By moving your on-premises SSIS workloads to Azure, you can reduce operational costs and the burden of managing infrastructure that you have when you run SSIS on-premises or on Azure virtual machines. 

You can also increase high availability with the ability to specify multiple nodes per cluster, as well as the high availability features of Azure and of Azure SQL Database. You can also increase scalability with the ability to specify multiple cores per node (scale up) and multiple nodes per cluster (scale out) - see Lift and shift SQL Server Integration Services workloads to the cloud

To lift and shift SSIS packages to ADF, you can use the SSIS Integration Runtime (IR) in ADF. The Azure SSIS-IR is a cluster of virtual machines for executing SSIS packages. You can define the number of cores and compute capacity during the initial configuration (Lift and shift SSIS packages using Azure Data Factory on SQLHack)

Even though there is Microsoft article which explains how to Configure the Azure-SSIS integration runtime for high performance, there is not a lot of guidance of how to run it at the lowest possible cost but still being able to complete the jobs. So would you recommend a higher sizing running on a single node or running a lower sizing on multiple nodes? Based on experience, it seems perfectly possible to run most jobs on a single node and up until now we have been running all of them on a D4_v3, 4 cores, 16GB Standard. If you decide to run it on a lower configuration, it would recommend monitoring failures, capacity usage and throughput. (See Monitor integration runtime in Azure Data Factory for more details)



Reference:


Wednesday, November 29, 2023

Dynamics 365 and Power Platform monthly reading list November 2023

 2023 Release Wave 2

Technical topics (Configuration, customization and extensibility)

Copilots, AI and machine learning

Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users


Sunday, November 26, 2023

Implementing Azure Synapse Link for Dataverse: gotchas and tips

Azure Synapse Link for Dataverse allows you to easily export data from a Dataverse (or Dynamics 365) instance to Azure Data Lake Storage Gen2 (ADLS) and/or Azure Synapse. Azure Synapse Link for Dataverse provides a continuous replication of standard and custom entities/tables to Azure Synapse and Azure Data Lake. 

I highly recommend you to view the awesome YouTube playlist Azure Synapse Link and Dataverse - better together from Scott Sewell (@Scottsewell) as an introduction.


This blog post provides a number of tips & tricks but is not an exhaustive list - it is highly recommended to go through the links in the Microsoft documentation listed in the reference section below. You can also take a look at the presentation I delivered at Techorama  in May 2023 which is available on Github - Azure Synapse Link for Dataverse from 0 to 100

1. Check the region of your Dataverse/Dynamics 365 instance

The configuration of Azure Synapse Link for Dataverse is done through the Power Platform maker portal but before you can get started you should first setup Azure Data Lake Storage Gen2 and Azure Synapse in your Azure subscription.  

It is however best that you first check in the configuration screen in which region your instance is located since the storage account and Synapse Workspace must be created in the same region as the Power Apps environment for which you want to enable Azure Synapse Link.  From the PPAC user interface it is currently not possible to create a Dataverse/Dynamics 365 instance in a specific region but this is possible with the PowerShell - see Creating a Dataverse instance in a specific Azure region using Power Apps Admin PowerShell module

If you need to move a Dataverse or Dynamics 365 instance to a different Azure region, you can open a Microsoft support tickets. Based on recent experience this specific type of Microsoft support request is handled fairly quickly (within 1-2 business days).

Azure Data Lake Storage is a set of capabilities, built on Azure Blob Storage. When you create a storage account and check the "enable hierarchical namespace" checkbox on the advanced tab, you create an Azure Data Lake Storage Gen2.


2. Make sure all prerequisites are in place before enabling Azure Synapse Link

Definitely make sure that all security configuration outlined on Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace (Microsoft docs) are correctly setup. The exception messages which are shown in the Azure Synapse Link configuration pages aren't always very helpful.

3. Azure Synapse Link for Dataverse is a Lake Database

In the documentation from Microsoft (Understand lake database concepts) a lake database is defined as:

A lake database provides a relational metadata layer over one or more files in a data lake. You can create a lake database that includes definitions for tables, including column names and data types as well as relationships between primary and foreign key columns. The tables reference files in the data lake, enabling you to apply relational semantics to working with the data and querying it using SQL. However, the storage of the data files is decoupled from the database schema; enabling more flexibility than a relational database system typically offers.




The data is stored ADLS Gen2 in accordance with the Common Data Model (CDM) -the folders used conform to well-defined and standardized metadata structures (mapped 1:1 with Dataverse tables/entities). At the root you will see a metadata file (called model.json) which contains semantic information about all of the entity/table records, attributes and relationships between the tables/entities.

The way the files are being written depends on the Azure Synapse Link for Dataverse configuration - both the partitioning mode and in place vs append only mode can be configured - see Advanced Configuration Options in Azure Synapse Link 

4. Synapse Link for Dataverse uses passthrough authentication using ACLs in Azure Data Lake - no support for SQL authentication

Since all the the data for the tables in Azure Synapse Link for Dataverse are CSV files which are stored in Azure Data Lake Storage, this also means that security needs to be set at the level of the files in Azure Data Lake Storage Gen2. There is no support for SQL authentication in the Lake DB which is created by Azure Synapse Link for Dataverse.


References:

Wednesday, November 22, 2023

Near real-time and snapshots in Azure Synapse Link for Dataverse

The Azure Synapse Link for Dataverse documentation contains a section about Access near real-time data and read-only snapshot data but it does not really explain why you want to use one or the other. 

When you open an Azure Synapse SQL Serverless LakeDB in SQL Server Management Studio you see a clear distinction between the two versions of the table data - whereas in Azure Synapse Studio there is no obvious distinction besides the name you will see the "account" table the "account_partitioned" view:

  • Near real time data: external table for all the underlying CSV files exported by the Azure Synapse Lin for Dataverse sync engine. There is a soft SLA for the data to be present in these tables within 15 minutes
  • Snapshot data/partitioned views: views on top of the near-real time data which are updated on an hourly interval.



In most scenarios, it best to do queries against these partitioned views since you will avoid read conflicts and you are sure that a full transaction has been written on the CSV files in Azure Data Lake storage. 

A typical exception that you might receive when doing queries directly against the "tables" is "https://`[datalakestoragegen2name].dfs.core.windows.net$$/[lakedbname]/[tablename/Snapshot/2023-05_1684234580/2023-05.csv" does not exist or you don't have file access rights)" but this also depends on your specific context. If you have a lot of create, updates or deletes on Dataverse tables this might happen more regularly. Even though, the partitioned views are update on an hourly basis - it might be that the Synapse Link engine is just refreshing the views at the same point that you perform a query, which will give you a similar exception but the changes that this occurs are more rare.

You can check the last sync timestamp and sync status in the Power Platform maker portal (see screenshot below)



For the moment, you will also have to manually check the monitoring page (which can be quite tedious if you have a lot of environments) but there is an item in the Microsoft release planner "Receive notifications about the state of Azure Synapse Link for Dataverse" which is apparently in public preview but I haven't seen it in for  environments (not in the https://make.powerapps.com and also not in  https://make.preview.powerapps.com/)  I have access to. 



It is also not easy to see if something went wrong with the refresh of the partitioned views - up until now the easiest way to find out is running a SQL query -  select name,create_date from sys.views order by create_date desc against the LakeDB.