The Rockhop logo in the navigation bar
The Rockhop logo in the navigation bar

Power Platform Solution ALM: Environment Dataflows Sync

April 24, 2024
Video
Power Platform • Alm
Michael demonstrates how you can utilize Dataflows as part of your ALM strategy within Power Platform to keep all environment data in sync, which can help with troubleshooting production issues.

Intro

Hey everybody, welcome back!

Today I wanted to quickly talk to you about data flows and how they can be utilized alongside Alm within Power Platform. I've got a quick demo to show you how you can set up data flows between your Dev, test, and production environments specifically around Dataverse.
So let's go check this out.

Demo Environment Setup

So, I have a solution within our Rockhop environment around an out of office app. I have two tables, two Dataverse tables within this solution. One is really just a lookup table that I use for the main submission table. I want to demo how we can set up a data flow to take the data that's in production and sync it back down to our test and our Dev environment so that we get live up-to-date data for testing and development purposes, throughout the life cycle of our application.

Some other areas where this is super beneficial is you can have external data being synced with your production environment and then that data is then synced down to your Dev and test environment. So, you really only need to set up one data flow with external systems to your production environment, and have that data trickle down to your lower environments.

So I have my production environment here, and I have one record.

I have my Hawaii vacation that I wish I was taking, and I have a couple of leave types entered into this table.

Now, let's go ahead and set this up, and just so we take a look at Dev. We do not have any records within Dev right now, but I would like to sync production data to Dev.

So let's go ahead and setup a data flow.

Demo

So, I'm in my Dev environment. That's where I'm going to create this data flow and I'm going to be reaching out to my production environment to sync the data.

  • I've got my start from blank data flow.
  • I've entered in my production environments URL.
  • And, for advanced options I make sure that I include relationship columns.
  • And, then I'm going to create a new connection.

Let's go ahead and send it in.

All right, we'll click next, and search for our leave and out of office Mission table. And, we'll go ahead and transform data.

Okay, so we've got both tables here. Now, I'm not going to do anything special with this data. I'm just going to take it as is; no Transformations.

Click next.

We're going to load to an existing table, and we're going to look for our Rockhop leave type.

I'm even going to select and delete rows that no longer exist in the query output, so it's going to truly keep those two tables in sync, and it will delete anything that's added in Dev and does not match production.

And, I'm going to specify a key that I've created. Now, you can create a key, or you can use the the unique ID that Dataverse provides.

Then we'll do the out of office submissions. Load existing. Searching does not work in this so you have to go and search out of office submission. Again, I'm going to keep these in sync, and I'm just going to use the out of the box primary key here, with the unique ID. I do have a key created on this table as the names will all be unique and it's an auto number column, but we're just going to use the one that they create for us.

So, we've mapped both tables. We can click next. And, we can just refresh this manually. We can also set a schedule for this, but for now I'm just going to do a manual refresh whenever I want data up to date with production.

This is going to publish, and then it's going to kick off the refresh after it publishes, so I'm going to go ahead and pause this.

Okay so, that refresh is finished. It took a few minutes for that to happen, but if we go into our Dev environment we can now see our leave types have been updated in Dev now. And, we also have our out of office submission, and it also takes the unique ID that we have in production. Since we're using that as the key, it does sync that as well. So everything truly is completely in sync with our production environment.

So pretty awesome!

That was my demo on data flows and Alm within Power Platform.

Thank you for joining today, and happy power platforming!

Recent Insights

HQforAI & Rockhop Partnership Announcement

Rockhop and Headquarters for AI Form Strategic Partnership for Comprehensive AI Solutions

September 17, 2024
Article
Copilot • Gen Ai • Power Platform
Copilot Gen AI with Power Automate

Copilot GenAI within Power Automate

May 20, 2024
Video
Copilot • Gen Ai • Power Automate
1 2 3 10
chevron-down