Unlocking Insights: An Introductory Guide to Integrating Jamf Pro and Microsoft Power BI for Powerful Reporting – Part 4

Part 4: Alternative Update Method

If you’ve been following along in this series, you know that I hit a pretty major roadblock in February 2024 that derailed it. In my last post, I pointed out the roadblock: being unable to do automatic refreshes of the Power BI data.

Back in March, a fellow Mac admin, Alexander Duffner (aduffner on Macadmins Slack), contacted me on LinkedIn to voice the same issue. Alexander took it further and dug into the problem, looking for a solution. In June, he reached out to me with a solution he found: use Power Automate and SharePoint to pull the Jamf Pro data into a JSON file. Once the data is in that JSON file, we can use the built-in SharePoint connector for Power BI to pull the data and model it.

Buckle up because this is going to be a long post.

Throw it All Away

Before we dive in I should let you know that you’re going to throw away 99% or more of what we already worked on. Realistically, the only piece of what we have already completed that you will keep is the setup of our API Client in Jamf Pro. All of the work we did in Power BI can be thrown away.

Requirements

Unfortunately, the solution that Alexander has come up with does come with a cost. To utilize Power Automate, you will need a license. Now I do not confess to be a Microsoft licensing expert, so you’ll want to talk to the person in your org that is responsible for Microsoft licensing to figure out what you need. You can visit this page for the types of Power Automate licenses. At a minimum, you need a license to run a Cloud Flow in Power Automate.

In addition to Power Automate, you will need access to a SharePoint site to place files. This can be a pre-existing SharePoint site; you need to know the URL of the site and have a folder where you can store your JSON files.

Jamf Pro Setup

For this workflow, we need two things in Jamf Pro: our API Client, which we created in the first post, and an Advanced Search to pull the data we require.

I will not go into great detail about how to create an Advanced Search in Jamf Pro. If you need help or training, you can head to the Jamf Training Catalog and take the training around searches.

In Jamf Pro, create a new Advanced Search using whatever criteria you want. If you want to capture all of your devices, do not set any criteria at all. On the Display tab, select all of the inventory fields that you would like to be in Power BI. After you have saved the Advanced Search, click into the URL field and copy down the search’s ID number. We’ll need that ID number later in this post.

Now that we have our search in place, we can head to Power Automate to begin creating the workflow.

Power Automate

Head over to the Power Automate site and sign in. Once you’re signed in, you should be taken to your home page in Power Automate. Click on the My Flows link on the left navigation bar.

This is where any flow that you have created will be stored. Think of flow as a shortcut for workflow. Click the New Flow button in the toolbar and choose Automated Cloud Flow.

You should get a pop-up window asking for a Flow Name and other values. You can click on the Skip button at the bottom of the window.

You should now have a blank design page with an Add Trigger button in the middle of the page.

Click on the Add a Trigger button, and you’ll get a slide-out panel on the left side of the screen where you can search for the different triggers you can use. The trigger we want to use is the Schedule trigger. Click on Recurrence to add it to your flow.

Go ahead and set the interval that you want this flow to update on. For my example, I set this to 3 hours and set the time zone to run in.

The next thing we’ll want to do is add our first action. Click on the plus symbol icon under Recurrence and choose Add Action. This will open another slide-out panel on the left side with all of the actions/connectors that we can use.

Like adding schedule, we’ll want to search for Variable this time and add the “Initialize variable” action to our flow.

In the variables pane, you’ll want to set the name of the action by clicking on the name at the top by the icon and editing it. For the Name of the variable to “url”, set the Type to String, and the Value will be the URL for your Jamf Pro instance.

You can see that I named that action “baseURL” at the top. The name of the action will come into play as we move on to other actions. This name allows us to reference different parts of an action in other actions. You’ll see what I mean shortly.

Next, add another action, this time searching for HTTP as the action and plain HTTP as the one we want.

Go ahead and rename the action. I used “generateBearerToken” as my name, and then we’ll get into the parameters. For the URI parameter, click in the field, and you’ll see a blue icon appear at the right edge of the field with a lightning bolt icon and an “fx” icon. Click on the lightning bolt, and this will show you the various actions and variables that you can use. Choose the “URL” variable that is showing up under the “baseURL” variable.

After adding the URL you want to add on the following text: /api/oauth/token You then want to choose POST as the method and in the Headers section set the first key to “Content-Type” and the value to “application/x-www-form-urlencoded”.

Content-Type       application/x-www-form-urlencoded

In the Body field, you want to place the following text, substituting your API Client ID and Client Secret where I have “<YOUR_API_CLIENT_ID>” and “<YOUR_API_CLIENT_SECRET>”:

grant_type=client_credentials&client_id=<YOUR_API_CLIENT_ID>&client_secret=<YOUR_API_CLIENT_SECRET>

Your parameters should look a lot like this:

Next, we need another variable to capture the bearer token we get from that HTTP action. So use the Add an Action button and search for “variable” again, and add the Initialize Variable action. Name this action (I used “accessToken”), name the variable “bearerToken”, set Type to String, and for the Value click on the field and choose the “fx” icon on the right side. In the window that slides out to the right, choose “Dynamic Content” at the top. Scroll to find the “Body” variable under the HTTP action we created previously (mine is named “generateBearerToken”). In the box above the list, add “[‘access_token’]” and click Add:

Before we get too much further, make sure to save your flow. You can give your flow a name by clicking on the name to the left of the menu bar (it should be “Untitled” at the moment) and then clicking the Save button.

Now that we’ve saved our flow and added several items, let’s make sure we’re doing well and test this flow. Click on the Test button in the upper right of the window. This will slide out a panel on the right side. Choose Manually and click Test at the bottom. On the next pane, click Run Flow at the bottom of the page, and then click Done at the bottom of the third pane.

You should have green checkmarks on all of the actions in your flow. Click on the Access Token action, which will open a pane on the left side. Under “Inputs,” check to see if you received an access token from the “generateBearerToken” action that we set up.

If you have a token in the “value” field of the Inputs section, then we are good, and all of the work we’ve done is correct. We can then move on to the next action we need, which is another HTTP action. Put your flow in Edit mode again by clicking the Edit button at the top right of the window, then click on Add an Action and search for HTTP again.

Again, name the action (mine is “getJSON”), and for the URI, choose the “url” variable from our baseURL action (lightning bolt -> “url” under baseURL). At the end of the URL, add “/JSSResource/advancedcomputersearches/id/<ID#>” and replace “<ID#>” with the ID number of the Advanced Search we created earlier.

Set the Method to GET. For Headers, set the first one to Accept with a value of “application/json” and a second header of Authorization for the key. For the value, type “Bearer ” (include the space), click the lightning bolt and choose “bearerToken” under the list of Variables. Once you are done, the parameters should look like this:

Next, we’ll parse the JSON from this “getJSON” action. To do that, we will need to identify the JSON schema so that Power Automate knows what to do. To do that, we need a JSON sample. The easiest way to get that sample is to run our flow and copy the output from the “getJSON” action.

First, save your flow and then click on the Test button once again to test the flow. Once the test is done running, hopefully without any errors, click on the “getJSON” action to see the output. Locate the OUTPUTS section, then the body section, and click the “copy” icon towards the top of the body window. This will copy the JSON of our Advanced Search.

Now that we have that copied to the clipboard put your flow in Edit mode again by clicking the Edit button at the top right of the window. We’ll want to add another action, so click Add an Action. This time search for “json” and choose “Parse JSON” under the Data Operations action.

Click in the Content field, then click the lightning bolt, and choose “Body” under the “getJSON” section of the list.

Next, click on the “Use sample payload to generate schema” link under the Schema box. This will open a new window where you will paste the JSON we copied above. Once you have copied the JSON, click “Done.”

You should now have a schema in the Schema box. Click on Add an Action, and this time, add another HTTP action. Rename the action to “invalidateToken,” set the URI to be our “url” variable from “baseURL” (lightning bolt), and add “/api/v1/auth/invalidate-token” after the URL. Once again, our method is going to be POST, and our headers are the same as above: Accept and Authorization.

We have finally reached the last action, and probably the most important one: Create File in SharePoint. Go ahead and click on Add an Action and search for “create file.” Then, choose the Create File option under SharePoint.

In the parameters for this action you’ll need to put the SharePoint site URL, choose the folder to save to, and give the file a name. When adding the SharePoint site address you will most likely have a pop-up under the address telling you “No values match” and asking if you want to use that address as a custom value. Click on the “Use https…. as a custom value” link.

In the Folder Path field, you can use the folder icon to the right to navigate to the folder you want to save to. I saved it to a folder within the SharePoint site’s Shared Documents folder.

For the File Name, I prefaced the name with the ID number of the Advanced Search, just so I knew which search to go to if I ever needed to change anything. Make sure the filename ends with an extension of .json.

In the File Content field, click the “fx” icon and choose “Dynamic Content”. Select the “Body” from under “parseJSON” and add “?
[‘advanced_computer_search’][‘computers’]” (don’t forget the question mark) in the dialog box.

Save your flow and go ahead and run a test. If we did everything right, you should have a successful run of the flow, and you should have a .json file in the folder on your SharePoint site.

Wrapping Up

Well, we accomplished a lot in this post. It was longer than I liked, but I felt the screenshots were necessary to ensure our success at the end. I am going to stop here before getting into the Power BI portion of this to give you all a chance to digest.

The next post, where we actually go into Power BI, is ready to go, and I will launch it in the next few days. In that post, I will show you how to use the SharePoint connector to bring our data in, how to transform the data, and how to make a couple of analytics in Power BI.

Categories: ,