Uploading Attachments from SharePoint Online to a DevOps WorkItem

An issue which I have recently been looking at involved taking a file, as an attachment in a SharePoint list, and attaching it to DevOps work item by using Microsoft Flow. For those familiar with Microsoft Flow, there are a number of actions available to you to interact directly with DevOps, and in particular there are actions which allow you to interact with work items. So you would be correct in assuming that this would include the ability to add an attachment.

Unfortunately this isn't the case, and this isn't a shortcoming of Microsoft Flow, more that DevOps doesn't take attachments in the way that you would expect it to.

The Issue

If you add an attachment to a work item through the DevOps user interface, you can open a work item and upload an attachment, and it then appears under the attachments tab. However, what is happening behind the scenes is that the file is uploaded into DevOps and is then associated (or linked) to the work item. This is the reason why the Work Item actions in Microsoft Flow don't allow us to add attachments, because the action directly interacts with the work item entry.

When doing this within Flow, I need to use this action twice. Once to upload the file into DevOps, and then a second to attach it to the work item.


Now that worked well while I was doing my initial testing as I was only using a text file. When I attach a text file, sending a base64 encoded string of my file results in success each time and I was then able to push that to a work item with everything in tact. The issue really came when I tried to do the same with a Word document, which wouldn't convert in the same way, and therefore when the file was attached to DevOps, it would appear as though it was there, but would never open. The file had become corrupted.

The Resolution (Work Around)

After investigating this for a while, I eventually found out that when uploading a file more complex, i.e. a Word document, I actually need to convert the file to a byte array and then pass this to the API. After running a few tests using some local PowerShell I confirmed that this worked. So how to solve this now using Flow?

I should point out here that the actual solution is not exactly Flow, however I am still using Flow as the controlling engine. The main element of the solution is the PowerShell which I'd used for testing, but packaged up into Azure Automation Runbooks.

Note: The code which I'll share in my own DevOps project is purely geared towards answering the questions of people in the Flow Community and is not intended to be a production ready solution.

Azure Automation

The first thing I did was create a Runbook in Azure where I can place the code from my repo. Effectively what that will do is get some information passed to it from Flow, these are the parameters at the top of the script. So from my Flow, I pass the following to the Runbook:

  • ID of the list item I want to process
  • List name containing the list item
  • Organisation URL of DevOps (this could be moved to be stored within the Azure Automation Assets)
  • Project Name
My Runbook is quite simple in that all it does is utilise PnP PowerShell to connect to my SharePoint site, grab the list item from the list and then grab the attachments.

What I am doing from here is effectively grabbing the file and temporarily storing it locally to the Runbook so that I can convert it to a byte array. Had I spent more time I would have investigated the possibility of doing this conversion in memory, but for speed I simply just save a copy of the file.

Once that file has been converted to a byte array, I can then use that to push to the DevOps REST API just like that I had tried previously in Flow, and process the result. 

The result will return two key pieces of information, an ID and a URL. The URL is what I will then take and use in my second REST call to associate the file with the work item.


I have created myself a Runbook which can be triggered from Microsoft Flow using the Azure Automation action: "Create Job". This way I can still use all of other goodness that I get with Flow, and I can capture anything which I return from my Runbook to process further.

My Flow is now much simpler that it was previously, as all of the logic to get attachments and to then call the REST APIs is part of the Runbook. All I need to do is use Flow as my controlling engine, to create the Runbook job and pass to it some basic information that I need.

The action that you are interested in is "Create Job" from the Azure Automation group, and then use your credentials to connect it to your Azure tenancy. The resources that you have acccess to will all be found in the relevant drop down selections such as the subscription, the resources group and the automation account etc. Once you have selected your Runbook, the additional parameters that are required will appear, allowing you to use dynamic content from Flow to pass into  your PowerShell scripts.

Now run your Flow, you will see that the Create  Job action will fire, and the job will be queued in Azure ready to be processed by your Runbook. Your PowerShell script will then grab the data that it needs from the SharePoint list, converts it to the correct format and uploads it to DevOps. The script will then perform a second call and will associate the file with the work item.

Hopefully this will save people some time trying to get this working with Flow. If you have any questions, please feel free to reach out on Twitter: @MattWeston365 or drop by the Flow Community  (https://powerusers.microsoft.com/t5/Microsoft-Flow-Community) forums and one of the community experts will be ready to help.


Azure DevOps Services REST API Reference: 


MattWeston365 DevOps Script Repo