Fabian Williams SharePoint Blog

Solving problems with SharePoint day and night

Creating a SharePoint Timer Job using SharePoint Designer 2013


What do i really mean?

Now that I have got your attention with that eye catching title, let me expand and qualify it.  So, what I will be doing exactly is

  1. Inspecting a List (a Task List actually) for Tasks that are Due and Overdue.
  2. I will be checking them nightly (every 24 hours to be exact)
  3. If task are Over Due, then I will take ‘some’ action on them, in this instance, I will be changing a KPI field but this can be quite exhaustive based on your own particular use case. see below for one i have in mind.
  4. Pause the Workflow for 24 hours and Repeat.

By the way, this CAN also be used in SharePoint Online

Use Case

In this use case I have been charged to monitor a Task list (multiple task list from varying SharePoint Sites/ Site Collections) and aggregating them. Once complete, then we need to serve up the tasks that are overdue into a KPI dashboard and send the email link to managers of those offending taskees.

For my POC demo here I will show how to get 1 Site Collection but to get multiple all you have to do is make a separate REST call to that particular Lists API, and the rest is easy.

Pre-Requisites and Technologies Used

In this example I am using an On Premises SharePoint 2013 with Workflows enabled, I am also using SharePoint Designer 2013 as well, that is if for tooling. i.e. NO CODE.  I will be using the SharePoint 2013 REST API to read the SharePoint Task Lists as my Entry Point URI. The generic url will be /_api/web/lists/getListByTitle(‘’)/items">/_api/web/lists/getListByTitle(‘’)/items">http://<sitename>/_api/web/lists/getListByTitle(‘<listNameHere>’)/items which fetches List Items from a Named list using oData. Now by default this will be returned to your in XML, so in order to use this in SharePoint Designer which requires JSON inside the Dictionary Object (see my post here if you need a refresher on that) you will need to modify the headers to accept JSON using the Accept Headers. The rest is a matter of using Loops to iterate through the List Items returned, and pausing for the time allotted.

Approach

As usual I put my URI inside a Variable to obscure the name & any API Keys if any, this time there was no need but it just makes for better programming especially if there are a lot and I want to swap them out from time to time without affecting the core program

image

the full list of properties from the Call HTTP web service is located below since it is truncated in the image

image

The Request Headers are “Accept: application/json; odata=verbose” and you can place that into a string type as well and call it into the HTTP Web Service Properties as well inside a Workflow Dictionary Variable

image

Once you have done that, then as seen in the first image and in my previous blog post on the same topic, you parse the JSON results to get the part of the dataset that you want, in this case i want what is under the “d/results” node and to do that I use the Get Item from Dictionary and parse out what i need then i count the items for good measure, save the “Count” as a variable to be used as my upper bound in my Loop Counter and log it out.

Do Business Logic

After setting up the stage to get the URI, return the JSON data, Parse It to the Node you want, and get a Count, the next thing is to do the business logic.

What you will see below in the image is us using the Variable Count as the upperbound in our Loop and then getting the JSON data into Local Variables to be manipulated in the Workflow Logic.

//MY SOAPBOX

So, i have been talking about this for a while as most people who have known me for a long time, know be because of my work and efforts in SharePoint BCS.  But i have actually moved my thoughts in some aspects to what BCS allows (even in 2013) and what Workflows gives you interms of interacting with External Data. You see my point is, if you do not need to look at that data, i.e. External List, or need to Search on that Data i.e. External Content Type as a Content Source, I THINK that it is to your best interest from a simplicity standpoint, a performance standpoint, best of breed standpoint… to do this as a Workflow calling External Web Services and manipulate the data how you want and then get rid of it (i.e. the Variables) when done. That way the data is only used for its intended purposes then goes away.

//END SOAPBOX

Back to our story, what we do next is inspect the Due Date and use Conditional Statements to affect changes to the Fields for what you need. in this instance I am updating a KPI field based on how far along or how overdue a Due date is on a Task List. Now the Use case here is that I can do this on Any Amount of List both On PREM and in Office 365 in the same logic, I have Scale and Scope down to a Tee 🙂

 

image

 

The logic loops through each item does a Check and then updates the loop counter until it hits the “Count” variable number then exits. Since this is a Timer Job then the next thing we need to do is Pause it and wait, see below

image

Now, this workflow was designed as a Site Workflow so it can be ran independently of any Library or Lists and this guy can run forever.

Summary

So the next time someone ask you to Do a Timer Job for them, especially if they say they want it on Office 365 (SharePoint Online) and it involves a use case similar to mine or at the very least is looking at affecting changes to a List/Library, consider doing a Site Workflow with Conditional Logic and Pause Duration. Its really is that simple.

If this was being done in SharePoint Online, you would also need to capture and pass along the oAuth Tokens FedAuth and rfTa in your dictionary object.

Cheers and Happy SharePointing

Advertisements

September 4, 2013 - Posted by | JSON, Office 365, REST, SharePoint 2013, SharePoint Designer 2013, SharePoint How-To, SharePoint Online, Workflows | , , ,

13 Comments »

  1. Fabian,

    It is nice, but SharePoint Workflows aren’t the best ones. They consume cpu and memory and when in a loop it will consume resources. What are your thoughts on that?

    André

    Comment by André | September 4, 2013 | Reply

    • Hey Andre, great hearing from you again mate, hope all is well, its been too long.
      So, to answer your questions, address your comments, let me make a few points then ill summarize.
      1. Everything consumes some level of CPU and Memory, its how its is used and employed that is important
      2. Loops are memory intensive by nature, it has to keep track of items, counters, iterations, agreed

      Notwithstanding that, I think your argument is valid if we had our Workflow on our SharePoint Farm as in the case of SP 2010, but if you structure your Azure Workflow Service in a proper farm then it is only consuming the resources you deem fit, and you can also direct the node in the Workflow Farm to have these items run on. NOW if you put this on BCS, you don’t have that isolation. That and for the reasons I outline, is why I think WF’s are better for this job. The real value is that this can be used in O365 and since MS isn’t allowing you to work in their Farm to do Timer Job its a good route to go.

      Comment by fabiangwilliams | September 4, 2013 | Reply

      • Hi Fabian,

        Thanks for you reply! Yes, my life is great have a good time… But yet less time to program only doing IT Pro stuff at the moment. You’re doing great with all these new features you use.

        I agree with you that WF is a way to bypass O365 limitations. It is a cool feature which you can use. Of course I always watch the performance of memory and cpu consumption with WF. And Azure Workflow is a lot better then SP2010 WF. I will keep it in mind that there are multiple possibilities regarding Timer Jobs.

        What about reboots, patches, etc? Will it resume after it has an one time fail? Or does it stop?

        Cheerz mate!

        Comment by André | September 5, 2013

      • Reboots and Patches wont affect it, that is all handled by the persistence of the Workflow Manager now, its all subscription based.

        Comment by fabiangwilliams | September 5, 2013

  2. You’re on a role, Fabian!
    Have you had long term success with this in production? I’m not sure I trust the reliability of the loops yet since it is new.

    Comment by resing | September 4, 2013 | Reply

    • Ive done this once in production for the same very thing, you have to realize that its only doing a loop once per day but for as many iterations as there are in the list. Its not very transactional in nature, so for me not much of a hit.

      Comment by fabiangwilliams | September 4, 2013 | Reply

      • Good to know. I really like how it deploys through SPD both to on premises and to multi-tenant hosted environments.

        Comment by resing | September 4, 2013

  3. Yah I can see why Rackspace would be interested in that 🙂 one solution that can apply to all your customers. 🙂 you can send a check to ….. LOL

    Comment by fabiangwilliams | September 4, 2013 | Reply

    • Ah, you noticed how I wrote “multi-tenant hosted environments” instead of Office 365! 😉

      Comment by resing | September 4, 2013 | Reply

  4. […] I’ve blogged about this topic and how I go about doing REST via SPD and Fiddler here and here and here, the first and last are probably most appropriate for this line of question. Below are […]

    Pingback by HELP: Unable to Create List using SharePoint 2013 REST API in SPD2013 « Fabian Williams SharePoint Blog | September 6, 2013 | Reply

  5. My experience has been that I prefer to have the workflow run once to completion every night. The I schedule the workflow to run every evening at the same time. The reasons I prefer this are:

    1. I do not like to have a workflow waiting for a long period. I have found issues with Workflows not waking up properly.
    2. If I have the workflow scheduled to run every night, then if I need to make a change to the workflow, the newer version of the workflow will run in the future, not the one I started a month ago.

    Comment by marcelmethmarcel meth | November 15, 2013 | Reply

    • Marcel, thanks for the comments and your point of view for the most part I guess 50% if I had to quantify 🙂 Id agree. I think your second point if very poignant and makes perfect sense in that scenario, however to your first, I think that was an issue when Workflow was coupled with SharePoint i.e. Pre-SP2013 but with the decoupling and the ability to make a Workflow Farm or even dedicate a WF box you can segment work that could be hmmm risky or need more horse power aside from other business processes. Regardless, I always welcome additional points of view, thanks for sharing. Cheers.

      Comment by fabiangwilliams | November 15, 2013 | Reply

  6. […] Q. So this approach should be used in all cases? A. No, as always in Solution Design 'it depends' is your mantra. Full trust solutions and timer jobs still exist (except for O365 of course), and if that's currently the best way to meet your requirements then do it. But where possible you might want to use the above to save your organization future headaches. I should also mention that recurrent functionality can be done via Workflows, but I suspect the WF approach will not be robust enough for most use cases where a Timer Job would have been considered. That said, do consider the post by Fabian Williams here. […]

    Pingback by SharePoint Nirvana | Alternatives to SharePoint Timer Jobs | December 4, 2013 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: