Category Archives: K2 Workflows

Completing tasks with Exchange

We recently came across an interesting issue where the user would complete an action with Exchange integration (replying to the email with “Approve”) and even though it worked they would always get a second email with:

“The K2 server could not find the worklist item 479_54. This item may have been actioned by another user.
The full error from the K2 server is ‘The worklist item 479_54 is no longer available or you do not have rights to open it.’.”

It took a while to get to the bottom of it, but when you realise what’s going on it’s actually simple to resolve.

What happens is that the K2 service account will be checking for emails on the account, and then when it gets the email it will parse it and action the task. In our case the problem was that the UAT K2 server was using the same K2 service account as the production server. This means that the exchange account was being checked by the production K2 service as well as the UAT K2 service. The production server happily completed the task, but of course that task doesn’t exist on the UAT environment so the second ‘fail’ email came form the UAT server, not the production server.

To resolve this issue make sure you aren’t using the same service account on more than one environment on the same domain. Of course it’s best practice to use different accounts anyway, but here’s one more reason why…


Posted by on September 18, 2015 in K2 Workflows


Some users complete tasks, but it ain’t working…

Had a very interesting issue with a client’s environment recently. There was one user (let’s call her Jane) who couldn’t complete tasks. They would appear in her task list and she could action them without any errors, but it just looked like nothing was happening and the task would just sit there until someone else completed the task.

It turns out the error was in the database, but it’s tricky to see the problem. I ran the following query:

FROM [K2].[Server].[Actioner]
where ActionerName like '%Jane%'

The results looked fine, but exporting the results to csv and opening them in Notepad revealed the problem. I expected this:


But what I got was this:


Notice the ‘_’s? That’s actually a massive amount of whitespace after the user’s name. That’s what caused the issue. I trimmed the username and suddenly the problem went away. Here’s the script to fix all the entries (Thanks Mieke):

begin transaction
update [K2].[Server].[Actioner]
set [ActionerName] = RTRIM([ActionerName])

So what caused it? Here’s a response from K2 support (who were very helpful as always):

Hi Trent

Values are written to the Actioners table based on the Destination Users set in a WF . Every User/Group/Role or value ever used as a Destination will show up in this table . Effectively , any user that get’s authenticated by the server …

The problem comes in at the point where Destinations are assigned . In cases where Destinations are assigned dynamically via DataFields who’s values are derived from outside sources (Drop Downs , User Input and so on) … If a value was copy/pasted into a Input Field that will then be used as the Destination , then this value will end up in the Actioners Table as is (UTF characters and all) .

In most cases this is why these spaces are introduced … Erroneous User Input which eventually filters down to the DB level but is pretty much invisible from any UI elements due to SQL and most of the Web Interface’s Auto-Trimming actions .

In very rare cases I have found that the spaces come directly from AD , where the AD Object was created using PowerShell and the inputs for the User Details copy/pasted from some sort of Rich Text Document containing UTF characters … However in your case the former appears more likely . Check where the Destinations User’s for your Workflows are derived from and ensure that there are no occurrences where Users can copy/paste values for Destinations … or if that is unavoidable , make sure to TRIM values before submitting to the K2 Server .

Leave a comment

Posted by on June 25, 2015 in K2 Workflows


Data Fields and Workflow Versions

Here’s our scenario… We have live workflows deployed and a fairly complex ASP.Net app as a front end. In our latest deployment we have added a few process level data fields and now the application expects to be able to read and write to those data fields. However, existing live processes won’t have those data fields so we will be looking at getting exceptions when we try access them. So what to do?

I figured there would be 2 ways of dealing with this sort of versioning. The first option is to have the process version stored in a data field which we could read and then the value of this version would tell us whether or not we can expect the data field to exist. This is fine in theory, but in practice I see there being problems further down the line, plus it means 2 reads instead of 1.

The option I decided on is to create a WorkflowDataField class and return this instead of just a string. This WorkflowDataField would not only contain the value of the data field but it would also have a result type so the calling code could check if there were errors.

Here’s my WorkflowDataField class:

    public enum ResultType

    public class WorkflowDataField
        private string m_value;

        public ResultType QueryResultType { get; set; }

        public string Value
            get { return m_value; }
            set { m_value = value; }

        public int IntValue
            get { return Convert.ToInt32(m_value); }
            set { m_value = value.ToString(); }

        // etc...

        public override string ToString()
            return Value;

Pretty simple, nothing fancy. Now when I query the data field I do something like this:

var workflowDataField = new WorkflowDataField();
    workflowDataField.Value = processInstance.DataFields[key].Value.ToString();
    workflowDataField.QueryResultType = ResultType.Success;
catch (Exception e)
    workflowDataField.QueryResultType = ResultType.DataFieldDoesNotExist;
    workflowDataField.Value = e.Message;
return workflowDataField;

Unfortunately the exception thrown is nothing more specific than System.Exception so if you want to be sure that the exception is being thrown because of a missing data field then you have to parse the exception message for ‘does not exist’.

I am a firm believer in the philosophy of not using exceptions to control program flow, but in this case I made an exception (sorry for the pun). The reasons I’m allowing myself to do this are:
1. It’s not a system exception, it’s expected.
2. Exceptions are slow and expensive, but we’d only be catching exceptions while the old instances are alive. As time goes on we’d get fewer and fewer, so in the long run it would be better performance than doing 2 reads each time.

Hope that helps someone…

1 Comment

Posted by on July 5, 2012 in K2 API, K2 Workflows


Moving K2 databases?

My first Making Flow Work post! Feels like I’m cheating on the Naked Programmer.

Sometimes you need to move K2’s databases from one SQL server to another. Various reasons might force you to do it, specifically when you realised you should probably not have put the K2 databases on the same SQL Server as SharePoint 2010.

It’s a straight forward process and includes two parts:
1. Backup and restore all the K2 databases to a new SQL Server
2. Recreate the SCSSOKey symmetric keys and SCHostServerCert certificates

The following two articles explain the whole processes step by step:

I did however discover that, even after running the K2 Setup Manager to reconfigure K2 to point to the new SQL server, the out of the box K2 reports stopped working and that users could no longer create custom worklist filters (and I assume various other user specific configuration features must have been broken). Users trying to run OOB report got this error: An error occurred loading the report. Service: K2Generic Settings Service Guid: ffffffff-ffff-ffff-ffff-fffffffffffff Severity: Informational Error Message: Cannot open database “K2HostServer”. Users trying to create worklist filters got: “Could not save user data. Check error logs for more information”.

It seems that there is a service broker instance called the Generic Settings Service that contains a link the K2HostServer database and the link is not updated by the K2 Setup Manager. After manually updating the link, everything worked fine.

Leave a comment

Posted by on November 16, 2011 in K2 Workflows


Getting child process IDs after IPC Event

I like IPC events and use them all the time – if you design your processes well with IPC events then you can keep them small and manageable, and each time you get to an IPC event your process is bumped up to the latest version so it’s easier to propagate bug fixes. However I recently needed to keep track of the child process IDs and I realised that you can’t do this using the IPC event wizard. I tried to map the child process ID to an activity level data field in the parent process but unfortunately I couldn’t add the process ID field from the child process as a source.

So how can you do it?

My implementation was to pass the parent process ID into the child process, and then write these IDs to a database in the first activity of the child process:

Child Process

Another way to do it is to include the parent process instance ID in the folio of the child process, then when you retrieve the worklist you can filter the worklist by folio. This is a simple way of doing it but isn’t the most efficient.

Hope that helps.


Posted by on July 22, 2011 in K2 Workflows


K2.ActivityInstanceDestination.User is NULL?????

This one caught me by surprise! I needed to build up a CSV list of email addresses for all previous task owners for an activity, so the simplest way would be to just append the email address each time we get to the task, right? Apparently not.

Here’s my code:

K2.ProcessInstance.DataFields["AllApproversEmailAddresses"].Value = K2.ActivityInstanceDestination.User.Email;

Nothing too complicated. However, running it gave me the dreaded (and oh so informative) “Object reference not set to an instance of an object”. So what’s happening?

Well, it’s actually not that complicated. Basically when you create any activity the default destination rule is “Plan Just Once” (in fact you don’t even have the option of choosing anything else unless you run the Activity wizard in advanced mode.) Anyway, when this option is chosen the destination instance isn’t actually created, which means that K2.ActivityInstanceDestination will always return null.

To get around this issue you will need to do the following:

  • Right-click the activity to open the activity properties.
  • Click on the Destination Rule Options tab, and then click the ‘Back’ button on the wizard.
  • Check the box which says “Run this wizard in advanced mode”, then click “Next”.
  • You’ll see that “Plan per slot (no destinations)” is selected. Choose “All at once” and then click through the rest of the wizard.
  • Deploy and test.

That’s it. Hope that helps someone!


Posted by on June 10, 2011 in K2 Workflows


28025 Failed to Start IPC

I came across this error again and it reminded me that I’ve come across it a number of times before, so I’ll list the possible issues here:

  1. You don’t have rights to start the child process. When you set up the IPC call the default authentication is set to ‘Integrated’ and I often forget to change this to ‘Impersonate’. When I get this 28025 error the first thing I check is whether or not I’ve set the authentication to impersonate and that the user has start rights on the child process.
  2. The data field you’re setting in the child process doesn’t exist. I came across this when I created my data field in the child process while I was going through the wizard to set up the IPC call. The problem was when I finished the wizard and everything compiled and exported, the data field I created hadn’t actually been created. One day if I have time I’ll raise a bug for this…
  3. You’re passing data with incompatible types. You can’t pass a string from the parent process to an int data field in the child process, for example.
Leave a comment

Posted by on June 7, 2011 in K2 Workflows


Flexible rights for actions

I was recently looking for the “Allow any user to finish this client event” and couldn’t find it. In 4.5 this has been replaced by a great piece of functionality which lets you configure this from the workspace. Take a look at this screenshot from my unit test process:

Workspace Screenshot

Workspace Screenshot

My client event has 3 actions, namely Action1, Action2 and Action3 (original, I know). You can now select any action and assign rights to that action. This is great because it means you can tighten down security but still allow for a lot of flexibility. For example, you can give everyone in a group the rights to decline a review (even if they weren’t assigned the task!) but only managers the rights to approve the review (again, even if they weren’t assigned the task).

Leave a comment

Posted by on April 8, 2011 in K2 Workflows, Security


Referencing dll’s from your K2 process

I’ve been having a lot of fun (not really) working with external dll references from K2 processes. There’s a light at the end of the tunnel but it helps to know what’s going on behind the scenes when you reference a dll from a process.

Here’s my situation: I needed to integrate a very complex hierarchy of dll’s so that I could access application data from the process (and wrapping them in a service wasn’t an option for me). Obviously I didn’t want to add references to all the dll’s (there are many) so I came up with a great idea – I’d add a class library which references all these dll’s (kind of a proxy class) and then reference only this one dll from my process. Great idea, yes. Successful, no.

To explain why it didn’t work I’ll first explain what happens when you reference a dll from a process. You’ll notice that ‘copy local’ is checked by default. This is because when you deploy the workflow, the referenced dll is sucked into the process and deployed to the database along with the process definition. Then, when you start the first instance of this process, the process definition is unpacked and a temporary folder is created in the HostServer\bin folder, and your referenced dll is copied into the temporary folder. (Side note: note how nicely versioned this is. If you have different version of the process running, each version will have it’s own temporary folder with the version of the dll that the process was deployed with. There’s an excellent blog about this here if you can access it).

So here’s the first important point: only the referenced dll is deployed with the workflow. NONE of the dll’s the referenced dll relies on are deployed, even of those dependent dll’s have ‘copy local’ set to true. So in my case I had a proxy class deployed but the actual dll’s I was trying to use were nowhere to be seen.
So what’s the best thing to do? You have 3 options:

  • Reference all the dll’s from the process (not my preferred option)
  • Copying the dll’s to K2’s bin folder
  • Deploying the dll’s to the GAC
  • Overall deploying to the GAC is your best option if you can. Or of course you can wrap the dll’s up in a service – this would have been the cleanest option but in my case this option wasn’t available because we needed to tweak every little bit of performance out of the system that we could.


    Posted by on March 10, 2011 in .Net, K2 Workflows