Enable a SharePoint REST API Post with RequestDigest Token

 SharePoint provides a very detailed set of RESTful API which allows us to interact with SharePoint data lists. When sending GET API calls, there is no need for additional security validation. When sending a PATCH (update) or POST (create), SharePoint requires that the request headers have an additional header with an authorization token. In this article, we take a look at getting this authorization token, and sending a request using an Angular application.

How to Get the Token

Before sending a POST requests via the APIs, we need to first get a fresh (valid) token.  Lucky for us, the SharePoint APIs also provide an end-point which can be used to get it.  Let’s take a look at the code below to see how this is done:

Note: This snippet runs under the SharePoint site context. The API URL is relative to the site location. For example if your site URL is https://mysp.com/sites/mysite, the API URL should be https://mysp.com/sites/mysite/_api/

function token() {
    var url = "../_api/contextinfo";

        method: 'POST',
        url: url,
        headers: {
            'Content-Type': 'application/json;odata=verbose',
            'Accept': 'application/json;odata=verbose'
    }).then(function success(resp) {
        var data = resp.data.d.GetContextWebInformation;               
        authToken = {};
        authToken.name = 'X-RequestDigest';
        authToken.value = data['FormDigestValue'];
    }, function error(resp) {

In this function, we use the API _api/contextinfo which returns a base64 encoded string.  The token also has an expiration of usually about ten minutes which depends on the SharePoint configuration.  Once the promise is resolved, we capture the X-RequestDigest JSON value, and we set it to a variable which can enable us to use it when making other API calls.

The JSON from the API call should look like this:

    "d": {
        "GetContextWebInformation": {
            "__metadata": {
                "type": "SP.ContextWebInformation"
            "FormDigestTimeoutSeconds": 1800,
            "FormDigestValue": "",          
            "SiteFullUrl": "",           
            "WebFullUrl": ""

Once the authorization/digest token is available, we can send a POST API call with the token value in the request header. This is done in the following code snippet:

function addItem(item) {

    var data = {
        "__metadata": {
            "type": "SP.Data.TodoItemsListItem"
        "Title": item.title,
        "OData__Comments": item.comments

    var request = $http({
        method: 'POST',
        url: url,
        headers: {
            'Content-Type': 'application/json;odata=verbose',
            'Accept': 'application/json;odata=verbose',
            'X-RequestDigest': authToken.value
        data: JSON.stringify(data)

    return request;

When creating or updating information on the data lists, we need to send the item values as well as the metadata information for the list. Without the metadata, the request will fail.   We can identify the metadata information by first sending a GET request. The returning payload provides the data with the corresponding metadata.

In the rest of the code, we set the title and comments properties of the JSON payload. We then use the HTTP service to send a POST request with the header information. We should notice that there is an X-RequestDigest header entry which matches the name that we received when we initially get the token in the previous snippet. In this header, we can then set the security token value and send the request.

By adding the digest token to the header, the PATCH and POST API calls should be successful. We do need to remember that these tokens have an expiration window, so we should check for this and refresh the token when it is about to expire.

I hope this is able to help you resolve the authorization token requirements when creating and updating a SharePoint data list.

You can get a sample project here:  https://github.com/ozkary/sp-addin-todo

Originally published by ozkary.com


Managing SSIS Catalog Database Space

SQL Server Integration Services (SSIS) is a great tool for building ETL processes.  On SQL Server, we can configure an integration services catalog to deploy and configure all the SSIS packages. When using this service, there is also a catalog database named SSISDB that needs to be maintained before we quickly run of disk space. In this article, we look at the catalog retention policy to better manage the disk space requirements on the SSISDB database.

SSISDB Database

The SSISDB is a database that holds all the integration service catalog metadata as well as version and execution history. Depending on the frequency of execution of our packages, the database size can grow very quickly.

Fortunately for us, there is a SSIS Server Maintenance Job (SQL Server Agent jobs) that runs every day to clean up and maintain the database.  The problem with that job is that it depends on some configuration to enable the cleanup and the retention period which can be 365 days. Depending on our package activity, that retention window can lead our database space to grow in the hundreds of gigabytes.

Catalog Configuration

The SSISDB has a catalog schema which contains the objects that can enable us to look at the catalog configuration. We need to look at the following objects to view and update the configuration:

catalog.catalog_properties  (view)
 This is a view for the catalog configuration.

catalog.configure_catalog    (procedure)
 This stored procedure is used to update a configuration setting.

When selecting the information from the view, we may get results to similar to the ones on this image:

When we query the view, we need to look at these two settings:

This should be set to TRUE to enable the cleanup of historical data.

This is the amount of dates that are allowed for data retention.   If this data is not critical, set it to a low number like 30 days or less.

Change the Settings

To enable this setting and set a low retention window, we can use a stored procedure within the catalog schema. Let’s take a look at how that can be done with TSQL:

exec [catalog].configure_catalog OPERATION_CLEANUP_ENABLED, TRUE

exec [catalog].configure_catalog RETENTION_WINDOW, 30

EXEC [SSISDB].[internal].[cleanup_server_retention_window]

By setting those fields, we can run the stored procedure to clean up the data on demand, or we could also wait for the SQL job to run at its scheduled time and clean up the data for us.

Database Size is Hundreds of Gigs

In the event that the database is big, changing the retention window to a very low number (i.e. 365 days to 30 days) in one step may cause the job to eventually fail.  For these cases, we need to decrease the retention window in smaller steps.  For example, we could write a script which decrements the retention window by one and runs the cleanup procedure as shown here:

--we reduce the retention window by one until we reach the target window of 30
declare @index int = 364, @max int = 30

while @index > @max

exec [catalog].configure_catalog RETENTION_WINDOW, @index

EXEC [SSISDB].[internal].[cleanup_server_retention_window]

--shrink the log file as well

set @index = @index -1


If the amount of data is very large, this script may take some time to run. Just let it run and monitor how the retention window decreases with every cycle.


SSISDB like any other database needs to be maintained. Depending on the activity of our SSIS packages, we need to be mindful of the maintenance plan for this database.  We need to look at the catalog retention policy to make it is compliant with our disk space capacity.

Thanks for reading.

Originally published by ozkary.com