1/20/18

Angular filtering on multiple options using anonymous functions

When building web applications using grid or list views, we often have the need to filter the data based on a selected option.  With AngularJS this is a trivial task as we can use a filter to select the data, but it gets tricky when we need to provide the same functionality using multiple filter options. With this article, we take a look at implementing a multi-option filter using anonymous functions.



Defining the model


For our demo app, we build a simple vehicle inventory viewer which can filter results by selecting one or more makes. The vehicles associated to checked selections should displayed.


var list = [
    { "id": 1, "year": 2018, "make": "Nissan", "model": "Altima" },
    { "id": 2, "year": 2018, "make": "Nissan", "model": "XTerra" },
    { "id": 3, "year": 2018, "make": "Subaru", "model": "Outback" },
    { "id": 4, "year": 2018, "make": "Subaru", "model": "Crosstrek" },
    { "id": 5, "year": 2018, "make": "Toyota", "model": "4Runner" },
    { "id": 6, "year": 2018, "make": "Toyota", "model": "Corolla" }
];


Notice on our model that there multiple items with the same make, so our first step is to get a list of unique makes. This can be done by creating a hash table with the make property as the key. We also add the checked property which can be bound to the checkbox to track what filters are selected.


ctrl.makes = {};

//build the list of makes using a hash
list.filter(function (value, index, arr) {
    if (!ctrl.makes[value.make]) {
        ctrl.makes[value.make] = { make: value.make, checked: true };
    }
});


Anonymous Function

The last step on the code is to build the filter logic. This can be done using an anonymous function which can be piped as a filter on the ngRepeat directive. The function gets each item on the collection via the item parameter. The item.make property is used as a key to get the object from the makes collection and validate if the item is checked.  A true value means that the item meets the filter condition, and it is displayed.


ctrl.optionFilter = function () {
                    return function (item) {
                        return ctrl.makes[item.make].checked;
                    };
                };


Building the Views

We can now move forward with building our views. We need the checkbox as well as a grid for the inventory list. This is done as follows:

Filter Options

We use our makes collection of objects to create the checkboxes using the ngRepeat directive. For the filter to work properly, we need to watch over the checked property by associating it to the ngModel directive.


<span class="center-block">
        <label data-ng-repeat="option in ctrl.makes">
            <input type="checkbox"
                   ng-model="option.checked"
                   ng-checked="option.checked"/> {{option.make}}  &nbsp;          
        </label>                  
    </span>     


Table / Grid

To build the grid, we use the ngRepeat directive on our list collection. If we look at the ngRepeat directive, we can see that we pipe the items to our filter which is called for each item to check if the corresponding make is checked. We also alias the result, so we can display the count of items.


<table class="table table-bordered text-center">
    <tr ng-repeat=" item in ctrl.list | filter: ctrl.optionFilter() as result" >
        <td><i class="fa fa-car fa-2x text-primary">&nbsp;</i></td>            
        <td>{{item.year}}</td>
        <td>{{item.make}} </td>
        <td>{{item.model}}</td>             
    </tr>
</table>


In Action





Summary

We are able to see how to filter a list with multiple options using our old friend AngularJS filter and an anonymous function. This however is not an acceptable solution when using the new Angular generation of software as filters have been a major cause of performance issues when dealing with large datasets. Read more from the Angular Documentation.  


In our next article, we see how to implement the same solution using Angular 5 with Typescript and Observers (Reactive Programming) which should address those performance concerns.

Originally Published by ozkary.com


1/13/18

Mocking a REST API With Model First Development

When developing application that integrates with JSON APIs, we usually test the API calls with a development API Server. This is a great approach when such server is available. However, when we are also building the API or have no development backend available, we need to use a mock server API to enable our frontend and integration development to move forward.  In this article, we take a look at building a mock server with the Model First Development approach and json-server.

Model First Development

With Model First Development, we focus on developing the application models that are required for a successful integration. After we build the models, we can then focus on building the APIs for our basic CRUD operations.  This is where the use of json-server can help developers build mock servers by just using the defined models.

What is json-server?

Json-server is a Fake REST API server application built on Node.js that reads JSON models and creates basic CRUD operations on those models thus enabling the rapid creation of REST APIs. This is often used by front-end as well as integration developers to test the API calls with a mock server.
We should also note that json-server support other operations like sorting, filtering, paging, search among a few other features. For more information visit the home page at this location:


Installation

The setup for this application requires NPM as well as Node.js. Once those dependencies are installed, we can type the following command to install json-server.


npm install –g json-server


Building our Models

We are going to build a simple vehicle inventory with the year, make and model property names. In our models, we create two collections, makes and vehicles. We use these collections to illustrate how json-server creates different routes for each collection.


{
    "makes":["Nissan","Subaru","Toyota"],   
    "vehicles":[ 
        {"year":"2018","make":"Nissan","model":"Altima"},
        {"year":"2018","make":"Subaru","model":"Outback"},
        {"year":"2018","make":"Toyota","model":"4Runner"}   
    ]
}


The important thing to notice from our JSON sample is that we have included multiple collections in the same file. This is a requirement from json-server. One session of the json-server can only watch a single JSON file. To overcome this problem, we can start multiple sessions on different ports using a different file and port number.


json-server –watch ozkary-inventory.json –port 3005


Starting the Mock server

To start the mock server, we need to pass the –watch parameter with a target JSON file. In this example, we are running the command from the same directory where our JSON file is located.


json-server –watch ozkary-inventory.json


Once we execute this command, we should have a list of the resources (APIs) that are available from our mock server as well as the endpoint which defaults to port 3000.  This is the endpoint that we want to use for our API calls.




Testing the APIs

To quickly test our APIs, we can use Postman or a similar tool to test our GET, POST operations on our models.  When we use a GET operation, we can see the JSON data that comes back from our API. We can send a GET request with the ID parameter to simulate a search request for a specific record.

When we send a POST request, json-server simulates the creation of a new record and returns the id for that record. This is a good way to test out create, updates and delete operations without having to write lots of fake operations.



Summary


With the help of json-server, we are able to mock an API by first focusing on the application models. We then let the json-server handle the logic of the CRUD operations which are inferred by the model definition. This accelerates the front-end and integration efforts without having to implement a mock server.

Originally published by ozkary.com

11/11/17

DevOps Set Default Azure Subscription with Azure CLI

DevOps for Azure configuration and deployment is a key component of cloud operations management without having the need to use the Azure user interface. When there are multiple subscriptions to manage, we need to make sure that we first select the correct subscription from Azure. We can take a look at how this can be done using Azure CLI 2.0  (install from this site).



After opening the Bash command shell, follow these steps:

Login to Azure

We can login to our Azure account using the login command. This requires some browser interaction to enter a code for the two factor authentication.


az login


List Our Subscriptions

After a successful login, we can list all of our current subscription using the account command.


az account –list –all –output table


We can see the returning JSON with a list of accounts that are available in a nice table format.  Each reference has a name and isDefault property. Only one of them is our default subscription.

Set a Default Subscription

We can change our default subscription by running the account set command.


az account set  --subscription “my subscription name”


Validate Default Subscription

We should be able to list all the subscriptions again and verify that the default subscription is correct by filtering the results using GREP. (Note this work when using Bash)


az account –list –all –output table | grep “True”


The result should be only the subscription that is set to IsDefault = true.

At this point, we should be on the right subscription, and we can move forward with any additional configuration using Azure CLI.

Originally published by ozkary.com

11/4/17

Input Range Slider with Color Indicator


The HTML input type range lets us specify a numeric value which falls within a min and max value. For some use cases, we want to style the control in such a way that provides a visual feedback that is related to the selected value.  In our example, we can create a slider control that maps to three different system statuses:

Status
Control Value
Background Color
Down
1
Red
Idle
2
Gray
Running
3
Green



We want to be able to change the background color of the thumb control to match the slider value. In order to do this, we first need to create the base CSS classes to style the control. We are also creating some thumb classes with the corresponding background color.


    .slidecontainer {
        width: 100%;
    }

    .slider {
        -webkit-appearance: none;
        width: 100%;
        height: 30px;
        border-radius: 5px;
        background: #d3d3d3;
        outline: none;
        opacity: 0.7;
        -webkit-transition: .2s;
        transition: opacity .2s;
    }
        .slider:hover {
            opacity: 1;
        }
        .slider::-webkit-slider-thumb {
            -webkit-appearance: none;
            appearance: none;
            width: 50px;
            height: 50px;
            border-radius: 50%;
            background: gray;
            cursor: pointer;
        }

    .thumb1::-webkit-slider-thumb {
        background: red;
    }

    .thumb2::-webkit-slider-thumb {
        background: gray;
    }

    .thumb3::-webkit-slider-thumb {
        background: green;
    }


Now that we have styled our control, we want to map the selected slider value to a CSS class that we defined. We can do this by using the range control oninput event which fires when the value changes.


    var slider = document.getElementById("myRange");
    var output = document.getElementById("demo");

    var thumb = {"1":"thumb1","2":"thumb2","3":"thumb3"};
    output.innerHTML = thumb[slider.value];

    slider.oninput = function() {
      output.innerHTML =thumb[this.value];
      slider.className = 'slider ' + thumb[this.value];
    }



In the code, we define a hash table with the possible slider values as keys. This enables us to quickly resolve the class name for the current value. When the value changes, we set the range controller class name to the slider base class plus the thumb class which provides the button background color.



With this article, we are able to show how easy it is to style the range input control to make it provide better visual feedback to the end users.

Originally published by ozkary.com

10/14/17

SharePoint 2013 Move Documents to Another Library Using Content Organizer Web Service

A common task for document management system is to move a document from one library to another based on a particular condition or content type. To enable this feature, SharePoint provides some core actions that can let you copy the document to another site collection and then delete it from the source. 



In our case, we want to be able to move the document using a more complex action called Send Document to Repository which moves a document by using the Content Organizer Web Service.

Note:  Content type enables us to organize documents and associate them to content rules which can enable us to route documents to specific document libraries.

Configure the Content Organizer Feature

This core action enables us to move a document to another library within the same site collection using a SharePoint web service named Official File. In order to enable this web service endpoint, we need to enable a site feature and configure some content rules.

We first activate this feature by visiting this page:
  • Click on site setting
  • Click on site features
  • Look for Content Organizer and activate this feature

Content Organizer Rules
Create metadata based rules that move content submitted to this site to the correct library or folder.


This enables the web service that we can use to move the document. The end point is found at a location relative to your site collection for example:

_vti_bin/OfficialFile.asmx  or  sites/demo/_vti_bin/OfficialFile.asmx

Note:  We use this address when we add the workflow action.

This web service address can also be found by visiting the following page: 
  • Site Settings
  • Under Site Administration select Content Organizer Settings

Create the Content Rule

Now that the service is available to the site collection, we need to configure a rule that indicates what type of content type we want to manage and to what library we need to send this content. This is done by visiting the following page:
  • Site Settings
  • Under Site Administration select Content Organizer Rules


We want a rule for the Document content type, and we want to send the document to the ArchiveLibray.  As we can see, here we could route documents based on custom content types to other libraries.

Note: When using custom content types, we can have a more granular set of rules to send documents to other libraries that are based on that content type.

Workflow Send to Document Repository Action

We have now setup all the requirements to be able to send or move our document to another location.  We can now add a workflow using SharePoint Designer.  On our first step, we want to add the Send Document to Repository action.

Note: SharePoint 2010 Workflows provide this action.


This workflow action has three parameters

Parameter

Description
This action
This allows us to indicate to copy or move the document.

This destination content organizer
This is needed to indicate where the web service is located. Use the address that was identified on the previous sections.

This explanation
Use this parameter to add some comment about the action.


In our example, we are deciding to move the document using our previously defined Content Organizer Web Service. In this example, there are no specific conditions to validate for this action, but this is the area we can validate a document status and run the action accordingly.


Summary

We have shown how to use the Content Organizer Web Service and a workflow action. With this approach there are more configuration steps that need to be complete in order to support the solution.  The additional configuration does bring added value as it can enable us to route documents based on their content type to other document libraries.

For cases where a simpler approach is desirable, we can refer to the article below which essentially shows us how to use the Copy and Delete core actions to implement a simpler Move action.




I hope this is able to show you a way to move document across document locations using a workflow action and the Content Organizer Web Service.

Originally published by ozkary.com