3/25/17

Presentation Securing Web Apps with Azure AD

This is the presentation for the Securing Web Apps with Azure AD.  In this presentation, we talked about the following areas:

  • We take a Web application with an AngularJS SPA as client and NodeJS server with no security context, and we deploy it to Azure using a local GIT repository.
  • We configure the Azure Application with Azure AD and show the different behaviors with forcing the users to Login or allowing anonymous access.
  • We also discuss how to change a NodeJS server application and handle the security implementation in the server code.
  • We finally show how to make the implementation changes on the AngularJS SPA to allow the application to directly integrate with Azure AD.
Related Blog Entries:



For the code, visit:

Presentation



Originally published by ozkary.com

1/7/17

SQL Server BCP Import Skip and Map Columns Using Views

The BCP (bulk copy program) command line utility enables us to quickly import CSV files into a SQL Server database. The utility works very well when the import file matches the structure of the table, and the table does not have an identity column.

The moment the import file does not match the structure of the table with can run into different possible errors during the import.  This is mostly due to the fact that the fields may not mapped correctly to the columns on the table.

To facilitate the control of how the CSV fields should be mapped to the fields on the table, we can create a view that matches the CSV field order.  Let’s take a look at an example to illustrate the problem and a solution using SQL views.

The Table

This is a simple table that allows us to store blog entries.  We should note the identity and default datetime columns which would not exist in a CSV file as these are system generated values.


CREATE TABLE dbo.ozkary_blog
(
       id       int identity,
       title    nvarchar(200) null,
       url      nvarchar(100) null,
       tags     nvarchar(100) null,
       created  datetime null,
       imported datetime default(getdate()) not null
)



The CSV File

Our CSV file contains four fields which are fewer fields than columns on the table.


Created,title,url,tags


We should note the following possible issues when trying to import this CSV file with our previously define table.


  • The order of the columns on the table does not match the fields of the CSV file. The create date field is first on the file, but it is on position five on our table.

  • Our table has an identity field which sure will not match the date field on position one of our CSV file.

  • Our table also has an import date time field with a default date value.

Data Import Process

Let’s attempt to import the data directly into our table without any modifications. To import the file, we execute the following command:


bcp dataimport.dbo.ozkary_blog IN ".\\ozkary_blog.csv" -t, -c -F1 -S .\sqlexpress -T


When calling the BCP utility we use the following parameters:

Parameter  Option
Description
Database and table name
This is the database and table name
CSV file name and location
This is the file location and name
-t,
Comma is the field delimiter
-c
Use char as the storage type for all the fields
-F2
Start at row 2 column - skip field headers
-S
Server name
-T
Use a trusted connection with the current session

In our first attempt to import the data, we are using the table name. This does not work because of fields mapping to the wrong columns, and ,as expected, we get some data type errors (add –e bcp.log parameter to create a log file and see more detail on the errors)



Solution – Using a View to Map the Fields

To address this problem, we can create a view that only uses the fields that are available on the import file and match their corresponding order.


CREATE VIEW dbo.vw_import_ozkary_blog
AS
   SELECT created,title,url,tags
   FROM dbo.ozkary_blog



We should now modify the command and use the view name instead of the table name. This should provide better results.


bcp dataimport.dbo.vw_import_ozkary_blog IN ".\\ozkary_blog.csv" -t, -c -F2 -S .\sqlexpress -T





By looking at the output on the image above, 3 rows got imported. If we query our table we should see the records with our identity and imported date values.


As shown, the approach of importing a CSV file using a database view provides a consistent and simple way to map and skip table columns during a BCP import.


Hope it helps.

Originally published by ozkary.com