9/17/17

SharePoint Create a host-named Site with PowerShell

On SharePoint, we usually create site collections using the path-based route. This approach aligns very well with the SharePoint online/Office 365 site structure recommendations. With SharePoint 2013, we can create a site collection using a host-named site which allows us to assign a unique DNS name to our site collections. This enables us to deploy multiple sites with different DNS names in the same web application.


Path-named vs Host-named


Site Collection Structure
Example
Path-named
ozkary.com/sites/demosite1
Host-based
demosite1.ozkary.com

In order to deploy a host-named site, we need to create a DNS entry that points to the SharePoint host server/farm. We then need to write some code using PowerShell as this feature is not available from the Admin Central.

To create the site collection, we run this custom function from PowerShell with elevated permissions.


#
# Name: createHostNamedSite
#
# Description:  Creates a host-named site collection
# Usage
# createSite web-app-name, dns-entry site-name site-description admin-account-name site-template content-database
#
createHostNamedSite "ozkary.com" "https://demosite1.ozkary.com" "Demo Site 1" "Demo Site 1 Description" "admin-account" "STS#0" "WSS_Content_DB"


Parameter Information

Parameters
Description
Web-app-name
Main SharePoint web app name where site collections are hosted
Dns-entry
The unique DNS entry for the site
Site-name
The name for the site collection
Site-description
The site description
Admin-account-name
The site collection owner usually the admin
Site-template
A site template like team site, document center

Click here to see more templates
Content-database
The content database for this site collection

createHostNameSite Function


Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue


#
#  Name: createHostNameSite
#  Description: create a host-named site collection
#
#  Params:
#       $webAppName 
#       $siteUrl
#       $siteName
#       $description
#       $owner
#       $template
#       $contentDatabase 
#
function createHostNameSite($webAppName, $siteUrl, $siteName, $description, $owner,$template, $contentDatabase)
{   
    write-host "Creating site with parameters " $webAppName, $siteUrl, $siteName, $description, $owner,$template, $contentDatabase , $mapUrl

    $continue = Read-Host -Prompt "Do you want to continue (y/n)"
    if ($continue -eq "y")
    {
        write-host "Provisioning Site...."       
        New-SPSite $siteUrl  -Name $siteName -Description $description -OwnerAlias $owner -Template $template  -ContentDatabase  $contentDatabase  -HostHeaderWebApplication (Get-SPWebApplication $webAppName)               

    }else{
        write-host "Process cancelled"
    }

}


On the code , we first include the SharePoint plug-in which enables us to use the New-SPSite cmdlet that handles all the hard work for us.  Another area to notice is that there is the HostHeaderWebApplication parameter for which we pass the SharePoint web application reference by using the Get-SPWebApplication cmdlet. This is what enables us to create the host-named site.

After the site is created, we should be able to type the DNS name, and if it is already created and propagated in your network, the request should be able to be sent to SharePoint which can load the corresponding site properly.



I hope this is helpful and enjoy your site creation automation (DevOps) with PowerShell. 

Originally published by ozkary.com

9/10/17

Powershell Upload Multiple Files to SharePoint

SharePoint document library enables us to manage documents. In some cases, we need to upload multiple files to that library.  A common approach to automate this process is to integrate via the SharePoint Restful APIs. In this article, we take a look at using PowerShell to create a script that can quickly enable this integration without having the need to know all the details of the API.


Solution


For our use case, we have a drop location where our files are located. We need to read all the files and upload them to a SharePoint document library.

Note: We can run this snippet using Visual Studio Code. Just save the file with an extension of ps1, and VS Code will guide you on installing the PowerShell extensions.



# include the web cmdlets
Add-PSSnapin microsoft.sharepoint.powershell

# name:  uploadFiles
# purpose: upload multiple files from a shared location
#
function uploadFiles($path, $siteUrl)
{

try {

  # gets all the files (-File) in the directory
  $files = Get-ChildItem -Path $path –File

  #iterate thru each file
  foreach($file in $files) 
  { 
      $url = $siteUrl + $file
      $filePath = $path + $file

      #upload the file to the server using default credentials from the
      #current session
      $result = Invoke-WebRequest -Uri $url -InFile $filePath -Method PUT
               -UseDefaultCredentials

      #if the request status code is successful, we delete the files
      #else write an error message
      if($result.statuscode -eq 200){       
        Remove-Item $filePath -Force
      }else{
        write-host "Failed to upload" $filePath $result.statuscode
      }
  }
}
catch {
  write-host "Exception was raised: $PSItem" #psitem is the error object
}

}


#usage
#uploadFiles parm0 param1
#param[0] shared-location 
#param[1] sharepoint document library url
#
UploadFiles "\\some-document-path" "https://ozkary.com/shared-docs/"


In this snippet, we first need to include the snap-in for SharePoint. This loads the web cmdlets that provide the implementation abstraction to the SharePoint APIs.

The uploadFiles function enables us to read a folder location and lists only the files. We then iterate thru each file and use the Invoke-WebRequest to load them to a particular SharePoint document library. If the response has Status 200, we know that the upload was a success, and we remove the file. Otherwise, we write to the standard output using write-host.

Security


We should notice that when uploading the files, we use the -DefaultCredentials parameter.  This enables our script to pass the current session credentials to the request.  The credentials are set when the script is executed under the context of a particular service account that has access to the SharePoint document library.

 I hope this provides a quick automated way to upload files to your document libraries.



Originally published by ozkary.com

8/19/17

SharePoint 2013 Service Cannot be Activated on host-named Site Collection

A host-named site collection on SharePoint has a URL format of apps.ozkary.com. In comparison, a path-based site collection has this URL ozkary.com/sites/apps.

When using a host-named site collection for document libraries, we may encounter some behavior that it is not reproducible when using a path-based site.  Let’s take a look at this service activation error.

Unable to activate service Error

Browser Error
This is error shown on the browser console.


https://apps.ozkary.com/_vti_bin/DocControl/DocControlService.svc/MoveFile 500
(System.ServiceModel.ServiceActivationException)


Server Error
This is the error shown on the server logs.


WebHost failed to process a request.
System.ServiceModel.ServiceActivationException: The service '/_vti_bin/DocControl/DocControlService.svc' cannot be activated due to an exception during compilation.  The exception message is: The type 'DocControl.Controls.ISAPI.DocControl.DocControlService, DocControl.Controls, Version=1.0.0.0, Culture=neutral, PublicKeyToken=, provided as the Service attribute value in the ServiceHost directive, or provided in the configuration element system.serviceModel/serviceHostingEnvironment/serviceActivations could not be found


This error indicates that the service cannot be activated because it is not found.  This is a problem on the fact that the relative path the service activation is using is not found because our site is a host-named URL, and there is nothing defined on IIS for this path. A host-named URL is managed internally by SharePoint, so this is the reason why IIS may not have the URL defined on the site bindings.  A solution for this problem is to configure the site on IIS and add this URL to the site bindings.



After adding the binding, the operation that was causing the error should work properly.

I hope this helps.


Originally published by ozkary.com

8/12/17

Move an Azure Subscription to a Different AD Directory

When managing multiple Azure AD and subscriptions, we need to create subscription on a particular Azure AD to grant access to the users in the directory. This allows us to prevent unauthorized access to other subscriptions. In some cases, we may want to transfer a subscription to a different AD directory. 

As of the time of this article, the new Azure portal does not enable this operation. In order to do that, we have to use the Azure AD portal and then load the Classic portal for the final transfers. Let’s see how that is done:

Transfer subscription to different directory

  • Login with the account that has the subscription to aad.portal.azure.com
  • Click on Azure Active Directory menu option
  • Click on Classic Portal (top menu bar)
  • On the Top-Right, select Subscriptions (ref image 1)
  • Filter by the directory where the subscription is located
  • Click on Manage subscription directory
  • Select the subscription that we want to transfer
  • Select on edit directory (bottom button - ref image 2)
  • Select the target directory
  • Click continue (right arrow) and make sure that there are no problems with users that do not exist on the target directory.  This means that the owner must exist on both directories.
  • Click OK (Check mark)

image 1

image 2



The subscription should be moved to the new directory in a few minutes. After this is successful, we can log back in to Azure portal. We need to make sure that we select the correct directory from the top-right user profile component. There we can find a list of all the directories the user is associated with.

We can now load this URL. There is no menu option to load this view, so we can create a shortcut on the dashboard with this link.


https://portal.azure.com/#blade/Microsoft_Azure_Billing/SubscriptionsBlade

If we only click on the subscription menu option that it is available from the billing blade area, we see only the subscriptions with associated billing, and that filters our newly transferred subscription.


This should help us move our subscriptions to another AD directory without too much pain.

Originally published by ozkary.com

7/15/17

Microsoft MVP Award - 2017

On July 2017, I was proud to receive my second Microsoft Most Valuable Professional (MVP) award. This year the award came in earlier than expected, as the Microsoft MVP program decided to grant these awards once a year moving forward.  



This is rewarding recognition that makes all the on-going work that I do for the different software communities around the world worth it


I enjoy knowing that what I learn and share with others has some meaningful value.


What to Expect Next


 My plans for the next year is to continue to learn and share with others by writing blogs  entries, deliver more presentations and help on different technical forums like Stackoverflow, ASP.NET, GitHub, Twitter and CodeProject.

On the technology side,  I plan to continue the  grow on areas like Angular 2, Node.js, Azure Technologies, SQL Server, NoSQL,  ASP.Net , Visual Studio, and Microsoft Core technologies to expand my areas into the Linux world. 

Thanks again to Microsoft and those working the MVP program as well as the community that enjoy and support my contributions.

Oscar Garcia
@ozkary

2016-2017

Originally published by ozkary.com

7/8/17

SQL Server - Pivot Rows to Columns

On systems that are meta-data driven, there may not be a concrete table schema declaration, and the meta-data values can be stored as rows. To provide an example, let’s build a custom_field table that can hold multiple custom values for a particular entity:

Pivoting rows to columns



Setup the table and sample data



create table dbo.custom_fields
(
   ref_id int, 
   field_id int,
   field_name varchar(55),
   field_value int
)
go

INSERT INTO [dbo].[custom_fields]
           ([ref_id]
           ,[field_id]
           ,[field_name]
           ,[field_value])          
    VALUES
 (1020, 1000,'width',10)
,(1020, 1001,'height',20)
,(1020, 1003,'thick',5)
,(2010, 1000,'width',15)
,(2010, 1001,'height',5)
,(2010, 1003,'thick',5)
,(2010, 1003,'units',5)
,(3000, 1000,'width',35)
,(3000, 1001,'height',55)
,(3000, 1003,'thick',20)

--returns all the records
select *
from custom_fields

After we create the table and insert the test data, we can run a query to select all the records. The results should as follows:




The problem with the way the data is stored is that those values are properties values for an entity associated by the ref_id, and we would want to read those values as columns instead of rows as shown below:


Pivot Query


To convert those rows to columns, we need to use the PIVOT relational operator which enables us to convert the table results into a different shape. Let’s write a SQL query that can do that for us.


SELECT ref_id,width,height,thick,units
FROM
(
  SELECT [ref_id], [field_name], [field_value]
  FROM custom_fields   
) fields
pivot
(
  MAX(field_value)
  FOR [field_name] in (width,height,thick,units)
) piv;



The syntax may not be clear at first, but the query pivots the data on the field_name (FOR Expression) values, so that they become the column heading. This matches the meta-data in the field_name column, so we can read the field value using the MAX function. 

When a field_name does not exist for a particular row, the return value is null.  For example, both ref_ids (1020, 3000) do not have a units custom field.

This should enable us to make more complex pivot queries using SQL Server.

Thanks for reading.

Originally published by ozkary.com