Dailycode.info

Short solution for short problems

AngularJS and Windows authentication

We are starting up several Anguar JS (in typescript) projects with an .Net WebAPI2 background. This is really an pleasure to work with, it's like playing with lego when you're 10 years old. Keep building and inventing new things.

One of the things that all applications require is security. Inside our application we can use windows authentication. To set this up was really easy, that is, to set it up for simple GET requests. But with POSTS it gets more tricky. Ruben Biesemans from SPIKES explains this in details here: 

How to implement Windows Authentication in an AngularJS application with a stand-alone Web API

Hope this will help solving this nasty preflight errors. 


Download a UTF-8 (with BOM) csv file from API with javascript

I had a webservice that generated a csv file. At first everything seemed ok. But when de users start testing, there were problems with special characters like ë and é and more  when the csv was opened with Excel. With notepad there were no problems. 
A colleague found out that it had something todo with UTF-8 witout BOM. If we changed the encoding to utf-8 (with BOM) then it opened correct in Excel. 
Now I first started to change  the API, trying all kinds of fixes. But at the end it was the front end were the problem occurred. The API side (.Net) looked like this:

#region Export
[Route("GetExport")]
[HttpPost]
[EnableCors(origins: "http://myserver.be", headers: "*", methods: "POST, OPTIONS", exposedHeaders: "Content-Disposition,MyFileName", SupportsCredentials =true)]
public HttpResponseMessage GetExport(BusinessObjects.ExportRequest request)
{
    this.LogMessage("GetExport IN API", JsonConvert.SerializeObject(request));
    HttpResponseMessage resultMessage;            

    if (request != null && request.IsValid())
    {

        var content = "Joëlle;Mark;Peter";
        var encoding = Encoding.UTF8;
        resultMessage = new HttpResponseMessage(HttpStatusCode.OK);
        resultMessage.Content = new StringContent(content, encoding, "text/csv");
        resultMessage.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
        {
            FileName = request.GetFileNameForRequest()
        };
        resultMessage.Content.Headers.Add("MyFileName", request.GetFileNameForRequest());
    }
    else
    {
        resultMessage = new HttpResponseMessage(HttpStatusCode.BadRequest)
        {
            Content = new StringContent("Invalid ExportRequest.")
        };
    }

    return resultMessage;
}

Up to here everything was working ok. I found out that the problem was rather on the client side were I was getting the file:

There were 2 implementations (first for internet explorer, second for real browsers):

if (window.navigator.msSaveOrOpenBlob) { //Source: http://stackoverflow.com/questions/17836273/export-javascript-data-to-csv-file-without-server-interaction
    var blob = new Blob([decodeURIComponent(encodeURI(rslt.data.toString()))], {
        type: "text/csv;charset=utf-8"
    });
    navigator.msSaveBlob(blob, rslt.fileName);
}
else {
    var element = angular.element("<a/>");
    element.attr({
        href: "data:attachment/csv;charset=utf-8" + encodeURI(rslt.data.toString()),
        target: "_blank",
        download: (rslt.fileName) 
    })[0].click();
}

Now the only thing that did the trick was adding UTF-8 BOM at the start of the text. It was a little different for IE and the rest of the world.

if (window.navigator.msSaveOrOpenBlob) { //Source: http://stackoverflow.com/questions/17836273/export-javascript-data-to-csv-file-without-server-interaction
    var blob = new Blob([decodeURIComponent(encodeURI('\ufeff'+rslt.data.toString()))], {
        type: "text/csv;charset=utf-8"
    });
    navigator.msSaveBlob(blob, rslt.fileName);
}
else {
    var element = angular.element("<a/>");
    element.attr({
        href: "data:attachment/csv;charset=utf-8,%EF%BB%BF" + encodeURI(rslt.data.toString()),
        target: "_blank",
        download: (rslt.fileName) 
    })[0].click();
}

Now when I open the downloaded file in Excel, the special characters are showing fine.


ORACLE: How to find unused tables or indexes

There is a way to see how much reads and writes are done on a table. This query gives an overview of all reads done on tables and indexes. If you export this and generate it every week, compare and see what tables and indexes are used. No need to enable auditing. 

SELECT  vss.owner,
        vss.object_name,
        vss.subobject_name,
        vss.object_type ,
        vss.tablespace_name ,
        SUM(CASE statistic_name WHEN 'logical reads' THEN value ELSE 0 END
            + CASE statistic_name WHEN 'physical reads' THEN value ELSE 0 END) AS reads ,
        SUM(CASE statistic_name WHEN 'logical reads' THEN value ELSE 0 END) AS logical_reads ,
        SUM(CASE statistic_name WHEN 'physical reads' THEN value ELSE 0 END) AS physical_reads ,
        SUM(CASE statistic_name WHEN 'segment scans' THEN value ELSE 0 END) AS segment_scans ,
        SUM(CASE statistic_name WHEN 'physical writes' THEN value ELSE 0 END) AS writes
FROM    v$segment_statistics vss
WHERE   vss.owner NOT IN ('SYS', 'SYSTEM') and VSS.TABLESPACE_NAME = 'USERS'
GROUP BY vss.owner,
        vss.object_name ,
        vss.object_type ,
        vss.subobject_name ,
        vss.tablespace_name
ORDER BY object_type,  reads DESC;



EF: add new Migration when there are multiple migration configurations in your project

You could get this error:

More than one migrations configuration type 'Configuration' was found in the assembly 'BDO.DataLayer.MSSQL'. Specify the fully qualified name of the one to use.

And then look for a while for a solution. You'll find that you need to tell what configuratin type to use, but not so easy to find is how to get this name.

So extend the add-migration command like this (Verbose parameter shows the SQL that is being rendered):

Add-Migration "VATRefundExtraProperties" -ConfigurationTypeName "DataLayer.MSSQL.Migrations.Configuration" -Force –Verbose
And after update the database:

update-database -ConfigurationTypeName "DataLayer.MSSQL.Migrations.Configuration" -Force
You can also add the Verbose flag here if you like.

More info on the attributes for ?-migration:


Write HTML content to file

Not so very hard. The only tricky part is when there are IFrames on the page.

string
html;
using (WebClient client = new WebClient())
{
    client.UseDefaultCredentials = true;
    html = client.DownloadString("http://dailycode.info/dailycardgames/#/home");
    using (FileStream fs = new FileStream(@"d:\temp\files\test" + DateTime.Now.Ticks.ToString() + ".htm", FileMode.OpenOrCreate))
    {
        using (StreamWriter w = new StreamWriter(fs, Encoding.UTF8))
        {
            w.WriteLine(html);
        }
    }
}

Then I noticed that in my page (not the one from the example above) there were iframes. And they were not loaded. 

So I had to also download the iframes, save them to a file and replace the url of the iframe to the url of the file.

Also I noticed that for example the umlaut ö was not shown correctly, so I had to include the UTF8 encoding in the download as well as in the file writer:

At the end it looked like this:

string html;
using (WebClient client = new WebClient())
{
    client.UseDefaultCredentials = true;
    client.Encoding = Encoding.UTF8;
    html = client.DownloadString("http://dailycode.info/dailycardgames/#/home");
 
    MatchCollection iframes = Regex.Matches(html, "<iframe.+?src=[\"'](.+?)[\"'].*?>", RegexOptions.IgnoreCase);
           
    int i = 0;
    foreach (Match m in iframes)
    {
        string url = m.Groups[1].Value;
        string iframe = client.DownloadString("http://dailycode.info/dailycardgames/" + url);
        iframe = iframe.Replace(@"../../images", @"D:\temp\images");
        var iframehtml = @"d:\temp\files\testiframe_"+i+"_" + DateTime.Now.Ticks.ToString() + ".htm";
        using (FileStream fs = new FileStream(iframehtml, FileMode.OpenOrCreate))
        {
            using (StreamWriter w = new StreamWriter(fs, Encoding.UTF8))
            {
                w.WriteLine(iframe);
            }
        }
 
        html = html.Replace(url, iframehtml);
        i++;
    }
}
           
html = html.Replace(@"../images", @"D:\temp\images");
           
using (FileStream fs = new FileStream(@"d:\temp\files\test" + DateTime.Now.Ticks.ToString() + ".htm", FileMode.OpenOrCreate))
{
    using (StreamWriter w = new StreamWriter(fs, Encoding.UTF8))
    {
        w.WriteLine(html);
    }
}

As you can see I also replace the images location to a offline location. 



Logging: log4net type not resolved error!!!

I got this error when using log4net. Type is not resolved for member 'log4net.Util.PropertiesDictionary,log4net

It only occurred on the development machines. So we were looking for iisexpress problems. Could find a lot of unittest fixes etc, but this was not our problem. It simply crashed whenever we called logging in our web site. the dll's were nicely referenced.

When we used local iss, it works fine.

So at the end I found out that in iisexpress the dll was not always loaded because of multi threading. So the fix was to install the dll into the GAC:

gacutil /i D:\dev\packages\log4net\log4net.dll



Web API call Json object is null

I have a WCF service that call's a Web API 2 service. I serialize the object ot JSON and pass it to the API. But when deployed to the server the incoming object was null. I had a very hard time figuring out what was the problem, the server only teturned this message:

The remote server returned an error: (500) Internal Server Error

The code to call the web API was something like this:

using (var client = new WebClient())

{

    //var employeeJson = new JavaScriptSerializer().Serialize(empBDO);

    var empJson = JsonConvert.SerializeObject(empBDO);

 

    client.Headers[HttpRequestHeader.ContentType] = "application/json";

    empJson.LogMessage(CurrentProject, LoggingMode.Debug, "JSON to send");

 

    try

    {

        var response = client.UploadString(apiUrl, empJson);

    }

    catch (Exception apiex)

    {

        ("PostToAPI.CallAPI error = " + apiex.Message).LogMessage(CurrentProject, LoggingMode.Debug, apiex.StackTrace);

    }

}

If I called the API from an angular app with the same json that was logged, there was no problem. Everything worked. Then I started to add more and more logging and noticed that the incoming object in the API was null if I called it from the server.

At the end I had to add this line of code to the webclient that was calling the API:

    client.Encoding = UTF8Encoding.UTF8;

So the code looked like this:

using (var client = new WebClient())

{

    var empJson = JsonConvert.SerializeObject(empBDO);

    client.Headers[HttpRequestHeader.ContentType] = "application/json";

    client.Encoding = UTF8Encoding.UTF8;

    try

    {

        var response = client.UploadString(apiUrl, empJson);

    }

    catch (Exception apiex)

    {

        ("PostToAPI.CallAPI error = " + apiex.Message).LogMessage(CurrentProject, LoggingMode.Debug, apiex.StackTrace);

    }

}


AngularJS: trigger click on 'Enter' in input

Some time ago I wrote a directive to get the Enter key to trigger a function on the controller. It looks like this:

 .directive('keyBind', ['keyCodes', function (keyCodes) {
        function map(obj) {
            var mapped = {};
            for (var key in obj) {
                var action = obj[key];
                if (keyCodes.hasOwnProperty(key)) {
                    mapped[keyCodes[key]] = action;
                }
            }
            return mapped;
        }
        return function (scope, element, attrs) {
            var bindings = map(scope.$eval(attrs.keyBind));
            element.bind("keypress", function (event) {
                if (bindings.hasOwnProperty(event.which)) {
                    scope.$apply(function ()
                    {
                        console.log(event);
                        scope.$eval(bindings[event.which]);
                    });
                }
            });
        };
    }]);

and it used like this:

<input type="text" class="form-control" data-ng-model="playerText" name="player" placeholder="Speler naam" aria-describedby="basic-addon1" key-bind="{ enter: 'addPlayer()'}">

But this can be a lot easier using the ng-keyup directive by angular:

<input ng-keyup="$event.keyCode == 13 && $ctrl.save()" ng-model="$ctrl.editName" placeholder="name" class="form-control" />

It's the same method thats called on this button:

<button type="button" ng-click="$ctrl.save()" class="btn btn-primary">Save</button>

So in stead of writing my own directive, we can just use ng-keyup.


SharePoint: Create a scheduled task to run a SharePoint PowerShell script

I need to do some daily check on the SharePoint environment. If some site properties do not exist, I have to add them for that site.

The script is very easy, it just loops over the sites with "mark" in the url and then check if a certain property is present or not. If not, it adds the property to the site. I also log the start of the script execution and if a property is added, I also make a log. So I can check what sites where altered. Here's the script:

$sites=get-spsite -limit ALL
$startText = "Start CheckAllPropertiesAndFixResult"
$a = Get-Date -Format g
$a + " " +  $startText | Out-File C:\Scripts\CheckAllPropertiesAndFixResult.txt -append -width 200
foreach($site in $sites)
{
	Write-Host $site.Url
	if($site.Url -like "*/mark/*")
	{
		$web=$site.OpenWeb()
		if ($web.AllProperties["isactive"] -eq $null)
		{
			$web.AllowUnsafeUpdates=$true
			$web.AllProperties["isactive"]=1
			$web.Update()
			$web.AllProperties
			$web.AllowUnsafeUpdates=$false
			Write-Host "Update" $web.Url 
			$web.url  | Out-File C:\Scripts\CheckAllPropertiesAndFixResult.txt -append -width 200
		}
	}
}

As you see I had to put the full path of the out file in the script, else it didn't work.

Now in order to use this as SharePoint PowerShell in a normal PowerShell.exe I need to add the SharePoint snapin.

Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

Now the complete ps1 file look like this:

Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue
$sites=get-spsite -limit ALL
$startText = "Start CheckAllPropertiesAndFixResult"
$a = Get-Date -Format g
$a + " " +  $startText | Out-File C:\Scripts\CheckAllPropertiesAndFixResult.txt -append -width 200
foreach($site in $sites)
{
	Write-Host $site.Url
	if($site.Url -like "*/mark/*")
	{
		$web=$site.OpenWeb()
		if ($web.AllProperties["isactive"] -eq $null)
		{
			$web.AllowUnsafeUpdates=$true
			$web.AllProperties["isactive"]=1
			$web.Update()
			$web.AllProperties
			$web.AllowUnsafeUpdates=$false
			Write-Host "Update" $web.Url 
			$web.url  | Out-File C:\Scripts\CheckAllPropertiesAndFixResult.txt -append -width 200
		}
	}
}

Now we create a scheduled task on our SharePoint server. Choose as action: start a program. Then enter PowerShell in the Program/script textbox. Last add the file path as argument to the Add arguments (optional): textbox: &'C:\Scripts\CheckAllPropertiesAndFix.ps1'


Try to run the task and see if the result file is changed. Be sure that you set the task to run if user is not logged on.


.Net: Parse 24 hour data time from utc to central europe time

While I was handling time format differences, I noticed different things. For example, if the time or date comes in as 1/1/2016 1:1:1.

Then if you specified 'dd/MM:yyyy hh:mm:ss' it will not work. You will get an exception because the time format is not known. Change the format to 'd/m/yyyy h:m:s' and it will work. 

Just one problem still, of you get a time above 12, eg 1/1/2016 13:1:1 it will again fail saying time format not known. Because of the 'h' as hour format. Changing this into capital 'H' will handle the 24 hour format.

So my code for a deserialization of a property looks like this:

[OnDeserialized]

void OnDeserializing(StreamingContext context)

{

    if (this.SyncTimestampString == null)

        this.SyncTimestamp = null;

    else

    {

        try

        {

            //2/03/2016 15:40:47

            var result = DateTime.ParseExact(this.SyncTimestampString, "d/M/yyyy H:m:s", CultureInfo.InvariantCulture);

            TimeZoneInfo cstZone = TimeZoneInfo.FindSystemTimeZoneById("Central European Standard Time");

            this.SyncTimestamp = TimeZoneInfo.ConvertTimeFromUtc(result, cstZone);

        }

        catch (Exception er)

        {

            this.SyncTimestamp = null;

            LoggingComponent.LogError(this.ToString(), "SyncTimestampStringToSyncTime", er);

        }

    }

}