How to Install Form Authentication Feature in IIS 10 on Windows Server 2016



Surprised! Cannot find Form Authentication in IIS 10 on Windows Server 2016. It took me a while to figure out.

  1. Choose ‘Add Roles and Features’
  2. Expand ‘Web Server (IIS)’
  3. Expand ‘Management Tools’
  4. Check ‘IIS Management Scripts and Tools’ and ‘Management Service’
  5. Restart IIS

TFS: Multiple Versions of DLLs in One Solution

In TFS, if there are multiple versions of 3-rd DLLs in one build solution, we got a problem because TFS dumps all DLLs in one work folder by default. In the end, only one version of the same DLLs remains in that work folder, and that one version of DLL will be copied to all destinations.

We can solve this problem by passing this parameter to msbuild.




A Review of Murach’s HTML5 and CSS3, 4th Edition

Today all modern browsers support the features of both HTML 5.0 and CSS3 (to some extent) according to w3school, quora, and many other online resources. HTML5 is the latest evolution of the standard that defines HTML, and has a larger feature set that allows the building of more diverse and powerful web sites and applications. CSS 3.0 is widely accepted as the right way to format web pages.

This book is just like any Murach’s book. The concepts are introduced from easy then to professional level. You can follow step-by-step by using its paired page layout. The authors begins with essential concepts and skills to design web pages. HTML5 semantic elements such as header, main, aside, section, article, footer, and nav are introduced. The authors explain how to create 2-tier and 3-tier navigation menu without using 3rd party libraries such as twitter bootstrap.

Responsive web design and its three major parts (fluid layouts, media queries, and scalable images) are well presented. The authors also introduces latest CSS3 layouts, Flexible Box Layout and Grid Layout, which are the latest approaches of laying out modern web pages. Other elements such as images, icons, tables, forms, audio, video, and fonts are included in this book.

To complete the course, there is a how-to-deploy website chapter.

I highly recommend this book. From reading this book, I feel that the advanced features of HTML5 and CSS3 can replace my current skills, especially twitter bootstrap, in a very easy and native way, and, therefore, I have to upgrade my skill set as well. To get the best of this book, please do:

  1. Download and install the free text editor, Brackets, which is tightly integrated with Chrome (click ‘Live Preview’ to use Chrome Developer Tool).
  2. Browse to the website,, and found out your browser’s HTML5 score. Remember that 555 is the perfect. score. I would recommend Chrome for the exercises.
  3. Download all the course materials, and do the exercises.


The review was first posted to Amazon.





More Threads Do More Work?

Did an experiment on multi-thread processing. The number of items to be processed was 988. I used this simple method of measurement.

  1. Round one. One thread to process one item
  2. Round two. One thread to process 2 items
  3. Round three. One thread to process 4 items
  4. Round N. One thread to process …

The raw result of the experiment:

Max Number of Items

in one Thread

Number of Threads Seconds
1 988 41
2 494 41
4 247 42
8 124 43
16 62 41
32 31 41
64 16 43
128 8 44
256 4 42
512 2 74
1024 1 135




  1. When one thread processed all 988 items, it took 135 seconds
  2. When two threads processed all 988 items, it took 74 seconds.
  3. When four threads processed all 988 items, it took 42 seconds.
  4. When number of threads were equal or greater than four threads, the processing time was around 42 seconds.


  1. It can save a lot of time using multi-thread processing
  2. It is almost a linear correlation between the time saved and the number of threads increased during early stage.
  3. Depending upon the context, the plateau is reached sooner or later. After that, it doesn’t matter how many threads you added, the time for processing is constant.


SQL Profiler Trace Skipped Records

If you insert a large amount of data, for example more than 1 MB, into SQL database in one single operation [1], the chance are that SQL Profiler is going to skip this operation, leave all fields blank, and mark as “Trace Skipped Records”. Don’t worry. The operation is still inserted into database, and only SQL Profiler skips the operation.

You might ask how to solve this problem. The easiest way to see records inserted is to decrease the size of data inserted. There are other ways, of course[2, 3, 4].



  1. SQL Insertion of Thousands Records Too Slow?



Parallel Programming and Command Design Pattern

The Command pattern allows requests to be encapsulated as objects, thereby allowing clients to be parametrized with different requests. It basically promote “invocation of a method on an object” to full object status [2]. These independent request objects can be used to build a solution for a more complex problem. For example, we can queue or log requests, and support undoable operations [2] using different data structures such as stack or queue, etc.

Another application of Command pattern is Parallel Processing, where the commands are written as tasks to a shared resource and executed by many threads in parallel (possibly on remote machines -this variant is often referred to as the Master/Worker pattern) [1]



  1. Command Pattern at Wiki
  2. Command Patern at SourceMaking

Does Single Responsibility Principle Apply to a Variable?

I ponder upon a pile of codes, and wondered what was going on. Why were so many smart decision makers in the lower-level business objects, and why they were making decisions or reversing decisions just made in higher-level layer?

Ok, they were making the decisions based on a parameter passed from higher-level business layer. This parameter was in such higher-level business layer that it had a huge influence on many lower-level business objects.

It dawned on me that this violated the Single Responsibility Principle (SRP), and overloaded one variable with two or more meanings. It, therefore, forced lower-level business objects to figure out the actual meaning in that particular context using adjustments of all kinds of assumptions and guessing.

I usually think about SRP in the context of classes (or objects), and don’t realized it applies to anywhere in our implementation even to a single variable. So we can use SRP to solve the problem here.

For reference to SRP, as quoted “The single responsibility principle is a computer programming principle that states that every module or class should have responsibility over a single part of the functionality provided by the software, and that responsibility should be entirely encapsulated by the class. All its services should be narrowly aligned with that responsibility. Robert C. Martin expresses the principle as, “A class should have only one reason to change.”


  1. Single Responsibility Principle



SQL Insertion of Thousands Records Too Slow?

Worked on improving another performance issue. This time I found that there were too many SqlCommand.ExecuteScalar() methods, and some of the return values were never used.

So let’s do some math. If we have 1000 records to be inserted, let’s assume that each insert takes 10ms to finish the round trip from business layer down to database and back.  10ms * 1000 = 10,000ms = 166 seconds = 2 minutes 46 seconds. What if we have 2000, 3000, 5000 records or more? Feel the pressure now?

In this case, the refactoring is to concatenate all the sql statements of those SqlCommand.ExecuteScalar() that return values are not used downstream into one big CommandText, delimited by ‘ ; ‘, and insert together using SqlCommand.ExecuteNonQuery(). I guarantee you the result will be impressive.

You might ask what if the string is too long, will it be truncated?

Answer: No. The CommandText can take a batch size of 64K x 4k = 256M [1].  By the way,  the max length of C# string type is 2,147,483,647 [5]. So you don’t worry about that either.



  3. SqlCommand.ExecuteScalar()
  4. SqlCommand.ExecuteNonQuery()

SQL Profiler Trace File Truncated?

Worked with a colleague to trace a performance issue using SQL profiler. The trace lasted a few minutes. However, the trace file was quite large. I began to browse through the records, and found out there were no records after a certain point in time. Apparently, the end of trace file was truncated.

Researched and found out the file was not truncated, and it just stopped recording. So if you expect larger size of trace files, please set up the proper parameters.


  1. Set a Maximum File Size for a Trace File (SQL Server Profiler)
  2. Limit Trace File and Table Sizes

Performance Tuning on C# LINQ

There was a performance issue in some large transactions. I investigated this issue. After spent a couple of hours on setting up tests and metrics, I began to time the performance of each component.

It turned out one component was the trouble maker. There were some queries with LINQ deferred operators used in this component, and there were time-consuming processing logics in these deferred operators. These queries were called again and again under the assumption that the result sets were materialized already. Therefore, these repetitive callings of these queries with deferred operators caused the performance issue.

Here is the link to Query Execution. It explains immediate and deferred operations, and when and how to use them.


  1. Query Execution