Where are Docker Layers Saved?

I began to wonder where docker layers are saved to when experimented with Docker multi-stage build.

After searched through the container host, I found they were at

C:\ProgramData\docker\windowsfilter\

If drill into the hash folder of SHA, I found, at least, these two folders, Files and Hives. Inside the Files folder, I found all the OS and web app contents. I stopped right here. But you can keep drilling.

Advertisements

DevOps Tip – Use PowerShell to Install Windows Feature

It used to take weeks, even months, to configure the servers. This must be changed. Powershell is the tool to automate the configuration of servers. Here is one tip to configure windows feature.

Use Get-WindowsFeature to inspect the feature you would like to install

Get-WindowsFeature

Get-WindowsFeature | Where-Object {$_.Installed -eq "Installed"}

Then use Install-WindowsFeature. For example, we can install
IIS Management Scripts and Tools.

Install-WindowsFeature Web-Scripting-Tools

[X] IIS Management Scripts and Tools Web-Scripting-Tools InstalledWeb-Scripting-Tools

Correction – Speeds among Cloud Serves are Very Fast

Last time, I mentioned that the speeds among cloud servers were very slow when I issued a copy command on an on-premise machine to copy data from one cloud server to another cloud server.

After I talked to network guru about this, he said it was because the network traffic was NOT from source cloud servers to destination cloud servers directly in this particular use case. Instead, the traffic made a round trip from source cloud servers to on-premise servers, and then from on-premise servers to destination cloud servers. So to confirm what he said, I went on the cloud server and issued a copy command from that server to another cloud server. He was right. The speed was very fast.

Quiz – Can We Call Static Method of an Abstract C# Class?

Will this block of codes be compiled successfully and executed? What’s the output? To add some excitement, we can add a T to class signature. I went across this style when refactored. You might wonder what use cases are.

 using System.IO;
using System;

class Program
{
static void Main()
{
AbstractClass<string>.StaticMethod();
}
}

public abstract class AbstractClass<T>
{
public static void StaticMethod()
{
System.Console.WriteLine("It's a static method!");
}
}

Changing App Settings in Web.config for Deployments to Different Environments Using Powershell

Sometime we need to change the app settings in web.config for deployments to different environments. Here is one option.

First, get the location of web application. The parameter $appName should be the virtual path such as “folder/app1”.

function Get-WebConfigLocation([string]$appName) {
return (Get-WebApplication `
-Name "$appName" `

-Site "Default Web Site" `
| Select-Object -first 1
).PhysicalPath
+ "\web.config"
}

Second, we read the web.config, find using regular expression and replace strings, and save the changes back.

function Replace-AppSetting(
[string]$webconfig,
[string]$findString,
[string]$replaceString) {
(get-content $webconfig) `
| foreach-object {$_ -replace "$findString", "$replaceString"}
| set-content $webconfig
}

Here is the exercise. We can wrap these two functions into one script with one parameter, $applicationName, so that we can reuse.

References:

https://stackoverflow.com/questions/17144355/how-can-i-replace-every-occurrence-of-a-string-in-a-file-with-powershell

Get-Member

Regular Expression

Read Committed Isolation Level, Updates Blocks Reads

Worked on an issue with incomplete or inconsistent records. Wrapped those lines of codes with TransactionScope and Read Committed isolation level, and it did solve the issue.

However, I did find something interesting and kinda refresh my knowledge. In default Read Committed isolation level, updates blocks reads from other transactions.

As quoted from [2], “Transaction isolation levels control the following:

  • Whether locks are taken when data is read, and what type of locks are requested.
  • How long the read locks are held.
  • Whether a read operation referencing rows modified by another transaction:
    • Block until the exclusive lock on the row is freed.
    • Retrieve the committed version of the row that existed at the time the statement or transaction started.
    • Read the uncommitted data modification.

Choosing a transaction isolation level doesn’t affect the locks that are acquired to protect data modifications. A transaction always gets an exclusive lock on any data it modifies and holds that lock until the transaction completes, regardless of the isolation level set for that transaction. For read operations, transaction isolation levels primarily define the level of protection from the effects of modifications made by other transactions. “

The highlighted portion explains what I observed. However, I also observed that the updates still block reads even if I just choose a single and different row. Why?

References:

  1. https://docs.microsoft.com/en-us/sql/t-sql/statements/set-transaction-isolation-level-transact-sql?view=sql-server-2017
  2. https://docs.microsoft.com/en-us/sql/connect/jdbc/understanding-isolation-levels?view=sql-server-2017

What’s Really Going On When Copying Zip Packages?

Sometime ago, I deployed web deployment zip packages among Azure cloud servers using Windows File Copy Task. I was expecting the speed among cloud servers should be fast. However, the speed was really slow. So I began to look at the console. I was surprised to find that there were so many thousands of files flew through the console. I then took a look at the destination servers. However, there were only zip packages present on the destination servers.

I used to think a zip package is copied as a whole just like a single file. Apparently, it wasn’t the case here. Here is what I observed, the Windows File Copy task unpacked the zip packages, transferred one file at a time, and assembled files back into zip packages.

Cloud Tip – Azure Data Box or AWS Snow Ball

When I read about Azure Data Box and AWS Snow Ball, I thought it’s funny. Is it a joke or something? How can we still use box [1] or ball [2] to ship data physically? Don’t we just use our high speed connections such as Azure Express Route and AWS Direct Connect? As the matter of fact, we already use 1GB internet connection at home, right?

It is really happening. The speed of data transfer to cloud is really slow. Azure Data Box or AWS Snow Ball is not a joke, and is designed for a reason. This fact will change the cloud strategy for sure, and we have to design our cloud solutions accordingly.

References:

  1. https://azure.microsoft.com/en-us/services/storage/databox/
  2. https://aws.amazon.com/snowball/

AWS – Certified Solution Architect Associate, Passed

Tags


Took this certification exam on December 8th. After I finished my first pass, there were only 4 minutes left. I had many questions I kind of felt uncertain, but I only flagged three questions. So I reviewed these three questions, and change one answer out of three. At this time, I only had 32 seconds left. I then just sat there, and enjoyed the 32 seconds flied by until timeout.

Each question was very difficult, and I was under constant time pressure. I kept looking at the on-screen clock all the time, and told myself kept pushing forward. The experience was wonderful.

FYI, the exam is 130 minutes, and has 65 questions.

https://aws.amazon.com/certification/