Case Study: : PowerShell File Transfer Slowness Issue

Moving a Large Number of Small Files under an SMB 3.0 Shared Drive

LAI TOCA
3 min readMay 10, 2024
Create by https://www.bing.com/images/create/

Background: We have a task scheduled every morning to move a significant amount of log data from a server located on the intranet to a server in the DMZ. Typically, this task takes around 6 hours to process approximately 10,000 files (totaling above 10GB) per day. However, one day, the task suddenly exceeded 12 hours and wasn’t completed. This delay heavily impacted other tasks reliant on uploading these logs to thirty’s location as part of our daily procedure.

As seen in the picture, the transfer time became frighteningly slow, posing a significant problem when dealing with a large number of small files.

Intranet -> FW -> DMZ

We requested the network team to investigate why the transfer rate had decreased significantly. The network team reported that they hadn’t changed any firewall rules and suggested that the performance issue might lie in using the common internet file system (SMB).

Subsequently, our team spent additional working hours surveying the SMB server tuning by enabling SMB multichannel and modifying performance counters of the SMB server…etc.

We also decided to rewrite the moving command from Move-Item to robocopy.exe to take advantage of its multi-thread processing capabilities.

        
# .....
# rewrite the move-item one by one to robocopy
# $files = Get-ChildItem -Path $ZipSource -Recurse -Filter "*.zip"
# foreach ($file in $files) {
# $Logger += "`t Move file: $($file.FullName.Replace("\", "/")) `r`n"
# Move-Item -Path $file.FullName -Destination $Destination -Force
# }

robocopy.exe $ZipSource $Destination *.zip /J /S /MOV /LOG+:$LogPath /BYTES

#.....

Please note that robocopy.exe skips moving files if the source path already exists in the destination path. If you wish to delete the source file path after robocopy completes the task, you could check the source directory and handle the deletion accordingly.

However, none of these solutions worked. A colleague suggested that we should prove that our application and SMB were capable of handling this volume of data. So, we conducted experiments with the same amount of data transferred within the route: intranet->FW->intranet.

Intranet -> FW -> Intranet

The speed was not as fast as expected but was reasonable compared to the intranet -> FW -> DMZ route.

Armed with this evidence, we approached the network team again to identify the factors causing the slowdown for traffic routed to the DMZ server. Eventually, the network team informed us that they had updated the firewall firmware to prevent zero-day attacks. They suspected that this patch update had affected the transmission speed. As a workaround solution, they set up a special firewall policy to address the issue.

Following the performance tuning and firewall policy modifications, transmitting the same volume of data (10~GB, more than 10,000 files) now only takes 12–15 minutes.

Intranet -> FW -> DMZ

This experience taught us a valuable lesson about the challenges of cross-functional coordination. Each team has its own priorities and missions to consider (security vs operation), and compromise is often necessary.

Reference

--

--

LAI TOCA

Coding for fun. (Either you are running for food or running for being food.)