Are you sure that it’s the hard links that are filling your disk? I tested this here, but couldn’t reproduce it.
Theoretically CreateHardLink
should be equivalent to the mklink /h
command; but just in case, I created the following AutoIt script to make sure that I was using the same function call as you. (I was far too lazy to code something in VC++…)
#include <WinAPI.au3>
#include <WinAPIError.au3>
Local $kernel = DllOpen("kernel32.dll")
If $CmdLine[0] <> 2 Then
ConsoleWriteError("usage: CreateHardLink Link Target" & @CRLF)
Exit
EndIf
Local $result = DllCall($kernel, "BOOL", "CreateHardLink", "str", $CmdLine[1], "str", $CmdLine[2], "LONG_PTR", 0)
If $result[0] == 0 Then
ConsoleWriteError("Windows error " & _WinAPI_GetLastError() & ": " & _WinAPI_GetLastErrorMessage())
Else
ConsoleWrite("Hardlink created for " & $CmdLine[1] & " <<===>> " & $CmdLine[2] & @CRLF)
EndIf
Then I created a separate 2.0GB disk in VMware and attached it, so that the tests wouldn’t be on the same disk as the pagefile, etc.
Test #1: Create a file with 1024 hard links (1023 + original file):
I put one file in the root directory and created an additional 1023 links to it (the maximum number supported) with the following batch file:
@echo off
dir | find "(s)"
for /l %%i in (0,1,1023) do C:CreateHardLink.exe %%i %1
dir | find "(s)"
Disk usage before:
1 File(s) 3,212,078 bytes
0 Dir(s) 2,089,775,104 bytes free
Disk usage after:
1024 File(s) 3,289,167,872 bytes
0 Dir(s) 2,089,222,144 bytes free
And Explorer says there are 1.94GB free of 1.99.
Test #2: Many files, all linked to the same directory (your case):
I copied about 1.08GB of data (in files of various sizes, and located in various directories) to the partition, and created one hard link for each file found to a directory called HardLinks. That batch file:
@echo off
setlocal
setlocal enabledelayedexpansion
dir /s | find "(s)"
set /a i=0
for /r %%a in (*) do (
C:CreateHardLink "HardLinks\!i!_%%~nxa" "%%~a"
set /a i=!i!+1
)
dir /s | find "(s)"
Disk usage before:
2034 File(s) 1,109,324,978 bytes
1998 Dir(s) 975,511,552 bytes free
Disk usage after:
4246 File(s) 2,490,368,854 bytes
1998 Dir(s) 973,955,072 bytes free
This would be physically impossible without hard links, since my disk is only 2.0GB.
The disk space did decrease by exactly 1520K, which is ~1.46K per hard link created . At that rate, to consume 40GB in just metadata for hard links, you’d need about 29 million of them. (I imagine by that point you’d be hitting another limitation, like the number of file entries in a single directory. ;-)
Apologies that this isn’t a good “answer” per se; but hopefully it gives you some reassurance that, no, hard links aren’t supposed to fill your disk. If I were you, I’d create more hard links in smaller batches and measure the disk space usage before and afterwards. It might also be worthwhile to see if something else on the same disk is using more space than it should be.
I also can’t really think of an alternate solution for you; hard links seem perfect for this case, don’t they?