I know this is (a bit) late, but for anyone else who stumbles across this question and has the same issue... My problem specifically is that I have hundreds of gigs of no-longer-needed files only on the Dropbox servers and I don't want to clean off a HDD just to be able to delete them with selective sync.
Deleting that many files still isn't possible from the web interface, but if you don't mind diving into the Dropbox API this can at least be automated and you won't have to use your own storage (I did this below with the Python SDK, but there are other language options). The file limit still applies, but the number of files in each directory can be counted to determine the proper way to delete them without running into the issue. Like so:
The following script takes your unique Dropbox API key and a list of Dropbox directories (deleteDirList
) as inputs. It then loops through each subdirectory of each element of deleteDirList
to determine if there are few enough files to be able to delete the directory without hitting the limit (I set the limit to a conservative(?) 10,000 files); if there are too many files, it deletes files individually until the count is below the limit. You'll need to install the Python package dropbox
(I use Anaconda, so conda install dropbox
)
Bear in mind this is a brute-force approach; each subdirectory is deleted one-by-one which could take a long time. A better method would be to count the files in each subdirectory then determine the highest-level directory that can be deleted without hitting the limit, but unfortunately I don't have time to implement that at the moment.
import dropbox
##### USER INPUT #####
appToken = r'DROPBOX APIv2 TOKEN'
deleteDirList = ['/directoryToDelete1/','/directoryToDelete2/'] # list of Dropbox paths in UNIX format (where Dropbox root is specified as '/') within which all contents will be deleted. The script will go through these one at a time. These need to be separate directories; subdirectories will deleted in the loop and will throw an exception if listed here.
######################
dbx = dropbox.Dropbox(appToken)
modifyLimit = 10000
# Loop through each path in deleteDirList
for deleteStartDir in deleteDirList:
deleteStartDir = deleteStartDir.lower()
# Initialize pathList. This is the variable that records all directories down each path tree so that each directory is traversed, files counted, then deleted
pathList = [deleteStartDir]
# Deletion loop
try:
while 1:
# Determine if there is a subdirectory in the current directory. If not, set nextDir=False
nextDir = next((x.path_lower for x in dbx.files_list_folder(pathList[-1]).entries if isinstance(x,dropbox.files.FolderMetadata)),False)
if not not nextDir: # if nextDir has a value, append the subdirectory to pathList
pathList.append(nextDir)
else: # otherwise, delete the current directory
if len(pathList)<=1: # if this is the root deletion directory (specified in deleteDirList), delete all files and keep folder
fileList = [x.path_lower for x in dbx.files_list_folder(pathList[-1]).entries]
print('Cannot delete start directory; removing final',len(fileList),'file(s)')
for filepath in fileList:
dbx.files_delete(filepath)
raise EOFError() # deletion script is complete
# Count the number of files. If fileCnt>=modifyLimit, remove files until fileCnt<modifyLimit, then delete the remainder of the directory
fileCnt = len(dbx.files_list_folder(pathList[-1]).entries)
if fileCnt<modifyLimit:
print('Deleting "{path}" and'.format(path=pathList[-1]),fileCnt,'file(s) within\n')
else:
print('Too many files to delete directory. Deleting',fileCnt-(modifyLimit+1),'file(s) to reduce count, then removing',pathList[-1],'\n')
fileList = [x.path_lower for x in dbx.files_list_folder(pathList[-1]).entries]
for filepath in fileList[-modifyLimit:]:
dbx.files_delete(filepath)
dbx.files_delete(pathList[-1])
del pathList[-1]
except EOFError:
print('Deleted all relevant files and directories from "{}"'.format(deleteStartDir))