In the past I rather would restart (disable/enable downloads) the torrent client after cleaning the torrent folder. I rather would not do that, since I always keep some of the torrents seeding and these need to be rechecked then, which takes quite a lot of time.
So I scripted a little. One script to get all active downloads which produces a file, one download a line. There I delete all the lines I do _not_ want to be removed. A second script takes all remaining lines and stops them.
Code: Select all
#!/bin/bash
session=65faa6af7444f27c471a21f4140513e0
FIELDS=7,17,25
BUBBA=bubba
curl -o - --cookie "PHPSESSID=${session}" "${BUBBA}"/admin/downloads/dolist 2>/dev/null \
|grep 'name="url"' -A1 -B11 | egrep 'style=""|input type="hidden"'| awk '{printf("%s%s", $0, (NR%3 ? "," : "\n"))}' \
|tr "<>\"" ";" |cut -d ";" -f ${FIELDS} |sort > downloads.csv
Code: Select all
#!/bin/bash
set -e
session=65faa6af7444f27c471a21f4140513e0
BUBBA=bubba
function getField {
echo "${1}" |cut -d ";" -f "${2}" 2>&1
}
Now edit downloads.csv if you do not want to remove all downloads.
while read entry; do
name=$(getField "${entry}" 1)
uuid=$(getField "${entry}" 3)
url=$(getField "${entry}" 2)
echo Stopping: "${name}" with uuid "${uuid}" at url "${url}"
curl -v --cookie "PHPSESSID=${session}" --data "url=${url}&uuid=${uuid}&do=Cancel" "${BUBBA}"/admin/downloads/remove 2>&1 |grep HTTP | cut -d '<' -f 2
done < downloads.csv
I may have had only torrent downloads removed in my first run, so no guarantee that other downloads are tested.
Of course this is just quick and dirty frontend parsing and parts of it will break with the next more than minor update for that. But again, that would be a good opportunity to support mass removals in the web interface, wouldn't it?
PS: The forum does not allow me to post the protocol "http" in the curl location - add that for yourselves (for example in the BUBBA-constant).