Delete folders larger than specified size on Linux at regular intervals

Background: There is no room for the development environment to move around, mostly debug logs.So it's necessary to delete unnecessary logs when ...

Background: There is no room for the development environment to move around, mostly debug logs.So it's necessary to delete unnecessary logs when they're crazy.

Ideas: 1. Write a script to delete log files; timed task execution.But sometimes logs need to be saved for queries.So it's not perfect.

2. Delete the script save, the timed task to query if the system space has reached the critical value, then delete, otherwise no action is taken.

Log deletion script:

#!/bin/sh date "+%Y-%m-%d %H:%M:%S" echo ==========before rm========== df -h echo echo current dir size: du -sh echo find /logs -name "log.out.*" -user $USER -exec rm -f {} \; find /logs -name "log-201*.out" -user $USER -exec rm -f {} \; find /logs -name "facade.out.*" -user $USER -exec rm -f {} \; find /logs -name "monitorlog.out.*" -user $USER -exec rm -f {} \; find /logs -name "monitordetaillog.out.*" -user $USER -exec rm -f {} \; find /logs -name "catalina.out.*" -user $USER -exec rm -f {} \; find /logs -name "catalina.2*" -user $USER -exec rm -f {} \; if [ ! -f .cleanFile.sh ]; then echo "cat /dev/null>\$1">.cleanFile.sh chmod +x .cleanFile.sh fi find /logs -name "catalina.out" -user $USER -exec ./.cleanFile.sh {} \; find /logs -name "log.out" -user $USER -exec ./.cleanFile.sh {} \; echo ==========after rm=========== df -h echo echo current dir size: du -sh echo

Delete tasks regularly:

1 4 * * * /logs/rmlog.sh>>/logs/rmlog.sh.log

Specify timed tasks for size deletion:

# How much space is used for queries mya="df -h | sed -n '3p;' | awk '' | sed 's/G//'" # Execute log deletion script if space is insufficient awk 'BEGIN{ if($mya>34)else}' # Timed Tasks */2 * * * * awk 'BEGIN{ if($mya>34)else}'

Welcome to communicate!!!(

3 December 2019, 05:15 | Views: 8330

Add new comment

For adding a comment, please log in
or create account

0 comments