I have a nohup command that is executed in a Linux server.
nohup python /home/user/code.py 1>>/home/user/python_log 2>&1
Python run log is logged under
Issue is each run generates too much log that the size of the
python_log is currently about 650 MB.
Is there a way to separate the logs based on the date (or even a incrementing number to resolve the file size issue) of when the cron job runs in procedural manner. For example:
Using the date on the file name
This should be what you asked for.
nohup python /home/user/code.py 1>>"/home/user/python_log_$(date +"%d_%m_%Y")" 2>&1
I usually save dates the other way around though.
nohup python /home/user/code.py 1>>"/home/user/python_log_$(date +"%Y_%m_%d")" 2>&1
Logging in Python
The answer above is just saving the stdout and stderr to the file. Python has it’s own logging module which should be used for more complex scripts and applications. Read more in the Python Logging HOWTO.
Handling log files in Linux
Depending on your distribution there are tools available to handle your log files and rotate them depending on age and or size.
If you’re working on a server the SysAdmin should have a space limit for your user in place or you should be using a seperate partition for the log files to ensure you do not potentially crash the system with bloated log files.
Answered By – SvenTUM