From: "Abraham Schneider" <aschneider@(email surpressed)>
Subject: Automatically delete logs when dumping job?
   Date: Fri, 11 Jan 2008 06:45:08 -0500
Msg# 1659
View Complete Thread (7 articles) | All Threads
Last Next
Hi there!

I'm just wondering how I can automatically delete the logs of a
particular job when dumping this job?

I think that should be possible with the Job Dump Command, but I'm
wondering about this description in the online help:

'The log output of the command will be written to a file in the Log
Directory calles "jobdumpcommand.log"'

Does this mean that I get a file which grows bigger and bigger? That
wouldn't be that perfect.

Thanks for your help,

Abraham

--
Abraham Schneider
Senior VFX Compositor
ARRI Film & TV Services GmbH  Tuerkenstr. 89  D-80799 Muenchen / Germany
Phone (Tel# suppressed) Email aschneider@(email surpressed)
www.arricommercial.de
www.arri.de

ARRI Film & TV Services GmbH
Sitz: München  -  Registergericht: Amtsgericht München  -  Handelsregisternummer: HRB 69396
Geschäftsführer: Franz Kraus

   From: Greg Ercolano <erco@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Fri, 11 Jan 2008 14:10:09 -0500
Msg# 1660
View Complete Thread (7 articles) | All Threads
Last Next
Abraham Schneider wrote:
> [posted to rush.general]
> 
> Hi there!
> 
> I'm just wondering how I can automatically delete the logs of a
> particular job when dumping this job?
> 
> I think that should be possible with the Job Dump Command, but I'm
> wondering about this description in the online help:
> 
> 'The log output of the command will be written to a file in the Log
> Directory calles "jobdumpcommand.log"'
> 
> Does this mean that I get a file which grows bigger and bigger? That
> wouldn't be that perfect.

	You probably want to include the -nolog flag, eg:
	http://www.seriss.com/rush-current/rush/rush-submit-cmds.html#JobDumpCommand

	Relevant quotes from the above:

--- snip
The option -nolog can be specified to disable the creation of the
'jobdumpcommand.log' file. This is useful to prevent 'file in use'
errors on Windows if you want the command to remove the entire logdir
as part of a cleanup operation.
[..]
The stdout and stderr output from the command is written to a file
called 'jobdumpcommand.log' in the LogDir. This can be disabled if
LogDir is disabled, or if the jobdumpcommand's '-nolog' option is
specified, eg. jobdumpcommand -nolog <command..>.
--- snip

	So when you fill out the form, you can specify:

		Job Dump Command: -nolog perl /path/to/your/cleanup.pl

	..and that will prevent the log from being created for your
	cleanup commands. You may want to redirect the output of
	the cleanup script somewhere else, so that you can still
	see errors.

	I should add the above to the "?" docs in the submit forms
	for all the "Job XXX Command" prompts.

	I think it never came up, because folks doing cleanup scripts
	have usually been writing their own custom submit scripts, and
	referred to the above docs when creating the actual rush submit commands.

	BTW, *be careful* when you write cleanup scripts!
	The common thing is to do 'rm -rf' on paths, but watch out;
	a small typo in a pathname might cause your cleanup script
	to recursively remove an entire prod directory if you're
	not careful how you write it. For instance, if the user
	submitted a scene file with a space somewhere in the pathname,
	the job fails, and then they dump the job, that stray space
	might cause big trouble when your cleanup script runs the
	resulting "rm -rf" command, eg:

            rm -rf /yourserver/BIGPROJECT /SCENES/1A/myproject.junk
                                        ^^^
                                        Stray space typed by user,
					causes BIGPROJECT to be removed!

	..so be sure to prevent that kind of thing by carefully quoting
	your pathnames, or calling internal commands (like perl's remove())
	directly, to avoid spaces being interpreted as path separators.

[01/27/2008 EDIT: Fixed typo "-nlog" -> "-nolog"]

-- 
Greg Ercolano, erco@(email surpressed)
Seriss Corporation
Rush Render Queue, http://seriss.com/rush/
Tel: (Tel# suppressed)
Fax: (Tel# suppressed)
Cel: (Tel# suppressed)

   From: "Abraham Schneider" <aschneider@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Mon, 28 Jan 2008 10:49:15 -0500
Msg# 1667
View Complete Thread (7 articles) | All Threads
Last Next
Hi Greg!

Thanks for your answer. Haven't had time to look on this topic for some days. Now that I did, I just recognized that I just don't know how to reference to the actual script I submitted from inside my dumpcommand script. This maybe a stupid question, but I'm a shake artist and not a fulltime programmer/scripter ;-)

Would be nice if you could give me a hint. Are there any variables that I can use in a shell script or how does this work?

Thanks, Abraham


Greg Ercolano schrieb:
[posted to rush.general]

Abraham Schneider wrote:
[posted to rush.general]

Hi there!

I'm just wondering how I can automatically delete the logs of a
particular job when dumping this job?

I think that should be possible with the Job Dump Command, but
I'm wondering about this description in the online help:

'The log output of the command will be written to a file in the
Log Directory calles "jobdumpcommand.log"'

Does this mean that I get a file which grows bigger and bigger?
That wouldn't be that perfect.

	You probably want to include the -nolog flag, eg:
      http://www.seriss.com/rush-current/rush/rush-submit-cmds.html#JobDumpCommand

	Relevant quotes from the above:

--- snip
The option -nolog can be specified to disable the creation of the
'jobdumpcommand.log' file. This is useful to prevent 'file in use'
errors on Windows if you want the command to remove the entire logdir
as part of a cleanup operation."
[..]
The stdout and stderr output from the command is written to a file
called 'jobdumpcommand.log' in the LogDir. This can be disabled if
LogDir is disabled, or if the jobdumpcommand's '-nolog' option is
specified, eg. jobdumpcommand -nolog <command..>.
--- snip

	So when you fill out the form, you can specify:

		Job Dump Command: -nlog perl /path/to/your/cleanup.pl

	..and that will prevent the log from being created for your
	cleanup commands. You may want to redirect the output of
	the cleanup script somewhere else, so that you can still
	see errors.

	I should add the above to the "?" docs in the submit forms
	for all the "Job XXX Command" prompts.

	I think it never came up, because folks doing cleanup scripts
	have usually been writing their own custom submit scripts, and
	referred to the above docs when creating the actual rush submit
      commands.

	BTW, *be careful* when you write cleanup scripts!
	The common thing is to do 'rm -rf' on paths, but watch out;
	a small typo in a pathname might cause your cleanup script
	to recursively remove an entire prod directory if you're
	not careful how you write it. For instance, if the user
	submitted a scene file with a space somewhere in the pathname,
	the job fails, and then they dump the job, that stray space
	might cause big trouble when your cleanup script runs the
	resulting "rm -rf" command, eg:

            rm -rf /yourserver/BIGPROJECT /SCENES/1A/myproject.junk
                                        ^^^
                                        Stray space typed by user,
					  causes BIGPROJECT to be removed!

	..so be sure to prevent that kind of thing by carefully quoting
	your pathnames, or calling internal commands (like perl's remove())
	directly, to avoid spaces being interpreted as path separators.



--
Abraham Schneider
Senior VFX Compositor
ARRI Film & TV Services GmbH  Tuerkenstr. 89  D-80799 Muenchen / Germany
Phone (Tel# suppressed) Email aschneider@(email surpressed)
www.arricommercial.de
www.arri.de

ARRI Film & TV Services GmbH
Sitz: München  -  Registergericht: Amtsgericht München  -  Handelsregisternummer: HRB 69396
Geschäftsführer: Franz Kraus

   From: Greg Ercolano <erco@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Mon, 28 Jan 2008 11:00:22 -0500
Msg# 1668
View Complete Thread (7 articles) | All Threads
Last Next
Abraham Schneider wrote:
> Thanks for your answer. Haven't had time to look on this topic for some
> days. Now that I did, I just recognized that I just don't know how to
> reference to the actual script I submitted from inside my dumpcommand
> script.

	It's pretty much just as shown in the example towards the
	bottom of the last post, where you would specify at the
	"Job Dump Command" prompt something like:

Job Dump Command: -nolog perl /path/to/your/cleanup.pl

	..where you specify the interpreter (in this case, 'perl')
	and the path to your dump command script.

	If your script were a csh script, then you might use:

Job Dump Command: -nolog csh -f /path/to/your/cleanup.csh

	If you prefer writing it as a DOS batch script (ie. if
	you're windows only), then you should be able to do:

Job Dump Command: -nolog cmd /c /path/to/your/cleanup.bat

> Would be nice if you could give me a hint. Are there any variables that
> I can use in a shell script or how does this work?

	Hmm, not sure what variables would help you derive
	the path to your script.. you should really just specify
	it as an absolute path, the way you would a path to
	the shake files, or image directories.

	Hope that helps; let me know if I'm missing something
	about your question.

-- 
Greg Ercolano, erco@(email surpressed)
Seriss Corporation
Rush Render Queue, http://seriss.com/rush/
Tel: (Tel# suppressed)
Fax: (Tel# suppressed)
Cel: (Tel# suppressed)

   From: "Abraham Schneider" <aschneider@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Mon, 28 Jan 2008 11:40:18 -0500
Msg# 1669
View Complete Thread (7 articles) | All Threads
Last Next
Ok, sorry, think it was my confuse english :)

The syntax to start the script is clear and fine. But if I dump a specific job, normally I'd like to do something with the script that was rendered with this job.

So if I'd like to delete the log folder of this job, I have to tell my cleanup script, which log folder to delete. So how can I get the name of the actual log folder, script name, etc. inside my cleanup script, because there is no syntax like:

-nolog perl /path/to/your/cleanup.pl submitted_script logfile_dir

Does this make it clear?

Abraham


Greg Ercolano schrieb:
[posted to rush.general]

Abraham Schneider wrote:
Thanks for your answer. Haven't had time to look on this topic for some
days. Now that I did, I just recognized that I just don't know how to
reference to the actual script I submitted from inside my dumpcommand
script.

	It's pretty much just as shown in the example towards the
	bottom of the last post, where you would specify at the
	"Job Dump Command" prompt something like:

Job Dump Command: -nolog perl /path/to/your/cleanup.pl

	..where you specify the interpreter (in this case, 'perl')
	and the path to your dump command script.
[..]

--
Abraham Schneider
Senior VFX Compositor
ARRI Film & TV Services GmbH  Tuerkenstr. 89  D-80799 Muenchen / Germany
Phone (Tel# suppressed) Email aschneider@(email surpressed)
www.arricommercial.de
www.arri.de

ARRI Film & TV Services GmbH
Sitz: München  -  Registergericht: Amtsgericht München  -  Handelsregisternummer: HRB 69396
Geschäftsführer: Franz Kraus

   From: Greg Ercolano <erco@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Mon, 28 Jan 2008 12:13:00 -0500
Msg# 1670
View Complete Thread (7 articles) | All Threads
Last Next
Abraham Schneider wrote:
> Ok, sorry, think it was my confuse english :)
> 
> The syntax to start the script is clear and fine. But if I dump a
> specific job, normally I'd like to do something with the script that was
> rendered with this job.
> 
> So if I'd like to delete the log folder of this job, I have to tell my
> cleanup script, which log folder to delete. So how can I get the name of
> the actual log folder, script name, etc. inside my cleanup script

	I see.

	Yes, the dump script will be passed several environment variables
	that are useful for this.

   RUSH_JOBID -- The jobid for the job. With this, you can
                 you can invoke 'rush -ljf' to get all the info about the job.
                 So if you want the log directory, grep for the "LogDir:" line,
                 or if you want the submit command, grep for the "Command:" line.

   RUSH_LOGFILE -- The path to the logfile for the dump command itself, eg:
                   /some/path/logs/jobdumpcommand.log
                   If you chop off the filename part (jobdumpcommand.log),
                   you can derive the log directory, eg:

			my $logdir = $ENV{RUSH_LOGFILE};
			$logdir =~ s%/jobdumpcommand.log%%;
                        print "LOGDIR=$logdir\n";

	Use RUSH_LOGFILE to derive the log directory, as this will save
	you the trouble of having to invoke 'rush -ljf'.

	However, if you also want the path to the submitted script,
	then you can either parse that from the 'rush -ljf' output, eg:

            open(LJF, "rush -ljf|");
            my $submitpath = "";
            while ( <LJF> )
            {
                if ( /^\s+Command: perl (\S+)/ )    # get the submit script's path
                    { $submitpath = $1; }
            }
            close(LJF);
            print "SUBMITPATH=$submitpath\n";

	..or parse the jobinfo file that will be sitting in the
	log directory, eg:

            my $logdir = $ENV{RUSH_LOGFILE};
            $logdir =~ s%/jobdumpcommand.log%%;

            my $jobinfo_file = "$logdir/jobinfo";
            my $submitpath = "";
            unless ( open(JOBINFO, "<$jobinfo_file") )
                { die("$jobinfo_file: $!\n"); }
            while ( <JOBINFO> )
            {
                if ( /^\s+Command: perl (\S+)/ )    # get the submit script's path
                    { $submitpath = $1; }
            }
            print "SUBMITPATH=$submitpath\n";

	The latter is a bit longer, but prevents hitting the job server
	with the 'rush -ljf' request, accessing the file server to get
	the info instead.

> because there is no syntax like:
> 
> -nolog perl /path/to/your/cleanup.pl submitted_script logfile_dir

	You can do that if you want.. you can pass arguments to your
	cleanup script just that way. In the case of the above, you
	could access submitted_script and logfile_dir as $ARG[0] and $ARG[1]
	respectively.

	But I could see where you'd want logfile_dir to be automatically
	generated by the script at runtime, so it'd be best to use
	the RUSH_LOGFILE variable for that.

-- 
Greg Ercolano, erco@(email surpressed)
Seriss Corporation
Rush Render Queue, http://seriss.com/rush/
Tel: (Tel# suppressed)
Fax: (Tel# suppressed)
Cel: (Tel# suppressed)

   From: "Abraham Schneider" <aschneider@(email surpressed)>
Subject: Re: Automatically delete logs when dumping job?
   Date: Mon, 28 Jan 2008 13:27:13 -0500
Msg# 1671
View Complete Thread (7 articles) | All Threads
Last Next
Thanks Greg, that's exactly what I wanted to know but failed completely on the first try :)

Abraham


--
Abraham Schneider
Senior VFX Compositor
ARRI Film & TV Services GmbH  Tuerkenstr. 89  D-80799 Muenchen / Germany
Phone (Tel# suppressed) Email aschneider@(email surpressed)
www.arricommercial.de
www.arri.de

ARRI Film & TV Services GmbH
Sitz: München  -  Registergericht: Amtsgericht München  -  Handelsregisternummer: HRB 69396
Geschäftsführer: Franz Kraus