[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Omaha.pm] [odynug] PERL Problem
- Join the Omaha Perl Mongers mailing list!
http://omaha.pm.org
- I haven't tried "hundreds of files," but this uses signals so you're not wasting any system resources:
http://search.cpan.org/perldoc?POE::Wheel::FollowTail
The modules Christopher mentioned probably use signals too.
- Everyone I know avoids threading. Polling "hundreds of files" more than once an hour sounds like a really bad option to me.
$0.02,
j
On Apr 7, 2011, at 10:15 AM, Nick Wertzberger wrote:
> Hey all,
>
> Opinions Please!
>
> I am currently writing a script that watches hundreds of files at the same time to determine what's happening in logs. I would really liek to do this as efficiently as possible, and as far as i can tell, I have 4 options.
>
> 1: Use non-blocking file streams, which basically turns everything into a polling operation (wasted CPU).
> 2: Use fork() with blocking file streams, which is going to use more memory than I'd like.
> 3: Use threads + blocking file streams, which i hear Perl Threading is something to be avoided.
> 4: Use two programs, one that gets initiated per file I'm watching... this seems to have the problems of option 2.
>
> So what's up? What do you think is the way to go, and why?
>
> I'll be looking into #3 in the meantime.
>
> - Nick