Questions, questions, questions... on how to improve

Questions, questions, questions... on how to improve

Post by Jacob Saab » Fri, 07 Dec 2007 19:44:54


Hey guys,

running my logscript has made me ponder a few things, and I have some questions
that perhaps
you scripting pros especially could help me out with.

One of them being that maaaby, I should've done the appendfile thingy after
each logfile hehe.

But you know, that's why I did the script, to learn from it. So I've thought
about it for some days,
and have some questions:

1. Is there a way to increase performance by reading blocks of data from
the logfiles, instead of
line by line ?

2. If I was to add a progress indicator, wouldn't that need to know how many
files in total it's looking
through, to be able to scale the indicator accordingly ? I'm thinking about
doing a progress indicator
for:

- Progress in relation to the current logfile
- In relation to the current folder
- In relation to the total number of files

3. I'm considering also displaying a counter for how many results it's found.
Is there a way to statically
define a variable that's shown on screen, and when the value is updated,
it's updated on screen, but
not in a new screen position ? (I know Rexx had this in OS/2)

4. Also considering displaying some metrics. How much data, how many files,
how many lines exactly,
etc. Any obvious cmdlets or methods to use here, other then getting filesizes,
counting lines in a log-
file, etc. ?

Hope you can help me out with those questions. It's funny how when you've
made a script like that, and
you're running it (especially on such amounts of data), the obvious lacks
tend to get in your face pretty
quick ;)

But it's all good, I'm hoping to learn from it, and I'm hoping you can help
me out a bit, learning :)

Best Regards,
Jacob Saaby Nielsen
mailto: XXXX@XXXXX.COM
 
 
 

Questions, questions, questions... on how to improve

Post by Jacob Saab » Sat, 08 Dec 2007 16:32:01

Wow...

Noone has any input on these thoughts ? I would think that everyone could
learn from the answers, but perhaps
I was wrong.


Best Regards,
Jacob Saaby Nielsen
mailto: XXXX@XXXXX.COM

 
 
 

Questions, questions, questions... on how to improve

Post by Marco Shaw » Sat, 08 Dec 2007 23:54:23


Well, a function versus a filter can change the performance of how
objects are read in.

If you're doing:
get-content file|some_function
or
get-content file|some_filter

A filter is acting on each object (each line) as it is retrieved by
get-content, and passing that on. A function will wait for get-content
to read the entire file.

It is a matter of testing as functions and filters are basically
structured the same way.

Will reading blocks do better? I think a filter will do a better job at
reading bigger log files, and I'll try to demo that today/over the
weekend maybe.

Marco

--
Microsoft MVP - Windows PowerShell
http://www.yqcomputer.com/

PowerGadgets MVP
http://www.yqcomputer.com/

Blog:
http://www.yqcomputer.com/
 
 
 

Questions, questions, questions... on how to improve

Post by Marco Shaw » Sun, 09 Dec 2007 01:06:37


write-progress will help you here. You need to know what to pass to
write progress beforehand though.

See 'help write-progress -examples' for some ideas.

Marco
 
 
 

Questions, questions, questions... on how to improve

Post by Jacob Saab » Tue, 11 Dec 2007 16:36:47

Hey Marco,

thanks. I'll try do rewrite with a filter instead :) My script is STILL running
(6 days
after I started it), and I see that it's because it doesn't get a lot of
CPU time by
now.

However, there are no other processes that use the cpu on that machine. Is
there
some kind of built in "if the script has run for this long, I'll downthrottle
the cpu usage" ?


Best Regards,
Jacob Saaby Nielsen
mailto: XXXX@XXXXX.COM