I just fired a couple of find commands with -exec rm option. I am pretty sure they are going to delete all those unwanted log files.
..... It's been a while since those find commands are running.
Bah! I am getting nervous for no reason.
I hope I will be able see my other directories and files alright tomorrow.
*Shudder*
Wednesday, September 21, 2005
Subscribe to:
Post Comments (Atom)
9 comments:
I hope so..
Its Freaky Friday tomorw though ;)
Greetz!!
I remember a scripted rm like this which nearly wiped out my final year project : thank god for cvs !
The above incident reminds me out this exploit I was executing on ma machine which I got from some underground site.
It compiled without giving any error...and I trusted this one coz previous 2 exploits didnt gave me any problem of any sort.
When I executed it, my whole linux setup got screwed up. It carried shell code full of "rm"'s :(
hey shu,
i got this http://www.faqs.org/faqs/aix-faq/part1/section-54.html
but i didnt get neone who tried this......how abt trying this?
just a coupla find/rm combinations ..
find /path -name '*.txt' -exec rm {} ';'
that works, with a O(n) time rm being fork/exec'd
but this works better
find /path -name '*.txt' | xargs rm -f
O(1) rm invocation, lot lesser fork/execs, works much faster when u got loooot of small files
the-almost-lost-your-final-year-proj-person - losing files arn't to be blamed on 'rm', you could lose them because of unexpected things, like 'indent' .. yeah.. i lost my proj like that.. it wasnt in cvs, no backups..i wanted to indent my source code files.. so i went on doing
$ indent -kr -i8 *
[ instead of *.c ]
and w00, my Makefile got indented too :D
--snip--
clean:
rm *.o *~
--snip--
that became
--snip--
clean:
rm *.o * ~
--snip--
and you know what i did next... make clean :-) .. and it wiped clean including the makefile :-)
the-one-who-ran-exploits-person:
bad boy! served u right >:)
LOL anand.
Luckily it worked alright for me. BTW, thanks for the commands.
Hey Anand ,
You forgot to quote your '{' in the -exec command. In some (all ?) shells it will barf without it.
And just to nitpick , xargs version might blow up in case the number of matched files is long >:)
Ofcourse I was not blaming 'rm' bless you ! I was blaming it on my incorrect usage :(
Like what they say vis-a-vis unix "The intelligent user knows" ... and if you make mistakes in your scripts (esp with rm , unlink, etc) , you will "know for sure" !
Neat "indent trick" btw ! ;-)
{} works fine without quoting in bash, bash is kinda half-intelligent in what to expand and what not to sometimes..
and xargs is never unhappy with number of arguments it receives.. infact no command is unhapy with that..
the usually seen effect 'argument list too long' is given by bash.. bash is unhappy about expending '*' or pathname expansion in general, if the list is too long (i guess there some env variable to set the limit, not sure)
ps: i'v experimented and confirmed it.
Post a Comment