So, today i was trying to zip up a large folder (that has a large nested set of subfolders), in linux. I did:
gzip -r bigfolder bigfolder.gz
and after it had been going for a couple of minutes I realised that this wouldn't make a single big gz file, but would instead individually zip, in place, every file (but not directory) within bigfolder
. So, I ctrl-c'd it.
Then, I realised that I'd also got the syntax wrong (one of those days): I thought the second argument would be where it was saved to, but it isn't: it's just going to do all the zipping in bigfolder in place.
So, I did:
gunzip -r bigfolder
and that seems to have turned it all back to normal. However, I'm worried that because i ctrl-c'd it, there might be a broken file in there, which had been halfway through getting zipped or something.
My understanding is that while ctrl-z will just KILL it instantly, ctrl-c is a bit more "gentle" and more likely to let a small sub-process, like gzipping one of the individual files, finish before stopping. But, as you can probably guess, my understanding of these things isn't too precise.
I don't have a copy of bigfolder which I can diff
against to see if it's broken. Is it likely to be broken, do you think?