Alternatives to dd for skipping empty space
I am attempting to archive a 240G internal SSD.
In the past I've experience difficulties using dd with command line compression, speed & I/O errors. Anyone using an alternative Linux utility for backup images of only used partition space (currently < 40GB)? |
Have you looked into PARTCLONE or FSARCHIVER?
|
if you backup a full partition there is no used/free space, this information is not available.
If you backup files/dirs you can obviously skip "free" space. You can use for example rsync to do that. |
Quote:
|
Quote:
|
Quote:
I will check out PartClone & FSArchiver. |
what do you mean by hidden/system file? What OS and filesystem is it?
|
Quote:
I see PartClone is part of Clonezilla & Wikipedia reads FSArchiver has a Windows version, but otherwise no obvious standout. |
Quote:
|
TAR
Normally you'd move the files to be contiguous. Then you'd Zero out free space. One can actually use windows backup feature on a hidden partition if they unhide it. If you have a way to attach a WD drive they offer a free Acronis version that works great. Attach can be isci I think too. Clonezilla is normally a file by file clone if it can read the filesystem. |
I am just curious. What sort of error do you get from a command like this?
Code:
dd if=/dev/sda1 bs=1M skip=0 count=100 | gzip -c > test.bin.gz Quote:
|
Quote:
What if we had room to clone the whole disk then split the .img before compressing each chunk after? I mull the possibilities. I also need to learn why defragmenting is "trimming" the disk instead of the old graphical read/write routine. If it works the partition could be shrunk to used + required space. |
Quote:
Quote:
Quote:
Quote:
|
And the question is again, what have you tried, what went wrong? What do you want to achieve?
If you have windows/ntfs better to use windows based tools to make backups. Why do you need linux at all? Zeroing the free space has a "side effect": you can easily backup the whole partition without storing all of those the zeroes (called sparse file). But anyway, as far as I see you want to backup files, and you have a solution already. It will automatically skip free space. Tar can make an archive like that, including splitting and compressing too. If you have I/O errors you need to fix them first, but it looks like it is not important. Copying 240GB may take some time, and there is no tool to overcome hardware limitations (if any). |
The sample command I just offered you ...
Code:
dd if=/dev/sda1 bs=1M skip=0 count=100 | gzip -c > test.bin.gz For example, the 100 mb "boot" partition which was standard on windows7, results in a backup file "test.bin.gz" of less than 10 mb. Of course, you would substitute the partition you want to backup in "if=/dev/", and put it's size (in mb) in "count=" This has worked for me for backing up fairly large windows partitions. But it can backup any partition that you can read from (even "hidden" partitions), no matter what filesystem/files it contains. Quote:
|
All times are GMT -5. The time now is 10:00 AM. |