LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Hardware (https://www.linuxquestions.org/questions/linux-hardware-18/)
-   -   Alternatives to dd for skipping empty space (https://www.linuxquestions.org/questions/linux-hardware-18/alternatives-to-dd-for-skipping-empty-space-4175736356/)

JASlinux 04-23-2024 04:29 PM

Alternatives to dd for skipping empty space
 
I am attempting to archive a 240G internal SSD.

In the past I've experience difficulties using dd with command line compression, speed & I/O errors.

Anyone using an alternative Linux utility for backup images of only used partition space (currently < 40GB)?

wpeckham 04-23-2024 04:37 PM

Have you looked into PARTCLONE or FSARCHIVER?

pan64 04-24-2024 02:17 AM

if you backup a full partition there is no used/free space, this information is not available.
If you backup files/dirs you can obviously skip "free" space. You can use for example rsync to do that.

rknichols 04-24-2024 09:59 AM

Quote:

Originally Posted by pan64 (Post 6498019)
if you backup a full partition there is no used/free space, this information is not available.

Tools like partclone can produce a backup that does not include a supported filesystem's free space. Partclone's image format includes only the used space, and can optionally be compressed. If you play the game right with losetup, you can even produce a mountable sparse file that looks like the original filesystem but does not allocate any disk blocks for the free space.

pan64 04-24-2024 10:08 AM

Quote:

Originally Posted by rknichols (Post 6498068)
Tools like partclone can produce a backup that does not include a supported filesystem's free space. Partclone's image format includes only the used space, and can optionally be compressed. If you play the game right with losetup, you can even produce a mountable sparse file that looks like the original filesystem but does not allocate any disk blocks for the free space.

yes, partclone works on the filesystem, but dd has no any idea about that (including partclone.dd).

JASlinux 04-24-2024 09:59 PM

Quote:

Originally Posted by pan64 (Post 6498019)
If you backup files/dirs you can obviously skip "free" space. You can use for example rsync to do that.

Right, Linux, but this is a laptop Redmond partition with its hidden/system file nonsense. It needs to be a partition backup without the blank space.

I will check out PartClone & FSArchiver.

pan64 04-25-2024 01:39 AM

what do you mean by hidden/system file? What OS and filesystem is it?

JASlinux 04-25-2024 10:25 AM

Quote:

Originally Posted by wpeckham (Post 6497967)
Have you looked into PARTCLONE or FSARCHIVER?

Do you lean towards one or the other?

I see PartClone is part of Clonezilla & Wikipedia reads FSArchiver has a Windows version, but otherwise no obvious standout.

JASlinux 04-25-2024 10:26 AM

Quote:

Originally Posted by pan64 (Post 6498165)
what do you mean by hidden/system file? What OS and filesystem is it?

If you don't need Windows, consider yourself blessed.

jefro 04-25-2024 02:54 PM

TAR

Normally you'd move the files to be contiguous. Then you'd Zero out free space.

One can actually use windows backup feature on a hidden partition if they unhide it.

If you have a way to attach a WD drive they offer a free Acronis version that works great. Attach can be isci I think too.

Clonezilla is normally a file by file clone if it can read the filesystem.

mw.decavia 04-25-2024 07:21 PM

I am just curious. What sort of error do you get from a command like this?

Code:

dd if=/dev/sda1 bs=1M skip=0 count=100 | gzip -c > test.bin.gz
Quote:

Originally Posted by JASlinux (Post 6497965)
I am attempting to archive a 240G internal SSD.

In the past I've experience difficulties using dd with command line compression, speed & I/O errors.

Anyone using an alternative Linux utility for backup images of only used partition space (currently < 40GB)?


JASlinux 04-26-2024 12:14 AM

Quote:

Originally Posted by mw.decavia (Post 6498290)
I am just curious. What sort of error do you get from a command like this?

Code:

dd if=/dev/sda1 bs=1M skip=0 count=100 | gzip -c > test.bin.gz

If it were recent I could tell you. I remember a lot of frustration, vaguely recalling it could have been the file system (NTFS). It was also very slow (I use old computers).

What if we had room to clone the whole disk then split the .img before compressing each chunk after? I mull the possibilities.

I also need to learn why defragmenting is "trimming" the disk instead of the old graphical read/write routine. If it works the partition could be shrunk to used + required space.

JASlinux 04-26-2024 12:28 AM

Quote:

Originally Posted by jefro (Post 6498265)
TAR

Normally you'd move the files to be contiguous. Then you'd Zero out free space.

I know what that means about not why it is a suggestion. If you .tar'd a partition that could be an archive if accurately replicated, not having to worry about what else is on a source disk.

Quote:

One can actually use windows backup feature on a hidden partition if they unhide it.
Windows has a default system backup I use, but I need a Linux version.

Quote:

If you have a way to attach a WD drive they offer a free Acronis version that works great. Attach can be isci I think too.
If you can download it as owner of a drive that sounds good.

Quote:

Clonezilla is normally a file by file clone if it can read the filesystem.
Again, the goal is to skip the unused space, split into manageable chunks, & ideally compress, but maybe not all in one step if done manually.

pan64 04-26-2024 12:41 AM

And the question is again, what have you tried, what went wrong? What do you want to achieve?
If you have windows/ntfs better to use windows based tools to make backups.
Why do you need linux at all?
Zeroing the free space has a "side effect": you can easily backup the whole partition without storing all of those the zeroes (called sparse file).
But anyway, as far as I see you want to backup files, and you have a solution already. It will automatically skip free space. Tar can make an archive like that, including splitting and compressing too.
If you have I/O errors you need to fix them first, but it looks like it is not important. Copying 240GB may take some time, and there is no tool to overcome hardware limitations (if any).

mw.decavia 04-26-2024 09:16 AM

The sample command I just offered you ...
Code:

dd if=/dev/sda1 bs=1M skip=0 count=100 | gzip -c > test.bin.gz
Would backup all of a windows "boot" partition with free/empty space compressed to almost nothing. (if it did not fail with an error)

For example, the 100 mb "boot" partition which was standard on windows7, results in a backup file "test.bin.gz" of less than 10 mb.

Of course, you would substitute the partition you want to backup in "if=/dev/", and put it's size (in mb) in "count="

This has worked for me for backing up fairly large windows partitions. But it can backup any partition that you can read from (even "hidden" partitions), no matter what filesystem/files it contains.

Quote:

Originally Posted by JASlinux (Post 6498322)
If it were recent I could tell you. I remember a lot of frustration, vaguely recalling it could have been the file system (NTFS). It was also very slow (I use old computers).

What if we had room to clone the whole disk then split the .img before compressing each chunk after? I mull the possibilities.

I also need to learn why defragmenting is "trimming" the disk instead of the old graphical read/write routine. If it works the partition could be shrunk to used + required space.



All times are GMT -5. The time now is 10:00 AM.