Hello! First timer here and thanks to a lot of digging I stumbled upon this wonderful website.
I've been following these instructions :
https://www.data-medics.com/forum/h...ive-with-bad-sectors-using-ddrescue-t133.html
The problem for me is that I don't have enough space to create a 1TB image file to scan with R-studio so my question is :
Is it possible to break this process into for example 200GB images that eventually stack up to 1TB or would that jeopardize the data for rescuing?
The reason is that I have a laptop that's completely clean and ready to do this except it's only 250 GB and I wanted to be completely sure that this option is not available before I take drastic measures and create a 'recovery rig' so to speak.
Thanks for reading
I've been following these instructions :
https://www.data-medics.com/forum/h...ive-with-bad-sectors-using-ddrescue-t133.html
The problem for me is that I don't have enough space to create a 1TB image file to scan with R-studio so my question is :
Is it possible to break this process into for example 200GB images that eventually stack up to 1TB or would that jeopardize the data for rescuing?
The reason is that I have a laptop that's completely clean and ready to do this except it's only 250 GB and I wanted to be completely sure that this option is not available before I take drastic measures and create a 'recovery rig' so to speak.
Thanks for reading