I need to migrate a lot of data to a new folder structure, one of the things I will have to deal with is file paths which are too long.
To copy the data I have made a powershell script which uses robocopy to copy all data.
Of course I do not want to have disruption in this process (or the least disruption possible), what can I do to prevent issues with long file paths ?
Is there an easy way to modify my script and detect issues with long file paths ?
What would be the way to go preventing many copy errors during the robocopy action and fix possible issues before starting ?