Hi!
I currently have in my environment a website that is replicated in 4 servers for load balancing. The thing is that now the "replication" is being done manually, so every file I have to copy to the site, I must repeat the process 4 times.
I wanted to implement DFS replication to get rid of this problem. But before I do it, I wanted to ask someone out there more experienced than me, if this would be the way to go, or if any of you guys have a better solution. Furthermore, I wanted to know if there is any problem to configure DFS knowing that the folders already exist (I had problems in Windows 2003 when I first configured replication and the replicating folders had files in them), or I just can configure it straightforwardly and won't encounter any known issue.
The 4 servers are Windows 2008 R2 Standard Edition, and the websites hardly have any file modification, and there aren't any databases in the folders.
Let me know if I'm missing any information that can help you to provide me a better solution :)
Thanks!
Regards