Quantcast
Channel: File Services and Storage forum
Viewing all 10672 articles
Browse latest View live

access copied files

$
0
0
our company using active directory roaming profiles as a login windows, lately we discover that if we copy the files from our servers to local, we just can only open it when we connect to our intranet, but once we disable the network or disconnect from our intranet, we cannot open the files, all office files type, and also PDF format, but we still can open the txt file via notepad, what I want to ask is, how can it happen? I tried the inheritance method, but it is not working, what should I do?, should we make another group policy? or what else, hope someone can help, thank you

Administrator cannot take ownership of files

$
0
0

I have a Windows 2008 R2 file server. Some files seem to have lost all permissions. I have tried several methods to take ownership of these files so that I can restore the proper permissions. Here is what I have tried while logged in as the local administrator and as a domain administrator (all result in "access denied"):

  • Standard take ownership method via the GUI
  • Running explorer.exe as Administrator, then perform the usual take ownership method.
  • Using the TAKEOWN command (both on the individual file, and parent directory with /R) in an administrative command prompt.
  • Using the TAKEOWN command (both on the individual file, and parent directory with /R) in a command prompt as the "SYSTEM" user ("psexec -i -s -d cmd" from an administrative command prompt).
  • Trying all of this with UAC off/on.
  • Trying to change permissions with icacls in an administrative command prompt, and again in a SYSTEM command prompt.
  • Trying to copy the files using various accounts.
  • Trying to take ownership via the network share as a domain administrator.
  • Double-check share permissions.
  • Taking ownership of the parent folder with administrative explorer.exe, and check "Replace owner on subcontainers and objects".

While I could probably restore the files from backup, the problem remains that I won't be able to remove or overwrite the junk files. Also, there are no previous versions of these files in Shadow Copy.

Any other ideas besides rescue disks, chkdsk, and booting to safe mode (I can't bring the server down during the day)?

EDIT: I have also ensured that nobody has the file opened by using the Share and Storage Management tool.

ADFS/WorkFolders keeps requiring all client machines to enter password

$
0
0

I configured ADFS with a WAP reverse proxy with a back-end Work Folders server. I configured the system with proper 3rd party certificates and everything connects fine both internally and externally. However after an amount of time, usually less than a couple days, client machines will ask for the current password. If I open the work folders control panel manually for domain joined machines and click credential manager, I am prompted to enter the password. The user name is already saved. Entering the password allows the machine to sync. For workplace join machines, I am prompted for the ADFS login page where I have to enter the username and password. This sync works as well once entering the password. The problem I have is that neither the domain joined internal machines or workplace join external machines will keep the password more than a day. I even had one machine prompt for the password within about 20 minutes after re-entering the credentials.

All machines including ADFS server, WAP server, WorkFolders server, and client machines are fully updated. Clients machines are Windows 7 Pro and Windows 10. I need the Work Folders configuration on the client machine to never prompt for a password. Otherwise this utility is pointless.


DFS and Windows Explorer

$
0
0
Windows explorer will not display the "Owner " it shows the SID when viewed from a drive letter that is mapped to the DFS ShareName. The "Owner" displays fine if you browse to windows share. First you need to set Windows Explorer to display the "Owner" column.

BSOD - Windows Server 2008 X86

$
0
0

I have just had 3 random restarts on a production system.

The Errors were as follows.

Mini042516-01.dmp25/04/2016 09:51:59PAGE_FAULT_IN_NONPAGED_AREA0x100000500x997050480x000000000x81dc32bf0x00000000ntoskrnl.exentoskrnl.exe+17c2bfNT Kernel & SystemMicrosoft® Windows® Operating SystemMicrosoft Corporation6.0.6002.19503 (vistasp2_gdr.150926-0606)32-bitntoskrnl.exe+17c2bfntoskrnl.exe+18281antoskrnl.exe+182e48ntoskrnl.exe+17d86eC:\Windows\Minidump\Mini042516-01.dmp815 6002175,984 25/04/2016 09:55:02

I have had a look on the net but not finding much information regarding 0x10000050 . 

lots of hits to 0x00000050  suggesting fault ram or corrupted ntfs, Can I check anything else to help isolate the problem.

Best Regards

Al.

File Sharing

$
0
0

Hi,

Want to Share A folder (name) Share. Under Share Folder Create Folders in the name of Departments. Want User can Create Files & Folder in any Department instead of Parent Share Folder. OS is Windows Storage Server 2008 R2 with AD.

Tried but not working. Please help.J


Shiju Chacko Stay Happy Stay Ahead :)

Storage Spaces, physical and logical sector sizes in the primordial pool

$
0
0

Hello,

I'm trying to think of how to word this so it doesn't get overly complicated, but I don't know if I can. Let me start with this: What happens when you have disks with different physical sector sizes in the pool? if I have a primordial pool that's initially comprised of disks with a physical sector size of 512 bytes, and I later add a disk with a 4K sector size, does that disk automatically get set to have a logical size of 512 bytes?

Which then follows on with, should I keep disks with different sector sizes in different pools, and then specify the virtual disks in the 4K pool to have 4K clusters? And in that situation, will Windows automatically set the logical sector size to be 4K (or do I have to do something to make sure the logical sector size is matched?), and make it so the virtual disk is aligned, as long as I set the cluster size?

Thank you!

Missing files on secondary DFSR server

$
0
0

Hi

We have a problem since migrating all ou server to Windows 2012 R2.

We have 2 servers with DFS replication enabled.

We have a two way replication enabled between the two.

The issue is:

Some files are missing on the secondary server(last among all target). There is nothing in the event log that mention these files. These files don't have the temp attribute, they are not locked for write and they are not open by anyone.

What is weird is, if we open one of the file on the primary server and just close it without doing any modification, now it replicate itself to the secondary server.

Those files aren't in the ConflictAndDeleted folder on any of the servers.

Help us please!


Mapped drive documentation

$
0
0
Is there any Microsoft documentation that mentions that it's not recommended to install applications [heavy custom apps] on to a Network mapped drive?

Ayan Mullick Principal Infrastructure Engineer. British Petroleum.

DFS Replication not working - The service has automatically initiated the journal loss recovery process

$
0
0

Hello,

We have 2 Servers with Windows 2012 R2 which are replicating files using DFS-R.
It happened now twice that the replication didn’t work and had to be rebuild. The problem started after event ID 2204:

The DFS Replication service has detected an NTFS change journal loss on volume D:. The service has automatically initiated the journal loss recovery process.

After that, we had to reboot the server and DFS-R started to recover from a dirty database shutdown.
Once the database was repaired, DFS-R had to work off on more than 1 million backlogs. It took about 3-4 days until both DFS-R Servers were in sync again.

I didn't find much information about ID 2204 but this seems to be our root cause.

I hope someone can help.

Regards,

Philipp

Grant user right to read file/folder NTFS permissions but not read the file itself

$
0
0

Hi there,

there's a kinda complicated request from IT Security guys:

Shared Folder on Win 2012 R2 Fileserver has a set of NTFS permissions for various groups/users.

ITSec group needs access to the shared folder to audit NTFS permissions (view security tab in properties of a file/folder)

But at the same time ITSec group doesn't want to be able to read the actual file (e.g. open a docx,pdf,whatever)

Trying to get my head around this, but so far no luck.

Any help appreciated

Why am I having access denied running a scheduled task for Write-DFSRHealthReport?

$
0
0

Hi everyone,

I need your help understanding what are the necessary permissions in order to have my service account sufficient permissions to run a scheduled task that goes to all

$groups = 'Group1','Group2' foreach ($group in $groups){ $members = 'MainServer','Server1','Server2' $report_parms = @{'GroupName' = "$group"; 'ReferenceComputerName' = 'MainServer'; 'MemberComputerName' = $members 'Path' = 'c:\dfsreports\'; 'CountFiles' = $true } Write-DfsrHealthReport @report_params -Verbose

}

This is part of the script. After that it emails me the results.

This is set up in a Scheduled Task, running as that specific Service Account I mentioned earlier.

When I receive the email, I get:

  • For MainServer "Not a local administrator. Access denied" (but... The account is a local administrator, I've ensured that one.
  • Also for MainServer "Cannot retrieve version vectors from this member due to a Windows Management Instrumentation (WMI) error."
  • For Server1 and Server2, "Cannot connect to reporting DCOM Server. Access Denied".
  • Yet, for server1 and server2 I get "Due to the following error, the DFS Replication reporting mechanism cannot access the WMI (Windows Management Instrumentation) namespace to retrieve certain reporting information. Error ID: 0x80041002."

I've seen somewhere some instructions to go to component services -> computer -> properties and giving permissions to the service account without any luck. Also I've ensured that the user is a local admin on the servers, and yet, I get this errors.

The funny thing is that, if I run the scheduled task with my domain admin account, I get the whole results without a problem.

If I open a command line as that service account, and run the script it gives me more outputs than as a scheduled task, but gives the error in the last line of errors I've put above.

The account is set to allo logon as a Batch, and also the scheduled task itself is set to run with the highest privileges.

Thanks in advance for any help.

Nuno Silva


Help with file services design - needs to be highly available yet simple - using server 2012 R2

$
0
0

Hello out there -

I'm looking for some input on one of my projects.  We have file shares scattered around storage systems and various aging servers.  We're looking to consolidate all of these shares and add features from server 2012 as well.

Some background:  We're a VMware shop that runs off a fiber channel SAN, which uses Veeam for backup.  There are about 750 clients - primarily Windows 7.  The shares consist mostly of MS Office type files, some foxpro apps and a few access database.  While the file shares are not exactly mission critical, I would like to maintain 99.9% up-time or greater and minimize maintenance downtime.  Basically if I am going to consolidate shares, I need this to be nearly maintenance free and of little impact to users when there is maintenance or troubleshooting to be done.  Some of our departments are 24x7. That said, there are some maintenance windows to work with.  I'd like to still backup normally with Veeam, and retain most of the features I normally get in VMware with vMotioning servers around. I also want to enable deduplication and shadow copies with Server 2012.

Some solutions come to mind:

  • Cluster services - certainly would reduce downtime for things like windows updates, but has restrictions when virtualized.  And then in some scenarios Veeam won't be able to back it up because it can't snapshots.
  • DFS - seems like some people love this and others hate it.  Looks like people have file lock issues, however these can be overcome with some priority setting changes?  This seems like I would be able to do windows updates and the DFS replicas would continue to server file shares.  I worry here about all clients needing to be DFS aware.
  • Server Core with multiple standalone servers - Aside from learning to work with Core - this would be the most simplistic way, and I could spread the impact of downtime by creating 2 or 3 "main" file servers.  Running core should reduce the need to reboot for windows updates.  In the event of an OS failure, there would be an outage but I could have a backup online quickly using Veeam.

I have not used a cluster for a file server before, nor have I configured DFS, and I only just started playing with 2012 Server Core.

Someone else out there must have had to make this decision too.  What did you decide on and why?  What are your experiences with DFS being deployed at this scale?  Is management of a file server a real pain with server Core?  Is there a better solution?

I welcome your comments and input.  Thanks!

Weird file transfer

$
0
0
We have been doing some large file transfers and each one has been chugging along at 40-50 MB/s then all of a sudden about halfway through, it drops down to 1 MB/s.  Has anyone experienced something like this before?  Been trying to do some research and it seems it may be associated with the Dell raid card we have which is the H700 but wanted to get your opinions.  Thanks.

Storage Spaces

$
0
0

Hey,

I currently have a ZFS-based file server (Illumos).  I have a disk pooling setup of ZFS' equivalent of RAID 6 (2 drive failures), and have 2 of them grouped for effectively RAID 60.  So 2x 6-drive RAID-Z2's wrapped in a RAID 0, which gives me 1 pool.

Can Storage Spaces (in Win2K12R2) do this?  I'd like to move my ZFS install to Storage Spaces but this is critical.  From a quick look, it looks like I'd end up with 2 distinct RAID groups in Storage Spaces.

What I'd like to end up with is:

2-way mirror with SSD write back
2-way mirror with SSD write back
2-way mirror with SSD write back
2-way mirror with SSD write back
2-way mirror with SSD write back

and wrap the lot in whatever Storage Spaces defines as RAID 0.  So one big pool made up of 5 2-way mirrors, each with its own SSD.  So that would be 15 disks - 10 spinning and 5 SSD.

Thanks



Creating volumes on a disk located in a storage pool

$
0
0

What is the difference between New-Volume and New-Partition cmdlets?

I tried to create a volume on a disk that is online with the New-Volume cmdlet, but it says No MSFT_StoragePool objects found, although I have written the name of my storage pool correctly

The New-Partition cmdlet works, but I assume it doesnt care about any storagepools.

In addition, I have to format it manually after creating the partition, I suppose the New-Volume does that automatically


Freddy

Server 2012R2 - Extend Spanned Volume: There is not enough space available on the disk(s)...

$
0
0

Hi all,

I have a problem with a spanned volume on a Server 2012R2:

We have a spanned volume over 7 disks (260101,19 GB in total) and i want to expand this volume (rest fo 7th disk and 1 new disk). Each time I try to expand the volume I get the following error:

There is not enough space available on the disk(s) to compete this operation.

I tried using the GUI and diskpart, same error.

Some information:

- All disks are dynamic disks

- Volume is formatted with ReFS

Hope you can help me!

Greetings,

Thomas

Work Folders won't sync existing files

$
0
0
I have setup Work Folders using an existing User Share. It is configured with the Alias method. If we connect a client machine to the work folder and create a file on the client it will successfully sync that file to the server and show up in the existing folder for that user. However there are many existing files in the share that will not show up in the Work Folder. I have used the command Set-SyncServerSetting -MinimumChangeDetectionMins 1 to change the enumeration time of files that are created outside of the Work Folder client, but it is not working. Is there a way to troubleshoot why existing files will not appear in the Work Folder? Thank you!

DFSR Monitoring

$
0
0

Hi all ,

Is any one knows the best tools to monitor the DFSR Link between the site to data center with a complete details.

Dedicate Space to Folder

$
0
0

Hi everybody

We have shared folder in a file server, contains some folders , each folder for specific department. unfortunately the capacity of Drive getting full and we are we must move some data in order to free up some spaces. My question is it is possible to dedicate 500GB to specific shared folder in a drive that grants even if other folders get full, this one would not. I appreciate if you ask my question

Thank you


Viewing all 10672 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>