View Full Version : OT synctoy: archive not the problem

J Tiers
04-02-2011, 05:38 PM
Archive bits not set, filedates OK, synctoy still marks for overwrite.....

Any other contestants in the "why's dat" competition?

04-02-2011, 08:45 PM
SyncToy has known problems in some environments with file timestamps and Daylight Savings Time, the symptom being unnecessary file rewrites:



If your needs are simple enough (IE, just need a smart backup of a single tree (or a few trees) of folders & files, overall size unimportant), then the DOS XCOPY command may be sufficient. I'm not sure if every flavor of Windows (home, pro, etc) ships with it (both my XP and Vista systems have it) but I imagine it can be downloaded from Microsoft.

XCOPY /C/V/E/D source-tree destination-tree

will create an initial backup, or update an existing backup (based on file timestamps), and verify the copied files by byte count.

Edit: it is certainly possible that XCOPY might exhibit the same malfunction as SyncToy ...

J Tiers
04-02-2011, 09:46 PM
Mebbe.... but it's gonna take a while to over write 11,735 files, cowabunga, I might as well do a full backup.

04-02-2011, 10:12 PM
The DST/file timestamp problem in Windows NT is an old one, though it may have been fixed in Windows 7.

Long discussion of the problem here:


J Tiers
04-02-2011, 10:18 PM
I use XP, which may be related enough to be affected

04-03-2011, 06:29 AM
Did you give Synkron a look ?..

Its a very capable mature cross platform synchroniser with a far greater feature set than synchtoy.


J Tiers
04-03-2011, 10:10 AM
Did you give Synkron a look ?..

Its a very capable mature cross platform synchroniser with a far greater feature set than synchtoy.


I did, but it does not seem to do what I want done.

I do NOT want to "synchronize" folders. I have already been there, and I did NOT LIKE IT. I LOST a bunch of data, I am not actually sure WHAT I lost, and I do NOT want any more of that nonsense.

I looked thru the info, and did not see an equivalent to the "contribute" option..... That is where each source contributes any NEW stuff, but does NOT delete in the second folder any deletions in the first.

That's where I got in trouble before..... I had new data in B, AND in A, and "synching" B to A (or A to B) meant that the new data in one or the other had to be deleted, because it was "obviously old bad data".... it was not present in the "source".

No more of THAT BS.

04-03-2011, 10:50 AM
You have to use whatever backup scheme is appropriate for your environment - but personally, I found migrating changes to a single master, and smart-updating clones of same (with XCOPY) was as simple/reliable as I could make it.

Previously I had gotten into trouble updating mutually contributory datasets (as you have experienced), and simply changed my usage/backup model in response. That is presumably not an option for yourself.

J Tiers
04-03-2011, 11:24 AM
and simply changed my usage/backup model in response.

Means what?

Did you stop having multiple sources?

I can't really change the fact that at least two sources contribute new files...... what I need to do is to propagate new data from any one of them to the others, for which I currently use a 16G stick memory.

I am OK with data purging being my problem manually.... better that way than some process that I imperfectly understand being in charge of that.

Maybe the synkron program does that, I know synctoy does easily, but I didn't find it in the on-line manual.

The alternative is a server and something like the old "visual source safe", where incremental changes and history are saved and ANY version can be retrieved. That isn't justified, isn't entirely practical here, and still requires backup.

04-03-2011, 02:49 PM
>>>Did you stop having multiple sources?

Yes: I basically restricted myself to using a single PC generally (down from 2 or 3), backing up data on a separate spindle on the most-favored machine. After some time has elapsed (or sensitive data has been acquired), I migrate copies of the data to other spindles on other machines, trying to keep at least 3 copies of the data somewhere in my little universe of hardware.

If I were running a commercial enterprise, I would use better software, tighter procedures, a dedicated RAID server, etc - but I don't want or need that level of complexity.