Roger,
I have reported this before, but not in many years. One customer, one server with 8 users connecting via remote access.
Not ADS.
This is our message to them.
Your IT person needs to review situation. When this occurs, from what we see, and have seen in the past, a file when updated on the server saves to one block instead of its natural size. The one block on a hard drive is 4,294,967,582 (may vary slightly). It is then unreadable (may have had 200,000 records but shows ***** as number of records).
In this situation, if they report the error immediately, the file is ok, just has millions of blank records. If they keep trying to update or add records the file cannot be read, and they have to restore the file from their latest backup, usually from prior day.. xdot, nothing else will open it.
We have not seen this in years, but did in the past. The IT person would eventually replace something, never tell us what is done, and no further issues.
Appears to occur when multiple records are being added, like a batch posting. Todays issue, 3 files being updated are being corrupted, sometimes 1, sometimes all 3, to the same size. Then we restore (on this batch run a backup is done, and a restore for the batch is on the menu)
Just wondering if anybody has run across this save file issue and the one block size. (memory, operating system, bad hard drive?)
Happy Holidays
Fred
Omni
Updating file issues
Re: Updating file issues
Bob Volz used to get the problem of a million blank records.
We haven't seen this in years.
We haven't seen this in years.
The eXpress train is coming - and it has more cars.
- sdenjupol148
- Posts: 151
- Joined: Thu Jan 28, 2010 10:27 am
- Location: NYC
Re: Updating file issues
Fred,
I had Bob switch to ADS and the problem went away.
If that's not possible, since this is part of a batch post, you could try putting in a Sleep() every few records.
This way you aren't pushing appends or changes too fast.
At certain clients we support, we have found that it isn't always a server issue however.
It could be the ethernet switch, line or connection.
Bobby
I had Bob switch to ADS and the problem went away.
If that's not possible, since this is part of a batch post, you could try putting in a Sleep() every few records.
This way you aren't pushing appends or changes too fast.
At certain clients we support, we have found that it isn't always a server issue however.
It could be the ethernet switch, line or connection.
Bobby
- Eugene Lutsenko
- Posts: 1649
- Joined: Sat Feb 04, 2012 2:23 am
- Location: Russia, Southern federal district, city of Krasnodar
- Contact:
Re: Updating file issues
4 294 967 582 - this is with a high accuracy of 4 GB. Perhaps this is the maximum file size supported in CLIPPER. I think we need to do error handling when adding a record and as soon as an error occurs, start filling in a new file. Well, make a directory of such files. Or really switch to ADS, where there is no such limit on file size.
- sdenjupol148
- Posts: 151
- Joined: Thu Jan 28, 2010 10:27 am
- Location: NYC
Re: Updating file issues
Hi Eugene,
You are correct that there is a file size limit.
But just for informational purposes, ADS does not have a limit.
We had at one time a 32gig file size with no appreciable performance issues.
Roger has since scaled it back to a manageable 12 gigs.
But without ADS I don't think it's possible.
Bobby
You are correct that there is a file size limit.
But just for informational purposes, ADS does not have a limit.
We had at one time a 32gig file size with no appreciable performance issues.
Roger has since scaled it back to a manageable 12 gigs.
But without ADS I don't think it's possible.
Bobby