Answered by:
Object Serialization Corruption (File Bytes Set to NULL)
Question
-
I am developing a WinForms application that programs various hardware controls within a motherboard chipset. The goal of the application is to identify the electrical/timing margin available for various hardware controls. The controls are interfaced via PCI configuration accesses.
The application will hard-lock or BSOD the system from time to time. I have a recovery mechanism in place that allows me to automatically reboot the system and recover. I am using object serialization to persist my recovery information to disk. See example below.
public void SaveRecoveryState(string path, ObjectType obj)
{
FileStream fs = null;
BinaryFormatter bf = new BinaryFormatter();
try
{
fs = new FileStream(path, FileMode.Create);
bf.Serialize(fs, obj);
}
catch
{
// Do Nothing
}
finally
{
if (fs != null)
fs.Close();
}
}
public ObjectType GetRecoveryState(string path)
{
FileStream fs = null;
BinaryFormatter bf = new BinaryFormatter();
ObjectType obj = null;
try
{
fs = new FileStream(path, FileMode.Open);
obj = (ObjectType)bf.Deserialize(fs);
}
catch
{
// Do Nothing
}
finally
{
if (fs != null)
fs.Close();
}
}
I have some controls that do not cause the system to hard-lock. While testing these controls, the recovery object is serialized/deserialized correctly to disk. I can recover using one of these saved recovery files.
I am seeing, ~85% of the time when a hard-lock occurs my recovery files upon reboot contain all null bytes. The file size is identical between an OK serialized file vs. a corrupted file. I have verified prior to programming the hard-locking control the recovery files on disk are OK, but once the platform hard-locks the files become all null values.
I have tried the following:
1. Serialize multiple files to disk, up to 12 (many times there are more than 50% of the files corruped), but sometime all of the files get corruped and I lose all of my data.
2. Added delay between serializing the recovery file to disk and programming the hard-locking PCI control (I have added several seconds of delay). The issue still occurs.
3. Copying the serialized files as backup files, using File.Copy(), in hopes if the standard file is corrupt I can load using the backup. In this case both files get corrupted.
4. I have tried forcing garbage collection using GC.Collect()/GC.WaitForPendingFinalizers().
I am at my whits end. I have been debugging this for almost a week now and am making zero progress. If anyone has any idea as to why I am seeing this issue I would greatly appreciate it.
Thanks
Answers
-
There are some other options to try.
1. Do not use the WriteThrough option.
2. Call FileStream.Flush immediately after writing all of your data.
3. After calling FileStream.Flush, use P/Invoke to call FlushFileBuffers.using System; using System.ComponentModel; using System.IO; using System.Runtime.InteropServices; using Microsoft.Win32.SafeHandles; namespace ConsoleApplication { class Program { static void Main(string[] args) { using (FileStream fs = new FileStream("test.txt", FileMode.CreateNew)) { // Write to fs here. fs.Flush(); if (!FlushFileBuffers(fs.SafeFileHandle)) throw new Win32Exception(); } } [DllImport("kernel32.dll", SetLastError=true)] [return:MarshalAs(UnmanagedType.Bool)] private static extern bool FlushFileBuffers(SafeFileHandle hFile); } }- Marked as answer by jb_cpe Wednesday, May 6, 2009 3:14 AM
All replies
-
The data might not have been fully written to disk at the time the BSOD occurred.
When you construct the FileStream you use for writing, pass FileOptions.WriteThrough to bypass the write cache. Keep in mind that this will slow down the writing.
Even with this option, you could still be unlucky. You may need to consider keeping the last n recovery files and having the application work backwards to find a good one (if this is reasonable for your application). -
-
See this page.
http://msdn.microsoft.com/en-us/library/system.io.filestream.filestream.aspx
Choose any of the constructors that take FileOptions. -
-
There are some other options to try.
1. Do not use the WriteThrough option.
2. Call FileStream.Flush immediately after writing all of your data.
3. After calling FileStream.Flush, use P/Invoke to call FlushFileBuffers.using System; using System.ComponentModel; using System.IO; using System.Runtime.InteropServices; using Microsoft.Win32.SafeHandles; namespace ConsoleApplication { class Program { static void Main(string[] args) { using (FileStream fs = new FileStream("test.txt", FileMode.CreateNew)) { // Write to fs here. fs.Flush(); if (!FlushFileBuffers(fs.SafeFileHandle)) throw new Win32Exception(); } } [DllImport("kernel32.dll", SetLastError=true)] [return:MarshalAs(UnmanagedType.Bool)] private static extern bool FlushFileBuffers(SafeFileHandle hFile); } }- Marked as answer by jb_cpe Wednesday, May 6, 2009 3:14 AM
-
Thanks BinaryCoder!!!
This solution seems to be working. Initial QA looks promising. After 10 hard-locks, the system was able to recover each time successfully.
No file corruption has been observed thus far. I will continue QA, but this solution looks very very promising.
The execution time is ~300ms per file serialization, which is good.
Thanks again.