Thursday, July 22, 2010

Fear and Loathing in Serialization

In an internal application, we use .NET Remoting to pass objects back and forth with a data-access tier. Without going too deeply into the reasons why we selected Remoting instead of web services (and no, there was no such thing as WCF back then…), I think it worthwhile to discuss an exception that occurred after migrating our serialized objects from .NET 3.5 to 4.0.

In this application, methods to retrieve and save typed datasets are called like this:

Dim tm As New TradeManager
'populate a typed dataset "TradeObj" with data from the remoted method
GetTrade
Dim TradeObj As TradeObject = tm.GetTrade(TradeID)
'Make form-based changes to TradeObj
'...
'then pass it back to the remoted save method
tm.SaveTrade(TradeObj)

The process of changing the target framework to 4.0 was straightforward, of course. Indeed, initial testing—running the server objects locally (not remoted) from within Visual Studio—showed communication between client and server objects to be working normally. However, once the server side objects were placed in a Remoting configuration (i.e. server-activated single-call over a TcpChannel), we soon noticed a security exception being “thrown by the target of an invocation.” It seemed that the typed dataset was generating a permissions error when passed as a parameter to the SaveTrade() method.

InnerException:
System.Security.SecurityException
GrantedSet=""
Message=Request
failed.
PermissionState=class="System.Security.PermissionSet" version="1"
Unrestricted="true"/>
RefusedSet=""
Source=mscorlib
Url=""

From the call stack of the inner exception, it was clear the typed dataset being passed into the remote method was failing to serialize.

at System.Array.InternalCreate(Void* elementType, Int32 rank, Int32* pLengths, Int32* pLowerBounds)
at System.Array.CreateInstance(Type elementType, Int32 length)
at System.Data.DataTable.NewRowArray(Int32 size)
at System.Data.Index.GetRows(Range range)
at System.Data.DataColumn.IsNotAllowDBNullViolated()
at System.Data.DataSet.EnableConstraints()
at System.Data.DataSet.set_EnforceConstraints(Boolean value)
at System.Data.Merger.MergeDataSet(DataSet source)
at System.Data.DataSet.Merge(DataSet dataSet, Boolean preserveChanges,
MissingSchemaAction missingSchemaAction)
at CustomObjects.TradeObject..ctor(SerializationInfo info, StreamingContext
context)

In this case, the Merge() that’s occurring is part of the server’s process of reconstructing the object. You'll notice that this call stack originates with the typed dataset's constructor. That's because in order to deserialize a dataset, Microsoft first creates an empty instance of the object's type in the target method before populating its ItemArray properties with the serialized data.

Curiously, the server seemed perfectly able to construct a TradeObject, populate it with values from SQL Server, and serialize it to the client during the GetTrade() call. Only when the object (even unmodified) was passed back to SaveTrade()would the exception be raised.

After much, long experimentation (and a phone call to Microsoft!), it turns out that the object was no longer meeting the criteria for the default (“Low”) level of automatic deserialization (see http://msdn.microsoft.com/en-us/library/5dxse167(vs.71).aspx).

Although the fix was simple enough…

Original

Private channel As New TcpServerChannel(TCP_CHANNEL)
ChannelServices.RegisterChannel(channel, False)

Fixed

Private channel As TcpServerChannel
Dim provider As New BinaryServerFormatterSinkProvider()
provider.TypeFilterLevel = Runtime.Serialization.Formatters.TypeFilterLevel.Full
Dim props As IDictionary = New Hashtable
props("port") = TCP_CHANNEL
channel = New TcpServerChannel(props, provider)

…I was left to wonder: what about the typed dataset changed that caused it to fail? Since .NET’s default deserialization level supports “Custom types that implement ISerializable and make no other demands outside of serialization,” even a weak-named typed dataset object should qualify (at least...it used to!) I looked very carefully into changes in the dataset’s generated code that occurred during the transition from a 2.0 to a 4.0 object. Interestingly, I was unable to find any significant differences that would have altered the way the object serialized or described itself.

At this time, I can only conclude that the .NET Framework’s criteria for identifying objects that meet the Low level deserialization requirements changed. However, I can't confirm that because the article referenced is not specific to a version of the Framework. I’m glad to have a working fix for the problem, but I’d sure like to know what I can change about my object (other than giving it a strong name, which has other implications for me) to make sure it’s not seen as a security threat by deserialization!

Cheers,
Kerry

No comments:

Post a Comment