Thursday, July 22, 2010

Fear and Loathing in Serialization

In an internal application, we use .NET Remoting to pass objects back and forth with a data-access tier. Without going too deeply into the reasons why we selected Remoting instead of web services (and no, there was no such thing as WCF back then…), I think it worthwhile to discuss an exception that occurred after migrating our serialized objects from .NET 3.5 to 4.0.

In this application, methods to retrieve and save typed datasets are called like this:

Dim tm As New TradeManager
'populate a typed dataset "TradeObj" with data from the remoted method
Dim TradeObj As TradeObject = tm.GetTrade(TradeID)
'Make form-based changes to TradeObj
'then pass it back to the remoted save method

The process of changing the target framework to 4.0 was straightforward, of course. Indeed, initial testing—running the server objects locally (not remoted) from within Visual Studio—showed communication between client and server objects to be working normally. However, once the server side objects were placed in a Remoting configuration (i.e. server-activated single-call over a TcpChannel), we soon noticed a security exception being “thrown by the target of an invocation.” It seemed that the typed dataset was generating a permissions error when passed as a parameter to the SaveTrade() method.

PermissionState=class="System.Security.PermissionSet" version="1"

From the call stack of the inner exception, it was clear the typed dataset being passed into the remote method was failing to serialize.

at System.Array.InternalCreate(Void* elementType, Int32 rank, Int32* pLengths, Int32* pLowerBounds)
at System.Array.CreateInstance(Type elementType, Int32 length)
at System.Data.DataTable.NewRowArray(Int32 size)
at System.Data.Index.GetRows(Range range)
at System.Data.DataColumn.IsNotAllowDBNullViolated()
at System.Data.DataSet.EnableConstraints()
at System.Data.DataSet.set_EnforceConstraints(Boolean value)
at System.Data.Merger.MergeDataSet(DataSet source)
at System.Data.DataSet.Merge(DataSet dataSet, Boolean preserveChanges,
MissingSchemaAction missingSchemaAction)
at CustomObjects.TradeObject..ctor(SerializationInfo info, StreamingContext

In this case, the Merge() that’s occurring is part of the server’s process of reconstructing the object. You'll notice that this call stack originates with the typed dataset's constructor. That's because in order to deserialize a dataset, Microsoft first creates an empty instance of the object's type in the target method before populating its ItemArray properties with the serialized data.

Curiously, the server seemed perfectly able to construct a TradeObject, populate it with values from SQL Server, and serialize it to the client during the GetTrade() call. Only when the object (even unmodified) was passed back to SaveTrade()would the exception be raised.

After much, long experimentation (and a phone call to Microsoft!), it turns out that the object was no longer meeting the criteria for the default (“Low”) level of automatic deserialization (see

Although the fix was simple enough…


Private channel As New TcpServerChannel(TCP_CHANNEL)
ChannelServices.RegisterChannel(channel, False)


Private channel As TcpServerChannel
Dim provider As New BinaryServerFormatterSinkProvider()
provider.TypeFilterLevel = Runtime.Serialization.Formatters.TypeFilterLevel.Full
Dim props As IDictionary = New Hashtable
props("port") = TCP_CHANNEL
channel = New TcpServerChannel(props, provider)

…I was left to wonder: what about the typed dataset changed that caused it to fail? Since .NET’s default deserialization level supports “Custom types that implement ISerializable and make no other demands outside of serialization,” even a weak-named typed dataset object should qualify (at used to!) I looked very carefully into changes in the dataset’s generated code that occurred during the transition from a 2.0 to a 4.0 object. Interestingly, I was unable to find any significant differences that would have altered the way the object serialized or described itself.

At this time, I can only conclude that the .NET Framework’s criteria for identifying objects that meet the Low level deserialization requirements changed. However, I can't confirm that because the article referenced is not specific to a version of the Framework. I’m glad to have a working fix for the problem, but I’d sure like to know what I can change about my object (other than giving it a strong name, which has other implications for me) to make sure it’s not seen as a security threat by deserialization!


Thursday, July 15, 2010

Ahhhh, simulation!

Well, it's been back to work again this week but I still found time to do a nice little write up on Monte Carlo simulation for the CodeProject. Hmmmn, how come those guys keep getting all my "work" and the blog just keeps pointing over there? I suppose it's just too darn convenient!

Anyway, I worked up a simple example of investment performance vs. retirement withdrawals to show how this kind of simulation could be used to decide if one has enough money saved up for retirement. Sadly, my own numbers indicate I may be working well past my life expectancy! Sigh.

At least I can share this knowledge and thus keep myself entertained at the keyboard...


Friday, July 2, 2010

A practical introduction to queue theory

OK, I'm on a roll here before I go on vacation next week...well, later this afternoon actually. Two days and two articles published at The Code Project!

This time, I take a stab at implementing the equations that describe queueing activity developed by Erlang and, later, Little. Read the full article and download the sample C# application at:

Thursday, July 1, 2010

Time-series forecasting

I've completed my first article for the Code Project ( It's titled "A Time-series forecasting library in C#" and it details techniques for producing--you guessed it--forecasts from historical data. Not the snazziest title, but hey, at least you know what's inside before you open the box. It includes the ability to reserve and test a holdout set and specify n periods into the future for forecast values.

You can find it published at