+ Start a Discussion

ASP.Net 2.0 Session management suggestions?

Hi everyone --

I'm building an application in .NET 2.0 and storing values into Session objects ("In Process") so I can access the objects on other pages in my application.  The size and number of objects I'm storing as session objects has grown and now the application is experiencing performance problems (and it frequently loses the session objects).

Does anyone have experience using .NET 2.0 session objects?  Would you recommend using "In Process" session objects (versus "SQL Server" mode)?  Any recommendations or preferences about better ways to maintain state in a .NET application instead of using session objects?

Mike LeachMike Leach
Object persistence to a physical data store, like a file system or database, is a scaleable solution.
Thanks, Mike.

Part of the problem, however, is that some objects (like the SForceService) are not serializable and so I don't think they can be stored in an "out of process" state management solution, right?

My big problem is that users of my application are having their (.NET) session id change in the middle of a session, which, of course, makes the session objects inaccessible.

I can't believe I'm the only person who's having this problem out here... very strange.

I appreciate any other thoughts you may have on the topic.

The only thing you need to re-create the SforceService instance is the sessionId and serverUrl, both are strings, both are easily stored in session.

Yes, you need to make sure your .NET session length is at least as long as the salesforce session length.
Mike LeachMike Leach

Sorry. I thought you were storing a plurality of SObjects in the Session. The singular SForceService should be stored in the global Application Cache and shared across sessions.

Other best practices include:
+ Lazy initialization of SForceService
+ Wrapping the SForceService in a class that caches calls to describeGlobal and describeSObject
+ SOAP Compression


The SforceService class has state, and therefore you need to be very careful if you're sharing a single instance across threads. (I wouldn't recommend this at all).
Mike LeachMike Leach
If SForceService is stored in Session, won't the concurrent connection manager limit the max sessions to 2?
I understand your state concern. I think you're recommending to create a new instance of SForceService on each request, but set the respective SessionHeader and URL from a globally cached sessionID?
Gareth DaviesGareth Davies
What we do is run an XP service that maintains the connection to Salesforce.com. Then each new session requests a SessionID from this service. The SessionID is served via a TCP port on localhost and is encrypted. This means that our sessions are quick to initialise, avoiding the login delay.

We only maintain a minimum of session objects in memory on the server (by carefull design) - but you can set configuration of IIS to maintain the session cache to the file system or to a SQLServer database if you are load-balancing across multiple servers. This is done in the IIS config files.

Wow, Gareth, that's incredibly cool!  I would love to know how you created an XP service to do that.

Let me ask you this, then:  using a minimum of session objects in memory, if you query Salesforce and pull back, say, 1000 contacts, what do you do with that result set (as in, do you write it temporarily into SQL Server or create a DataTable, etc.) before processing?

I've made a horrible error this week:  I started by querying Salesforce, generating a DataTable from the QueryResult and then storing that DataTable in a Session object so I could access it from various pages in the application.  Needless to say, the DataTable is sometimes very large and it's taking a huge toll of my server's memory.  (In fact, the hosting company I use limits me to 100Mb of memory for the entire application and it gets chewed up in just a few minutes.)

I originally thought the DataTable would be a good idea because I wanted to show the result set in a GridView.  But now, alas, I'm thinking that instead of creating the DataTable, I'm going to create an on-the-fly table in SQL Server and simply load the QueryResult into SQL Server.  That way, I can still point the GridView at the newly created table, still have access to the data from various pages, and not kill the memory on the server.

Does that seem like a reasonable approach?

Message Edited by ColoradoMike on 02-16-2007 08:07 PM

Message Edited by ColoradoMike on 02-16-2007 08:08 PM

A couple of .NET buddies of mine today suggested that using session objects that are stored in SQL Server are definitely the way to go.  They suggested that I can sidestep the server memory issues I'm currently facing.

So, for example, I might do a query in Salesforce that yields 1000 contacts which I store in a DataTable.  I would then stuff the DataTable into a session object and move to another page in the application, which, in turn, grabs the session object and does something with it (and perhaps re-stores it in the session object --if necessary-- for access by another page).

Does this seem like a reasonable approach to you guys?
Mike LeachMike Leach

If it'll serialize, give it a shot. I'd be curious to know if this works.


Hey Mike --

It actually works quite well!  Since the QueryResult I pull down for each user contains any number of fields, it made more sense to me to gather the results into a DataTable and then store the DataTable in a Session object (SQL-Server backed session objects, that is).

The data is readily at hand and I have a very small impact on server resources.  So far, this looks like it's going to work for me.

Thanks so much for your thoughts on this, everyone.