i have a table with 250,000 row all selected to be cached as List; here the point that this table takes about 5.6 GB oo ram when cached to appfabric, is this normal ? is there any other way to reduce the size? what would be the best approach for such a challenge.
Asked
Active
Viewed 939 times
2
-
If you are not commited to AppFabric try look at another cache client like RediCache or MemCache – BossRoss Dec 30 '13 at 12:25
-
@BossRoss i have a cache factory therefor i have no commitment to any cache, but appfabric is an habit :) – MuhanadY Dec 30 '13 at 12:47
-
1[Look here for more](http://stackoverflow.com/questions/20114994/appfabric-cache-memory-very-intensive) – BossRoss Dec 30 '13 at 13:31
-
@BossRoss i do appreciate your researches, but i do believe that some how we might doing some thing wrong and not using appfabric could not be the required answer. maybe other solution are much better then appfabic but that does not feed my curiosity. let me try to check how i can fix it and will share the solution soon. – MuhanadY Dec 30 '13 at 14:58
1 Answers
1
Objects are stored in the cache in a serialized form. So to understand the cache size, you simply have to calculate the serialized object size.
AppFabric uses the NetDataContractSerializer class for serialization before storing the items in the cache. So determine the object size, add instrumentation/debug code to your application/unit tests that serializes your objects and records their serialized size.
The standard way to is
// requires following assembly references:
//
//using System.Xml;
//using System.IO;
//using System.Runtime.Serialization;
//using System.Runtime.Serialization.Formatters.Binary;
//
// Target object “obj”
//
long length = 0;
MemoryStream stream1 = new MemoryStream();
using (XmlDictionaryWriter writer =
XmlDictionaryWriter.CreateBinaryWriter(stream1))
{
NetDataContractSerializer serializer = new NetDataContractSerializer();
serializer.WriteObject(writer, obj);
length = stream1.Length;
}
if (length == 0)
{
MemoryStream stream2 = new MemoryStream();
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(stream2, obj);
length = stream2.Length;
}
// do somehting with length
You said
i have a table with 250,000 row all selected to be cached as List Sometimes you have to use a different storage format : for example, you can't use datatable/dataset because there are very unefficient. To reduce to cache size, optimize the serialized format.

Cybermaxs
- 24,378
- 8
- 83
- 112
-
thank you for your answer, the point here that the format is not datatable nor datadest its typed list; you might be right about the size of the list, therefor i'll try to optimize it in another way. – MuhanadY Jan 02 '14 at 07:08
-
by they the length is 384376461 !! which is about 366 MB but am having about 5 GB size !!!! – MuhanadY Jan 02 '14 at 07:26
-
did you enable specific AppFabric features ? How do you get cache size ? via Get-CacheStatistics ? – Cybermaxs Jan 02 '14 at 10:36
-
-
here is the cache statistics: Size : 387427328 ItemCount : 1 RegionCount : 1 RequestCount : 11 ReadRequestCount : 7 WriteRequestCount : 1 MissCount : 6 IncomingBandwidth : 384379418 OutgoingBandwidth : 768753344 – MuhanadY Jan 02 '14 at 13:32