More in this section

Forums / Developing with Sitefinity / Sitefinity performance tuning, better API choices?

Sitefinity performance tuning, better API choices?

3 posts, 0 answered
  1. clayman
    clayman avatar
    38 posts
    Registered:
    23 Sep 2009
    12 Apr 2010
    Link to this post
    I have an API "wrapper" that retrieves sitefinity content for my webpages.  It works fine, but it can be a bit slow for large pages with many pieces of content.  Since I can't alter the site layout/design, I'm trying to address performance by caching and improving load times.

    Caching
    I've implemented the ASP.NET output caching, which works really well but it is unpredictable in terms of how long it will cache the pages even though I have specified a hard number in all the pages and in the config.  I suspect the cache is hitting some limit in memory (when crawling all the pages, asp worker process gets to about 750-800mb and then it drops back down to 600mb) and then rolling over.  I read somewhere that ASP.NET manages that size dynamically based on system resources or something...  There's always DB caching but we're using authentication on every page, so the cache substitution controls are required, which means that other cache implementations are not an option.  Love the caching but it seems our site is just too big to rely on it alone.

    CMS API
    Our content is stored based on categories, with tags used to distinguish between product-specific documents.  Based on the developer's guide, I came up with the following code to retrieve content for a given product:
    public IEnumerable<CmsContentBase> FindContentGivenLibraryAndCategoriesAndTagNames(LibraryType type, IEnumerable<string> categoryNames, IEnumerable<string> tags)
            {
                var list = new List<CmsContentBase>();
     
                var contentManager = GetContentManager(type);
     
                foreach (var categoryName in categoryNames)
                {
                    var filter = new IMetaSearchInfo[]
                                     {
                                         new MetaSearchInfo(MetaValueTypes.ShortText, "Category", categoryName)
                                     };
     
                    var listofItems = contentManager.GetContent(filter);
     
                    foreach (CmsContentBase document in listofItems)
                    {
                        if (IsNotFromCorrectLibrary(document, type)) continue;
     
                        var foundDocument = FindDocumentMatchingTags(contentManager, document, tags);
                        if (foundDocument != null)
                            list.Add(document);
                    }
                }
                return list;
            }
    private static CmsContentBase FindDocumentMatchingTags(ContentManager contentManager, CmsContentBase document, IEnumerable<string> matchingTags)
            {
                var allTags = contentManager.GetTags(document.ID);

                var found = allTags.ToArray<ITag>().Join(matchingTags, allTag => allTag.TagName, matchingTag => matchingTag,
                                         (allTag, matchingTag) => matchingTag);

                if(found != null && found.Count() == matchingTags.Count())
                    return document;

                return null;           
            }

    The obvious performance issue here is that I have to go back to the db for each document to retrieve its tags before I have enough information to perform the match.  Considering I have some categories with 200 or even 300+ documents/content items and some of these pages pull in up to a dozen of them, it's no wonder some pages can take 10-15 seconds to load.

    Is there a better way to pull in tag information as part of the "MetaSearchInfo" filter?  Or, is there a (supported) way for me to query the db and skip past the API?  I think I read somewhere that the Nolics ORM provider is getting phased out, and I've avoided using it thus far.  I'm reluctant to go against the DB unless as a last resort. 

    We can also upgrade to 3.7 SP3 (we are currently on 3.7 SP1) and I understand there are some performance improvements for large db's... but the way I'm going after this content is just not efficient and I fear it'll be slow even in 4.0 unless I change it.

    Appreciate your advice, I'm sure there has got to be a better way...

    Thanks



  2. clayman
    clayman avatar
    38 posts
    Registered:
    23 Sep 2009
    14 Apr 2010
    Link to this post
    Hello?
  3. Ivan Dimitrov
    Ivan Dimitrov avatar
    16072 posts
    Registered:
    12 Sep 2017
    14 Apr 2010
    Link to this post
    Hello clayman,

    You could use Generic Dictionary and ASP.NET indexer to improve the performance a bit, instead of making multiple foreach loops.

    Kind regards,
    Ivan Dimitrov
    the Telerik team

    Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Public Issue Tracking system and vote to affect the priority of the items.
Register for webinar
3 posts, 0 answered