Setting the AppFabric Cache MaxBufferSize

AppFabric Cache has a max buffer size of 8MB. If you’re using the SQL Provider, you need to export and re-import the Xml configuration file to modify this. Here’s a powershell script to do it for you, building on the blog post from Javi.

Save this to UpdateAppFabricCacheBufferSize.ps1


Import-Module DistributedCacheAdministration, DistributedCacheConfiguration

Function UpdateBufferSizeInConfig ([string]$configFilename, [int]$maxBufferSize)
	$xml = New-Object XML
	$transportProperties = $xml.configuration.dataCache.advancedProperties.transportProperties
	if ($transportProperties -eq $NULL) {
	  $transportProperties = $xml.CreateElement("transportProperties")
	$transportProperties.SetAttribute("maxBufferSize", "$maxBufferSize")

$tempConfigLocation = "c:config.xml"

Export-CacheClusterConfig -File $tempConfigLocation

UpdateBufferSizeInConfig $tempConfigLocation $maxBufferSize

Import-CacheClusterConfig -File $tempConfigLocation -Force 

To change your buffer size to 15MB:

powershell ./UpdateAppFabricCacheBufferSize.ps1 15000000

Tips for using Disqus comments

Two tips I gave recently to a colleague just setting out with Disqus.

Q. How do I make Disqus comments visible to Google?
A. Use the Javascript code snippet Disqus provide as this fetches the comments asynchronously. On your server implement a background task to fetch and cache recent comments from Disqus using the Disqus API (you could fetch them during page render, but your page load speed will be directly coupled to the response time from Disqus). When the page is rendered embed the cached comments between <noscript> tags. This allows you to use HTML page caching services like Akamai/Varnish whilst still having moderately fresh comments in the page for Google (and non-JS users). Best to only include a few comments to keep page size down and then provide pagination links for the search engines. (This was inspired by

If you are using an ESI caching solution you might be tempted to implement an ESI include to fetch the comments as they are dynamic content. I’d recommend not doing this as you’ll be fetching the comments (from your cache, or Disqus) on every page load which is very unnecessary just for the occasional visit by Google.

Q. What should I use for the disqus_identifier?
A. I recommend using an internal identifier for the piece of content to which the comment is attached prefixed with an environment indicator, e.g. disqus_identifier = ‘live_ 21EC2020-3AEA-1069-A2DD-08002B30309D’. I’d strongly suggest that you do not leave it blank and do not use the page URL. If you leave it blank Disqus will automatically use the URL which may not be permanent, thus when the article title changes (which is regularly included in a URL), the comments will be lost. Prefixing the environment to the identifier mitigates any clash with comments made in your testing environment when you move your CMS data around.

Be careful making Instant, “Instant”

UPDATED: 7/Feb/2011 with comments from Simon Smith.

I’ve seen a few examples of people trying to mimic the Google Instant search with their own solution. Most of these have just made them “instant” searches by changing

...perform actual search...


...perform actual search...

My gripe is that Instant doesn’t need to be, and in fact shouldn’t always be, Instant. There are 3 reasons for this

1) A lot of users type looking at their keyboard so Instant just needs to mean “ready when a user looks up from their keyboard to see the result”
2) Browser and network performance can be significantly harmed if you’re issuing complex javascript/ajax/network requests on every single keypress.

The solution? Well, my solution is very simple.

var _timerId = 0;

_timerId = window.setTimeout(function() {
...perform actual search...
}, 170);

In this example I set a timer which fires 170ms after the LAST keypress. It’s pretty imperceivable that there’s a delay at all, but it dramatically improves CPU/bandwidth performance of these “Instant” searches, and it still appears to be pretty Instant.