After a mammoth effort by Jarrod Dixon the team’s production profiler is now ready for an open source release.

Let me start with a bold claim. Our open-source profiler is perhaps the best and most comprehensive production web page profiler out there for any web platform.

There I said it, so let me back up that statement.

The stone-age of web site profiling

Ever since the inception of the web people have been interested in render times for web pages. Performance is a feature we all want to have.

To help keep sites fast many developers include the time it takes to render a page in the footer or an html comment. This is trivial on almost any platform and has been used for years. However the granularity of this method sucks. You can tell a page is slow, but you really want to know is, why?

To overcome this issue often frameworks use logs, like the famous Rails development.log. The trouble is that it is often very hard to understand log files and this information is tucked away in a place you often do not look at.

Some people have innovated and taken this to the next level Rack Bug is a good example for Rails and L2S Prof is a good example for .Net. Additionally some products like NewRelic take a holistic view on performance and give you a dashboard on the cloud with the ability to investigate particular perf issues down to the SQL.

The trouble has always been that the trivial profilers don’t help much. The nice ones are often not designed to work in production and often involve external tools, installs and dependencies. One clear exception is NewRelic, an excellent product. However when dealing with a single web page I think our profiler has an edge.

An ubiquitous production profiler

Our “dream” with our profiler was to have a way for us to get live, instant feedback on the performance characteristics of the pages we are visiting - in production. We wanted this information to be only visible to developers and have no noticeable impact on the site’s performance. A tricky wish to fill seeing our network sees millions of page views a day.

The ubiquity of the profiler is key, developers have become aware of slowness and the reasons for it in every day usage of the site. Analyzing the source of performance problems is trivial since you are able to drill-down on and share profile sessions. We have become much more aware of performance issues and mindful of slow ajax calls (since they are also displayed on the page as they happen)

Production and development performance can vary by a huge margin

Ever since we started using our profiler in production we noticed some “strange” things. Often in dev a page would be fast and snappy, but in production the same page had very uneven performance. Often we traced this back to internal locking in LINQ-2-SQL and ported queries to dapper. This does however bring up a very important fact.

Development performance may be wildly different to production performance.

Page profiling vs. the holistic view

Internally we use 2 levels of production profiling. We log every request with it’s total render time in a database (via haproxy log), this gives us a birds-eye view of performance. When we need to dig in on actual issues we use our profiler.

Both approaches are complimentary and in-my-view necessary for a high performance, high scale website. Efforts are much better spent optimizing a page that is hit 100k times a day vs. an equally slow page that is only accessed a handful of times.

This kind of functionality should be baked in to web frameworks

I find it strange that web frameworks often omit basic functionality. Some do not include basic loggers, most do not offer an elegant log viewer. None seem to provide a comprehensive approach to page profiling out of the box. It’s a shame, if we were all looking at this kind of information from day-1 we could have avoided many pitfalls.

Play with our profiler … today

Our profiler is open source, so is Data Explorer. All logged in users can experience the profiler first-hand by browsing the web site.

Ease of deployment

The profiler is neatly packaged up in a single DLL. No need to copy any CSS, JS or other files to get it all working. Internally we use the excellent razor view engine to code our profiling page,this is compiled on the fly from the embedded resource using this handy trick. Our CSS is all in awesome LESS, we translate it to CSS on the fly in JavaScript. All the resources are embedded into the DLL.

Profiling SQL is achieved by introducing a bespoke DbConnection that intercepts all DB calls. This interception only happens when a profiling session is in progress.

Profiling blocks are ludicrously cheap since we use a fancy trick around extension methods. You may call extension methods on null objects.

public static IDisposable Step(this MiniProfiler profiler, string name, ProfileLevel level = ProfileLevel.Info)
  return profiler == null ? null : profiler.StepImpl(name, level);

If there is no MiniProfiler in play the cost is a simple null check.

Hope you enjoy the MiniProfiler be sure to tweet a thank you to @marcgravell and @jarrod_dixon if it helps you.


Lomaxx about 13 years ago

Very cool… one question tho, does the MiniProfiler work with Spark view engine?

Sam Saffron about 13 years ago
Sam Saffron

sure, it works with any view engine as long as you have .Net 4 installed

Petr about 13 years ago

Is it possible to use it for profiling webforms apps?

Sam Saffron about 13 years ago
Sam Saffron

sure … as long as you have .Net 4

Usman_Masood about 13 years ago

lovely… why don't you guys provide this as a NuGet package?

Sam Saffron about 13 years ago
Niklas about 13 years ago

Excellent. If you just could give the dll in the NuGet package a strong name it would be perfect…

Chad_Moran about 13 years ago

Great stuff Sam, thanks!

Jeff_Holt about 13 years ago

“Efforts are much better spent optimizing a page that is hit 100k times a day vs. an equally slow page that is only accessed a handful of times.”

No caveats?

Sam Saffron about 13 years ago
Sam Saffron

I dunno, perhaps if a background process is doing the 100k hits it may make more sense to look at the pages real people are looking at :slight_smile:

Nathan about 13 years ago

You got me excited with “for any web platform” and then disappointed with “packaged up a single DLL”. A windows .net profiler is hardly “any web platform”.

Sam Saffron about 13 years ago
Sam Saffron

The concepts are open source there is a demo at you can play with, and also see the very cool ajax profiling.

I really hope the concept of an ubiquitous production profiler is ported to other web platforms using the same kind of UI we created.

John_Rusk about 13 years ago

Often we traced this back to internal locking in LINQ-2-SQL

Just to clarify, for any other LINQ to SQL users who may be reading this, I believe that the internal locking only happens with UN-compiled queries. (I haven't checked with Reflector, but that's certainly the impression I've got from my own testing of compiled and uncompiled queries.)

Sam Saffron about 13 years ago
Sam Saffron

well possibly, dapper out-performs compiled queries. Often we had to port compiled queries to dapper for perf reasons.

Compiled queries everywhere, is an incredibly cumbersome pattern and certain queries can not even be compiled (like Contains ones)

Cmholm about 13 years ago

The Mono Project supports ADO.NET to some degree, so it may be possible to port the MiniProfiler DLL to Linux and OS X.

Sam Saffron about 13 years ago
Sam Saffron

I am pretty sure this could be ported to mono quite easily.

Max_Ivak about 13 years ago

Thanks! it helps a lot. I am using it now to log requests from Ajax.

Ryan_Meyer about 13 years ago

I'm guessing the answer is no, but is there a way for profiling behavior to bubble up to a web project from a WCF service? If I'm making database calls in the service, there is no way to draw those on the screen, correct?

In general it seems to be very useful though and I love the UI implementation of reporting results.

Sam Saffron about 13 years ago
Sam Saffron

perhaps the way we do ajax profiling could be ported to wcf over http ?

Alan about 13 years ago

I'm confused, the Step() method looks like it's doing tracing – not profiling. Why not just use trace.axd for this type of ‘profiling'?

Sam Saffron about 13 years ago
Sam Saffron

Step returns an IDisposable that finished timing the operation when you dispose of it.

Jed about 13 years ago

How much of this is dependent on .NET 4, how much wouldn't be possible without .NET 4, and based on those answers, how much work would it be to support those of us stuck with .NET 3.5 or even .NET 2?

Sam Saffron about 13 years ago
Sam Saffron

not too much … this could be backported

Tushar about 13 years ago

This is quite helpful. I am definitely going to give it a try.

Johhny_Utah about 13 years ago

ASP.Net Mvc only out of the box?

Sam Saffron about 13 years ago
Sam Saffron

out of the box it works with as well, we have put in lots of work lately to cut down dependencies

Kowsik about 13 years ago

If you need to kick the profiling code to understand the bottlenecks, check out It makes load and performance testing a fun sport. More importantly, the ruby gem enables continuous integration so you can validate your application scalability right after you push your code up to production.

Starter about 13 years ago

Sorry, I'm quite new to Google Project Hosting: How could I download current version?

Sam Saffron about 13 years ago
Sam Saffron

Just grab the code from here: or try the nuget version

Steve about 13 years ago


Can't wait to try it

Sriram about 13 years ago

Nice. I integrated with a webform app using NHibernate. The profiler runs fine on top of Cassini/IIS Express. However, when I publish the app on IIS 7, I get a 404 request for the miniprofiler js files.

Sriram about 13 years ago

Ah! Looks like there is a bug in MvcProfiler when dealing with virtual directories.

Adamjford almost 13 years ago

Woah, it's weird coming across someone linking to one of my own SO questions. :) Glad it helped!

Sergi almost 13 years ago

Great stuff!! Got it working and integrated in my MVCScaffolding templates in half an hour more or less!!

I've managed to profile the frontend methods of the MVC web site project I'm working on, but I'm wondering if I could use it to profile my SQL queries, which are located in a separate DAL assembly, which uses Data Access Application Block to deal with the DB. Any samples about how to do it, if it is possible?

Thanks, Sergi

Sergi almost 13 years ago

I answer myself: Yes I can!!

Just added MiniProfiler as a nuget reference to the DAL assembly and wrapped the DbCommand with ProfiledDbCommand, using MiniProfiler.Current, and worked like a charm!!

This is an example code:

   using (MiniProfiler.Current.Step(“DAL: ExecuteReader”))
       IDataReader dataReader = DBInstance.ExecuteReader(new ProfiledDbCommand(dbCommand, DBInstance.CreateConnection(), MiniProfiler.Current));
       while (dataReader.Read())
           // do something with the data

where DBInstance is of SqlDatabase type, provided by Data Access Application Block. It also happens that the common code to access to DB is in a separate assembly from the actual DAL, to encapsulate all the DAAB stuff, so just needed to add the profiler there, and ALL my DB calls got profiled!!!!!!! And it just took me 1 hour or so…!!!!!

And all these calls are showed as children of the method in the frontend, so it's very readeable as well…

This is awesome!

Great job, and probably a must-have for every medium-sized project or bigger…

Sam Saffron almost 13 years ago
Sam Saffron

awesome! glad to hear this

Obliojoe almost 13 years ago

Seriously – thanks so much to you and the SO team. Dapper and MiniProfiler have made me a happier developer. Inspiring!

Sergi almost 13 years ago

Hi Sam,

currently using intensively miniprofiler at my projects, and really happy with it. We could solve a few performance issues with it.

But I miss some kind of changelog for the Nuget packages that tells me which changes have been done, so that I can update and improve my code in every new version that cames out. Looking at your commit logs in the source code doesn't semms to be the best way… :–)

Thanks, Sergi

Steve almost 13 years ago

@sergi easy one first – update-package (or update-package miniprofiler) will get any updates if there are any – or are you thinking of ‘push' notifications?

tougher one – like you I'm using ent.lib for a bunch of data access (moving to EF, but not 100% there yet) did you / anyone else come across a way of wrapping / extending a DAAB database object with miniProfiler rather than using dbInstance.CreateConnection()? Unfortunately we have this as a general pattern where we use the db instance to use its internal connection – no call to .CreateConnection()

using(db = DatabaseFactory.CreateDatabase()) { db.GetSqlStringCommand(“..sql…”); //do some stuff, usually filling objects }

I can refactor so DatabaseFactory.CreateDatabase() goes through a common wrapper – but is there anyway of wrapping the underlying connection it will use when creating a command – to refactor all our Get..Command calls or GetReaders would be a nightmare!

Hoping someone knows a nifty way before I start looking into unity / IoC to somehow do this..

comments powered by Discourse