Alexander Manekovskiy

Automating Automation

How I’ve Fixed My Dell Inspiron Overheating Issues

Last summer I started experiencing issues when working on CPU bound tasks on my laptop. At first I thought that the main cause was the summer heat - it was 30°C (86°F) at the time when I first noticed my laptop automatically shut down because of overheating. But then when temperature went down and occasional shutdowns didn’t stopped I understood that I have a real problem.

Dell Inspiron N5110 I own Dell Inspiron N5110 which has Intel Core i7-2670QM CPU and NVidia GeForce GT 525M dedicated GPU. Browsing over the Internet showed that I’m not the only one with such issue. But there was no consistent/believable explanation/guide of why laptop started overheating and how to fix it. One part of the community was just blaming Dell’s greediness and/or cooling system which was not designed for such powerful CPU as i7 and another was suggesting to replace the thermal grease and through the power management controls decrease max speed of the CPU. I already knew how to disassemble my laptop (previously I had to replace my stock HDD which is not fast or easy operation when you own a Dell laptop) so I’ve decided to replace the thermal grease first and then try to understand and maybe even fix engineering blunders in cooling system.

Running ahead of the story I want to say that I’ve successfully accomplished both tasks and reduced overall temperature of my CPU by 20°C (68°F) resulting in stable 50°C (122°F) when idle and 85-90°C (185-194°F) under continuous 100% load.

Step 1: Clean the dust and replace the thermal grease

The things you’ll need:

  • The thermal grease. For those who is interested I was using Zalman ZM-STG2.
  • The Dell Inspiron N5110 Service Manual. This is a “must” if you never saw the “innards” of your laptop. You’ll have to follow the steps from “Removing the Thermal-Cooling Assembly” section (see page 75). Friendly tip: print pages with necessary steps as it is hard to remember everything when disassembling laptop for the first time.
    I’m sure that there are also plenty of video guides showing how to do this, but being a bit of old school I prefer reading over watching so I cannot recommend any video guide.

It turned out that stock thermal grease became rock solid and was no longer able to do its work. I used a 70% isopropyl alcohol to remove it.

Rock solid thermal grease on CPU and GPU

The fan was also full of dirt. The sad fact is that you cannot open fan case without removing the whole cooling system. This means that every time you want to clean it from dirt and dust you’ll have to replace the thermal grease.

Dirt inside cooling fan

So after I’ve replaced the thermal grease and cleaned the fan the CPU temperature decreased by around 15°C (59°F). That was a big win.

Step 2: Fix the airflow inside the cooling system

After two weeks I decided to try to make air flow inside the laptop more streamlined. First thing I did - closed the hole in motherboard with a piece of thick paper. The idea was to minimize the amount of hot air going under my keyboard which sometimes was making it too hot to work with it normally.

Dell Inspiron N5110

Secondly I decided to fix the air intake. From my point of view it has two issues:

  1. For some reason a piece of plastic was covering the 25% of the air intake grid. So I’ve just cut it away with the paper knife.

Plastic cover over air intake Plastic cover over air intake removed 2. There was a gap of 7mm (~0.25") between the motherboard and the grid so I’ve made a compactor from the little piece of linoleum. I’m sure something thick enough like a piece of foam rubber would also work as the idea is to streamline the air intake and do not allow the hot air from the laptop to be taken again.

A piece of linoleum that

I just glued the pieces of linoleum to the laptop case and made something that looks like a well.

Linoleum compactor applied

This gave me a little improvement of around 4-5°C (41°F). Not much but still better than nothing.

Conslusions

Replacement of the thermal grease and cleaning of the fan from dust is a must if you want to fix the overheating issues. Attempts to improve the air intake could also help to lower the temperature but not much.

Anyways you will not lose anything from trying to make things better.

Good luck!

How to Configure ComEmu Task for GitHub for Windows Portable Git

2/19/2015 Update: I’ve decided that it would be good to propose the change described in this post to the msysgit project. And today it was accepted and merged. It took me only 7 months to come up with idea that the change described below could be included into the official release of the software that I’m using on a daily basis :)

Maybe a year or something ago I switched from Console2 to ConEmu. One of the reasons behind this switch was a Task concept that ConEmu offered.

There was only one problem with my tasks setup - I wanted to launch Portable Git which is a part of GitHub for Windows installation inside ConEmu. But launching the git-cmd.bat from ConEmu will create a new window.

As you may know Portable Git binaries are located in %LOCALAPPDATA%\GitHub\PortableGit_054f2e797ebafd44a30203088cd3d58663c627ef\ Note that the last part of the directory name is a version string so it could change in future.

The problem lies in the last line of the git-cmd.bat file:

git-cmd.bat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
@rem Do not use "echo off" to not affect any child calls.
@setlocal

@rem Get the abolute path to the current directory, which is assumed to be the
@rem Git installation root.
@for /F "delims=" %%I in ("%~dp0") do @set git_install_root=%%~fI
@set PATH=%git_install_root%\bin;%git_install_root%\mingw\bin;%git_install_root%\cmd;%PATH%

@if not exist "%HOME%" @set HOME=%HOMEDRIVE%%HOMEPATH%
@if not exist "%HOME%" @set HOME=%USERPROFILE%

@set PLINK_PROTOCOL=ssh
@if not defined TERM set TERM=msys

@cd %HOME%
@start %COMSPEC%

To fix the issue replace last line @start %COMSPEC% with @call %COMSPEC%.

This change will not break the existing “Open in Git Shell” context action in GitHub application GUI.

The difference between start and call commands is that call runs the batch script inside the same shell instance while start creates a new instance. Here is a little fragment from start and call help:

1
2
3
4
5
C:\>call /?
Calls one batch program from another.

C:\>start /?
Starts a separate window to run a specified program or command.

That’s it! Now following task for ConEmu will work as expected:

*cmd /k Title Git & "%LOCALAPPDATA%\GitHub\PortableGit_054f2e797ebafd44a30203088cd3d58663c627ef\git-cmd.bat"

Automate Your Dev Environment Setup

Every time I need to install and configure developer environment on a fresh OS (either on real or virtual machine) I feel irritated by the fact that I need to spend almost all my day just clicking around various installation dialogs confirming destination folders, accepting user agreements (that I can bet no one even tried to read fully) and performing other repetitive and almost pointless tasks.

I’m developer, I’m creating things (or at least trying to), so why would I waste my time doing dull and pointless work?! Ah, and why should I keep in mind (or in notepad, “installs” folder, etc.) a list of my tools and installation packages?

But honestly, I just cannot resist or give a single reason why this shouldn’t be automated. Said it - did it. And here are my adventures.

Let’s get Chocolatey?

Maybe you’ve heard about Chocolatey. In short this tool is like apt-get but for Windows and it is built on top of NuGet.

For those who are not familiar with NuGet and all variety of tools around it take a look at An Overview of the NuGet Ecosystem article by Xavier Decoster.

For a quick Chocolatey overview I can recommend Scott Hanselman’s post Is the Windows user ready for apt-get?

As of time of writing Chocolatey had 1,244 unique packages which is pretty cool - it is really hard to find package that does not exist there.

After a little search it appeared that I can even install Visual Studio with Chocolatey. Okay, cool, let’s do this.

No Battle Plan Survives Contact With the Enemy

I tried to install my first package on a fresh Windows 8 virtual machine and failed on the very first step. Jumping ahead of the story partially that was my fail but let’s roll on.

I wanted no more no less - install Visual Studio 2013 Ultimate Preview and see its new shining features for web devs. As described on site I installed Chocolatey and run cinst VisualStudio2013Ultimate command. Package downloaded, and .NET 4.5.1 installation started. Boom! I got my first error:

1
[ERROR] Exception calling "Start" with "1" argument(s): "The operation was canceled by the user"

Chocolatey .NET 4.5.1 installation error

After some research it appeared that by default Windows 8 processes are not launched with administrator privileges (even if current user is member of Administrator group) and because of silent installation mode (read “non-UI mode”) UAC prompt was not showed and attempt to elevate rights was cancelled by default. To fix this issue I had to disable UAC notifications. I have spent quite time searching the cause of my issue and decided to table VS 2013 for now and proceed with installation of the Visual Studio 2012 instead.

To install 90 day trial of Visual Studio 2012 Ultimate I run cinst VisualStudio2012Ultimate command and after a little pause and some blinking of standard installation dialog another crazy error appeared:

1
blah-blah-blah. Exit code was '-2147185721'

Chocolatey VS 2012 installation error

Thankfully, I have experience with silent installations of Visual Studio and I have a link to Visual Studio Administrator Guide in my bookmarks which contains a list of exit codes for installation package. -2147185721 code is ”Incomplete - Reboot Required”. That sounded logically. /NoRestart switch in VS chocolatey install script setup was automatically cancelled and returned non-zero value which was treated as error. Okay, rebooted the machine.

But this was not my last error :). After reboot using -force parameter I resumed installation process of the Visual Studio and got my next error (extracted from installation log vs.log file):

1
2
3
[0824:0820][2013-09-14T12:56:04]: Applied execute package: vcRuntimeDebug_x86, result: 0x0, restart: None
[082C:09C4][2013-09-14T12:56:04]: Registering dependency: {ae17ae9b-af38-40d2-a194-6102c56ed502} on package provider: Microsoft.VS.VC_RuntimeDebug_x86,v11, package: vcRuntimeDebug_x86
[082C:0850][2013-09-14T12:56:12]: Error 0x80070490: Failed to find expected public key in certificate chain.

The last words from “chocolatey gods” were Exit code was '1603'.

This time nothing came to my mind except trying to install updates on Windows first (words “certificate chain” lead me to this idea). As it turned out that was the case and my great mistake not to install updates first.

Moral: never try to install something serious unless you have all updates for your OS installed.

After all these errors I decided to rollback my virtual machine back to the initial state and start from scratch. This time I installed all Windows updates and after I finished all Chocolatey packages were installed without any errors.

Share all the scripts!

Share all the scripts!

After I finished with my journey I decided that it would be great to keep my scripts in one place and have a possibility to share them. I cannot find any better service for this but Github. Now I can share my scripts, update them, have a history of changes, make tags and special branches for some specific setups. Isn’t this great and how it should be?

Go fork my repository and start making your life easier!

Conclusions

Here I did only first steps on the road to the bright future of the automated environment setup. And while we can use Chocolatey to save time with installations we still need to configure the stuff. Of course if you are using default settings this is not a problem but unfortunatelly this is not my case ;)

I think in my next post I will share my experience in automated configurations trasferring.

Configuring Web Forms Routing With Custom Attributes

1/13/2013 Update: Now PhysicalFile property is filled and updated automatically, using T4 template. Say good-bye to issues caused by typos and copy-pasting.

Recently I had to add routing to existent ASP.NET Web Forms application. I was (and I suppose I’m still) new to this thing so I started from Walkthrough: Using ASP.NET Routing in a Web Forms Application and it seemed fine until I started coding.

The site was nothing special but approximately 50 pages. And when I started configuring all these pages it felt wrong - I was lost in all these route names, defaults and constraints. If it felt wrong, I thought, why not to try something else. I googled around and found a pretty good thing - ASP.NET FriendlyUrls. Scott Hanselman wrote about this in his Introducing ASP.NET FriendlyUrls - cleaner URLs, easier Routing, and Mobile Views for ASP.NET Web Forms post. At first glance it looked far easier and better, but I wanted to use RouteParameters for my datasource controls on pages. ASP.NET FriendlyUrls are providing only “URL segment” concept - string that could be extracted from URL (string between ‘/’ characters in URL). URL Segments could not be constrained and thus automatically validated. Also, segments could not have names, so my idea to use RouteParameter would be killed if I’d go with ASP.NET FriendlyUrls.

At the end of this little investigation I thought that it would be easier to tie together route configuration with page class via custom attribute and conventionally named properties for defaults and constraints. So every page class gets its routing configuration as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
namespace RoutingWithAttributes.Foo
{
 [MapToRoute(RouteUrl = "Foo/Edit/{id}")]
  public partial class Edit : Page
  {
      public static RouteValueDictionary Defaults
      {
          get
          {
              return new RouteValueDictionary { { "id", "" } };
          }
      }

      public static RouteValueDictionary Constraints
      {
          get
          {
              return new RouteValueDictionary { { "id", "^[0-9]*$" } };
          }
      }
  }
}

The code above states that Edit page in folder Foo of my RoutingWithAttributes web application will be accessible through http://<application-url>/Foo/Edit hyperlink with optional id parameter. Default value for id parameter is empty string but it should be integer number if provided.

For me this works better, it is self describing and I’m not forced to go to some App_Start\RoutingConfig.cs file and search for it. Now how it is working under the hood? Nothing new or special - just a bit of reflection on Application_Start event. And routes are still registered with RouteCollection.MapPageRoute method.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
protected void Application_Start(object sender, EventArgs e)
{
  RouteConfig.RegisterRoutes(RouteTable.Routes);
}

public class RouteConfig
{
  public static void RegisterRoutes(RouteCollection routes)
  {
      var mappedPages = Assembly.GetAssembly(typeof (RouteConfig))
              .GetTypes()
              .AsEnumerable()
              .Where(type => type.GetCustomAttributes(typeof (MapToRouteAttribute), false).Length == 1);

      foreach (var pageType in mappedPages)
      {
          var defaultsProperty = pageType.GetProperty("Defaults");
          var defaults = defaultsProperty != null ? (RouteValueDictionary)defaultsProperty.GetValue(null, null) : null;

          var constraintsProperty = pageType.GetProperty("Constraints");
          var constraints = constraintsProperty != null ? (RouteValueDictionary)constraintsProperty.GetValue(null, null) : null;

          var dataTokensProperty = pageType.GetProperty("DataTokens");
          var dataTokens = dataTokensProperty != null ? (RouteValueDictionary)dataTokensProperty.GetValue(null, null) : null;

          var routeAttribute = (MapToRouteAttribute)pageType.GetCustomAttributes(typeof(MapToRouteAttribute), false)[0];

          if(string.IsNullOrEmpty(routeAttribute.RouteUrl))
              throw new NullReferenceException("RouteUrl property cannot be null");

          if (string.IsNullOrEmpty(routeAttribute.PhysicalFile))
              throw new NullReferenceException("PhysicalFile property cannot be null");

          if(!VirtualPathUtility.IsAppRelative(routeAttribute.PhysicalFile))
              throw new ArgumentException("Property should be application relative URL", "PhysicalFile");

          routes.MapPageRoute(pageType.FullName, routeAttribute.RouteUrl, routeAttribute.PhysicalFile, true, defaults, constraints, dataTokens);
      }
  }
}

Route name is equal to the FullName property of page type. Since Type.FullName includes both namespace and class name it guarantees route name uniqueness across the application.

To utilize route links generation I had to create two extension methods for Page class. These methods are just wrappers for Page.GetRouteUrl method.

1
2
3
4
5
6
7
8
9
10
11
12
public static class PageExtensions
{
  public static string GetMappedRouteUrl(this Page thisPage, Type targetPageType, object routeParameters)
  {
      return thisPage.GetRouteUrl(targetPageType.FullName, routeParameters);
  }

  public static string GetMappedRouteUrl(this Page thisPage, Type targetPageType, RouteValueDictionary routeParameters)
  {
      return thisPage.GetRouteUrl(targetPageType.FullName, routeParameters);
  }
}

So now I can generate link to Foo.Edit page as follows:

1
    <a href='<%= Page.GetMappedRouteUrl(typeof(RoutingWithAttributes.Foo.Edit), new { id = 1 }) %>'>Foo.Edit</a>

And it will produce http://<application-url>/Foo/Edit/1 link.

Described approach helped me to accomplish task without frustration and I’m satisfied with the results.

Code for this article is hosted on GitHub feel free to use it if you liked the idea.

Improve Your Reading Experience With Instapaper, Calibre and Command Line

After I read Scott Hanselman’s post ”Instapaper delivered to your Kindle changes how you consume web content - Plus IFTTT, blogs and more” I bethought that I wanted to create an automated Instapaper to my e-book reader “contend delivery system”. Now as I finished here is my little story.

LBook V5Almost a year ago when I started using Instapaper I realized that it would be great to grab all articles that were collected through the week, convert them to EPUB format and send electronic book to my e-book reader device. The only problem was in my device - Lbook V5. Yes, it is totally outdated and old comparing to Kindle devices. It supports EPUB but does not have access to the Internet, so Instapaper “download” feature doesn’t work for me.

A few month ago I found Calibre - free and open source e-book library management application. It helped me to organize and manage all my electronic library and I’m totally happy with it. Calibre has everything that could be possibly needed - scheduler support, custom news source with interactive setup and converters to various e-book formats. But what most interesting and important Calibre has command line ebook-convert.exe utility which could be driven by recipe files. Recipes in Calibre are just Python scripts (with a bits of custom logic if it is needed to parse some specific news source).

Below is simple Calibre recipe:

1
2
3
4
5
6
7
class AdvancedUserRecipe1352822143(BasicNewsRecipe):
  title          = u'Custom News Source'
  oldest_article = 7
  max_articles_per_feed = 100
  auto_cleanup = True

  feeds = [(u'The title of the feed', u'http://somesite.com/feed')]

This defines RSS feed source at http://somesite.com/feed and declares that there should be no more than 100 articles not older than 7 days. If we’ll use it with ebook-convert utility, it will automatically fetch news from specified feed and will generate e-book file. The command line to generate book is following:

1
ebook-convert.exe input_file output_file [options]

When input_file parameter is recipe ebook-convert runs it and then produces e-book in specified by output_file parameter format. Recipe should populate feeds dictionary so ebook-convert will know what XML feeds should be processed. Options could accept two parameters - username and password (correct me if I’m wrong but I didn’t found any information about possibility to use other/custom parameters). That was a brief introduction to Calibre recipe files. Now here is the problem.

Calibre has built in Instapaper recipe. This recipe was created by Stanislav Khromov with Jim Ramsay. Recipe has two versions - stable (it is part of current Calibre release) and development version, both could be found on BitBucket.

The development version of Instapaper recipe does almost what I want, but I needed to extend its functionality including:

  • Grab articles from all pages inside one directory (yes, sometimes it happens, when I’m not reading Instapaper articles for a few weeks).
  • Merge articles from certain directories into one book.
  • Archive all items in directories. This actually implemented in development version, but instead of using “Archive All…” form recipe emulates clicking on “Move to Archive” button which takes a lot of time to process all items.

At first I decided to extend development version of the mentioned above recipe but after I wasted an hour trying to beat the Python I realized that I can write command line utility in .NET (where I feel myself very comfortable) which will do whatever I want and I will save a ton of time (I’m definitely not going to learn Python just to change/fix one Calibre recipe :)). So here is InstaFeed - little command line utility that can enumerate names of Instapaper directories, generate single RSS feed for specified list of directories and archive them all at once. It uses two awesome open-source projects - Html Agility Pack and Command Line Parser Library.

Note: While this utility parses Instapaper HTML and produces RSS you can probably bypass “RSS limits” of Instapaper non-subscription accounts. But I encourage you to support this service. Cheating is not good at all, please respect Marco Arment’s work and efforts he put in this awesome service.

Having the command line utility that produces locally stored RSS feeds the only thing that remains is to create simple Calibre Recipe for ebook-convert utility. The recipe should be parameterized with path to RSS feed generated by InstaFeed. Here is the code:

1
2
3
4
5
6
7
8
9
10
11
class LocalRssFeed(BasicNewsRecipe):
  title        = u'local_rss_feed'
  oldest_article    = 365
  max_articles_per_feed    = 100
  auto_cleanup    = True
  feeds = None

  def get_feeds(self):
      # little hack that allows passing path to local RSS feed as a parameter via command line
      self.feeds = [u'Instapaper Unread', 'file:///' + self.username]
      return self.feeds

All custom recipes should be stored within Calibre Settings\custom_recipes folder.

Note: Everything in this post applies to Portable 0.8.65.0 version of Calibre for Microsoft Windows. I have no idea whether it will work for other versions or installation variants.

Below is sources for batch file that produces RSS feed from Read Later Instapaper directory and then generates e-book in EPUB format at C:\Temp. I run this batch weekly via Windows Task Scheduler.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@echo off
setlocal EnableDelayedExpansion
setlocal EnableExtensions

:: change path to your calibre and instafeed executables
set _instafeeddir=F:\util\instafeed\
set _calibredir=F:\util\Calibre Portable\

:: set output directory and naming convention here
set filename=C:\Temp\[%date:/=%]_instapaper_unread_articles
set rssfile=%filename%.xml
set ebookfile=%filename%.epub

%_instafeeddir%instafeed.exe -c rss -u <instapaper_username> -p <instapaper password> -d "Read Later" -o "%rssfile%"
%_calibredir%\Calibre\ebook-convert.exe "%_calibredir%\Calibre Settings\custom_recipes\local_rss_feed.recipe" "%ebookfile%" --username="%rssfile%"

endlocal

I had fun writing InstaFeed and digging in Calibre recipes and hope that someone will benefit from my experience. What else could be said? Read with convenience and have fun!

Adding Client-Side Validation Support for PhoneAttribute or Fighting the Lookbehind in JavaScript

Today, I was working on JavaScript implementation of validation routine for PhoneAttribute in context of my hobby project DAValidation. Examining the sources of .NET 4.5 showed that the validation is done via regular expression:

Unsupported lookbehind part of phone validation regexp pattern

And here is the problem - the pattern uses lookbehind feature that is not supported in JavaScript. Quote from regular-expressions.info:

Finally, flavors like JavaScript, Ruby and Tcl do not support lookbehind at all, even though they do support lookahead.

This lookbehind is used to match the “+” sign at the beginning of string, i. e. check the existence of the prefix. To make this work in JavaScript pattern should be reversed and lookbehind assertion should be replaced with lookahead (replace prefix check to suffix). And that’s it! The resulting pattern is:

1
^(\d+\s?(x|\.txe?)\s?)?((\)(\d+[\s\-\.]?)?\d+\(|\d+)[\s\-\.]?)*(\)([\s\-\.]?\d+)?\d+\+?\((?!\+.*)|\d+)(\s?\+)?$

As a proof here is test html page:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
<html>
    <head>
        <title>Phone Number RegExp Test Page</title>
    </head>
    <body>
        <script>
            function validateInput() {
                var phoneRegex = new RegExp("^(\\d+\\s?(x|\\.txe?)\\s?)?((\\)(\\d+[\\s\\-\\.]?)?\\d+\\(|\\d+)[\\s\\-\\.]?)*(\\)([\\s\\-\\.]?\\d+)?\\d+\\+?\\((?!\\+.*)|\\d+)(\\s?\\+)?$", "i");

                var input = document.getElementById("tbPhone");
                var value = input.value.split("").reverse().join("");
                alert(phoneRegex.test(value));
            }
        </script>

        <input type="text" id="tbPhone" />
        <button onclick="javascript:testPhone()">Validate</button>
    </body>
</html>

While working on pattern reversing I was using my favorite regular expressions building and testing tool Expresso. Also, a great article of Steven Levithan Mimicking Lookbehind in JavaScript helped to look deeper and actually find the right solution of the problem.

PS. Now, as I finally finished adding support for new .NET 4.5 validation attributes the new version of DAValidation will be published soon. Stay tuned ;)

How to Implement Configurable Dynamic Data Filters in ASP.NET 4.5

Every time, when we speaking about data driven web applications there is a task of providing data filtering feature or configurable filters with ability to save the search criteria individually for each user. The most convenient filtering experience I have ever encountered were the bug tracking systems. Fast and simple. To get the idea of what I’m talking about just look at Redmine Issues page. Can we implement something similar with pure ASP.NET, particularly with ASP.NET Dynamic Data? Why Dynamic Data? Because of its focus on metadata which is set by attributes from DataAnnotations namespace and convention over configuration approach for building data driven applications. Its simple and convenient, and does not take much efforts to extend it.

For filtering Dynamic Data offers us Filter Templates with FilterRepeater control. To get the idea of how Dynamic Data Filter Templates are working I highly recommend reading a great post of Oleg Sych “Understanding ASP.NET Dynamic Data: Filter Templates”.

Until .NET 4.5 there were no extension points where we could retake control over filter templates creation. And surprisingly, I found that interface IFilterExpressionProvider.aspx) became public in .NET 4.5. So now we can extend Dynamic Data filtering mechanism.

ASP.NET Dynamic Data QueryableFilterRepeater

For the jump start lets remind how List PageTemplate in Dynamic Data looks like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
<asp:QueryableFilterRepeater runat="server" ID="FilterRepeater">
  <ItemTemplate>
      <asp:Label runat="server" Text='<%# Eval("DisplayName") %>' OnPreRender="Label_PreRender" />
      <asp:DynamicFilter runat="server" ID="DynamicFilter" OnFilterChanged="DynamicFilter_FilterChanged" /><br />
  </ItemTemplate>
</asp:QueryableFilterRepeater>

<asp:GridView ID="GridView1" runat="server" DataSourceID="GridDataSource" >
<%-- Contents and styling omited for brevity --%>
</asp:GridView>

<asp:EntityDataSource ID="GridDataSource" runat="server" EnableDelete="true" />

<asp:QueryExtender TargetControlID="GridDataSource" ID="GridQueryExtender" runat="server">
  <asp:DynamicFilterExpression ControlID="FilterRepeater" />
</asp:QueryExtender>

The purpose of QueryableFilterRepeater is to generate set of filters for a set of columns. It should contain DynamicFilter control which is the actual placeholder for a FilterTemplate control. QueryableFilterRepeater implements IFilterExpressionProvider interface that is supported by QueryExtender via DynamicFilterExpression control.

1
2
3
4
5
public interface IFilterExpressionProvider
{
  IQueryable GetQueryable(IQueryable source);
  void Initialize(IQueryableDataSource dataSource);
}

The complete call sequence is represented on diagram below.

Sequence diagram showing QueryExtender interaction with Dynamic Data controls

Building Configurable Alternative to QueryableFilterRepeater

As QueryableFilterRepeater is creating filters automatically, the only thing we can do is to hide DynamicFilter on client- or on server-side. To my mind it is not good idea, so a custom implementation of IFilterExpressionProvider is needed. It should support the same item template model as in QueryableFilterRepeater but with ability to add/remove filter controls between postbacks.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
[ParseChildren(true)]
[PersistChildren(false)]
public class DynamicFilterRepeater : Control, IFilterExpressionProvider
{
  private readonly List<IFilterExpressionProvider> filters = new List<IFilterExpressionProvider>();
  private IQueryableDataSource dataSource;

  IQueryable IFilterExpressionProvider.GetQueryable(IQueryable source)
  {
      return filters.Aggregate(source, (current, filter) => filter.GetQueryable(current));
  }

  void IFilterExpressionProvider.Initialize(IQueryableDataSource queryableDataSource)
  {
      Contract.Assert(queryableDataSource is IDynamicDataSource);
      Contract.Assert(queryableDataSource != null);

      if (ItemTemplate == null)
          return;
      dataSource = queryableDataSource;

      Page.InitComplete += InitComplete;
      Page.LoadComplete += LoadCompeted;
  }
}

The only disappointing thing is the content generation of DynamicFilter which is done on Page.InitComplete event.

Oleg Sych tried to change the situation, but his suggestion is closed now and seems nothing will be changed. I just reposted his suggestion on visualstudio.uservoice.com in hope that this time, we will succeed.

To make things working, DynamicFilter control should initialize itself via EnsureInit method which is generally speaking responsible for FitlerTempate lookup and loading. In other words to force the DynamicFilter to generate its content this method should be called. The only way to do it is to use reflection, since EnsureInit is private.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
private static readonly MethodInfo DynamicFilterEnsureInit;

static DynamicFilterRepeater()
{
  DynamicFilterEnsureInit = typeof (DynamicFilter).GetMethod("EnsureInit", BindingFlags.NonPublic | BindingFlags.Instance);
}

private void AddFilterControls(IEnumerable<string> columnNames)
{
  foreach (MetaColumn column in GetFilteredMetaColumns(columnNames))
  {
      DynamicFilterRepeaterItem item = new DynamicFilterRepeaterItem { DataItemIndex = itemIndex, DisplayIndex = itemIndex };
      itemIndex++;
      ItemTemplate.InstantiateIn(item);
      Controls.Add(item);

      DynamicFilter filter = item.FindControl(DynamicFilterContainerId) as DynamicFilter;
      if (filter == null)
      {
          throw new InvalidOperationException(String.Format(CultureInfo.CurrentCulture,
              "FilterRepeater '{0}' does not contain a control of type '{1}' with ID '{2}' in its item templates",
              ID,
              typeof(QueryableFilterUserControl).FullName,
              DynamicFilterContainerId));
      }
      filter.DataField = column.Name;

      item.DataItem = column;
      item.DataBind();
      item.DataItem = null;

      filters.Add(filter);
  }

  filters.ForEach(f => DynamicFilterEnsureInit.Invoke(f, new object[] { dataSource }));
}

private IEnumerable GetFilteredMetaColumns(IEnumerable filterColumns)
{
  return MetaTable.GetFilteredColumns()
      .Where(column => filterColumns.Contains(column.Name))
      .OrderBy(column => column.Name);
}

private class DynamicFilterRepeaterItem : Control, IDataItemContainer
{
  public object DataItem { get; internal set; }
  public int DataItemIndex { get; internal set; }
  public int DisplayIndex { get; internal set; }
}

Another problem that should be solved - filter controls instantiation. As it was pointed before, all things in Dynamic Data that are connected with filtering are initialized at Page.InitCompleted event. And if you want your dynamic filters to work, they should be instantiated before or at InitComplete event. So far I see only one way to solve this - method AddFilterControls should be called twice, first time to instantiate filter controls that were present on the page (InitComplete event) and second time for newly added columns that are to be filtered (LoadComplete event).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
private void InitComplete(object sender, EventArgs e)
{
  if (initComleted)
      return;

  addedOnInitCompleteFilters.AddRange(FilterColumns);
  AddFilterControls(addedOnInitCompleteFilters);

  initComleted = true;
}

private void LoadCompeted(object sender, EventArgs eventArgs)
{
  if (loadCompleted)
      return;

  AddFilterControls(FilterColumns.Except(addedOnInitCompleteFilters));

  loadCompleted = true;
}

Encapsulating DynamicFilterRepeater

DynamicFilterRepeater is only a part of more general component though. Everything it does is rendering of filter controls and providing of filter expression. But to start working, DynamicFilterRepeater needs two things - IQueryableDataSource and list of columns to be filtered. Since filtering across the website should be consistent and unified it would be good to encapsulate DynamicFilterRepeater in UserControl which will serve as HTML layout and a glue between page (with IQueryableDataSource, QueryExtender and data source bound control) and DynamicFilterRepeater. In my example I chose GridView.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
<asp:Label runat="server" Text="Add fitler" AssociatedControlID="ddlFilterableColumns" />
<asp:DropDownList runat="server" ID="ddlFilterableColumns" CssClass="ui-widget"
  AutoPostBack="True"
  ItemType="<%$ Code: typeof(KeyValuePair<string, string>) %>"
  DataValueField="Key"
  DataTextField="Value"
  SelectMethod="GetFilterableColumns"
  OnSelectedIndexChanged="ddlFilterableColumns_SelectedIndexChanged">
</asp:DropDownList>

<input type="hidden" runat="server" ID="FilterColumns" />
<dd:DynamicFilterRepeater runat="server" ID="FilterRepeater">
  <ItemTemplate>
      <div>
          <asp:Label ID="lblDisplayName" runat="server"
              Text='<%# Eval("DisplayName") %>'
              OnPreRender="lblDisplayName_PreRender" />
          <asp:DynamicFilter runat="server" ID="DynamicFilter" />
      </div>
  </ItemTemplate>
</dd:DynamicFilterRepeater>

Remember I have mentioned about two-stage filter controls instantiation and a storage for list of filtered columns? Yes, this user control is a place where list of filtered columns could be stored. To get list of filtered columns before Page.InitComplete event I’m using a little trick - the hidden input field serves as a storage for filtered columns list. Enforcing hidden input to have its ID generated on server makes it possible to retrieve value directly from Page.Form collection at any stage of page lifecycle.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
public partial class DynamicFilterForm : UserControl
{
  public DynamicFilterRepeater FilterRepeater;
  public Type FitlerType { get; set; }

 [IDReferenceProperty(typeof(GridView))]
  public string GridViewID { get; set; }

 [IDReferenceProperty(typeof(QueryExtender))]
  public string QueryExtenderID { get; set; }

  private MetaTable MetaTable { get; set; }
  private GridView GridView { get; set; }
  protected QueryExtender GridQueryExtender { get; set; }

  protected override void OnInit(EventArgs e)
  {
      base.OnInit(e);
      MetaTable = MetaTable.CreateTable(FitlerType);

      GridQueryExtender = this.FindChildControl<QueryExtender>(QueryExtenderID);
      GridView = this.FindChildControl<GridView>(GridViewID);
      GridView.SetMetaTable(MetaTable);

      // Tricky thing to retrieve list of filter columns directly from hidden field
      if (!string.IsNullOrEmpty(Request.Form[FilterColumns.UniqueID]))
          FilterRepeater.FilterColumns.AddRange(Request.Form[FilterColumns.UniqueID].Split(','));

      ((IFilterExpressionProvider)FilterRepeater).Initialize(GridQueryExtender.DataSource);
  }

  protected override void OnPreRender(EventArgs e)
  {
      FilterColumns.Value = string.Join(",", FilterRepeater.FilterColumns);
      base.OnPreRender(e);
  }
  // event handlers ommited
}

Conclusions

While this solution works, I’m a bit concerned about it. Existent infrastructure was in my way all the time I experimented with IFilterExpressionProvider, and I had to look deep inside the mechanisms of Dynamic Data to understand and find ways to come round its restrictions. And this leads me to only one conclusion - Dynamic Data was not designed to provide configurable filtering. So my answer on question about possibility of configurable filtering experience implementation with Dynamic Data is yes, but be careful what you wish for, since it was not designed for such kind of scenarios.

Here I did not mentioned how to save filters, but it is pretty simple, and all we need is to save somewhere associative array of “column-value” for a specific page. Complete source code is available on GitHub and you will need Visual Studio 11 Beta with localdb setup to run sample project.

I would gladly accept criticism, ideas or just thoughts on this particular scenario. Share, do coding and have fun!

What Types Were Added, Moved or Became Public in .NET 4.5 Beta

There is a whitepaper about new features of ASP.NET 4.5, dozens of blog posts, videos from conferences, some tutorials and MSDN.aspx) topic describing overall changes. But why there are no reports about what types were actually added in .NET 4.5 Beta? Where are lists of “new types added to assembly N” or “types that became public in assembly N”? I understand that these lists are not very interesting and that it is more convenient to read descriptions of the new features. Nobody cares about details until they start working against you. And this is normal, right and ok, but now I have some free time and I want to share info about new types that were added or became public in .NET 4.5 comparing to .NET 4.0.

Why I became so interested in this? Well, I accidentally found that System.Web.DynamicData.IFilterExpressionProvider.aspx) interface became public. Briefly, this allows to work with QueryExtender and finally (I really wanted this since .NET 3.5) support its DynamicFilterExpression. Maybe on next weekend I’ll post I’ve already posted about my experiments with IFilterExpressionProvider, take a look if you are interested, and now lets return to the main topic.

It is pretty simple to do compiled assemblies diff if you have NDepend or similar tool, but what if you (like me) have no license? I started thinking to enumerate public types via reflection, but soon recalled that .NET 4.5 beta replaces assemblies of .NET 4.0 during installation and Assembly.LoadFrom will not work. To overcome this I decided to parse and compare XML documentation that comes along with all .NET assemblies. Simple as that, every public type is documented and difference between old and new version of documentation will give me at least names of types.

Ok, where to get xml documentation files for .NET 4.0? Binaries with xml docs of .NET 4.0 and 4.5 are located in :\Program Files[(x86)]\Reference Assemblies\Microsoft\Framework.NETFramework.

Reference assemblies folder location

What I wanted is to get some statistics. There are 969 new public types in .NET 4.5. But it does not mean that those are completely new things, because it is not, it means that out of the box .NET 4.5 Beta has +969 new types comparing to .NET 4.0 and now there are totally 14971 public and documented types in .NET 4.5. Almost 15K only public types - that’s incredibly huge number.

Round Diagram showing types count of .NET 4.0 and .NET 4.5 Beta

Most of new types are located in System.IdentityModel, System.Web and System.Windows.Controls.Ribbon assemblies. Taking into account that System.IdentityModel is providing authentication and authorization features and System.Windows.Controls.Ribbon is UI library allowing use of Microsoft Ribbon for WPF, we can make a conclusion that vast amount of new changes is connected with web.

Histogram of new types count by asembly

But the most interesting thing was to examine minor changes and see that something new and really useful has been added. And I encourage you to look over list of new classes and I bet you will find something interesting.

LINQPad script with which I did the documentation comparison is listed below. Excel report with some diagrams is also available online.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
void Main()
{
  // in case of non 64 bit system change "Program Files (x86)" to "Program Files"
  string net40Dir = @"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0\";
  string net45Dir = @"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5\";

  // 1. Get all types from all xml doc files in both directories that are containing .NET assemblies and group them by assemblies
  var net40Grouped = GetPublicTypesByAssembly(net40Dir);
  var net45Grouped = GetPublicTypesByAssembly(net45Dir);

  // 2. Get list of newly added assemblies
  var newAssemblies = net45Grouped.Where(kvp => !net40Grouped.ContainsKey(kvp.Key)).ToList();
  Console.WriteLine("New assemblies in .NET 4.5 Beta: (total count - {0})", newAssemblies.Count);
  newAssemblies.ForEach(kvp => Console.WriteLine(kvp.Key));

  Console.WriteLine();

  // 3. Get all assemblies that are not present in .NET 4.5 beta
  var nonExistentAssemblies = net40Grouped.Where(kvp => !net45Grouped.ContainsKey(kvp.Key)).ToList();

  Console.WriteLine("Assemblies that are not present in .NET 4.5 Beta folder comparing to .NET 4.0: (total count - {0})", nonExistentAssemblies.Count);
  nonExistentAssemblies.ForEach(kvp => Console.WriteLine(kvp.Key));

  Console.WriteLine();

  // 4. Get all new types in .NET 4.0 and .NET 4.5 Beta assemblies
  var net40 = net40Grouped.SelectMany(kvp => kvp.Value).ToList();
  var net45 = net45Grouped.SelectMany(kvp => kvp.Value).ToList();

  var newTypes = net45.Except(net40).ToList();
  Console.WriteLine("Types count in .NET 4.0:|{0}", net40.Count);
  Console.WriteLine("Types count in .NET 4.5 Beta:|{0}", net45.Count);
  Console.WriteLine("New types count in .NET 4.5 Beta comparing to .NET 4.0:|{0}", newTypes.Count);

  // 5. Get assemblies that are containing new types
  var assembliesWithChanges = net45Grouped.Where(kvp => newTypes.Any(type => kvp.Value.ContainsValue(type.Value)));

  // 6. Remove existent in .NET 4.0 types from assembliesWithChanges to get clear lists of new types grouped by assemblies
  var newTypesGrouped = assembliesWithChanges
      .ToDictionary(typesGroup => typesGroup.Key, typesGroup => typesGroup.Value.Except(net40)
      .Select(kvp => kvp.Value).ToList());

  Console.WriteLine("New types by assembly:");
  foreach (var assemblyWithNewTypes in newTypesGrouped)
  {
      Console.WriteLine("{0}|{1}", assemblyWithNewTypes.Key, assemblyWithNewTypes.Value.Count);
      foreach (var typeName in assemblyWithNewTypes.Value)
      {
          Console.WriteLine(typeName);
      }
      Console.WriteLine();
  }
}

Dictionary<string, Dictionary<int, string>> GetPublicTypesByAssembly(string xmlDocsDirectory)
{
  string[] xmlDocFiles = Directory.GetFiles(xmlDocsDirectory, "*.xml");
  var result = new Dictionary<string, Dictionary<int, string>>();

  foreach (var xmlDoc in xmlDocFiles)
  {
      var root = XDocument.Load(xmlDoc).Root;
      if (root == null) continue;

      var members = root.Element("members");
      if (members == null) continue;

      var typesByAssembly = members.Elements("member")
          .Where(e => e.Attribute("name").Value.StartsWith("T:"))
          .ToDictionary(e => e.Attribute("name").Value.GetHashCode(), e => e.Attribute("name").Value.Substring(2) /* T: */);

      result.Add(Path.GetFileNameWithoutExtension(xmlDoc) + ".dll", typesByAssembly);
  }

  return result;
}

And at the end here are links that will help a bit to embrace the changes of .NET 4.5 Beta:

Happy digging in new .NET and don’t hesitate sharing!

How Internet Archive Saved My Day

UPDATE: igoogle_themes.zip archive is no longer available through Wayback Machine since it is pointing to mattberseth2.com which is not working. However archive could be found on my SkyDrive.

Internet Wayback Machine Logo Today when I had to find a theme for ASP.NET GridView the first resource I found in my memory was Matt Berseth’s blog (Google also found something for me but I’m convinced that “favorites list” in my memory is a much better and reliable source). Matt had great examples of AJAX control extenders and some other things connected with styling of ASP.NET controls on his site. But while domain still belongs to Matt Berseth, the site is currently down and not available.

Well, I quickly found some references to the article 5 GridView Themes Based on Google’s Personalized Homepage (igoogle) and tried to get to it with the help of The Internet Archive (aka The Wayback Machine). From the “About the Wayback Machine” section:

Browse through over 150 billion web pages archived from 1996 to a few months ago.

In fact it never helped me, but I wanted to get rarely visited pages or downloads of a big size so it is nothing to complain. And at this time I was interested in recovery of popular resource and to my relief the page was crawled 20 times from the 3rd of November, 2007. And sample project download was also available! So it took me something near to 20 minutes to get what I wanted and this is nothing comparing to the efforts needed to create my own CSS for a GridView.

Thank you Internet Archive, you saved my day!

Building Data Annotations Validator Control With Client-Side Validation

When I had worked on ASP.NET MVC project I really liked how input is validated with Data Annotations attributes. And when I had to return to the Web Forms, and write a simple form with some validation, I was wondering how I lived before with standard validator controls. For me, it was never convenient, when I had to write an enormous amount of server tags just to state that “this is required field which accepts only numbers in specified range…”. Yes, there is nothing terrible in declaration of two or three validation controls instead of one. But, if I had a choice, I would like to write only one validator per field and keep all input validation logic as far as I can from the UI markup. And being a developer the code-only approach is most natural for me.

System.ComponentModel.DataAnnotations namespace was introduced in .NET 3.5, and now its classes are supported by wide range of technologies like WPF, Silverlight, RIA Services, ASP.NET MVC, ASP.NET Dynamic Data but not in Web Forms. I thought that someone had already implemented ready-to-use Validator Control with client-side validation, but after searching the Web and most popular open source hosting services I found nothing. Ok, not nothing, but implementations what I have found lacked client-side validation and had some other issues. So I decided to write my own Data Annotations Validator that will also support client-side validation.

Creating Data Annotations Validator Control

Server-Side

As I wanted to achieve compatibility with existing validation controls (new validator is not a replacement for an old ones, it is just an addition to them), it was decided to inherit from BaseValidator. This class does all necessary initialization on both client- and server-sides and exposes all necessary methods for overriding.

1
2
public class DataAnnotationsValidator : BaseValidator
{ }

First of all EvaluateIsValid method of BaseValidator should be overridden.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
protected override bool EvaluateIsValid()
{
  object value = GetControlValidationValue(ControlToValidate);
  foreach (ValidationAttribute validationAttribute in ValidationAttributes)
  {
      // Here, we will try to convert value to type specified on RangeAttibute.
      // RangeAttribute.OperandType should be either IConvertible or of built in primitive types
      var rangeAttibute = validationAttribute as RangeAttribute;
      if (rangeAttibute != null)
      {
          value = Convert.ChangeType(value, rangeAttibute.OperandType);
      }

      if (validationAttribute.IsValid(value)) continue;

      ErrorMessage = validationAttribute.FormatErrorMessage(DisplayName);
      return false;
  }

  return true;
}

The only interesting aspect of this method is line 16. I’m using FormatErrorMessage method of ValidationAttribute to use all the goodness like support of resources and proper default error message formatting. So, now there is no need to invent something with error messages.

Next thing to deal with is where to get ValidationAttributes collection. There is System.Web.DynamicData.MetaTable class that could be used to retrieve attributes. It was introduced in the first versions of ASP.NET Dynamic Data and now in 4.0 version of Dynamic Data, MetaTable has a static method CreateTable which accepts Type as input parameter. Why using MetaTable, why not retrieve attributes of Type directly from PropertyInfo for specified property name? Because MetaTable also supports retrieving of custom attributes that are applied to property through MetadataTypeAttribute and merges attributes applied to property both in metadata class and entity class. And again, why inventing something new when everything that is needed is right here?

1
2
3
MetaTable.CreateTable(ObjectType))
  .GetColumn(PropertyName).Attributes
  .OfType<ValidationAttribute>()

Now let’s look a bit into the future - if we place ObjectType property into DataAnnotationsValidator, it means that we should specify this property for every validator control on page. This is redundancy and leads to copy-pasting which is not acceptable. Lets step aside and create MetadataSource control that will act like metadata provider for validators on page.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public class MetadataSource : Control
{
  public Type ObjectType { get; set; }

  private MetaTable metaTable;
  private MetaTable MetaTable
  {
      get { return metaTable ?? (metaTable = MetaTable.CreateTable(ObjectType)); }
  }

  public IEnumerable GetValidationAttributes(string property)
  {
      return MetaTable.GetColumn(property).Attributes.OfType();
  }

  public string GetDisplayName(string objectProperty)
  {
      var displayAttribute = MetaTable.GetColumn(objectProperty).Attributes
          .OfType()
          .FirstOrDefault();

      return displayAttribute == null ? objectProperty : displayAttribute.GetName();
  }
}

Here I also thought about DisplayAttribute which is used to format default error message. Now how ObjectType of MetadataSouce should be specified? Well, we can do it programmatically on Page_Load or do it.. programmatically with CodeExpressionBuilder to keep all control setup in one place.

1
2
3
<dav:MetadataSource runat="server"
  ID="msFoo"
  ObjectType="<%$ Code: typeof(Foo) %>" />

Now, with existence of MetadataSource all fields of DataAnnotationsValidator are initialized in the OnInit method

1
2
3
4
5
6
7
8
9
10
11
12
protected override void OnLoad(EventArgs e)
{
  base.OnLoad(e);

  if (!ControlPropertiesValid())
      return;

  MetadataSource = this.FindChildControl(MetadataSourceID);

  ValidationAttributes = MetadataSource.GetValidationAttributes(ObjectProperty);
  DisplayName = MetadataSource.GetDisplayName(ObjectProperty);
}

This is all what was needed to provide server-side validation.

Client-Side

First of all, lets check what standard validator controls could be replaced with Data Annotations attributes.

Data Annotations Attribute Standard Validator Control
RequiredAttribute RequiredFieldValidator
StringLengthAttribute -
RegularExpressionAttribute RegularExpressionValidator
- CompareValidator
RangeAttribute RangeValidator

I have no ideas how to replace CompareValidator and I don’t think it is so critical and necessary to think on it. Time to look how standard validator controls are working on the client side.

Every validator that works on the client side should override AddAttributesToRender method of BaseValidator class. This method adds some fields to resulting javascript object. For example, RequiredFieldValidator adds evaluationfunction and initialvalue fields.

1
2
3
4
5
6
7
8
9
protected override void AddAttributesToRender(HtmlTextWriter writer) {
    base.AddAttributesToRender(writer);
    if (RenderUplevel) {
        string id = ClientID;
        HtmlTextWriter expandoAttributeWriter = (EnableLegacyRendering) ? writer : null;
        AddExpandoAttribute(expandoAttributeWriter, id, "evaluationfunction", "RequiredFieldValidatorEvaluateIsValid", false);
        AddExpandoAttribute(expandoAttributeWriter, id, "initialvalue", InitialValue);
    }
}

And resulting javascript block for RequiredFieldValidator will look next:

1
2
3
4
5
6
7
8
<script type="text/javascript">
//<![CDATA[
var rfvSampleTextBox = document.all ? document.all["rfvSampleTextBox"] : document.getElementById("rfvSampleTextBox");
rfvSampleTextBox.controltovalidate = "tbSampleTextBox";
rfvSampleTextBox.evaluationfunction = "RequiredFieldValidatorEvaluateIsValid";
rfvSampleTextBox.initialvalue = "";
//]]>
</script>

After examining source code of standard validator controls I found that every control sets evaluationfunction field that states a name for a javascript function that actually performs validation on the client-side. RequiredFieldValidator evaluation function is represented below.

1
2
3
function RequiredFieldValidatorEvaluateIsValid(val) {
    return (ValidatorTrim(ValidatorGetValue(val.controltovalidate)) != ValidatorTrim(val.initialvalue))
}

The val parameter is a validator object that was initialized with all the fields that were set in the AddAttributesToRender method. Plain and simple, if you need to supply your validator on client-side with some information override AddAttributesToRender and add what you want. To replace standard validators DataAnnotationsValidator is doing a little trick - it adds all standard evaluationfunction names, error messages and all necessary fields that are used by standard validation functions. Evaluation function of DataAnnotationsValidator:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
<script type="text/javascript">
//<![CDATA[
function DataAnnotationsValidatorIsValid(val) {
    var functionsToEvaluate = val.validatorFunctions.split(';;');
  var errorMessages = val.errorMessages.split(';;');

    for (var funcIndex in functionsToEvaluate) {
        var result = eval(functionsToEvaluate[funcIndex] + "(val)");
        if(result === false) {
          val.errormessage = errorMessages[funcIndex];
          val.innerText = errorMessages[funcIndex];
            return false;
        }
    }
    return true;
}
//]]>
</script>

This function is registered in the OnPreRender stage of control lifecycle.

1
2
3
4
5
6
7
8
9
10
11
12
13
protected override void OnPreRender(EventArgs e)
{
  base.OnPreRender(e);

  if (RenderUplevel)
  {
        var scriptManager = ScriptManager.GetCurrent(Page);
        if (scriptManager != null && scriptManager.IsInAsyncPostBack)
            ScriptManager.RegisterClientScriptResource(this, GetType(), DAValidationScriptFileName);
        else
          Page.ClientScript.RegisterClientScriptResource(GetType(), DAValidationScriptFileName);
  }
}

The only thing that remains is to get list of fields and values that are needed for validation functions. Every Data Annotation validation attribute will have an Adapter class that stores an array of ClientValidationRule classes. ClientValidationRule is just a container for storing javascript object field names and evaluationfunction.

1
2
3
4
5
6
7
8
9
10
11
12
13
public class ClientValidationRule
{
    public string EvaluationFunction { get; set; }
    public Dictionary<string, object> Parameters { get; private set; }

    public string ErrorMessage { get; set; }

    public ClientValidationRule()
    {
        Parameters = new Dictionary<string, object>();
        EvaluationFunction = string.Empty;
    }
}

And ValidationAttributeAdapter acts like a bridge between existing ValidationAttibute and its ClientValidationRules.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
internal class ValidationAttributeAdapter
{
    protected ValidationAttribute Attribute { get; set; }
    protected string DisplayName { get; set; }
    protected string ErrorMessage { get; set; }

    public ValidationAttributeAdapter(ValidationAttribute attribute, string displayName)
    {
        Attribute = attribute;
        DisplayName = displayName;
        ErrorMessage = Attribute.FormatErrorMessage(DisplayName);
    }

    public virtual IEnumerable<ClientValidationRule> GetClientValidationRules()
    {
        return Enumerable.Empty<ClientValidationRule>();
    }
}

All ValidationAttributeAdapter classes are registered within ValidationAttributeAdapterFactory in a Dictionary.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
private static readonly Dictionary<Type, Func<ValidationAttribute, string, ValidationAttributeAdapter>> PredefinedCreators
    = new Dictionary<Type, Func<ValidationAttribute, string, ValidationAttributeAdapter>>
    {
        {
            typeof(RangeAttribute),
            (attribute, displayName) => new RangeAttributeAdapter(attribute as RangeAttribute, displayName)
        }, {
            typeof(RegularExpressionAttribute),
            (attribute, displayName) => new RegularExpressionAttributeAdapter(attribute as RegularExpressionAttribute, displayName)
        }, {
            typeof(RequiredAttribute),
            (attribute, displayName) => new RequiredAttributeAdapter(attribute as RequiredAttribute, displayName)
        }, {
            typeof (StringLengthAttribute),
            (attribute, displayName) => new StringLengthAttributeAdapter(attribute as StringLengthAttribute, displayName)
        }
    };

public static ValidationAttributeAdapter Create(ValidationAttribute attribute, string displayName)
{
    Debug.Assert(attribute != null, "attribute parameter must not be null");
    Debug.Assert(!string.IsNullOrWhiteSpace(displayName), "displayName parameter must not be null, empty or whitespace string");

    // Added suport for ValidationAttribute subclassing. See http://davalidation.codeplex.com/workitem/695
    var baseType = attribute.GetType();
    Func<ValidationAttribute, string, ValidationAttributeAdapter> predefinedCreator;
    do
    {
        if (!PredefinedCreators.TryGetValue(baseType, out predefinedCreator))
            baseType = baseType.BaseType;
    }
    while (predefinedCreator == null && baseType != null && baseType != typeof(Attribute));

    return predefinedCreator != null
        ? predefinedCreator(attribute, displayName)
        : new ValidationAttributeAdapter(attribute, displayName);
}

As I said previously, idea was borrowed directly from ASP.NET MVC, so if you are familiar with its validation mechanisms you don’t need to learn how it works here. Approach here is almost identical to ASP.NET MVC which was described well by Brad Wilson. As in ASP.NET MVC DAValidation.IClientValidatable interface exposes an extension point and we can now create a custom validation attribute, implement IClientValidatable interface, write validation function or mix existing ones and get both server- and client-side validation. There is a great set of validation attributes - Data Annotations Extensions created by Scott Kirkland so it is only client function must be changed in order to use them with DataAnnotationsValidator.

1
2
3
4
public interface IClientValidatable
{
    IEnumerable<ClientValidationRule> GetClientValidationRules();
}

And that’s all, now we have fully functional control that makes validation with Data Annotations possible in the ASP.NET Web Forms universe.

Complete source code could be found on Codeplex. There you can download latest version of control and example project.