Make the ASP.NET DataPager SEO friendly?

Currently I’m working on a brand new .NET Framework v4 Project which is using all the new stuff of the new framework version. So I decided to try out the ASP.NET DataPager once more. It was introduced in previous .NET Versions already. However, at the beginning it only provided Javascript/Postback URLs which were not crawlable by searchengines. I mentioned that on Scott Gu’s Blog back in the day, maybe that helped the process to include a “QueryString” Option. You can define the DataPager using this options:

 <asp:DataPager runat="server" ID="DataPager1" PageSize="10" PagedControlID="listViewProducts" QueryStringField="pageNumber">

By using the “QueryStringField” option it at least won’t create uncrawlable Javascript/Postback Links. However, URLs like www.domain.com/?pageNumber=2 still won’t satisfy my needs. The best solution would be to integrate the Datapager with URL Routing. What we need for SEO is Static URLs like www.domain.com/page/2/ and nothing less! I saw a post on codeproject which explains how to extend the Datapager using that kind of searchengine friendly URLs, but it got some annoying limitations:

  • It does not support postback for paging.
  • Only the Next/Previous etc., pager style is supported, not a list of page numbers with direct jump capability.
  • There is no declarative way to display paging information (e.g., “showing items 1-10 from 100, page 1 of 10″).

Due to those facts I unfortunately still need to use my own custom solution. I would love to see a Datapager / URL Routing integration in the future!

Unrecognized attribute ‘targetFramework’

I’m currently playing around with ASP.NET 4.0 Beta 2. There are lots of bugs to fight with, but I’m looking forward to RC1, which should be available soon. It’s coming with a GoLive Licence, so when you put your ASP.NET v4.0 bits on IIS7, don’t forget to set your Application Pool to .NET Framework 4.x. Otherwise you will encounter the mentioned error:

Unrecognized attribute ‘targetFramework’. Note that attribute names are case-sensitive.

Looks like the Website Project Template is not working in Beta2, at least for me, or did they remove it completly? However, so far I wasn’t able to do a pre compilation of my website, I can’t find that feature in Visual Studio 2010.

Still sticking to the DataReader most of the time

Of course this is a big topic. We got LINQ, SubSonic and all the great ORM Mappers now, but when it comes to performance it’s still a critical decision. I never really cared much about that when developing Desktop Applications as you usually don’t have to deal with more than one user at the same time.

We run several Web Applications based on ASP.NET and many of them got more than 10 Million Pageviews per Day. So it is really critical to have a good Database Backend. I just want to talk about the direct Data Access, so I leave out all the caching stuff, that is a topic of its own. I tested the DataReader, the DataSet and some of the popular ORM tools. It’s a standard SELECT Query of 10.000 Database Rows. Here’s what I found out regarding performance:

Obviously the fastest way of accessing Data is still the DataReader with about 20-30ms.
The DataSet raises the bar to about 120ms already.
After that we got LINQ with about 210ms.
There’s ADO.NET with a whopping 400ms.
And the slowest ORM Mapper is unfortunately nHibernate 2.1 with around 580ms.

So if you have to develop the Data Access for big Web Applications you should still stick as much as possible to the DataReader, no matter how (un)comfortable it is, at least for the really critical sections.

By the way: Really big sites like Plenty Of Fish or MySpace which are based on ASP.NET are not using Caching at all. We’re talking about more than 1.2 billion page views/month here. It is not used because as soon as the data is put in the cache it’s already expired. Furthermore no components and just very simple structures including if, then and for loops. Obviously a Load Balancer makes very much sense here. You usually exclude a Database which is being used for searching to save load from the main database.

ORM Mappers are sweet, but it’s still critical to use them if your Website might go through the roof sometime.

Windows Server 2008 FTP Service

Since the IIS7 Team published the FTP Server in Version 7.5 life just became easier. It now also supports SSL and the setup is really comfortable. Get the executable from the URL mentioned underneath this post, install it, add Basic Authentication and a suitable FTP User and execute these two lines from the Commandline:

  1. To open Port 21: netsh advfirewall firewall add rule name=”FTP (no SSL)” action=allow protocol=TCP dir=in localport=21
  2. To enable passive transfers: netsh advfirewall set global StatefulFtp enable

That’s pretty much it. Enjoy a thin and powerful FTP Client from Microsoft for free.

Check this post for all the updates.

Check For Row Existance Only

It’s something you need to take care of in almost every ASP.NET Application. What if you would just like to know if that specific row exists or not, in SQL Server 2005 or SQL Server 2008. For example: Did this user vote on that topic already or not? There are different possibilities for that kind of situation. However, I wanted the fastet solution. If it’s about a query which looks like this:

SELECT somefield FROM sometable WHERE somefield = ‘somevalue’

In this case ‘somevalue’ is used as a literal key value, which also has an index running on it.

So if this query will always return either 0 or 1 rows, then – from an I/O point of view – using SELECT COUNT(*) or using EXISTS will be equally fast.

Why? Unfortunately SQL Server is not shortcutting an index seek if the value is being discovered in an intermediate index level. It also doesn’t shortcut it, if the value is out of range. So we always got a logical read for all index levels.

For the discussion COUNT(*) versus EXISTS, it does not matter whether the index on ‘somefield’ is clustered or not. However, the definition of the clustered index (if present) does affect performance. That’s about testing, you need to check out if the clustered on is faster or not. It mainly depends on the key size of the clustered index, the row size and the size of ‘somefield’. In theory, the fastest situation would be a clustered “and” nonclustered index on ‘somefield’. This will make the nonclustered index on ‘somefield’ the most shallow index possible, so the index seek on thisindex will use the least amount of I/O.

Finally handing over a resultset will be more costly than returning a return value.

404 Error while using ASP.NET URL Routing

I’m heavily using the new ASP.NET URL Routing feature introduced in .NET Framework 3.5 SP1 in my new ASP.NET Applications instead of traditional URL Rewriting. I actually wanted to use IIS7 URL Rewriting, but there’s no way to test your rewrite rules if you are using the built-in VS2008 Cassini Webserver for development (yet).

So I’ve set up some sweet URL Routing rules with Webforms and I have to say it works pretty good so far. However, sooner or later I stumbled upon the first problem. I used an UpdatePanel on a webform which went through the URL Routing mechanism. As soon as I hit the postback button I got the following error:

Sys.WebForms.PageRequestManagerServerErrorException: An unknown error occurred while processing the quest on the server. The status code returned from the server was: 404.

Obviously because the form action attribute didn’t supply the full URL and that is the problem. You can fix it by adding this to your Page_Load:

form1.Action = HttpContext.Current.Request.RawUrl;

I just added it into the MasterPage Page_Load as I’m using URL Routing all over the Page. That’ll fix it.

I’ve also seen this:

if (!String.IsNullOrEmpty(Request.ServerVariables["HTTP_X_ORIGINAL_URL"]))
        {         
          form1.Action = Request.ServerVariables["HTTP_X_ORIGINAL_URL"];
        }

However, that snippet doesn’t work for me, I actually don’t know why. If that’s the case for you, too, just use the RawURL snippet above (without an if block).

Best Online Storage Service

Welcome to the broadband Internet. Nowadays I even get a 3,6MBit Connection when I’m outside and connected through UMTS. It’s not a problem anymore to move bigger files through the internet. Yes, what we want is to be able to carry and access our digital lives wherever we go and from whatever device we may have in hand. Photos, video, documents, contacts, spreadsheets and everything else we want have access to. Can this be stored comfortable in just one place, accessible from everywhere?

When you use these online apps, it’s because you’re saving a lot of time and putting lots of different people on the same page. It will be a natural process for companies to become more webcentric as they notice what the benefits are.

Edit Documents Online, manage multiple users and much more, I highly recommend trying this provider 14 days for free. My experiences are that this is the easiest way to share and manage your files online, everytime and everywhere.

New .NET Logo

I just stumbled upon the new .NET Logo which will be replacing the old and rusty 8 years old .NET Logo. The new one looks way more professional.

Old .NET Framework Logo:
Old .NET Framework Logo

New .NET Framework Logo:
New .NET Framework Logo

Side note: There is a pretty interesting discussion going on about the Amazon EC3 W2K3 topic, if you didn’t check out the thread at the AWS forum yet you should have a look at it, check my previous blog post here.

Amazon launches Windows.. BUT

Yesterday Amazon launched a new EC2 Version including support for Windows and SQL Server.

Unfortunately it only runs Windows 2003 Server and SQL Server 2005. Now that we are all happy about Windows 2008, the new IIS7, the new URL Rewriting features and so on.

This really is a pity. I’d really like to use Amazon EC2, but I don’t want to take two steps back in development. Once you worked with IIS7 you don’t want to switch back to the old and clumsy IIS6. Especially URL Rewriting is so nice with IIS7.

There is a thread running on the EC2 Forum, but until now noone from Amazon replied. If you’re registered there you might want to drop a reply, too: http://developer.amazonwebservices.com/connect/thread.jspa?threadID=25660.

UPDATE:

Thanks for all of your feedback. Our intention is to support the widest variety of options for our customers that we can. We are already working to support Windows 2008 in EC2, and anticipate being able to offer it publicly in the early part of next year.

C# CSV Import

An old topic, I agree. Unfortunately we still can’t forget about CSV files and I see so many questions popping up on forums and especially on my blog related to this topic. Not every company features APIs or XML files yet, so it’s always nice to be comfortable with CSV Readers or Libraries.

First of all: Don’t write your own CSV Parser! No matter if it’s a C# CSV Import or VB.NET. There are so many libraries, frameworks, etc. – you really don’t have to reinvent the wheel once again. It’s not just about using String.Split(“,”), you need to care for much more nasty stuff. Fortunately other people already went through the headaches for you!

So what to do? Use FileHelpers Library 2.0! Instead of calling array items by writing myarray[3] you can even use e.g. mycart.ID now, isn’t that beautiful? Here’s a little code example:

FileHelperEngine engine = new FileHelperEngine(typeof(AdRow));
    AdRow[] res = engine.ReadFile(filename) as AdRow[];       
    foreach (AdRow ar in res)
    {           
        Response.Write(ar.CartItem + "<br />");                          
    }

It won’t get easier than that. I use it in several ASP.NET Projects and best of all: It’s free! Check it out and thanks to Marcos for creating it: FileHelpers 2.0. You will find tons of examples for reading and writing CSV Files.

URL Rewriting with IIS7

Finally Microsoft released the long anticipated IIS7 URL Rewrite Module. It really was about time! The key features look promising:

  • Rules-based URL rewriting engine
  • Regular expression pattern matching
  • Wildcard pattern matching
  • Global and distributed rewrite rules
  • Access to server variables and http headers
  • Various rule actions
  • Rewrite maps
  • UI for managing rewrite rules
  • GUI tool for importing of mod_rewrite rules

Quite comfortable that they even integrated a GUI Tool for importing Apache2 mod_rewrite rules. I’m not sure yet how strong URL Routing is, which first came up with the MVC CTPs and is now available for ASP.NET, too – I didn’t test it yet. For now I will give the new IIS7 Module a first shot as my current projects use URL Rewriting already and I think it’ll be easier to port my existing third-party URL Rewrite Engine to the new IIS7 one. I plan on not touching URL Routing until it reached the final status.

Get it here: IIS7 – using url rewrite module.

Windows 2008 Server Mailserver

Last week I installed Windows 2008 Server Standard on one of our dedicated Servers. So far I really like it, IIS7 seems to be pretty nice and the administration features evolved a lot. I also like the improved Remote Desktop, it’s logging in way quicker and feels more smooth than before. The installation itself had been a piece of cake, even the beta drivers of our 3Ware RAID-1 Controller are working flawless. Although we plugged in 8GB of RAM but we can only use 4GB as we have to use the 32-Bit Edition of Windows 2008 Server. Some payment gateways do not provide 64-Bit Software yet and that’s why we had to stick with the rusty 32-Bit bits, but it’s ok.

After setting up a couple of ASP.NET Webs we had to take care about the E-Mail delivery. These days you should carefully choose your Mailserver as the ropes of E-Mail Delivery are very much tightned. So besides of avoiding spam filters with your ASP.NET Application you have to be even more critical about your Mailserver.

Windows 2008 Server comes with a build-in SMTP Service which is still running on old IIS6 bits, that’s one of a couple of services depending on IIS6, so it’s still installed. Comparing the IIS6 and IIS7 Administration-Consoles is like comparing 2 different worlds. After playing a bit around with it I wasn’t able to set it up as a perfect mail delivery service. I think it only works great if you can bump your mails to an external, dedicated SMTP Service. Another drawback is that there’s no POP3 Service anymore coming with Windows 2008 Server. So afterall you need a standalone Mailserver anyway if you want to use POP3.

Not all of us are blessed by having a dedicated Microsoft Exchange Server which is taking care of all the mailing stuff. Installing it on our current Windows 2008 Server system would steal to many resources which are desperately needed for our high traffic webs.

We can’t use Google Apps as they are limiting the outgoing mails to 500 per day (per Account). Of course we could work around that by using several Accounts and run a counter on the sent mails but we wanted to use something bullet proof using just one e-mail address like noreply@domain.com.

Mercury Mailserver is a free standalone Mailserver of the creator of Pegasus Mail. It’s pretty light but offers SMTP, POP3 and even IMAP. Although we ran into pretty heavy DNS problems while using it and it ran somewhat unstable on Windows 2008 Server. Another drawback is that it’s not coming along with a Windows Service, there are some addons for that but it’s just too unstable.

There are a couple of other free Mailservers like hMailServer and Surge but none of them ran stable on Windows 2008 Server, it looks like they’re not compatible yet to the Vista-Like-Architecture.

So we ended up with Kerio Mailserver. It’s already Vista compatible and runs great on Windows 2008 Server. It has a lot of features, in fact more than we needed, but it’s still smaller as Microsoft Exchange. Although it’s also not that cheap, but at least our proper Mail delivery is guaranteed – which is very important for stuff like Activation Mails. The alternative would have been Smartermail which is based on ASP.NET, but we went for Kerio this time as we have had good experiences with it in the past.

If anyone of you guys got additional information or tipps on this topic, feel free to leave a comment! As of know there’s no free Mailserver for Windows 2008 Server available, at least the ones I found didn’t run stable yet.

SkyDrive without Germany?

I just read about the bigger, better and faster SkyDrive Release on the SkyDrive Team Blog.

SkyDrive is also available now in 38 countries/regions. In addition to Great Britain, India, and the U.S., we’re live in Argentina, Australia, Austria, Belgium, Bolivia, Brazil, Canada, Chile, Colombia, Denmark, the Dominican Republic, Ecuador, El Salvador, Finland, France, Guatemala, Honduras, Italy, Japan, Mexico, the Netherlands, New Zealand, Nicaragua, Norway, Panama, Paraguay, Peru, Puerto Rico, Portugal, South Korea, Spain, Sweden, Switzerland, Taiwan, and Turkey.

Now seriously guys, Microsoft is pretty big in Germany, why are we always missing cool stuff? Ok, we can use GMail as free Online Storage, but maybe we’d like to stick to Microsoft stuff, too. We can’t use AdCenter and we can’t use SkyDrive in Germany and there are probably lots of other Microsoft Services we can’t use in Germany. Anyone has an explanation for this? I’m clueless.

Visiting VSone in Munich tomorrow

I’m visiting the VSone in Munich, Germany tomorrow. The topics are Visual Studio, .NET and SQL with a focus on the new .NET 3.5 Framework. It will take place in the IMAX Cinema on very big screens. I’ll arrive around 9 AM and I will also visit the party afterwards, which is taking place in the famous Planetarium.

If you happen to be there, feel free to drop me a line via e-mail or add me on Xing!

SubSonic 2.1 Beta released

Just in case you never heard about SubSonic yet, it’s a kickass DAL and I use it for a lot of my projects. Currently it supports SQL Server 2000 or 2005, MySQL and or Oracle (with SQLLite, SQLCE). They also offer a nice Starter-Kit which comes pre-wired with SubSonic, Membership, AJAX, Useful Utilities, and the FCK Editor. It’s based on the .NET 2.0 Framework but it’s automatically being converted by Visual Studio 2008 in case you use v3.5 already (which I strongly recommend).

With v2.1 they shipped a new Query Tool which is now fully capable of creating more complex queries, for example:

  Northwind.CustomerCollection customersByCategory = new Select()
        .From()
        .InnerJoin()
        .InnerJoin(Northwind.OrderDetail.OrderIDColumn, Northwind.Order.OrderIDColumn)
        .InnerJoin(Northwind.Product.ProductIDColumn, Northwind.OrderDetail.ProductIDColumn)
        .Where(“CategoryID”).IsEqualTo(5)
        .ExecuteAsCollection();

Read more about the new version here. Check it out, I don’t want to miss it anymore ;).