:::: MENU ::::

Friday, February 29, 2008

Microsoft's ADO.NET Team readies Entity Framework and Tools 1.0 for release as a VS 2008 add-in with enterprise-level features that LINQ to SQL doesn't offer -- domain object modeling, flexible inheritance techniques, multiple database vendors, and do-it-yourself n-tier deployment.

 

The ADO.NET Entity Framework (EF) will be Microsoft's first production-quality object/relational mapping (O/RM) platform that promises to be fully competitive with entrenched open source or commercial O/RM tools and object-persistence code generators for .NET. The goal of these tools is to minimize the time and effort required to transfer business objects to and from storage in relational databases. OR/M tools reduce the programming disconnect between relational and object models and, in many cases, generates the class code for the business objects. But O/RM platforms -- often combined with automated, template-based Web-site generation frameworks (scaffold generators), such as Ruby on Rails and SubSonic -- have become critical components of professional developers' toolkits. .NET O/RM tools and code generators are a growth industry: In early 2008, the Sharp Toolbox's .NET Object-Relational Mapping category listed 48 products, not including EF and LINQ to SQL, most of which are commercial offerings. NHibernate and NPersist are widely used open source O/RM tools for .NET; LLBLGen Pro and WilsonORMapper are popular commercial offerings.

 

The Entity Data Model (EDM), Entity SQL (eSQL) query language, and LINQ for Entities are the components that distinguish EF from other .NET O/RMs. Microsoft released EF beta 3 and the EF Tools community technical preview (CTP) 2 in early December 2007; you can expect the release to manufacturing (RTM) versions in the first half of 2008. I'll give you a quick EF refresher, describe what's new in EF Beta 3 and EF Tools CTP 2, show you how to quickly create an EDM from the Northwind sample database, demonstrate simple eSQL and LINQ to Entities queries, preview the ADO.NET Team's EntityBag components for n-tier EF deployment, and compare EF with LINQ to SQL for production object persistence use.

EDM is an entity-relationship (ER) data model that's based on the pioneering work of Dr. Peter Chen, who introduced the concept as a "Unified View of Data" in a 1976 Association for Computing Machinery paper (see Additional Resources). EDM defines entities as instances of EntityTypes (Order, for example) and EntitySets as keyed collections of entities (Orders). An EntityKey (usually--but not necessarily--a primary key, such as OrderID) implements object identity with an EntityReference, which uniquely identifies an entity instance for creating, retrieving, updating, or deleting (CRUD operations) and prevents creating duplicate instances in memory. The EntityKey lets entities participate in relationships, which are logical connections between entities called associations. EntitySets implement 1:many associations (Customer.Orders) and EntityReferences (Order.Employee) implement many:1 Associations. NavigationProperty instances identify entities at either End of an Association (see Figure 1). The Multiplicity of an association's End is indicated by 0..1 for zero or more, 1 for exactly one, and * for many. Unfortunately, EF also introduces a terminology disconnect for relationally oriented .NET developers.

Layer Logical and Conceptual Schemas
Three layers of XML mapping files implement the EDM, which minimizes dependency of domain object design on the underlying database schema. An EF-enabled ADO.NET data store provider connects the database instance to the logical data store layer, which a ModelName.ssdl schema file matches to the physical database schema. A ModelName.mdl mapping schema file defines the relationship between the logical layer and the conceptual layer's ModelName.csdl schema that defines the EDM (see Figure 2). The mapping schema isolates the EDM at the conceptual layer from subsequent changes to the database schema or vice versa and enables support for the three common domain object inheritance models: table per hierarchy (TPH), table per type (TPT), and table per concrete type (TPCT). It's possible to design your domain model in EF and then implement the database and mapping schemas by hand. However, it's much more practical to use EF v1's graphical mapping tools and code generator to create the mapping schemas and classes.

The ADO.NET Team describes EntityClient as an ADO.NET data provider that's "a gateway for entity-level queries." EntityClient executes queries against the EDM's conceptual layer by using its own provider-agnostic query language, eSQL, with the familiar Connection, Command, and Parameter objects whose names carry an Entity prefix. EntityCommands that you run against EntityClient can execute eSQL text and parameterized queries as well as stored procedures. The EntityDataReader returns tabular or hierarchical DBDataReader objects, depending on the query and associations of the affected entities. eSQL is a SQL dialect that extends ANSI SQL with object-oriented keywords that support EntitySets, EntityTypes, and composability, but it doesn't include join-related commands. Joins are implemented by the eSQL NAVIGATE operator, which operates on associations.

Here's a simple EntityCommand against the nwEntities EntityContainer that's shown in Figure 1's Model Browser pane:

EntityCommand eCmd = 
                    nwEntities.CreateCommand(); 
                    eCmd.CommandText = @" SELECT VALUE o 
                    FROM nwEntities.Orders AS o 
                    WHERE o.ShipCountry = 'USA' 
                    ORDER BY o.OrderDate DESC"; 

eSQL requires aliases, such as o in the preceding example, and its SELECT VALUE clause tells the query processor to return a collection (Order entities) instead of implicitly wrapped rows. SELECT VALUE queries are limited to returning a sequence of a single EntityType and don't support projections, such as a column list. eSQL doesn't support T-SQL's * wildcard to represent all columns of a table; Count(*) becomes Count(0) in eSQL.

In its query pipeline, EntityClient parses the eSQL query, validates it against the conceptual model, and then sends the query to the database-specific data provider in the form of a Canonical Query Tree (CQT). The provider translates the CQT to the database's SQL dialect.

Object Services Does the Heavy Lifting
Object Services is the EF's top layer; it orchestrates O/RM operations and data transfers. Object Services autogenerates the code for CLR partial classes that define strongly typed EntityTypes and EntitySets<EntityType> collections. Object Services implements querying with LINQ or eSQL, object-identity management based on EntityKeys, event-based change tracking for object state management, eager and lazy loading of related entities, value-based optimistic concurrency management, and routine CRUD operations. eSQL v1.0 doesn't include SQL-style data manipulation language (DML) commands, so you must manipulate domain objects directly to emulate traditional SQL CRUD commands; alternatively, both Entity Client and Object Services support stored procedures for CRUD operations. Object Services dramatically reduces the amount of planning, design, and programming effort to manage the transition to and from a relational object persistence store.

ObjectContext is Object Service's top-level object and contains an EntityConnection to the data model. In fact, when you name the connection string in the Entity Data Model Wizard, you name the derived EntityContainer type for the model; the default is DatabaseName Entities. The connection string contains a metadata section that specifies the names and locations of the .CSDL .SSDL, and .MSL files separated by the pipe symbol (|) and the connection string or fully qualified name of the EF-enabled store-specific data provider. The remainder of the string corresponds to a conventional ADO.NET connection string plus the EntityClient providerName; this example utilizes the Northwind database running on SQL Server 2005+:

connectionString="metadata=.\Northwind.csdl|.
                    \Northwind.ssdl|.\Northwind.msl;
                    provider=System.Data.SqlClient;
                    provider connection string="
                    Data Source=localhost\SQLEXPRESS;
                    Initial Catalog=Northwind;
                    Integrated Security=True;
                    MultipleActiveResultSets=True"" 
                    providerName="System.Data.EntityClient"

Entity Framework Tools CTP 2 and later add MultipleActiveResultSets=True to enable MARS automatically for SQL Server 2005 and later. By default, the EDM Wizard places connection strings in the App.context or Web.context file.

ObjectContext is represented by the autogenerated EntityContainer derived from it; it's also the EF's programming target. ObjectContext encapsulates the metadata defined by the conceptual schema (ModelName .csdl) in a MetadataWorkspace object and the ObjectStateManager that's responsible for identifying and managing entity instances in the memory cache. The EntityContainer holds the autogenerated EntitySets and AssociationSets, which act as the data source for an ObjectQuery that adds entity instances to the cache. Persisting updates to entity instances made by modifying their properties or adding them to or deleting them from EntitySets requires executing the ObjectContext .SaveChanges() method. ObjectContext supports the Unit of Work pattern with implicit local DBTransaction or by explicitly enlisting in a distributed System.Transaction.

ObjectQuery implements IQueryable<T> and IEnumerable<T> and supports two eSQL query expression formats: query-string and query-builder methods. This is the eSQL query-string version of the earlier EntityCommand example for nwEntities that returns an ObjectQuery<Order> for iteration in a foreach loop:

string query =
                    @" SELECT VALUE o 
                    FROM nwEntities.Orders AS o 
                    WHERE o.ShipCountry = 'USA' 
                    ORDER BY o.OrderDate DESC"; 
ObjectQuery<Order> orderQuery = 
                    new ObjectQuery<Order>(query, 
                    nwEntities, MergeOption.NoTracking); 

ObjectQueries share LINQ queries' lazy execution feature. The preceding query won't execute until it's iterated in a foreach loop, assigned to a List<T> collection, or run explicitly by invoking the Execute() method. The MergeOption enumeration offers AppendOnly (default), NoTracking, OverwriteChanges, and PreserveChanges options for concurrency management. NoTracking makes no changes to the ObjectStateManager when the query executes.

The query-builder method emulates LINQ's method call syntax to enable composable queries:

ObjectQuery<Order> orderQuery = 
                    nwEntities.Orders
                    .Where("it.ShipCountry = 'USA'") 
                    .Orderby("it.OrderDate DESC");

The "Query Builder Method (Entity Framework)" topic of the Entity Framework API online help file has a list of query-builder methods and their eSQL command counterparts. (The EF Tools setup program installs the help file as an ADO.NET Entity Framework Tools Preview menu choice.) Each query-builder ObjectQuery execution returns an ObjectQuery, which enables query chaining. The output of a query-builder method query is an ObjectQuery<T> that can be the data source for another ObjectQuery<T>. You can mix and match LINQ to Entities' Standard Query Operators (SQOs) with query-builder methods, but adding an SQO returns IQueryable<T>, not ObjectQuery<T>.

LINQ to Entities is the EDM's most important query technique, primarily because you don't need to master eSQL to return the entities or scalar values you want. Like Entity Client, LINQ to Entities uses a command tree to communicate queries to the store-specific ADO.NET data provider, which translates the command tree to the databases' SQL dialect. This snippet illustrates the LINQ to Entities version of the preceding query example:

 var usOrders = 
                    from o in nwEntities.Orders 
                    where o.ShipCountry = 'USA' 
                    orderby o.OrderDate descending
                    select o;

The Entity Framework beta 3 Samples download from CodePlex (EFBeta3Samples.zip, 28MB) includes an EF version of the LINQ Query Explorer included with Visual Studio (VS) 2008's LINQ samples (see Additional Resources). The Entity Framework Sample Query Explorer demonstrates query-string, query-builder method, and LINQ to Entities queries with tree-view Output, Text Output, and Generated SQL output tabs. The samples include some other simple examples of EF running in WinForms and WebForms.

Test Drive the New EDM Designer
The ADO.NET EF team has delivered a surfeit of new EF features and enhancements since the EF August 2006 CTP that I wrote about in "Objectify Data with ADO.NET vNext" in the October 2006 issue (see Additional Resources). EF beta 1, which arrived in Orcas beta 1 without a visual designer, was far from ready to compete with the then-current stable of .NET O/RM tools. Creating an EDM required running the EdmGen.exe command-line tool and then manually editing the three schema files. However, August 2007's EF beta 2/EF Tools CTP 1 restored and improved the graphic EDM Designer and EF beta 3/EF Tools CTP 2 of Dec. 6, 2007, delivered numerous improvements to both EF and the Designer, including a substantial performance boost with compiled queries (see Table 1). If you haven't downloaded the latest EF bits, now's the time (see Additional Resources). According to a post in the MSDN ADO.NET (Pre-Release) forum, one more beta/CTP is schedule prior to EF's RTM.

Creating a nwModel EDM and nwEntities EntityContainer is almost as quick as generating a new DataContext with LINQ to SQL. The EntityDataSource component for ASP.NET -- EF's answer to LINQ to SQL's LinqDataSource -- isn't ready yet, so start a new Visual Basic or C# WinForm project, add a new ADO.NET Entity Data Model template with the file name changed to Northwind.edmx to start the EDM Wizard, click on next to accept the default Generate from Database option in the EDM Wizard's Choose Model Contents dialog, and click on next to select a Northwind database connection. Next, rename the connection string to nwEntities in the Choose Your Data Connection dialog. Finally, select the tables and stored procedures to add to your model and change the model's namespace to nwModel in the Choose Your Database Objects dialog and click on Finish to let the EDM Wizard populate the Northwind.edmx EDM Designer file and autogenerate the nwEntities class files. When autogeneration completes, save the diagram, and right-click on an entity or association line to open the Mapping Details pane. Optionally, singularize the names of the entity types by double clicking on their names to open an edit box.

Databinding to EF entities isn't as simple as binding to LINQ to SQL entities; for example, Object Data Sources created from EF EntitySets don't expose associations as relationships to serve as the DataMember for child grids in parent/child forms. Instead, you must navigate the association with a query that returns an IEnumerable<T> from the association's EntityCollection. This requires that you write a query in the event handler for the parent's BindingSource_CurrentChanged event to load the correct set of orders:

var query = ctxNwind.Customers
                    .Where(cust => cust.CustomerID == 
                    customerIDTextBox.Text)
                    .Select(c => c.Orders.Select(o => o))
                    .FirstOrDefault()
                    .OrderByDescending(o => o.OrderID);
orderBindingSource.DataSource = query.ToList();

The preceding query is based on the design of the Entity Framework Sample Query Explorer's LINQ to Entities | Relationship Navigation | Relationship Collection 1 (LinqToEntities70) query. The first Select() operator returns the Customer entity, and the second Select() returns an EntitySet of Order entities; FirstOrDefault() returns the IEnumerable<Order> sequence that supplies the BindingSource's DataSource property value. Download this article's sample code and run the NwindEdmCS.sln project, a simple WinForm databinding example with BindingSources and DataGridViews.

The ADO.NET Team wanted to avoid LINQ to SQL's "no out-of-the box n-tier story" stigma but, like LINQ to SQL's DataContext, the ObjectContext isn't serializable with Winows Communication Foundation (WCF)'s DataContextSerializer. So Daniel Simmons, a development lead on Microsoft's ADO.NET EF team, crafted a top-level EntityBag object that creates a ContextSnapshot, which is a DataContractSerializable Data Transfer Object (DTO) containing the ObjectContext's contents, including original and modified values. Messages with a ContextSnapshot as a SOAP payload pass between the WCF service's ObjectContext and a service client ObjectContext that's identical to the service's, but doesn't have a database connection. To learn more about the EntityBag and its related classes and extension methods, see the sidebar, "Retrieve and Update Entities Across Tiers with WCF and the EntityBag." The sample code includes the EntityBagCS.sln project that retrieves and updates Northwind entities with SOAP messages and a basicHttpBinding.

Numerous posts on MSDN's LINQ Project General and ADO.NET (Pre-Release) forums indicate that developers find it difficult to decide whether to adopt LINQ to SQL or EF and LINQ to Entities for upcoming data-intensive .NET 3.5 projects. LINQ to SQL is part of the VS 2008 RTM bits and EF is scheduled to release in the "first half of 2008." But the promised EntityDataSource, which will substitute for ASP.NET's LinqDataSource for LINQ to SQL, hadn't appeared as a beta implementation when I wrote this article. In any case, LINQ to SQL is riveted to 1:1 table:entity mapping, only supports LINQ queries, and is connected at the hip to SQL Server 2000+. (LINQ to SQL offers partial support for SQL Server Compact Edition 3.5.) EF offers extremely flexible mapping, is database-agnostic, and enables three distinctly different query mechanisms (Table 2, for a detailed, feature-by-feature comparison).

As Data Programmability architect Mike Pizzo noted in his "Data Access API of the Day: Programming to the Conceptual Model" blog post of Jan. 23, 2007, "[T]he bet on the Entity Framework, and the Entity Data Model in particular, is big. You can expect to see more and more services within SQL Server, as well as technologies throughout the company, embrace and leverage the Entity Data Model as the natural way to describe data in terms of real-world concepts." Microsoft is devoting extraordinary resources to ensuring that EF will RTM on time as an enterprise-class O/RM tool. ADO.NET Data Services (formerly codenamed "Project Astoria") has adopted EF as its default data source for relational data, and it's likely that other new data-related Microsoft products will deploy with EF. I'm betting that EF will eclipse LINQ to SQL as the preferred Microsoft OR/M tool in the next year or two.

Read More

Thursday, February 28, 2008

Event Validation is a new feature in ASP.NET 2.0 which provides an additional level of checks on postback actions. It verifies whether a postback from a control on client-side is really from that control and not from a malicious person trying to break your application.

Even if you forget to add security checks of your own, ASP.NET provides this functionality, because this feature is enabled by default. Sometimes, it is safe to turn this of, but Microsoft tries to have developers turn this of when they know what they are doing.

Unfortunately: I came across Event Validation… A user control on a master page convinced ASP.NET that a postback within that same user control was unsafe, resulting in the following error:

"Invalid postback or callback argument. Event validation is enabled using <pages enableEventValidation="true"/> in configuration or <%@ Page EnableEventValidation="true" %> in a page. For security purposes, this feature verifies that arguments to postback or callback events originate from the server control that originally rendered them. If the data is valid and expected, use the ClientScriptManager.RegisterForEventValidation method in order to register the postback or callback data for validation."

There are some options to overcome this… One is to add a EnableEventValidation="false" in your @Page directive, another is to globally disable this in your Web.config (don’t!). The best solution, however, is telling ASP.NET to allow events from your user control’s inner controls, by adding the following snippet of code in the user control:

protected override void Render(HtmlTextWriter writer)

{
// Register controls for event validation
foreach (Control c in this.Controls)

{
this.Page.ClientScript.RegisterForEventValidation(c.UniqueID.ToString()
);
}
base.Render(writer);
}

The Visual Studio 2008 and .NET Framework 3.5 Training Kit includes presentations, hands-on labs, and demos. This content is designed to help you learn how to utilize the Visual Studio 2008 features and a variety of framework technologies including: LINQ, C# 3.0, Visual Basic 9, WCF, WF, WPF, ASP.NET AJAX, VSTO, CardSpace, SilverLight, Mobile and Application Lifecycle Management.

On and it free from Microsoft.

Read More

It seems like amount of posts on ASP.NET's Session State keeps growing. Here's the list:

Yesterday's post on Session State Partitioning used a round-robin method for partitioning session state over different state server machines. The solution I presented actually works, but can still lead to performance bottlenecks.

Let's say you have a web farm running multiple applications, all using the same pool of state server machines. When having multiple sessions in each application, the situation where one state server handles much more sessions than another state server could occur. For that reason, ASP.NET supports real load balancing of all session state servers.

Download example

Want an instant example? Download it here.
Want to know what's behind all this? Please, continue reading.

What we want to achieve...

Here's a scenario: We have different applications running on a web farm. These applications all share the same pool of session state servers. Whenever a session is started, we want to store it on the least-busy state server.

Read More

We all know unit testing. These small tests are always based on some values, which are passed throug a routine you want to test and then validated with a known result. But what if you want to run that same test for a couple of times, wih different data and different expected values each time?

Data Driven Testing comes in handy. Visual Studio 2008 offers the possibility to use a database with parameter values and expected values as the data source for a unit test. That way, you can run a unit test, for example, for all customers in a database and make sure each customer passes the unit test.

Data Driven Testing in Visual Studio 2008 - Part 1 - Unit testing

Data Driven Testing in Visual Studio 2008 - Part 2 - Web testing

 

Visual Studio developer, did you know you have a great performance analysis (profiling) tool at your fingertips? In Visual Studio 2008 this profiling tool has been placed in a separate menu item to increase visibility and usage. Allow me to show what this tool can do for you in this walktrough.

The profiling tool is hidden under the Analyze menu in Visual Studio. After launching the Performance Wizard, you will see two options are available: sampling and instrumentation. In a “real-life” situation, you’ll first want to sample the entire application searching for performance spikes. Afterwards, you can investigate these spikes using instrumentation. Since we only have one simple application, let’s instrumentate immediately.

Upon completing the wizard, the first thing we’ll do is changing some settings. Right-click the root node, and select Properties. Check the “Collect .NET object allocation information” and “Also collect .NET object lifetime information” to make our profiling session as complete as possible:

Read More

Wednesday, February 27, 2008

Spell checker works in Source view, it is able to extract text from markup elements and use Office 2003 spell checker to check the text. Spell checker is able to handle entities to some extent (they are considered whitespace for now). It is also able to spell check values of attributes that typically contain human readable text. You can customize spell checker behavior by editing an XML file. Since the add-on uses Office spelling engine, you must have Office 2003 or at least Word 2003 installed. Spell checker uses active Office dictionary. Therefore if page is in Japanese, but primary dictionary set in Office is in English, spell checking will be done in English. Current version does not merge words split by tags, such as <b>S</b>ymbol. I am planning to add this functionality in a future version. Right click on the misspelled word does not bring suggestions, double-click does. This is because right-click in VS 2005 HTML editor is hardcoded for the context menu and cannot be overridden.

More details here

Visual Studio 2005 Add-in download

Visual Studio 2008 Add-in download

Spammers had broken the CAPTCHA for Windows Live (Hotmail), Yahoo and Gmail.  The evidence was circumstantial in that I was seeing a lot more spam from these services.

Over the past couple of weeks I have read a few articles confirming my suspicions.  While spammers cannot solve 100% of these Human Interactive Proofs, they can still automate the process using a bot which is, in effect, breaking these security devices.  In other words, the equivalent of solving 10% of the HIPs is from a security standpoint, completely broken.

So where do we go from here?  Knowing that the anti-bot device is broken, what do we do?  Here are some options that I can think of:

  1. Make the HIP more difficult to solve.  This is probably the most obvious one, but keep in mind that the more difficult they are to solve for a bot, this also makes it tougher for humans to solve as well.  In addition, it takes time to properly research a HIP to make sure that it actually is more difficult to solve.  According to the Wikipedia entry, these things have only been around for about a decade.  In other words, in my estimation, CAPTCHAs are not fully understood yet.
  2. Block known bots.  This is similar to IP blacklists; get a list of known bots that are signing up for email accounts and prevent them from doing so.  The downside of this is the potential false positive issue, it is possible legitimate users are on the same IP as the bots.  This could be alleviated with some finesse, perhaps the IP could be limited to 5 sign ups per day, for example.  There is still the false positive issue but at least it is somewhat mitigated.
  3. Use a double HIP.  After the spammer/user breaks/verifies the first HIP, get them to solve a second one.  This one should be a different type of HIP that uses a different technology or pattern, so the bot involved cannot revert back to the same algorithm as before.  If they have a 10% chance of breaking the first one, and a 10% chance of breaking the second one, this means they have a 1% chance of getting an account.  That's still broken, but at least it slows them down.  It also gives the service a chance to detect the bot.
  4. Think outside the box and use a different HIP.  I blogged about this a few months ago.  Microsoft Research has a different type of HIP.  Given a list of images of cats and dogs, the user is required to click the pictures of all of the cats or all of the dogs.  This requires facial (?) recognition.  The downside to this is that it is conceivable that not everyone will recognize the pictures of the cats or the dogs.  Animals can be culturally specific.  The upside is that bots will be in a dilly of a pickle because animal recognition is a very different animal (pun intended) that text recognition.

Those are the ones I can think of.  I'm not involved in HIPs or CAPTCHAs at all, but I would think that some of the above theories would be a place to start.

Read More

Tuesday, February 26, 2008

Managed Fusion URL Rewriter is a powerful URL manipulation engine based on the Apache mod_rewrite extension. It is designed, from the ground up to bring all the features of Apache mod_rewrite to IIS 6.0 and IIS 7.0. Managed Fusion Url Rewriter works with ASP.NET on Microsoft's Internet Information Server (IIS) 6.0 and Mono XPS Server and is fully supported, for all languages, in IIS 7.0, including ASP.NET and PHP. Managed Fusion Url Rewriter gives you the freedom to go beyond the standard URL schemes and develop your own scheme.

URL Rewriter provides websites with the ability to replace your querystring into short, memorable and meaningful links. One of the great advantage by using this tool, is that your web site becomes totally search engine friendly which means higher page ranking from the search engines.

URL rewriting is the process of intercepting an incoming Web request and redirecting or rewrite the request to a different resource. When performing URL rewriting, the requested URL is validated and the request is redirected to a different URL. Proper URL rewriter components such as Managed Fusion Url Rewriter will completely mask the original URL so the clean URL will always be exposed to the client.

Advantages

  • Developed by my company but unlimited FREE use for everybody.
  • Full .NET 2.0, 3.0, and 3.5 support.
  • Full support for IIS 6.0 and IIS 7.0 (including integrated pipelines).
  • Fully functional Proxy and Reverse Proxy integrated in at no extra cost.
  • Full support for Mono XPS, and the integrated Visual Studio Web Development Server, two things that ISAPI_Rewrite and Ionic Rewriter cannot claim.
  • Create short URLs that are easy for your users to remember.
  • Structure your site for search engine readability.
  • Hide the implementation of your URL from the URL that is displayed to the user.
  • Provides easy implementation for standardizing your website to help in SEO efforts.
  • Block hot linking of your sites content, very useful for sites that steal your images and consume your bandwidth for their gain.
  • Proxy content of one site into directory on another site.
  • Create a gateway server that brings together all your companies proprietary web application under one standardized schema through the proxy feature.
  • Create dynamic host-header based sites using a single physical site.
  • Change your ASP.NET file extensions to other extensions (i.e. .html). This also helps in migrating old CGI technology to ASP.NET, without changing your URL structure.
  • Return a browser-dependent content even for static files.


How does it work?

Url Rewriter works as HttpModule for ASP.NET Web Applications. Redirections and rewrites are managed by rules created in a configuration file in plain text using Apache mod_rewrite syntax that is determined using regular expressions.

When the request is made for the new friendly URL, Url Rewrite tries to find a match of the virtual URL against the rules and if successful, it redirects or rewrites the request to the original URL. Rewrites are made server-side so the users will never see again your complex and hackable URL.

Unlike other existing URL rewriting process, Managed Fusion Url Rewrite is entirely masking the old URL so ASP.NET Form Postbacks are fully supported by this component and it uses the common Apache mod_rewrite syntax that is so popular with PHP, Cold Fusion, and Ruby.

 

Read More http://managedfusion.com/products/url-rewriter/

In ASP.NET solutions using forms based security, there is a problem if the forms authentication ticket times out while the user is filling out a large web form. This is a simple solution to that problem using a custom HttpModule and a single .aspx page:

A typical scenario is where:

  • The solution is using forms authentication (using cookies, and with a timeout set to 30 minutes in web.config).
  • The user starts filling out a large web form.
  • The user takes a long phonecall or goes to lunch.
  • The user returns, resumes filling out the form and submits.
  • Bang - the user is redirected to the login page because the authentication ticket timed out.
  • After logging in again the form will be empty - all work filling out the form is lost.


The easiest solution would be to make the forms authentication ticket live very long (ex. 24 hours). But in my experience many customers require that the login times out after typically 30 minutes for security reasons.

I had no luck googling a solution, so here is what I came up with:

Solving this requires a simple HttpModule and a transit page. The HttpModule will capture the posted form and save it to application state, just before the forms authentication redirects the user to the login page. Application state is used because Session state is not accessible at the time of interception. A few tricks are used here, see the code for details.

After login, the same HttpModule will redirect the request to a transit page. This transit page will restore the form contents as hidden fields and do a submit (http post triggered on load) to the original page. There you go, form restored.

This way everything from the original form post will be restored, including ViewState. The original page will happily recieve the postback and do normal page processing. The restoring of the form will be transparent to the end user. Usually the transit page will be so fast that the user doesn't see it.

This will also work with Ajax to some extent. Form fields inside an UpdatePanel will be re-populated by the form saver, but any callback from controls inside the UpdatePanel will be "replaced" by a full page postback from the transit page.

Read More

Thursday, February 21, 2008

I received some great ideas for new features and thought I'd incorporate them in a new version. Here's what's new:

  • Tab/linebreak escape choices for C#
  • Verbatim Literals (optional line spanning) for C#
  • Paste As StringBuilder (with optional AppendFormat usage)
  • Auto Formatting After Paste (optional)
  • Ability to hide Paste As options on the context menu
  • Add-In Commands for adding to custom menus and keyboard shortcuts
Download:

VisualStudio.NET 2003 Installer - http://www.papadimoulis.com/alex/SmartPaster1.1.zip
VisualStudio.NET 2005 Installer - http://thedailywtf.com/forums/76015/PostAttachment.aspx (Rename to SmartPaster.vsi after download)
VisualStudio.NET 2005 Source - http://thedailywtf.com/forums/54966/PostAttachment.aspx
VisualStudio.NET 2008 Source - http://www.papadimoulis.com/alex/SmartPaster2008sln.zip


Read more
GhostDoc is a free add-in for Visual Studio that automatically generates XML documentation comments for C#. Either by using existing documentation inherited from base classes or implemented interfaces, or by deducing comments from name and type of e.g. methods, properties or parameters.

Read More

If you've designed, developed or worked on websites recently, you have probably come across the topic of localization. In this article, I'll show you how to implement localization in your Visual Studio .NET projects.

Most of us blanch at the idea of tailoring our pages to localize in multiple languages. The hard part of localization is, of course, the localizing. However, translating static labels and site content can be as easy as asking your user base to help. Good news though, once you've got that covered, the implementing is the easy part!

In this article, I'm going to focus on two ways of implementing localization in your .NET projects (and how to let your users choose). The first way is the "easy" way but might prove daunting in the long-term for large-scale websites. The second way is by grouping and scaling-down the amount of files to tend to. No way is best and I'll leave it up to you to decide which is better for your project.

Read more (intrepidstudios.com, Code Project)

This tutorial examines the new Visual Studio 2008 Server Control and Server Control Extender. A compendium of tips, tricks and gotchas, it is a comprehensive tutorial that will provide readers with the skills necessary to start building advanced AJAX-enabled custom controls with Visual Studio.

When you open up Visual Studio 2008 to create a project, you will notice that it has two new web templates designed specifically for building AJAX controls: ASP.NET AJAX Server Control and ASP.NET AJAX Server Control Extender. You'll also find an old friend, the ASP.NET Server Control project template.

What are the differences between the Server Control, the ASP.NET AJAX Server Control and the ASP.NET AJAX Extender, and when should each be used?

At first glance, it would seem that the ASP.NET Server Control differs from the other two controls in that it doesn't support AJAX. This isn't completely true, however, and in the first part of this tutorial I will demonstrate just how far you can go in developing an AJAX-enabled control based on the Server Control alone. While the ASP.NET Server Control does not provide direct access to AJAX scripts, it can implement AJAX scripts encapsulated in other controls, such as the UpdatePanel or the AJAX Extensions Timer Control, to provide AJAX functionality. For control developers who are not all that keen on delving into the intracacies and pitfalls of JavaScript, the Server Control offers an excellent and clean development path.

The AJAX Server Control and the AJAX Server Control Extender differ from the regular ASP.NET Server Control by coupling themselves with javascript files, and allowing mapping between properties of a control class and properties of a javascript class. When you need functionality not provided by other AJAX Server controls, or simply want to customize your control using client-side script in order to avoid the ASP.NET control life-cycle, then this is the best option.

Finally, while the AJAX Server Control Extender is primarily used to add behavior (that is, javascript) to other controls on your ASP.NET page, the AJAX Server Control is a self-contained control in which any client-side script you write will apply, for the most part, only to the control itself, or to its children. In other words, an AJAX Extender will be aware of other controls on your page, while an AJAX Server Control will not.

Of some interest is the fact that the ASP.NET AJAX Server Control template, like the ASP.NET Server Control template, implements a ScriptControl class that derives from System.Web.UI.WebControls.WebControl, while the ASP.NET AJAX Server Control Extender template implements an ExtenderControl class that derives directly from System.Web.UI.Control. This means that using the first two kinds of templates, your control will include some built in properties like Enabled, Height and Width, while this is not true of the Extender Control. For all practical purposes, however, this is not a significant difference. For a somewhat fuller treatment of the distinction between the WebControl and Control classes, please see Dino Esposito's article on the topic at msdn2.microsoft.com.

Read more (Code Project)

Wednesday, February 20, 2008

Yesterday Google started hijacking 404 status codes through the Google Toolbar from unknowing Web sites to show pages with ads. TechCrunch even has a poll asking if you think the Google hijacking is a good thing or not. While Google publicly states they do not like Made For Adsense sites (MFAs) they have gone out of their way to force their own where unknowing Web site owners have missed a basic task in designing their site. While I think the stated purpose sounds good, helping users have a better online experience, is a good one. The reality is more like a Democrat enacting legislation, says it is to help you, but in reality is there to control your life and ensure you work for and depend on them for basic needs.

According to Matt Cutts Blog entry about their custom 404 pages, Google will not display their ads if the error page you serve is larger than 512 bytes, which means this should be a fringe occurrence for now. (Just as an aside, if you are not a part of the Search Engine World, Matt Cutts is the equivalent to Scott Guthrie). But like all liberal legislators I am sure this policy will eventually creep to cover more of our lives.

This type of practice has long been the territory of predatory hosting companies for years. In fact many used to inject their own promotions into your site's content, but that has long past away as useless. So I do not fault Google for bringing back an old practice. Regardless we should be responsible Web property owners and provide a good user experience on our own.

Read more (professionalaspnet.com)

Cross-Site Scripting attacks are very common on the Internet these days. The sad thing is these attacks can easily be thwarted with a little extra analysis and architecting before programming begins.

Far too many Web Sites do not make the effort to stop common attacks and thus are wide open to attack. This can allow hackers to do just about anything from altering the content of your site or extracting private information from a database. Worse yet they can also wipe an entire database off the face of the earth in a few minutes flat. Yes it can be as bad as the Hollywood movies dramaticise it.

Fortunately most of these attacks can easily be stopped with a little extra effort in the application design and programming phases. ASP.NET comes with some built-in protection, which is turned on by default. But attacks can change and prevention algorithms need to constantly be updated to be more efficient. This is why Microsoft create the Anti-Cross Site Scripting Library.

This library provides seven static (C#) or Share (VB.NET) methods that can be used to stop many attacks dead in their tracks.

HtmlEncode - Encodes strings for use in HTML

HtmlAttributeEncode- Encodes strings for use in HTML Attributes

JavaScriptEncode - Encodes strings used in JavaScript

URLEncode - Encodes strings used in a URL

VisualBasicScriptEncode - Encodes strings used in Visual Basic Script

XMLEncode - Encodes Strings used in XML

XMLAttributeEncode- Encodes Strings used in XML Attributes

Basically anytime you gather text from a user and echo it back on a page it needs to be encoded. While there are utility methods built into the ASP.NET framework, the library steps above and beyond them.

I will be demonstrating the use of each one over the next few weeks. But I encourage you to download the library and start examining how to integrate it into your framework and architecture.

 

Tuesday, February 19, 2008

Sometime ago, I demonstrated how to develop a row-clickable DataGrid control. Now, as we have a few holiday days here, I adopted the same to GridView in ASP.NET 2.0. Basic idea is the same, clicking the row (e.g entire row, not specific column) in the control causes a postback and throws RowClicked event. This can be used to customize selections a user can make with the control.

 

Read more (aspadvice.com/blogs/joteke)

Monday, February 18, 2008

After upgrading to Visual Studio 2008 RTM, you will have trouble updating Linq to SQL Classes which are read from one data context and then updated into another data context. You will get this exception during update:

System.NotSupportedException: An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext.  This is not supported.

Here's a typical example taken from a Forum post:

   1: public static void UpdateEmployee(Employee employee)
   2: {
   3: using (HRDataContext dataContext = new HRDataContext())
   4: {
   5: //Get original employee
   6: Employee originalEmployee = dataContext.Employees.Single(e=>e.EmployeeId==employee.EmployeeId);
   7:  
   8: //attach to datacontext
   9: dataContext.Employees.Attach(employee, originalEmployee);
  10:  
  11: //save changes
  12: dataContext.SubmitChanges();
  13:  
  14: }
  15: }

When you call the Attach function, you will get the exception mentioned above.

Here's a way to do this. First, create a partial class that adds a Detach method to Employee class. This method will detach the object from it's data context and detach associated objects.

   1: public partial class Employee
   2: {
   3:   public void Detach()
   4:   {
   5:     this.PropertyChanged = null; this.PropertyChanging = null;
   6:  
   7:     // Assuming there's a foreign key from Employee to Boss
   8:     this.Boss = default(EntityRef<Boss>);
   9:     // Similarly set child objects to default as well
  10:  
  11:     this.Subordinates = default(EntitySet<Subordinate>);
  12:   }
  13: }
  14:  
  15:  

Now during update, call Detach before attaching the object to a DataContext.

   1: public static void UpdateEmployee(Employee employee)
   2: {
   3:     using (HRDataContext dataContext = new HRDataContext())
   4:     {
   5:         //attach to datacontext
   6:         employee.Detach();
   7:  
   8:         dataContext.Employees.Attach(employee);
   9:         //save changes
  10:  
  11:         dataContext.SubmitChanges();
  12:     }
  13: }
  14:  

This'll work. It assumes the employee object already has its primary key populated.

You might see during update, it's generating a bloated UPDATE statement where each and every property is appearing on the WHERE clause. In that case, set UpdateCheck to Never for all properties of Employee class from the Object Relational Designer.

 

1. Generate new encryption keys

When moving an application to production for the first time it is a good idea to generate new encryption keys. This includes the machine validation key and decryption key as well as any other custom keys your application may be using. There is an article on CodeProject that talks about generating machineKeys specifically that should be helpful with this.

2. Encrypt sensitive sections of your web.config

This includes both the connection string and machine key sections. See Scott Guthrie's post for some good references. Note that if your application runs in a clustered environment you will need to share a custom key using the RSA provider as described in an MSDN article.

3. Use trusted SQL connections

Both Barry Dorrans and Alex Chang have articles which discuss this in detail.

4. Set retail="true" in your machine.config

<configuration>
<system.web>
<deployment retail="true"/>
</system.web>
</configuration>

This will kill three birds with one stone. It will force the 'debug' flag in the web.config to be false, it will disable page output tracing, and it will force the custom error page to be shown to remote users rather than the actual exception or error message. For more information you can read Scott Guthrie's post or the MSDN reference.

5. Create a new application pool for your site

When setting up your new site for the first time do not share an existing application pool. Create a new application pool which will be used by only by the new web application.

6. Set the memory limit for your application pool

When creating the application pool, specifically set the memory limit rather than the time limit which is set by default. Asp.net has a good whitepaper which explains the value of this:

By default IIS 6.0 does not set a limit on the amount of memory that IIS is allowed to use. ASP.NET’s Cache feature relies on a limitation of memory so the Cache can proactively remove unused items from memory.

It is recommended that you configure the memory recycling feature of IIS 6.0.

7. Create and appropriately use an app_Offline.htm file

There are many benefits to using this file. It provides an easy way to take your application offline in a somewhat user friendly way (you can at least have a pretty explanation) while fixing critical issues or pushing a major update. It also forces an application restart in case you forget to do this for a deployment. Once again, ScottGu is the best source for more information on this.

8. Develop a repeatable deployment process and automate it

It is way too easy to make mistakes when deploying any type of software. This is especially the case with software that uses configuration files that may be different between the development, staging, or production environments. I would argue that the process you come up with is not nearly as important as it being easily repeatable and automated. You can fine tune the process as needed, but you don't want a simple typo to bring a site down.

9. Build and reference release versions of all assemblies

In addition to making sure ASP.NET is not configured in debug mode, also make sure that your assemblies are not debug assemblies. There are of course exceptions if you are trying to solve a unique issue in your production environment ... but in most cases you should always deploy with release builds for all assemblies.

10. Load test

This goes without saying. Inevitably, good load testing will uncover threading and memory issues not otherwise considered.

CodePlex - 7zSharp

"7zSharp is a .NET 2.0 LGPL wrapper around the 7z LZMA SDK and executable written in C#, providing a library (DLL) wrapper and simplified API to encode and decode using the 7z library.

Ability to encode: 7z (.7z), ZIP (.zip), GZIP (.gz), BZIP2 (.bz2) and TAR (.tar)

Ability to decode: 7z (.7z), ZIP (.zip), GZIP (.gz), BZIP2 (.bz2) and TAR (.tar), RAR (.rar), CAB (.cab), ISO (.iso), ARJ (.arj), LZH (.lzh), CHM (.chm), Z (.Z), CPIO (.cpio), RPM (.rpm), DEB (.deb), NSIS (.nsis)

Example:
// encode:
CompressionEngine.Current.Encoder.EncodeFromDirectory(@"C:\someDirectory", "C:\out\someDirectory.7z");
// decode:
CompressionEngine.Current.Decoder.DecodeIntoDirectory(@"C:\out\someDirectory.zip", "C:\someDirectory");

"[Wikipage leached in full]”

"XML Tools team has released the XSLT Profiler Addin for VS 2008 - a quick and reliable performance analysis profiler tool that assists in the development and debugging of XSLT documents. The XSLT Profiler Addin for VS 2008 allows developers to measure, evaluate, and target performance-related problems in XSLT code by creating detailed XSLT performance reports. The XSLT Profiler includes a wealth of useful hints for XSL and XSLT style sheet optimizations, which are essential for XSLT-based applications that demand maximum performance.

For more information and download check out Microsoft Downloads site.

"This release updates the Power Tools for the Database Edition to work with Visual Studio 2008 and provides several new features. The new features include two custom unit test conditions, a new Data Generation Wizard, and new MSBuild tasks to support running TSQL Static Code Analysis from the command line. The updated features include 5 refactoring types, a dependency viewer, additional data generators and editors, 2 MSBuild tasks for Schema and Data Compare and the TSQL Static Code Analysis feature

New Test Conditions for Database Unit Tests
• ChecksumCondition – You can use this test condition to verify that the checksum of the data set returned by a database unit test matches the checksum of an expected data set.
• ExpectedSchemaTestCondition – You can use this test condition to verify that the column names and data types of the returned data set match expected values.

Data Generator Improvements
• New Data Generator Wizard – This new wizard creates a data generation plan that is configured to copy data from a source database. You can use this wizard when you need to copy most of your data from a live source, but need to make small changes to ensure privacy.

MSBuild Task Improvements
• SqlAnalysis Task – You can use this build task to run T-SQL Static Code Analysis from MSBuild.
TSQL Static Code Analysis
• Static Code Analysis - A precursor to the functionality that will be in future versions of VSTS that will allow you to perform Static Code Analysis on T-SQL code.

Refactoring
• “Move Schema” Refactoring - Allows a user to right click on an object and move it to a different but existing schema
• SP Rename Generation - Generate a new script that will contain sp_renames for all rename refactored objects that the user can then execute.
• Wildcard Expansion - Automatically expand the wildcard in a select to the appropriate columns.
• Fully-Qualified Name Support - Automatically inject fully-qualified names when absent in a script
• Refactoring extended to Dataset - Refactor into strongly typed dataset definitions

MSBuild Tasks
• Data / Schema Compare Build Tasks - MSBuild tasks that can generate scripts as if the user had run the Data / Schema compare UI

Schema View
• API Access to Schema View - Insert / Update / Delete to schema View and list schema objects and their associated files

Dependency Tool Window
• Dependency Tree - Show the dependencies ( incoming / outgoing ) for selected schema objects in a new tool window

Miscellaneous Tools
• Script Preprocessor - Expand SQLCMD variables and include files and command line version (sqlspp.exe) & an MSBuild version

Download

Sunday, February 17, 2008

Over the last few weeks I've been writing a series of blog posts that cover LINQ to SQL. LINQ to SQL is a built-in O/RM (object relational mapper) that ships in the .NET Framework 3.5 release, and which enables you to model relational databases using .NET classes. You can use LINQ expressions to query the database with them, as well as update/insert/delete data.

Below are the first eight parts in this series:

Why does URL mapping and rewriting matter?

The most common scenarios where developers want greater flexibility with URLs are:

1) Handling cases where you want to restructure the pages within your web application, and you want to ensure that people who have bookmarked old URLs don't break when you move pages around. Url-rewriting enables you to transparently forward requests to the new page location without breaking browsers.

2) Improving the search relevancy of pages on your site with search engines like Google, Yahoo and Live. Specifically, URL Rewriting can often make it easier to embed common keywords into the URLs of the pages on your sites, which can often increase the chance of someone clicking your link. Moving from using querystring arguments to instead use fully qualified URL's can also in some cases increase your priority in search engine results. Using techniques that force referring links to use the same case and URL entrypoint (for example: weblogs.asp.net/scottgu instead of weblogs.asp.net/scottgu/default.aspx) can also avoid diluting your pagerank across multiple URLs, and increase your search results.

In a world where search engines increasingly drive traffic to sites, extracting any little improvement in your page ranking can yield very good ROI to your business. Increasingly this is driving developers to use URL-Rewriting and other SEO (search engine optimization) techniques to optimize sites (note that SEO is a fast moving space, and the recommendations for increasing your search relevancy evolve monthly). For a list of some good search engine optimization suggestions, I'd recommend reading the SSW Rules to Better Google Rankings, as well as MarketPosition's article on how URLs can affect top search engine ranking.

Read more (scottgu)

In this article, Masoud Tabatabaei discusses the concept of Cross Application Authentication using ASP.NET authentication model consisting of Membership Providers, web.config configuration, encryption, and decryption of configuration files. At the end of the article he also examines the application of the concept using ASP.NET login controls.

Read more...(aspalliance.com)