:::: MENU ::::
Showing posts with label VS2008. Show all posts
Showing posts with label VS2008. Show all posts

Friday, August 29, 2008

Tuesday, July 29, 2008

If you don't know about CTRL+I in Visual Studio, go try it, I'll be waiting...

So? Ain't that cool?

(CTRL+I does incremental search, so it will progressively select the first occurrence of whatever you type. It's a lot less disruptive than CTRL+F as a means to search your files)

 

Tuesday, July 15, 2008

Let me at the start of the post first say that I love ReSharper. It is by far the best refactoring support that can be found for VB.NET. I haven't yet used it for C# but are told but esteemed colleague that it rocks.


But... (there is always a but isn't it?) it messes up the Intellisense in my Visual Studio. The same colleague (kudos to Jocke) tipped me on how to solve it and here it is;

Open the options for ReSharper and choose Intellisens->General->Use Visual Studio. This will not give you as much support for "Smart Completion" but I'll take that over missing Intellisense everyday in the week, and twice on Sundays.


Next - open the Visual Studio options and recheck that you have Intellisense enabled for are your languages.



Finally restart the Visual Studio - just to be sure that this is saved properly.

Again - the refactoring support with ReSharper is great compared to everything else out there, for VB.NET. But this is not so good - at least now you know how to solve it.

More

In Visual Studio 2008, many new features were implemented that eliminate the top issues found in the VS 2005 Web Test recorder, as covered in the white paper Web Test Authoring and Debugging Techniques.

While many areas have been addressed, there will still be times when record/playback fails or does not give the desired result. I’ll be writing a series of blog articles on how the recorder works, “knobs” in the registry that will let you control what does and does not get recorded, and problems you may encounter. Finally, I’ll introduce new debugging techniques that will enable you to find and fix your tests.

New Recorder Technologies in VS 2008

The VS 2008 recorder introduced two key new technologies that eliminate the majority of the problems encountered in the VS 2005 recorder:

1)      The new recorder now picks up all requests, whereas the 2005 recorder did not record AJAX request or certain types of popup requests, and

2)      The new recorder has a feature to detect dynamic parameters and automatically add the extraction rules and bindings to the test to properly correlate them. A common class of these parameters was dynamic parameters on the query string, such as a session id.

3)      If a page has a redirect, the Expected Response URL records the page that was redirected to. In VS 2005, a redirect to the error page was not detected as an error. In VS 2008 the recorder adds the Response URL validation rule to catch this and flag it as an error.

Filtering Requests

Even though the recorder now captures all requests, some requests are still filtered when the web test is generated.

First of all, “dependent” requests are filtered. Web Test has a feature to “Parse Dependent Links”, in which resources on a page such as images, java scripts sources, and css references are not recorded, but at run time the web test engine parses the response, finds all references to dependents, and then fetches them.  This works the same way in VS 2008. When the parser runs, it looks for all IMG, SCRIPT, and LINK tags to find the dependent resources and fetch them.

However, a web page can also download content using java script. There was a “hole” in the VS 2005 recorder, in that it could not pick up these requests.  A couple of examples of this are mouse over images that are fetched via java script, or images fetched in a mapping program via AJAX calls.

By default, the recorder is configured to filter “static” content such as images and css files. You can override what gets recorded using these registry settings below. These settings are the default (set in code, these registry entries aren’t present after install):

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\WebLoadTest]
"WebTestRecorderMode"="exclude"
"ExcludeMimeTypes"="image;application/x-javascript;application/x-ns-proxy-autoconfig;text/css"
"ExcludeExtensions"=".js;.vbscript;.gif;.jpg;.jpeg;.jpe;.png;.css;.rss"

You can see that these settings will filter out images, java script source files, and css files. If you want to record everything, you can simply set “ExcludeMimeTypes” and “ExcludeExtensions” to the empty string.

The recorder will also work in an “include” mode, where you specify a list of mime types and extensions to include. For example, the default include list is:

Windows Registry Editor Version 5.00

 

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\WebLoadTest]

"WebTestRecorderMode"="include"

"IncludeMimeTypes"="text/html;text/xml;text/xhtml;application/xml;application/xhtml+xml;application/soap+xml;application/json"

"IncludeExtensions"=""

Note you have to include the xml, soap, etc., in order to pick up ajax calls.

Folding in Additional Requests

One thing the recorder is not smart about is that requests picked up by the low level recorder are always treated as top-level requests. If you do set the recorder to include images and such, it does not try to figure out which request the dependent came from and store it under the appropriate top-level request. Instead any additional requests are recorded as top-level requests.

Also, asynchronous requests are recorded at the top level of the web test and will be played back synchronously. We hope to add an “Async” property to top-level requests in our next release that will enable you to more accurately simulate the request pattern generated by the browser for AJAX requests.

Note that dependents are fetched in parallel over two connections in the same way the browser fetches them.

Filtering HTTP Headers

HTTP header filtering works in a similar way to request filtering.

Normally in a web test, most HTTP headers are set from the browser template file. Here are the contents of the IE7 browser template file:

<Browser Name="Internet Explorer 7.0">

  <Headers>

    <Header Name="User-Agent" Value="Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)" />

    <Header Name="Accept" Value="*/*" />

    <Header Name="Accept-Language" Value="{{$IEAcceptLanguage}}" />

    <Header Name="Accept-Encoding" Value="GZIP" />

  </Headers>

</Browser>

 

Notice the MSIE 7.0 in the User-Agent header, which identifies it as IE7.

By default, the recorder only records these additional HTTP headers:

Windows Registry Editor Version 5.00

 

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\WebLoadTest]

"RequestHeadersToRecord"="SOAPAction;Pragma;x-microsoftajax"

 

Your application may send additional custom headers. In that case, you can change the recorder settings to add the headers your app sends.

The recorder is set to never record these headers, which are automatically handled in the HTTP engine:

"Authorization", "Proxy-Connection", "Connection", "Host", "Expect", "Content-Length"

Recorder Settings Summary

You can see the new web test recorder is powerful, but the default settings may night be right for your application.

1.       The default filtering of static content may mask performance problems in your application. Consider recording additional requests.

2.       HTTP Headers that your application depends on may be filtered out. If your application uses custom headers, consider changing your recorder settings to pick them up.

Detecting Dynamic Parameters

A major new feature in VS 2008 is the ability to detect dynamic parameters. A dynamic parameter is a parameter whose value is generated each time a user runs the application, and therefore playback of recorded values won’t work.

The best example of this is a dynamically generated session ID. For apps that support login, each time a user logs in, the server generate s a unique session ID to track the user. This session ID may then be passed back to the server via a cookie, form field, or query string parameter. In VS 2005, Web tests handled cookies and hidden fields. Note there were some bugs in hidden field binding in VS 2005, some of which were fixed in SP1 and some have been fixed in VS 2008.

VS 2008 adds support for two more types of dynamic parameters: query string parameters and form fields (other than hidden fields).

The way it works is at record time, the value of each form post and query string parameter is recorded, and an appropriate extraction rule is computed. Immediately after recording the web test is played back by the correlation tool. During playback, the tool applies the extraction rule and compares the extracted value with the recorded value. If they differ, it flags the parameter as a dynamic parameter.

Dynamic Detection Playback Fails

A subtle problem you may encounter is Dynamic Detection playback failing. As mentioned above, in order to detect dynamic parameters, the web test is played back under the covers. If this playback fails you may or may not get a complete list of parameters to promote. You can see whether or not it failed by looking in the result window, where the result of the playback is displayed. If it does fail, fix your test per the guidance below and then re-run dynamic parameter detection from the web test toolbar.

When to not Promote a Parameter to a Dynamic Parameter

There may be times when playback thinks a parameter is dynamic when in fact it is not.

One example of this is when cookies in IE are used to populate form fields. For example, our Web test forums provide an option to save your user name for login. If you record a Web test with this setting turned on, a cookie is used to set the value of the user name so it is automatically filled in. When dynamic parameter detection runs, it looks at the user name value and sees that it is different than the recorded value. Aha! A dynamic parameter! If you accept this as a dynamic parameter, the web test engine will scrape the value out of the response rather than playing back what you typed in, clearly not the desired behavior.

To avoid problems like this, clear your browser cookie cache prior to recording.

Playback Doesn’t Work, Now What?

In general, the problem is that the http requests sent by the web test engine are somehow different than the requests sent by IE. The first challenge is figuring out what IE is sending over http. There are different tools for doing this, include Fiddler (http://www.fiddler2.com/), or NetMon. There is also a new feature in the web test recorder for generating a recording log from a recording. Turning this on is probably the easiest way to see what IE is sending.

Common Problems to Look For

Common problems we’ve seen customers encounter:

1)      Missing custom HTTP headers. See the comments on HTTP headers in the recorder section above.

2)      Cookies saved on your local machine.  A good example of this is cookies stored on your machine to automatically fill out the user name. See the section on Detecting Dynamic Parameters above.

Using the Recorder Log

To turn on the recorder log, set these registry entries:

Windows Registry Editor Version 5.00

 

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\WebLoadTest]

"CreateLog"=dword:00000001

"RecorderLogFolder"="c:\\RecorderLogs\"

 

This will result in a log file for each recording session. Open the log file and find the failing request, then carefully compare all parts of the request in the request log to the request in web test playback:

·         URI

·         Query string parameters and values

·         HTTP headers, including custom headers and cookies

·         Post body

Once you identify the difference, you can go back to the web test to fix the problem. Some areas to look out for:

·         Missing custom http headers in the web test

·         Incorrectly handled dynamic parameters (including

o   parameters marked as dynamic that aren’t, or )

o   Incorrect extraction rules

Using Fiddler

Because of the underlying technology, there may be times when either the web test recorder or web test playback viewer does not accurately reflect what is actually getting sent over the wire in subtle ways. There may even be times when the web test recorder interferes with the requests sent by IE. To get a true picture of what is getting sent, you can use a tool like Fiddler or NetMon.

One example of this is in VS2005, web test playback always showed cookies being sent in a single header, when in fact they were sent in multiple http headers.

You can also configure Fiddler to run while the web test is running to capture the http traffic the web test is sending. To do this, you need to create a web test plugin and add this code to the constructor:

            this.Proxy = "http://localhost:8888";
            WebProxy webProxy = (WebProxy)this.WebProxy;
            webProxy.BypassProxyOnLocal = false;

Web Test Logging

Web test playback is a great tool for seeing what is going on in the web test engine. However, there may be times when you just want to dump the entire session to a text file. One limitation in Web test playback is there is no way to search across all the requests and responses in the session. We have developed a web test logging sample plugin that will do just that and plan to release it to CodePlex soon at http://www.codeplex.com/TeamTestPlugins.

Conclusion

Record playback in VS 2008 has been vastly improved over VS 2005. The recorder now has the capability to pick up all requests sent from IE, the Dynamic Parameter Detection feature catches the most common cases for dynamic parameter correlation, the Response URL validation rule, and a number of bugs have been fixed to make playback more reliable. Even with these new features, there may be times where the recorder by default isn’t capturing the meaningful requests for your application, and you will want to tweak the recorder settings to record additional requests.

There may also be times where record/playback will fail, and you will need to debug the web test to figure out what is going wrong. The addition of the recorder logging feature will make this easier.

More

Thursday, June 26, 2008

One of the most useful registry hacks I use on a regular basis is one Robert McLaws wrote, the “ASP.NET 2.0 Web Server Here” Shell Extension. This shell extension adds a right click menu on any folder that will start WebDev.WebServer.exe (aka Cassini) pointing to that directory.

I recently had to repave my work machine and I couldn’t find the .reg file I created that would recreate this shell extension. When I brought up Robert’s page, I noticed that the settings he has are out of date for Visual Studio 2008.

Here is the updated registry settings for VS2008 (note, edit the registry at your own risk and this only has the works on my machine seal of approval).

32 bit (x86)

Windows Registry Editor Version 5.00
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\VS2008 WebServer]
@="ASP.NET Web Server Here"
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\VS2008 WebServer\command]
@="C:\\Program Files\\Common Files\\microsoft shared\\DevServer
\\9.0\\Webdev.WebServer.exe /port:8080 /path:\"%1\""

64 bit (x64)

Windows Registry Editor Version 5.00
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\VS2008 WebServer]
@="ASP.NET Web Server Here"
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\VS2008 WebServer\command]
@="C:\\Program Files (x86)\\Common Files\\microsoft shared\\DevServer
\\9.0\\Webdev.WebServer.exe /port:8080 /path:\"%1\""

For convenience, here is a zip file containing the reg files. The x64 one I tested on my machine. The x86 one I did not. If you installed Visual Studio into a non-standard directory, you might have to change the path within the registry file.

 

Monday, June 23, 2008

 

Monday, May 26, 2008

Microsoft released a new cool tool called Microsoft Source Analysis for C#.  It's an internal tool that does somewhat FxCop does.  While FxCop analyse the IL, Source Analysis analyse the source code itself.  It comes with about 200 built in rules that are however not customizable.  These rules cover:

  • Microsoft Layout of elements, statements, expressions, and query clauses
  • Placement of curly brackets, parenthesis, square brackets, etc
  • Spacing around keywords and operator symbols
  • Line spacing
  • Placement of method parameters within method declarations or method calls
  • Standard ordering of elements within a class
  • Formatting of documentation within element headers and file headers
  • Naming of elements, fields and variables
  • Use of the built-in types
  • Use of access modifiers
  • Allowed contents of files
  • Debugging text

After installation, the tool is integrated into the VS 2005 or 2008 IDE. 

You can read about Source Analysis for C# here:
http://blogs.msdn.com/sourceanalysis/

You can download it here:
http://code.msdn.microsoft.com/sourceanalysis/Release/ProjectReleases.aspx?ReleaseId=1047

 

A long time ago I was watching Joe Stagner's ASP.NET AJAX videos and I saw him purposefully indenting the attributes in his ASP.NET markup so that they lined up neatly underneath each other. I really took to this concept because it's so much easier to read a laundry list of attributes than it is to scroll across your page trying to hunt down your markup. I emailed Joe, asking him what the Visual Studio hotkey was to perform the lineup, and he replied that he did it all by hand. I've done the same thing since watching those videos and while it's time consuming, I really prefer the readability.

 

At DevTeach, Rob Burke gave an excellent talk on building line-of-business applications using WPF
and Silverlight . He is another practitioner of listed attributes, and watching him painstakingly line them up by hand during the presentation was the last straw for me; I wasn't going to waste another keystroke on presenting these attributes how we want them!

 

More

Thursday, May 22, 2008

If you just want the annotated jQuery file download it here: jQuery with Intellisense comments. For more about it, read on!

I really like jQuery.  I might even say that I love jQuery.  It is a simple and elegant JavaScript framework.  The genius is that it realizes that what most people want to do with JavaScript is manipulate the DOM, and it makes doing that fun.  Look at this example from the jQuery home page:

 
$("p.surprise").addClass("ohmy").show("slow");

That will get all the p elements with the class surprise, add the class "ohmy", and then change the display to "block" with a slow animation. Go run it on the jQuery homepage.  Nice, eh?  The dollar function is magical.  You just give it a CSS selector, and it returns jQuery objects that you can do all kinds of cool things with.  Also, almost all the functions in the framework return this same jQuery object, so they are chainable like you see above.

Now, I also really like ASP.NET, and with the new ASP.NET MVC Framework coming out, I might even say that I love ASP.NET.  I would really like to use jQuery in my ASP.NET projects, but it has never worked too well.  The JavaScript intellisense features of VS 2008 wouldn't work right.  When Visual Studio 2008 was released, it came with better support for JavaScript.  Unfortunately, it did not work with jQuery, but after only a few months, Microsoft released a "Hot-Fix Roll-Up" patch.  This will make the intellisense work with jQuery, and fix some other things.

In order for you to get all the great hints in intellisense, you need to have XML comments in your JavaScript.  So I spent yesterday afternoon commenting jQuery, and it is awesome!  Download it yourself.

All you have to do to use it is include it in a script reference in your page, and add a reference comment in your .js file.  So in your HTML you'll have this under the head element:

 
  <script src="jquery-1.2.3-intellisense.js" type="text/javascript"></script>
  <script src="your.js" type="text/javascript"></script>

Then in your.js you'll need to tell VS about the jquery-1.2.3-intellisense.js file like so:

 
/// <reference path="jquery-1.2.3-intellisense.js" />

Now you'll have all the intellisense goodness for jQuery available in your.js.  You won't want to deploy the jquery-1.2.3-intellisense to production since it is quite a bit larger than the minified version, but for development it should work the same as jquery-1.2.3.  I didn't change any of the code, although in some cases it would have made for better intellisense.  Which brings me to my next point.

I couldn't get everything to work due to the way the library is written, and the JavaScript comments work.  For instance, some functions like append() in jQuery don't have parameters specified.  The functions still expect arguments, they just use the "arguments" array instead of formal parameters.  Another problem was that for serveral API functions, only one implemention is used.  For example the event helpers such as blur(), click(), focus(), etc. that fire events are the same function in jQuery.  For these, I've just written a generic XML comment.

I'll be speaking at Boise Code Camp on March 8th about using jQuery with ASP.NET.  It will be mostly an introduction, with a walk-through on changing a plain old contact form, into a fancy-schmancy AJAX one using jQuery.  So if you're around stop by and say hi.  This was my first blog post, and Code Camp will be my first speaking engagement so some things may still be a little rough around the edges.  The nice thing about Code Camp is that you don't have to be invited to speak :)

Update 5/20/2008

jQuery 1.2.4 was released yesterday.  I just posted the updated jquery-1.2.4-intellisense file.

Update 5/21/2008

jQuery 1.2.5 was released today.  jQuery 1.2.4 was a bad build.  I just posted the updated jquery-1.2.5-intellisense file.

 

More

Friday, May 9, 2008

 

A simple addin which adds toggle buttons for various find options (Match Case, Match Whole Word) in Visual Studio 2005 & 2008.

This is behavior that existed in prior versions of Visual Studio, but was removed in 2005.

 

More

Thursday, May 8, 2008

Introduction

CopySourceAsHtml is an add-in for Microsoft Visual Studio 2005 that allows you to copy source code, syntax highlighting, and line numbers as HTML. CSAH uses Visual Studio's syntax highlighting and font and color settings automatically. If Visual Studio can highlight it, CSAH can copy it, and your source should look the same in your browser as it does in your editor.

Latest Release

CopySourceAsHtml 2.0.0

CopySourceAsHtml 2.0.0 Installer (163 KB) (built and tested on Microsoft Visual Studio 2005)
CopySourceAsHtml 2.0.0 Source (91 KB) (now includes build and installer scripts)

CSAH 2.0.0 is the first official release of CopySourceAsHtml for Microsoft Visual Studio 2005. This release has a leaner, meaner, refactored codebase that fixes a few minor defects and takes advantage of new features in Visual Studio 2005 and .NET 2.0. Derick Bailey and Jason Haley, thank you both for releasing your port in a much more timely fashion.

More

Monday, April 28, 2008

In this walkthrough you will learn how to create Web-Sites, Web Applications, Virtual Directories and Application Pools.

Introduction

The IIS PowerShell namespace consists of items like Web-Sites, Apps, Virtual Directories and Application Pools. Creating new namespace items and managing them is very easy using the built-in PowerShell cmdlets.

Creating Web-Sites

 If you are familiar with PowerShell you know that the New-Item cmdlet is used to create new items in the various PowerShell namespaces. The command "New-Item c:\TestDirectory" creates a new filesystem directory for example (most people use the "MD" or "MKDIR" alias for New-Item however). New-Item is also used to create new Web-Sites within the IIS 7.0 PowerShell namespace.

Parameters

Specifying the name of the directory is the only argument needed when you create a new file system directory. Unfortunately this is not enough when you create a Web-Site. Additional parameters like the file system path and network bindings are needed to create a Web-Site. Here is the command to create a new Web-Site followed by a dir command:

PS IIS:\Sites> New-Item iis:\Sites\TestSite -bindings @{protocol="http";bindingInformation=":80:TestSite"} -physicalPath c:\test

PS IIS:\Sites> dir

Name             ID   State      Physical Path                  Bindings
----             --   -----      -------------                  --------
Default Web Site 1    Started    f:\inetpub\wwwroot             http *:80:
DemoSite         2    Started    c:\test                        http :80:TestSite 

Using the -physicalPath argument is pretty straightforward. But you might ask yourself why the -bindings argument looks so complex.

The construct used is a hashtable (go here to learn more about PowerShell hash tables). Within the hash table key=value pairs indicate the settings that reflect the attributes within the IIS site bindings section:

<bindings>
        <binding protocol="http" bindingInformation=":80:TestSite" />
</bindings>

Now here is the reason why we use a hash table: IIS configuration is completely extensible (see  here for more details) with additional sections and attributes. You can imagine that somebody extending the <binding> element with additional attributes. Key value pairs within a hash table provide the flexibility to incorporate these new attributes without having to completely rewrite the IIS PowerShell Provider.

Granted, the syntax is a bit complex. We are thinking about wrapping some typical tasks like creating sites with additional functions or scripts in a later Tech Preview.

Deleting Sites

Here is how you delete the site you just created.

PS IIS:\ >Remove-Item IIS:\Sites\TestSite

Creating Web Applications

Creating Web Applications is easier than creating sites. Here we go: 

PS IIS:\> New-Item 'IIS:\Sites\Default Web Site\DemoApp' -physicalPath c:\test -type Application

 

Name                     ApplicationPool          EnabledProtocols         PhysicalPath
----                     ---------------          ----------------         ------------
DemoApp                  DefaultAppPool           http                     c:\test

The only parameter you have to specify is the type (-type) because underneath a Web-Site you might want to create an Applications or a Virtual Directories. By specifying the -type parameter you tell the IIS Provider to create an application.

To delete the application you can also use Remove-Item.  

Creating Virtual Directories

To create a Virtual Directory you also use the New-Item cmdlet. Let's create a Virtual Directory underneath the 'Default Web Site' but and a second one underneath the Web Application we created in the previous step.

PS IIS:\> New-Item 'IIS:\Sites\Default Web Site\DemoVirtualDir1' -type VirtualDirectory -physicalPath c:\test\virtualDirectory1

Name                                              PhysicalPath
----                                              ------------
DemoVirtualDir1                                   c:\test\virtualDirectory1


PS IIS:\> New-Item 'IIS:\Sites\Default Web Site\DemoApp\DemoVirtualDir2' -type VirtualDirectory -physicalPath c:\test\virtualDirectory2

Name                                              PhysicalPath
----                                              ------------
DemoVirtualDir2                                   c:\test\virtualDirectory2

Creating Application Pools

But it gets even simpler. Creating a new AppPool only requires the name to be specified.

PS IIS:\> new-item AppPools\DemoAppPool

Name                     State
----                     -----
DemoAppPool              {}

Simple, wasn't it? Now let's put this together to an end-to-end scenario.

Putting it all Together

In the following end-to-end scenario we will execute the following step:

  1. Create a set of new file system directories for the sites, web applications and virtual directories we will create a little later.
  2. Copy some very simple web content into the newly created directories.
  3. Create new Application Pool
  4. Create a new site, a new application and two new virtual directories and assign them to newly created Application Pool.
  5. Request the web content via the web browser.

Step 1: Create New Directories

 We use the New-Item cmdlet to create four new file system directories. Execute the following commands (use 'md' instead of New-Item if you don't want to specify the -type parameter):

New-Item C:\DemoSite -type Directory

New-Item C:\DemoSite\DemoApp -type Directory

New-Item C:\DemoSite\DemoVirtualDir1 -type Directory

New-Item C:\DemoSite\DemoVirtualDir2 -type Directory

Step 2: Copy Content

Now let's write some simple html content to these directories:

Set-Content C:\DemoSite\Default.htm "DemoSite Default Page"

Set-Content C:\DemoSite\DemoApp\Default.htm "DemoSite\DemoApp Default Page"

Set-Content C:\DemoSite\DemoVirtualDir1\Default.htm "DemoSite\DemoVirtualDir1 Default Page"

Set-Content C:\DemoSite\DemoVirtualDir2\Default.htm "DemoSite\DemoApp\DemoVirtualDir2 Default Page"

Step 3: Create New Application Pool

Create the new Application Pool 'DemoAppPool' for the new site if you deleted the one we created in the previous sample.  

New-Item IIS:\AppPools\DemoAppPool

Step 4: Create New Sites, Web Applications and Virtual Directories and Assign to Application Pool

Here comes the beef. We create DemoSite, DemoApp and two Virtual Directories - DemoVirtualDir1 is directly underneath DemoSite and DemoVirtualDir2 is underneath DemoApp. We are assigning DemoSite and DemoApp to DemoAppPool created in the previous step. DemoSite is assigned to port 8080 to not conflict with the 'Default Web Site'

New-Item IIS:\Sites\DemoSite -physicalPath C:\DemoSite -bindings @{protocol="http";bindingInformation=":8080:"}

Set-ItemProperty IIS:\Sites\DemoSite -name applicationPool -value DemoAppPool

New-Item IIS:\Sites\DemoSite\DemoApp -physicalPath C:\DemoSite\DemoApp -type Application

Set-ItemProperty IIS:\sites\DemoSite\DemoApp -name applicationPool -value DemoAppPool

New-Item IIS:\Sites\DemoSite\DemoVirtualDir1 -physicalPath C:\DemoSite\DemoVirtualDir1 -type VirtualDirectory

New-Item IIS:\Sites\DemoSite\DemoApp\DemoVirtualDir2 -physicalPath C:\DemoSite\DemoVirtualDir2 -type VirtualDirectory

Voila. All that's left is to request the web content.

Step 5: Request the Web Content

You can of course open the browser and enter http://localhost:8080/ and all the other URLs. But its a PowerShell walkthrough and we'll use PowerShell to do it by using the .NET WebClient classes:

$webclient = New-Object Net.WebClient

$webclient.DownloadString("http://localhost:8080/");

$webclient.DownloadString("http://localhost:8080/DemoApp");

$webclient.DownloadString("http://localhost:8080/DemoVirtualDir1");

$webclient.DownloadString("http://localhost:8080/DemoApp/DemoVirtualDir2");

If you feeling adventurous you can also use Internet Explorer object itself:

$ie = new-object -com InternetExplorer.Application

$ie.Visible = $true

$ie.Navigate("http://localhost:8080/");

Summary

In this walkthrough you learned how to create Web-Sites, Web Applications, Virtual Directories and Application Pools with PowerShell. Additional PowerShell features were used to build a functional end-to-end scenario.

More

Thursday, April 24, 2008

Did you know there's T4 (Text Template Transformation Toolkit) support inside VS2008 now? Add a file with the .tt or .t4 extension to your project and you got a T4 template which VS2008 will automatically run when you save it. If you're into this and want to try it out, go to http://www.t4editor.net/ and download the T4-editor from Clarius. It gives you proper editing support from within Visual Studio. They got a version for vs2003 as well.

This may not be as attractive as it used to be, now that we got designer tools for Linq to Sql, entity framework and so on to help us generate boilerplat code for database access, but I'm sure I can come up with some good use for this. The ASP3-like syntax is very easy to grasp, so basically there is no excuse for not using this if you know you're going to produce lots of code that looks or behaves the same. As long as you have some metadata to work from, be it a database, xml file or text file, I'm sure this can be of help to you. Ah yes, you can write the templates in either c# or VB.

Some T4 resources for you maybe:

Hilton Giesenow blogs about T4 and you can download a couple of VS T4 templates from him!

Oleg Sych has a large number of very, very good blog posts about T4, Oleg is THE T4 blogger out there.

The T4-editor website by Clarius has a couple of videos.

The Text Template documentation for vs2008 on MSDN.

On MSDN there's also a good intro video on the topic that you might want to look at if this is of interest to you.

 

Friday, April 11, 2008

After almost one year of work and organization, I am very happy to share this project with you:

http://code.msdn.microsoft.com/vlinq

The Visual Linq query builder is a Visual Studio 2008 addin. It's a designer that helps you create Linq to Sql queries in your application. Both C# and VB projects are supported.

As you will read it in this post, this project developed by interns is a prototype for a new kind of designers.

Please give us your feedbacks !!!

Project history

It is an academic project developed during a Microsoft France internship in collaboration with Microsoft Corporation.
I have been the 'local' manager and technical lead of the project. I wanted to create a VS designer using WPF for a long time and I had the idea of a query builder for Linq to Sql. Then came the opportunity to organize a 6 months long internship in collaboration with Ms Corp.

I have recruited two french students that I want to thank again today for their excellent job.

- Simon Ferquel from SupInfo who is now working in a french company (Winwise). You may know him from the time he was student as the author of a funny tool for Vista : myExposé.

- Johanna Piou from ISEN Toulon who is still student this year and who is well known for her brilliant Imagine Cup participation in the Interface Design category.

You can find the French description of the project here: http://msdn.microsoft.com/fr-fr/vcsharp/default.aspx (coming soon).

The project goal

Linq to Sql and Linq more generally speaking, is a new technology mainly based on language evolutions. As any new syntax, you have to take some time to get familiar with it.

The VLinq project as any designer helps you to build graphically Linq to Sql queries but we wanted to keep it visually very close from the code. The goal is not to hide the generated code but to make it visible in the designer. It's a kind of mix between a classical designer and a graphical intellisense.

VLinq also helps you grouping all queries at the same place allowing easy management (edit, add, remove) and previewing and testing.

Last goal: releasing the whole solution, including source code to share with you our experience about using WPF with VS2008 extensibility.

What do we release ?

The whole project has been developed using Visual Studio 2008 (betas then RTM) and Expression Blend. We provide the whole solution (binaries + source code). The solution contains a Setup project for a quick installation (msi file).

You can get all the stuff here: http://code.msdn.microsoft.com/vlinq/ under the 'Releases' tab. (msi, quick reference guide, user documentation, webcast).

 

Read More

Tuesday, April 8, 2008

Overview:

Like mathematicians, developers talk about their code in aesthetic and sometimes hygienic terms. The code is “elegant,” or it “looks clean,” and sometimes it “smells bad”. I have even heard developers refer to their code as “sexy.” Talking about your code as sexy is surely a sign that you need to get out more! Achieving elegant code is not easy and so as I deepen my experience with .Net 2.0, I am always pleased to discover when the framework offers a way to do something that I could have done in 1.1, but can now do much more elegantly. Predicates and the related Actions and Converters are just such additions to the framework. They will not revolutionize how you code, but used properly they will reduce the amount of code needed, encourage reuse, and just look sexier.

This article will examine the following questions:

  • What are Predicates?
  • How they are used?
  • How does their performance stack up against similar foreach routines?
  • What are Actions and Converters?

So What are Predicates?

A Predicate is a new class introduced by the .Net 2.0 framework. The class has the following signature: public delegate bool Predicate (T obj) and is used by collections such as List and Array to perform methods like RemoveAll, Find, FindAll, Exists, etc. Its signature reveals that a Predicate is a Delegate. What that means is that so long as the method signatures are the same –i.e. the methods have the same return type and accept the same arguments—then any method that conforms to that signature can be called in its place. C# Delegates have been compared to C or C++ function pointers, callback functions, etc. They enable you to specify what the method that you want to call looks like without having to specify, at compile time, which actual method will be called.

Also revealed by its signature is that a Predicate is a Generic method. There are a lot of good introductions to Generics so I will not traverse that ground here. Suffice it to say in this context Generic refers to the fact that the same Predicate can be used for Collections with different Types.

One slight twist with Predicates worth mentioning is that since the Predicate class is baked into the .Net framework you do not need to specify the type argument of the generic method or to create the delegate explicitly. Both of these are determined by the compiler from the method arguments you supply. You will see from the examples what this means in practice.

The chief benefit of using Predicates, besides the “coolness factor,” is that they are more expressive, require less code, promote reuse, and surprisingly, are faster than other alternatives. Using a Predicate turns out to have better performance than a similar foreach operation. This is demonstrated below.

How do I Use Them?

The best way to illustrate the use of Predicates is to compare them to foreach routines. For example, say you have a List collection like the one below. You want to find every member of that collection that starts with “a”. One reasonable approach is to use foreach and return a new List which contains only those members of the original list that start with “a”.

The List collection below will be used to illustrate the key concepts in this article.

Sample List Collection

List<string> items = new List<string>();
items.Add("Abel");
items.Add("Adam");
items.Add("Anna");
items.Add("Eve");
items.Add("James");
items.Add("Mark");
items.Add("Saul");

To filter all members of this collection that begins with "a" using forech, the method (or routine) would look something like this:

Finding All items starting with "a"

public List<string> FindWithA(List collection)
{
   List<string> foundItems = new List<string>() ;
   foreach(string s in collection)
   {
      if ( s.StartsWith("a", 
         StringComparison.InvariantCultureIgnoreCase) )
      {
         foundItems.Add(s) ;
      }
   }
   return foundItems ;
}

In this case the List collection returned would contain the following items: Abel, Adam, and Anna.

Now what if instead of retrieving every member of the collection that starts with “a” you just wanted to confirm that at least one member of the collection started with an "a"? In this case you’d likely create a new method like the one below. You might even be clever and factor out the common code to evaluate if a particular member of the collection started with “a” and use that for both the Filter and Exists methods.

Checking if an item starting With "a" exists

public bool CheckAExists(List collection)
{
    foreach (string s in items)
    {
       if (StartsWith(s)) //calls the refactored method
          return true;
    }
    return false;
}
 
// the test to see if the string starts with "a" is now factored 
// out to its own method so that both the existence check and the
// find all method could use it.
public bool StartsWith(string s)
{
   return s.StartsWith("a", 
      StringComparison.InvariantCultureIgnoreCase) ;
}

There is nothing particularly stinky about this approach and prior to Predicates, this was the best way to achieve the desired result. But let’s see how we could achieve the same ends using Predicates.

Using a Predicate to find all and check existence

public void SomeMethod()
{
   // Uses StartsWithA method to check for existence.
   if ( items.Exists(StartsWithA)) 
   {
      // Also uses the StartsWithA method, but now 
      // to find all values
      List<string> foundItems = items.FindAll(StartsWithA); 
   }
}
 
// Method conforms to the Predicate signature. 
// It returns a bool and takes a string. 
// (Note: Even though a Predicate is a generic,
// you do not need to supply the type.  The compiler 
// handles it for you.)
public bool StartsWithA(string s)
{
   return s.StartsWith("a", 
      StringComparison.InvariantCultureIgnoreCase);
}

Using Predicates just smells better --at least to my refined olfactory sensibility. The Predicate code could have also been written using an Anonymous Method. This is a good approach if the logic you are applying to the collection is not going to be used again in the application. If that logic might be reused, then my bias is to put it in a method so you don’t run the risk of code duplication. I have also found that Anonymous Methods decrease clarity as many beginning programmers do not understand the syntax. So if you do use them, make sure everyone on the team understands their use.

Using a Predicate as an Anonymous Method to find all items starting with "a"

public void SomeMethod()
{
   // FindAll now uses an Anonymous Method. 
   List<string> foundItems = items.FindAll(delegate(string s) 
      { 
         return s.StartsWith("a", 
            StringComparison.InvariantCultureIgnoreCase); 
      } ) ;
}

Sweet fancy Moses that is some elegant code! Sure the results are not different than what was achieved using a foreach loop, but using Predicates smells like it was just bathed in rosewater, swaddled, and then pampered with frankincense and myrrh. (If you are like me and wondered just what the f*#! are frankincense and myrrh, here's a link.)

Scent aside, there is a limitation to the standard implementation of Predicates. It is often the case that you will want to pass in a parameter to the method which the Predicate points to. It would be a pain to have to define a method for StartsWithA and another method for StartsWithB, etc. Because Predicates are delegates, however, you cannot change the signature to pass in additional arguments. Fortunately, it is easy enough to wrap the predicate in another class so you have access to additional parameters in your predicate method. There is a good article by Alex Perepletov entitled “Passing parameters to predicates” demonstrating this technique.

Of course, I've only demonstrated a few of the methods in the .Net framework that utilize predicates. I encourage you to review the API for Array and List to view the other methods of those classes that use Predicates. (I am not sure if there are any other classes that have methods which use Predicates, so if there are any outside of Array and List, please let me know so that I can post them too.)

Performance

I would argue that even if the Predicate class is a little slower than similar looping structures like foreach or for they are still preferable as Predicates have other more important virtues. I know there are performance Nazis out there who agonize over nanosecond differences, but if a nanosecond difference is that important to your application than it is likely you should not be using C# at all. Still having an understanding of the performance impact of your programming decisions is a good thing and I was curious about how Predicates stacked up so I put together a simple head-to-head comparison of Predicate vs. foreach. The test compared a List of 100,000 string values to see which ones started with "1." For each test iteration, the search was performed 100 times. The results are below.

C# Predicate Performance

As you can see, Predicates are the winner. We are, however, talking milliseconds and, in most situations, nanoseconds so I would not take one approach or the other on performance considerations alone.

Also a few days after I completed these tests, I found another article that conducts a more through comparison between Predicates and foreach. You can find that comparison at Jon Skeet's Coding Blog.

As Long as We're Here: Actions and Converters

In addition to the Predicate class, .Net 2.0 also introduces the Action and Converter classes. Like Predicates, these are generic delegates. The Action class provides a simple way to walk all items of a collection and call a method on each member. Both Action and Converter work in the same way as the Predicate class --including the ability to either define or use Anonymous methods. For example, if you wanted to display all the members of a List collection, you could use the following:

Using an Action to display all members of a collection

public void SomeMethod()
{
   // I am using an Anonymous Method. As with Predicates,
   // I could have also defined a method and used it.
   items.ForEach(delegate(string s)
   {
      Console.WriteLine("value: {0}", s);
   });
}

The Converter class is used to change all members of a collection from one type to another type. The signature of the Converter class is a little different than the Predicate or Action classes: public List ConvertAll (Converter converter). It is, however, used basically in the same way. For example, if I wanted to turn a List collection of strings into a list collection of Names, (assuming for the moment that I had created a Name class,) I could do the following.

Using a Converter to convert a List of strings.

public void SomeMethod()
{
   // The Converter must specify the type to be converted, string, 
   // the type it is being converted to, Name, and the method
   // doing the conversion, StringToNameConverter.
   List<Name> names = 
      items.ConvertAll(new Converter<string, Name>( StringToNameConverter));
}
 
// The method used by the ConvertAll method to do the actual
// conversion of strings to Names.
private Name StringToNameConverter(string s)
{
   return new Name(s);
}
 
// Our Name class might be defined as follows:
public class Name
{
   private string m_value;
   public string Value
   {
      get { return m_value; }
      set { m_value = value; }
   }
 
   public Name(string name)
   {
      Value = name;
   }
}

I did not benchmark the Action or Converter class, but my hunch is that they too would offer slight performance benefits over similar routines using foreach. Like the Predicate class, the chief benefit they offer is that they make it easier to write the same elegant, sweet smelling code and will likely encourage reuse as well. Finally, if you need to pass parameters to either class, you can wrap them in another class as shown in the Predicate example cited earlier.

Conclusion:

So we have arrived at the end of the article. Likely at this point you are so seduced by the sexiness of the code that the Predicate, Action, and Converter classes make possible that you are contemplating leaving your significant other and moving to Massachusetts to marry your application --which is, I believe, legal here in Boston. I wish you both well in that endeavor!

As always, do not hesitate to email me if you have any questions (my email is in the "About the Author" tab above.) If you extend what I've done or have additional information I missed, I would also greatly appreciate your letting me know. (Leaving a comment, positive or incredibly positive, is also always appreciated.)