My Development Blog

My Development experiences

Kentico – Inline Data Binding

Posted by ipwright83 on November 12, 2013

My first Kentico page resulted in a bit of a mess. It involved lots of WebPartZones, and then each zone had a WebPart in that would then be bound to the CMSContext.CurrentDocument.SomeField. This allowed me to bind the name, description, image etc…

Unfortunately it quickly looks a mess in CMSDesk, is quite time consuming and it just a bit of a pain working in CMSDesk for all this content. I asked around and my team pointed out that I could do in-line data binding like so:

1
2
3
4
<div class="imageInfo">
   <h1><%= CMSContext.CurrentDocument.GetStringValue("Title", "{Title}") %></h1>
   <p><%= CMSContext.CurrentDocument.GetStringValue("Description", String.Empty) %></p>
</div>

The problem is at this point things like the Designer view in CMSDesk will frequently complain with the error:

The Controls collection cannot be modified because the control contains code blocks

This is rather annoying, and consequently developers have either used the approach of adding lots of WebParts (the norm) or sometimes assigning values in the code behind. What I find easier however is to change the syntax slightly to use data binding:

1
2
3
4
<div class="imageInfo">
   <h1><%# CMSContext.CurrentDocument.GetStringValue("Title", "{Title}") %></h1>
   <p><%# CMSContext.CurrentDocument.GetStringValue("Description", String.Empty) %></p>
</div>

Now this won’t work either directly, because your page probably hasn’t been data bound. The easy way to fix this is to ensure you call DataBind(). In my case I wanted to put this in the masterpage, but the masterpage from Kentico didn’t feature a code behind file that I could easily modify so I simply changed the master page inherits it like so:

1
<%@ Control Language="C#" Inherits="MasterPageLayout" %>

Now in page load I can call DataBind:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using CMS.PortalControls;

namespace NetC.BurtonRoofing
{
    /// <summary>
    /// Code behind for the Master Page
    /// <remarks>Allows the use of markup in the ascx such as </remarks>
    /// </summary>
    public class MasterPageLayout : CMSAbstractLayout
    {
        /// <summary>
        /// Raises the <see cref="E:Load" /> event.
        /// </summary>
        /// <remarks>Calls the DataBind method for the page</remarks>
        /// <param name="e">The <see cref="EventArgs"/> instance containing the event data.</param>
        protected override void OnLoad(EventArgs e)
        {
            base.OnLoad(e);

            Page.DataBind();
        }
    }
}

This makes data binding nice and simple, keeping your binding in your designer code without needing to mess around with WebPartZones.

Posted in Uncategorized | 2 Comments »

WCF Services – Dispose can cause you problems

Posted by ipwright83 on November 7, 2013

Normally when you’re working with something that implements IDisposable you should always call Dispose() on it. Unfortunately there are a whole bunch of classes that implement IDisposable implicitly. What that means is that you don’t see Dispose in intelli-sense, which makes it quite difficult to check. The general rule is if it’s going to deal with Stream/Connections/GDI/Databases there will probably be a dispose in there somewhere. The simplest way to do this is to wrap the code in a using statement to ensure that Dispose is called even if there is an error.

Once such example of this is the WCFClient. You would normally expect to write something like the following

using(PaymentPortTypeClient client = new PaymentPortTypeClient())
{
    // Do Stuff
}

Unfortunately however for WCF the base class System.ServiceBase.ClientBase can throw an Exception in Dispose if the connection failed to open for any reason. This is a bit of a pain, so instead we need to conditionally dispose depending on the connection state.

A little helper for achieving this:

 ///
/// Use the service safely such that it will correctly dispose itself upon success or failure.
///
/// The delegate to execute safely 
public static void Use<T>(UseServiceDelegate codeBlock) 
{ 
   // Ensure that the factory has been initialized 
   IClientChannel proxy = (IClientChannel)channelFactory.CreateChannel(); 
   bool success = false; 
   try 
   { 
       codeBlock((T)proxy); 
       proxy.Close(); 
       success = true; 
   } 
   finally 
   { 
       if (!success) 
       { 
           proxy.Abort(); 
       } 
   } 
}

At this point assuming you’ve placed the method in a class called Service you can use the following code to safely call the same code that Dispose would on your connection:

Service.Use<PaymentPortType>(proxy =>
{ 
   // Do Stuff
}

Posted in C# | Tagged: , | 2 Comments »

Kentico – Pages losing content on refresh?

Posted by ipwright83 on October 30, 2013

Being fairly new to Kentico I hadn’t used the built in workflow system before. I picked up a solution that it had been enabled on and quickly got the hang of it. There’s a little toolbox available to you when you’ve got a document selected. To publish a change you simply need to submit the item for approval:

publish

Then you can publish it:

submit

Nice and straight forward. However when you’re not used to it you can sometimes forget, which can lead to some behaviours that seem really strange at first. What I’d find is that my item on the first load had some content (name, description, image) for example. I’d make a change to my JavaScript or one of the ASPX files, refresh, and suddenly all my content had gone. This was quite baffling.

content

The reason this happens, is because on the first load you are using Kentico’s Preview mode. This allows you to see the current content in Kentico that hasn’t yet been published. If however you refresh your page then Kentico switches to the live mode, which doesn’t show unpublished information. Therefore if you forget to publish your changes to a document (images and the like) then you’ll see a different view between initial load and refresh.

NoContent

Quite bizarre at first. To avoid it simply remember to publish your changes or turn off workflow while your developing. To do this:

1) Open Kentico’s SiteManager by browsing to http://hostname/CMSSiteManager
2) Select the Workflows Option in the tree on the left
3) Select the Workflow you want to modify and click the edit pencil
4) Click the Scope option
5) Modify the condition in the advanced section to something that won’t match for example ‘0=1’

Posted in Kentico | Leave a Comment »

Visual Studio – File Navigation

Posted by ipwright83 on July 25, 2013

tl;dr;

VSFileNav – new improved, themed version for Visual Studio 2012 available for download here.

 

Finding Files in Visual Studio

Once upon a time I used Resharper¬†Express to jump between files in a Visual Studio solution. However in Visual Studio 2010 this feature was removed from the express versionūüė¶

navigation_gotofile

 

At this point I decided to get involved in Visual Studio extension writing and wrote my first extension VSFileNav. This allows you to quickly find files in your solution using ‘contains’ matching or Camel case matching. I use it all the time and find it incredibly useful (as do a few others who’ve rated it). It did however have a few bugs in, and didn’t install straight into VS2012. However a little while ago I finished off an update which features as a separate installer. There is a download link on the extension page but here it is again: VSFileNav VS2012

The behavior is the same as before, with some extra additions. After finally getting my head around the crazy event model in Visual Studio (blasted classes get garbage collected even when you’re hooked up to events unless you keep an explicit reference to items) it should now:

  • Re-cache all the files when you change solution.
  • Add additional files when they’re added to a project.
  • Add additional files in projects, when a project is added to the solution.
  • Remove files when they’re removed from a project…. you get where this is going?
  • Remove files when a project is removed from the solution.
  • Handle renames of both files/projects appropriately.

Additionally, and this bit took about as long as all the other functionality put together, it has been updated to use the VS2012 theme. This includes both the icons/glyphs and colours. Unfortunately due to a bug in Visual Studio this isn’t done automatically but Tools->Options->VSNav Options allows you to switch between the standard light, dark and blue themes. Here are some screenshots:

Dark

Dark

Light

Light

 

Blue

Blue

 

I hope you find it as useful as I do and find the new colour schemes nicer to the eye with the VS themes. Once again the download link is here.

Posted in Tools | 1 Comment »

MVC Part1 – Routing

Posted by ipwright83 on June 14, 2013

I’ve been spending some time watching videos to learn ASP.NET and decided to blog the useful nuggets of information that I’ve learnt to try and help myself remember them, and share them with anyone interested.

Routing

So what’s routing? Routing is what MVC uses to determine where an HTTP request should go. When you make a request such as http:\\localhost\home\about this gets split up into sections and determines which controller and method should be visited.

Routing is setup in the global.ascx file with the following line:

RouteConfig.RegisterRoutes(RouteTable.Routes);

So lets take a look at the RouteTable class:

public class RouteConfig
{
    public static void RegisterRoutes(RouteCollection routes)
    {
        routes.IgnoreRoute("{resource}.axd/{*pathInfo}");

        routes.MapRoute(
            name: "Default",
            url: "{controller}/{action}/{id}",
            defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional }
        );
    }
}

The first line to note is the routes.IgnoreRoute. This prevents .axd files from being included in the routing system as these are handled by MVC specifically. Following that a single route is setup which matches a particular pattern of {controller}/{action}/{id}, and a default is set. This means when we visit /Home/About we can see that it will use the Home controller, and the About action (aka Method).

We could create our own route, for example to define a system for controlling the language, so whenever /language/ is used it will direct to our controller. It should be noted that the default matches {controller} first which is essentially a variable parameter, making it very greedy. Therefore we need to place our custom route before it:

routes.MapRoute("Language", "language/{name}",
				new { controller = "Language", action = "Search", name=""});

At this point we’ll get a 404 or ‘Page not Found’ error if we try to navigate to /language/english. The reason for this is because we don’t yet have an appropriate controller. If we create a new LanguageController and rename the ‘Index’ method to ‘Search’ then we should be able to test the system.

public class LanguageController : Controller
{
    //
    // GET: /Language/
    public ActionResult Search(string name)
    {
    	string message = Server.HtmlEncode(name);
        return Content(message);
    }
}

Notice here that the name property has been passed in as a parameter. This is because the Controller will automatically search through the routing data, querystrings etc. and fill it in for us. Also remember to protect your strings against potential attacks, the Razor engine introduced later will do this for us, but for custom Actions we need to do this ourselves manually using helper methods such as Server.HtmlEncode().

What is interesting is that this information can be interrogated later on within your controller, for example:

public ActionResult Index()
{
	var controller = RouteData.Values["controller"];
	var action = RouteData.Values["action"];
	var id = RouteData.Values["id"];

	string message = controller + "::" + action + " " + id; 
	ViewBag.Message = message;
}

You can then test this by running your application and going to /Home/Index/49833345.

There are some other useful things to note, when dealing with URL’s you should generally use tilde ~ which means from the ‘root’, this means you don’t have to try and figure out complex web locations and they can instead all be relative. An example: Server.MapPath(“~/Content/site.css”) which would turn this into a real URL.

When you add a method to a controller, by default it’s name will be used as the Action, however using the ActionAttribute it is possible to change this behavior to use a different name, in this example to use the action named Modify:

[ActionName("Modify")] 
public ActionResult Index()
{}

You can also specify verbs which ensures the action is only called for a specific HTTP requests (e.g. Get, Post, Put). This method also allows two action methods to have the same name. You can do this using attributes again:

[HTTPPost]
public ActionResult Index()
{}

Finally it is also possible to apply action filters to commands, for example to ensure only people with certain permissions have access to an action.

[Authorisze(Roles="Admin")]
public ActionResult Search() {}	
public ActionResult Edit(string departmentName) {}

Posted in Uncategorized | Leave a Comment »

Windows based Git Client

Posted by ipwright83 on March 14, 2013

I find myself increasingly having to grab code from Github¬†or similar sources with the project I’m working on. It’s relatively straight forward to hit the ‘ZIP’ button in GitHub:

zip

 

The problem is it’s awkward and clunky to keep on doing this to get changes, extracting over old locations. Also, what if you want to contribute to a project? I’ve tried setting up Git on a windows machine, creating all the various SSH keys and found it hard work… I want my source control to be simple.

GitHub have released a new windows client¬†which I’ve actually not yet tried, the reason being I believe you can only set it up to point to one location (e.g. GitHub) and then choose repositories to fork. As an alternative Atlassian have started a beta programme for their conversion of SourceTree, there popular MAC source control software. You can sign up to the private beta here¬†which currently allows connecting to GitHub, BitBucket and Stash.

I’ve already¬†received¬†one update to the software and they’re receptive to bugs and feature requests. Best of all it’s free (and I believe will be remaining free just like the MAC version). It’s currently a little sluggish I find due to the live ‘Diff’ pane that works for each file you’ve got selected, but it’s generally a nice clear system to use. They’ve gone for a simple clean UI at the moment which is a great improvement over trying to use something like TortoiseGit and find the right options in the right menus:

Clear

Posted in Uncategorized | Leave a Comment »

WPF – RateBar

Posted by ipwright83 on June 5, 2012

As per my previous post, I have been working on a WPF based progress bar. This is supposed to be similar to the ¬†Windows 8 file transfer progress bar, I’ve named it a ‘RateBar’ as it seems appropriate. Well I’ve finished the initial version of this project and based on the popularity of the question on StackOverflow¬†I imagine there are some other people who would benefit from the control.

I’ve not got any automated tests for it (I’ll try to add some at some point) and I don’t know if it works with WPF styles (mainly because I don’t know how they work), but it all seems to work so far and matches quite nicely based on my¬†initial¬†intention. To get a true windows 8 style it’ll probably need combining with an ‘expansion’ arrow which switches between a progress bar and a rate bar.

Some basic instructions to get you started. You’ll need to do the following (I hope most of which are fairly obvious):

  • Add a reference to RateBar.dll within your WPF project.
  • Add the RateBar namespace within your WPF window/control that you wish to include it on:
            xmlns:my="clr-namespace:RateBar;assembly=RateBar"
  • Add an instance of the RateGraph, recommended size is 380×88:
            <my:RateGraph x:Name="rateGraph1" Height="88" Width="380" />
  • Update Value (the progress), Rate (the current rate) and Caption (the rate info above the black line) on a regular basis.
Note:¬†The Progress needs to change before the Rate is updated, otherwise you’ll end up with a vertical line on your graph, as the rate get’s added based on the current progress location to the graph.

I’ve included the source and release binaries in a zip file accessible from here. If you like this and would like to contribute in some way then please feel free to sign up for Dropbox if you’re not already on (free multi location file storage with cloud synchronization) using my referral link. This will give me more space for hosting source code, binaries and useful tools for the community.

The license for the code is that you may use it in any project and modify it as much as you need. Please reference me somewhere if you have a readme and if you’d like to use it for commercial use feel free to do so as long as you leave a comment so I know how many people are using it (and whether I should invest more time improving it!).

Posted in Uncategorized | 4 Comments »

Windows 8 Style Progress Bar

Posted by ipwright83 on June 1, 2012

I recently saw a preview of Windows 8, I didn’t see very much of it but one of the things I did see really caught my eye. The new file transfer progress bar that is in use. For those of you who haven’t seen it a screenshot is included below. Not only does it report progress, but it shows you the current rate of transfer and how that has varied over time.

I wondered whether or not this was something that Microsoft would be releasing, a question on StackOverflow led to ‘maybe’, but chances are it would only be avaliable for Windows 8 anyway. I could see some real potential for this sort of feedback for the application I contribute toward at my workplace, many of the customers won’t be using Windows 8 for some time. I love writing custom tools to make my development life easier, so I thought why not write a new progress bar mimicking this behaviour?

I could have chosen to do this in WinForms and GDI. This would have been well in the realm of comfort although probably tedious ensuring things were pixel perfect. I have been trying to learn some WPF in my spare time, and one of the big problems, is coming up with small isolated projects that are interesting and achievable, I felt like this fit the bill so got started during my lunch breaks and evenings.

I originally thought that I’d just inherit from a ProgressBar and modify some visuals (I actually did do this in the end) but when I discovered I had to use templates this gave me some shivers, I struggle with the bindings still in XAML and didn’t felt like I could easily seek answers to the wrong implementation. So I started off writing a purely code based solution – ProgressGraph.cs.

Although it was fairly easy, and did seem to work, it felt like it was breaking the WPF paradigm. I hadn’t touched a piece of XAML and I’m sure it had bugs in it. I loaded up Reflector to see how ProgressBase.cs had been implemented, and extended this implementation to include the relevant Rate based information that I needed – RateBase.cs. This is a reusable class that you can build your own templates for if you so wish. I’d learnt a few things on the way, a little about DependencyProperties, I was reminded that co-ordinates were based from the top-left and initally had forgotten to use the Dispatcher.

I then started writing my template, which was relatively easy actually once I knew what the bindings should be (note to self… controls don’t appear if you forget to apply the template!). I wanted to do as much as possible in the XAML and using converter’s where possible, I managed to achieve much of this when I stumbled across a blog about a Javascript based multi-converter. This allows me to do much of the calculation within the XAML itself. The only aspect I’m less pleased with is how I’ve handled updating the graph as this had to be done in the code behind. I don’t want to pollute the RateBase with a property of historical values, but I do need to store these somewhere to produce a graph. It works at least and I managed to achieve most of it in XAML which wasmy aim.

Keep an eye out for the next post where I’ll be sharing a link to the source and compiled version…

Posted in C#, WPF | Tagged: , , | Leave a Comment »

Visual Studio – File Navigation Extension

Posted by ipwright83 on May 30, 2012

One of the most frustrating things I found when I upgraded to Visual Studio 2010 was the navigation. I work with a fairly large solution with hundreds of files, many of which I can’t locate via the solution explorer as I just can’t remember where they are. To make this harder, sometimes the file names don’t match the class names and other times the name is the compound of several words that I can’t quite remember the combination of.

Within Visual Studio 2008 I used to use Resharper, and I almost exclusively used this for the file searching capabilties. It was quick and supported camel case searching. I’d previously used other solutions but found them to be slower and not as lightweight. Once I upgraded to VS2010 however I found that the search had been disabled in the express version.

Not wanting to sacrifice the brilliant searching I decided to write my own Visual Studio addin (I found this much harder than I anticipated, hence the extension doesn’t get many updates). It does exactly what I wanted, in a performant way and is avaliable to anyone who wants it. It allows searching by name or camel case and supports wildcards, with filtering as you type and highlighting of the matching criteria, bumping the best matches to the top of the list.

Image

You can find the shortcut under the ‘Edit->Find and Replace->VS File Nav’ option and can re-bind the shortcut using Edit.VSFileNav.

Feel free to have a play and leave some feedback.

Visual Studio Gallery – VSFileNav

Posted in Uncategorized | Tagged: , , | Leave a Comment »

Memory Leak within XmlSerializer

Posted by ipwright83 on May 3, 2012

Within the product I work on we used XML based serialization for some of our classes, the original aim was to provide a way of storing settings that was human readable to make debugging and upgrades easier in the future, while avoid the deprecated SOAP serialization. Unfortunately XML serialization brings about it’s own problems, most recently of which we discovered was leaking memory.

I discovered the memory ramping via Process Explorer, if you don’t have this tool I recommend downloading it for free from the SysInternals suite.

This lead down a road to our XML serialization helper class. By commenting out parts of the code eventually the problem was isolated to a section of code along the following line:

private XmlSerializer GetSerializer(Type type, XmlRootAttribute root)
   return new XmlSerializer(type, root);
}

When you construct a new XmlSerializer it will generate a¬†temporary¬†assembly on the disk (not however if you generate the xml serialization assemblies when you build via SGen). Normally these assemblies are loaded up into the AppDomain and then cached, however not in the case of this specific constructor call. It never caches the assembly, instead it’s re-created every time and therefore leaks memory.

The solution is fairly simple, create your own cache, in our case we made the cache based upon the Type name, and the name of the XmlRootAttribute:

private readonly Dictionary cache = new Dictionary();

private XmlSerializer GetSerializer(Type type, XmlRootAttribute root)
{
    String key = type.AssemblyQualifiedName + "|" + root.ElementName;
    XmlSerializer result;

    if (cache.TryGetValue(key, out result))
        return result;

    result =  new XmlSerializer(type, root);
    cache.Add(key, result);
    return result;
}

Note that the code above isn’t threadsafe, you’d need some form of locking if you were to call this over multiple threads. This however solved our problem and gave us a nice flat memory profiler for¬†de serialization.

Some useful links I discovered on my way:

Microsoft’s Tess Ferrandez –¬†.NET Memory Leak: XmlSerializing your way to a Memory Leak
Microsoft Support – Memory usage is high when you create several XmlSerializer objects in ASP.NET
Microsoft Connect – XmlSerializer memory leak

A little later I’ll be discussing Binary serialization, why I’d like to move to it instead of XML serialization and how we can work around the issue of human readability.

Posted in Uncategorized | Tagged: , , , | Leave a Comment »

 
Follow

Get every new post delivered to your Inbox.