You have reached the blog of Keith Elder. Thank you for visiting! Feel free to click the twitter icon to the right and follow me on twitter.

Visual Studio 2008 Seminar in Jackson, MS and Geek Dinner in Hattiesburg on Feb 4th

Posted by Keith Elder | Posted in Presentations, Smart Clients, Speaking | Posted on 25-01-2008


Doug Turnure, Chad Brooks and me are doing a half-day technical overview covering the new features surrounding Visual Studio 2008 in Jackson, MS from 9:00 AM to 12:00 PM on February 4th, 2008.  Then later that night we are having a Geek Dinner with special guest Sara Ford in Hattiesburg, MS.  Here are the details and how to register for the seminar (yes you HAVE to register for the seminar).  You do not have to register for the Geek Dinner but please RSVP as described.

Seminar Agenda

9:00 – 10:00: Visual Studio 2008 Overview for Architects (Chad)

If your job as a Software Architect is to research and exploit technology to help achieve business results – then you need to take a look at what has just been released in Visual Studio 2008 and the .NET Framework 3.5.   In this talk we will discuss several of the many new features you need to understand and explore to be able to fully leverage these new capabilities in your environment, including multi-targeting support, new features for SOA, and web enhancements. Finally, we will discuss the road ahead and preview some of the local architect, web design, and developer events that you can use to help further your understanding of Microsoft platform and development technologies.

10:00 – 11:00 Visual Studio 2008 and the .NET Framework 3.5 (Doug)

Visual Studio 2008 has shipped, and with its arrival, there are several new features in the framework, compilers, and the IDE itself. In this technical overview, we will discuss some of the new developer productivity enhancements in Microsoft Visual Studio 2008. We’ll explore new designers that assist developers in creating applications more quickly and easily. We’ll also take a look at LINQ, and the benefits it provides.

11:00 – 12:00 Visual Studio 2008: New Developer Productivity Enhancements (Keith)

Learn what is new in Visual Studio 2008 when it comes to building Smart Clients.  We’ll take a look at several new enhancements and improvements developers can take advantage of in Visual Studio 2008.  In this session we’ll look at several things such as Client Application Services, new improved ClickOnce features, offline and online data syncing and more.


New Horizons Computer Learning Center
1855 Lakeland Drive
Suite R101
Jackson, MS 39216

Phone: 601-914-4500


Or by phone:  877-673-8368 and reference Event ID:  1032367675

Geek Dinner Agenda

7:00 – 10:00 Geek Dinner in Hattiesburg with Sara Ford


Chesterfields’ Restaurant
Hattiesburg, MS

For more information about the dinner including directions, and how to RSVP for dinner please visit previous Geek Dinner announcement on this site.

My Codemash Podcast Is Up!

Posted by Keith Elder | Posted in .Net, Smart Clients, Speaking | Posted on 15-01-2008


Chris Woodruff caught me at Codemash and we did a podcast together.  It was a very free flowing conversation.  We covered a lot of topics including my open source background, how I got started in .Net, Smart Clients vs Web Applications, WPF, music, the value of learning multiple languages and why Codemash is so unique.

You can grab the podcast here:


And you can also try playing the podcast below.

Powered by Podbean.com

By the way there are other podcasts from Dustin Campbell, Sara Ford and Michael Rozlog as well on the Codemash web site.

Datasets vs Business Entities

Posted by Keith Elder | Posted in .Net, Smart Clients, Web Services | Posted on 26-10-2007


If you are an experienced .Net developer more than likely you’ve come to a cross roads of sorts in development over which object model to go with.  Do you use Strong-Typed Datasets or should you write your own business entities from scratch or generate business entities using an ORM.  A reader emailed me asking my opinion the other day and the very question was also raised on Twitter by Joel Ross yesterday as well.  Here are my words of wisdom on the subject matter to help you hopefully arrive at a conclusion which road you may want to travel.

Things to Think About

What are you going to do with the data?  This is the most important question to ask your self.  The reason is Datasets and business entities solve two different problems and where one makes life easy in one example it overly complicates the heck out of another problem.  Let’s take a simple scenario that any Smart Client developer may run into.  For example, let’s say you are instructed to build a search screen within your application and bind the results to a DataGridView.  The user interface should allow an end-user to search and return joined row data from the data store.  After the data arrives to the client the end-user needs the ability to filter, group, and sort the data within the client.  What do you do?  Here is a rough draft of what we need.


In this case my default answer is a plain Dataset.  There are a few things to think about in this scenario.  The first one is how often the screen data might change.  In the example above it is returning CustomerId, CompanyName and ContactName and ContactTitle to the end-user.  The question is how do we handle a simple change if the business requirement changes a month from now and we need to add a new column of Email to the result set?  Let’s look at the three options we could go with to tackle this scenario.  It helps to visualize it on paper.

  Non-Typed Dataset Typed Dataset Business Entity
Fastest Development Time   X  
Requires Custom Programming for filter and sort     X
Requires Re-deploying Client To Add New Column   X X
Requires Re-deploying Service To Add New Column X X X
Heaviest Payload X    

Looking at the table we see the non-typed Dataset has the fewest checks (one less).  While it isn’t the fastest to develop because we do not get all of the automatic pre-built bindings it is still pretty fast.  Much faster than the custom business entity.  Even then we still don’t have to write our own sorting and filtering routines nor do we have to redeploy our client.  Having to redeploy is the biggest cost to a business and should be taken very seriously when developing Smart Clients for the enterprise.  Downtime is not cool in any business, even if it only takes a minute for a ClickOnce app to be redeployed.  In this scenario all we’d have to do is change the way we fill our Dataset within our middle tier (services layer) and then send it down the wire.  This change could be made pretty much whenever we want without having to interrupt the business.  Notice we get the flexibility of being able to change our business requirements on the fly so to speak, but we are using the heaviest payload to send our data over the wire to our client.  If you aren’t familiar with why the strong-typed Datasets can have smaller payloads via web services over the wire then read this tip on strong-typed Datasets and web services.

Is the Data Really an Entity?

The above example didn’t favor very well for using an entity.  Why?  I think it has to do with the fact that the problem we were trying to solve didn’t match itself well to an entity.  I even question if the results from a search in this scenario is an entity.  I argue that it isn’t, it is a result set based on an action.  Not a true business entity.  If we think of the entity above as a Customer entity we would have a lot more properties within the Customer entity.  For example addresses, contact information, orders maybe and so on.  In our scenario we didn’t need any of that data to be filled.  As with most ORM mappers which help developers build entities, this is where a lot of them fall short in the fact that only a few properties need to be loaded, yet we have to pay the entity tax as I call it just to get to a few fields of data and load all the data within the entity. 

What if we created a brand new entity with just the four fields we needed to display?  While we could create a plain old collection of C# objects that only have the fields we need, we are still back to the problem of filtering, sorting, grouping and deployment. 

In this scenario:  Dataset 1  Entity 0

Another Scenario

To take our example a little further, what would we do if the end-user was able to double-click one of the rows that was returned from the search?  The end-user would then be presented with a new screen so they could edit the customer record and see other data like address information, contact information and so on.  What would we do in this case?

In this scenario we are truly dealing with an entity.  We are dealing with a Customer entity and it makes perfect sense to handle all of our bindings directly to a business entity.  Sure we have to bake in all of the OnChange events, validation, and so on, but the end result is we have a very flexible way of dealing with our Customer.  Maintaining customer information in just a Dataset scenario is slow, lots of overhead and isn’t near as clean as just a plain old C# object (or entity, however you think of it).  We can wrap our Customer entity with policy injection and validation much cleaner than we can trying to represent a Customer in a DataSet no matter how we look at it. 

In this scenario:  Dataset 1  Entity 1

Deadlines and Size of Application

When it comes to making a decision in your applications I say use common sense when it comes to what you are developing.  Honestly if I’m building a one or two page web form for internal use within our company, I’m cutting corners to get the thing done.  I want it to work and make the client happy but then move on.  Datasets here I come!  On larger applications though that isn’t the case.  For example if you are building a commercial web site or large scale enterprise app the code base will be taken more seriously.  There will be plenty of time to identify entities and put in the proper plumbing such as filtering, sorting, grouping, change tracking and more.  You may also take the time to explore one of the many entity frameworks available for .Net as well to give yourself a jump start. 


No matter how many times we argue the benefits and merits of one versus another I think the best approach a developer can take when it comes to Datasets vs the entity argument is take a holistic approach at the problem that he or she is trying to be solve.  Don’t forget to take into account future changes and how much maintenance may be required.  Use common sense and match the best solution to the problem.  I hope this helps and is now officially clear as mud.


Technorati tags: , , ,

Sync Services for SQL Server Compact Edition 3.5 in Visual Studio 2008

Posted by Keith Elder | Posted in Smart Clients, SQL Server Compact Edition | Posted on 23-09-2007


One of the new features in Visual Studio 2008 provides the ability to easily sync data from the server to the client and the client to the server using Sync Services for ADO.Net.  For developers supporting Winform and Smart Clients this is a much welcomed feature.   Developers will be able to take advantage of Sync Services in a variety of ways including a local store for off line data as well as caching data locally for speed.

When you deal with syncing of data as a developer the first thing that pops into mind is conflict resolution.  For those getting started with Sync Services we’ll look at how to setup sync services in a new project and then at some of the things you’ll need to know in order to handle conflict resolution within your applications.  All of the source code for this is available for download at the end of the article.

One Way Data Sync with Sync Services

In Visual Studio 2008 we have several new templates to get us started.  The one used for Sync Services is called “Local Database Cache”.  To get started with Sync Services open Visual Studio 2008 and create a new windows application.  Once you have your windows application created add a new item to your project and select the following template in the data category.


Essentially this file provides all the sync logic and we’ll use it later on to extend and insert our own logic for handling conflict resolution.  Other options can be extended here as well but more on that later.  Once the local database cache template is added to the solution a new screen to configure our data synchronization will be displayed.


From this screen we are going to select our database connection (or create one).  In this walk through we are going to leverage the Northwind database.  You can download this database from Microsoft if you do not already have it.  Once the Northwind database is setup create a connection to it.


The next step is to identify the tables we want to sync from the server to the client.  In the lower left corner of the configure data synchronization screen we are going to select the “add” button and choose our tables.


In the above example three tables were selected from the Northwind database:  Customers, Employees, and Shippers.  Note: If your tables do not use the built-in naming convention of “LastEditDate” or “CreationDate”  to compare the update or insert statements click the “New” button and specify the name.  For example, as a standard we typically use LastModifiedDate.  In this screen note that DateTime and TimeStamp are both supported.


Once the tables are configured the only options left are under the advanced button on the main screen.  This is where the project location is specified (which is how you move sync services to support WCF services) along with a few other options that should be self explanatory.


Once everything is configured press OK and the server database tables selected will be mirrored onto the client database you selected. 


Once the data is synced into the local SQL server compact database the next screen shown allows us to build the data objects for the project.  Strong-typed datasets are the default option since most if not all of the other controls such as DataGridView, BindingSource, BindingNavigator and others work natively with this type.  It doesn’t mean another option like LINQ to Entities or something else couldn’t be used though.  In this walk through we’ll stick with the dataset.


If you are following along the project will now look like this:


We started this whole process by using the “Local Database Cache” template.  We now have a local SQL server compact edition database in our project called Northwind.sdf that will hold the local cache of our data.  We also have the NorthwindCache.sync file that will provide the syncing ability and lastly we have a strong-typed dataset which has our customers, employees and shippers tables.

So we can see the sync in action drag the Customers table from the Data Sources window onto the default form provided and add a button that will trigger the sync in the BindingNavigator.  The end result should look something like this:


To get the code we need for the click event of the form double click the NorthwindCache.sync file and press the link in the bottom right corner of the form called “Show Code Example..”.  This form will display:


Press the “Copy Code to the Clipboard” and close this form and the configure data synchronization form.   Paste the code into the click event of the sync button in the form.  The code is fairly simple since it creates an instance of our sync agent and then calls sync.  You’ll notice there is a TODO comment in the code that is pasted.   We need to add a line of code that will merge in the changes into our instance of the NorthwindDataSet.   Add this line to the sync click event.


If you are following along we can now run our application.  When it launches if we press the “Sync” button at the top of your form it will provide us a quick look at the options provided by the SyncStatistics object.  Pulling the syncStats object into a quick watch window should look like this:


Remember there is nothing to sync at this point since a sync was already performed after we added the NorthwindCache.sync file to our project.  It is also important to note that at this point we only have one way sync capability.  In other words, if we change the data locally in the database it will never make it back to the server.  If you are looking at a way to get data locally and cache it with a client / server model this is as far as you need to go with sync services.  At this point local data will get synced with the latest changes from the server.  We can easily add a timer to our application and have it sync every hour or based on network connectivity. 

Bi-directional Sync With Sync Services

Although bi-directional syncing isn’t an option we can turn on yet in the sync designer we can enable it with one line of code.  In your project right click the NorthwindCache.sync file and click view code.  This will create a new file that will be a partial class that we can use to extend the sync agent with our own logic.   In the OnInitialized method we are going to add the following line to enable bi-directional syncing on the customers table.

namespace SqlCESyncServices {
    public partial class NorthwindCacheSyncAgent {
        partial void OnInitialized(){
            this.Customers.SyncDirection = Microsoft.Synchronization.Data.SyncDirection.Bidirectional;

With this one line, any changes we make to the local data will now be sent back up to the server and any changes from the server will be sent back down to the client.  It is at this point when changes are allowed to be made locally and on the server we have to handle conflicts. 

Handling Conflicts with Sync Services

To handle our conflicts we are going to continue to flesh out the above file.  Remember this code isn’t affected by the designer so we can customize this to suit the business logic needs of our application.  The first thing we are going to do is add a new partial class to our code behind file for our sync agent.  For clarification we are going to be adding this code into this file.


If this file is missing simply right click NorthwindCache.sync and click “View Code”.  This file will be generated.

We are going to take advantage of two features of C#.  One is partial classes to handle our conflicts, the other is a newer feature called partial methods.  The reason the partial method is needed is our NorthwindCache.Designer.cs file already has a parameterless constructor in the object we need to extend.  In order for us to work around this the OnInitialized() method is marked as partial in the class which allows us to wire up our own events. 

In the NorthwindCache.cs code behind file, add a new partial class called NorthwindCacheServerSyncProvider.  In this class we are going to implement the OnInitialized() partial method so we can wire up our events to handle conflicts (and various other things).  The main event we care about for conflict resolution is the ApplyChangeFailed event.  Our class stub will look something like this starting out.

    public partial class NorthwindCacheServerSyncProvider
        partial void OnInitialized()
            this.ApplyChangeFailed += new System.EventHandler<Microsoft.Synchronization.Data.ApplyChangeFailedEventArgs>(NorthwindCacheServerSyncProvider_ApplyChangeFailed);

        void NorthwindCacheServerSyncProvider_ApplyChangeFailed(object sender, Microsoft.Synchronization.Data.ApplyChangeFailedEventArgs e)
            // handle conflicts here

In the ApplychangeFailed event we have a lot of options.  We have the client changes along with the server changes available to us to do some really fancy merging of data along with different actions we can take based on the the conflict.  We can chose to continue, retry, or retry and force write actions as seen in the below screen shot.


One example as to how this might be handled is to create a screen that displays both sets of data so the end user can make the choice which one they prefer and or merge the data from both records.  Obviously this will require a lot of thought and time to create but here is a sample that shows the Region on the client and the server are different and caused a conflict. 


Here is the code that was added into the ApplyChangeFailed event. 

   public partial class NorthwindCacheServerSyncProvider
        partial void OnInitialized()
            this.ApplyChangeFailed += new System.EventHandler<Microsoft.Synchronization.Data.ApplyChangeFailedEventArgs>(NorthwindCacheServerSyncProvider_ApplyChangeFailed);

        void NorthwindCacheServerSyncProvider_ApplyChangeFailed(object sender, Microsoft.Synchronization.Data.ApplyChangeFailedEventArgs e)
            // handle conflicts here
            DataTable clientChanges = e.Conflict.ClientChange;
            DataTable serverChanges = e.Conflict.ServerChange;

            CustomerConflictResolution frmConflict = new CustomerConflictResolution();
            frmConflict.clientBindingSource.DataSource = clientChanges;
            frmConflict.serverBindingSource.DataSource = serverChanges;


One would need to keep going in this direction and resolve conflicts based on the data being resolved and the business logic surrounding that data.  It may be the last record edited wins or it may be that you need to merge data from one to another. 

As we can see in this walk through, Sync Services with SQL Server Compact Edition 3.5 adds a lot of new and much needed functionality to Visual Studio 2008.  I suspect that due to the ease at which data can be cached locally and synced a lot of developers will embrace this ability first.  If you are writing Smart Clients do investigate the ability of Sync Services to run under WCF.  It is supported and can be accomplished by moving the sync agent to another project (which is why I mentioned WCF in the advance screen above).   Happy syncing! 


What Acropolis Is and Isn’t

Posted by Keith Elder | Posted in .Net, Programming, Smart Clients | Posted on 08-06-2007


Ayende wrote a post about Acropolis as another executable XML language.  Then a few other people chimed in on comments about Acropolis being another example of Microsoft providing tools to turn bad developers into mediocre developers.  I think the point of Acropolis has been totally lost in this conversation so please allow me to weigh in.

To start with WPF is already expressed in XML.  This has been known for awhile and we’ve all seen amazing results of expressing the UI declaratively.  Look at all the eye candy WPF and Silverlight has dazzled us with over the past several months as an example.  Acropolis is simply leveraging the new WPF stack so to call it an executable XML language is a little far fetched.  Of course it is true that XAML generated to display WPF applications is in XML format but it isn’t a language it is merely parsed.  Calling Acropolis an executable XML language is like calling a component or control that ships out the box with the framework a language because that is what Acropolis is, additional controls that are going to be shipping to enhance WPF which in return will help us composite our client applications better.  

Acropolis isn’t a language but merely an extension of controls and patterns to WPF similar to the Smart Client Software Factory and CAB built leveraging the richness of Windows Presentation Foundation.   That is the Forrest Gump definition of how I would explain it.

We’ve already seen and tried to solve a lot of the problems developers face in building rich client applications with SCSF and CAB.  Acropolis is no different in what it is trying to solve just in how it is put together.  Meaning Acropolis is built on the WPF stack rather than built on object oriented design patterns.  Under the hood there are design patterns going on I am sure but they are abstracted to controls. 

To give you an analogy here is how I would think about it.  To me it is no different than Asp.Net 2.0 shipping the Login controls.  This is an example of a common problem web developers face and an abstract way of dealing with that problem.  Acropolis to me is no different in the fact that there are inherent things as client developers we have to do each and every time we start a client application.  Acropolis will hopefully help us solve these problems, but it isn’t a new XML language.  It also has nothing to do making bad developers mediocre developers as one commenter pointed out.  Just as the login control bundled with Asp.Net 2.0 didn’t make bad developers mediocre developers.

The point of Acropolis is to take things that are “common” that client developers have to do and abstract the repetitiveness of building composite applications into something that can be reused in the framework.   As Brad Abrams pointed out in his comment there is still separation of code and business logic. 

I saw at lengthy talk on Acropolis at TechEd done by Kathy Kam and mostly what I saw was a set of new controls that will assist client developers in building out the plumbing of smart client applications faster.  It is still new but the direction it is going will in my opinion solve what it is trying to solve if done correctly. 

Technorati tags: , , , ,