Welcome

You have reached the blog of Keith Elder. Thank you for visiting! Feel free to click the twitter icon to the right and follow me on twitter.

Datasets vs Business Entities

Posted by Keith Elder | Posted in .Net, Smart Clients, Web Services | Posted on 26-10-2007

5

If you are an experienced .Net developer more than likely you’ve come to a cross roads of sorts in development over which object model to go with.  Do you use Strong-Typed Datasets or should you write your own business entities from scratch or generate business entities using an ORM.  A reader emailed me asking my opinion the other day and the very question was also raised on Twitter by Joel Ross yesterday as well.  Here are my words of wisdom on the subject matter to help you hopefully arrive at a conclusion which road you may want to travel.

Things to Think About

What are you going to do with the data?  This is the most important question to ask your self.  The reason is Datasets and business entities solve two different problems and where one makes life easy in one example it overly complicates the heck out of another problem.  Let’s take a simple scenario that any Smart Client developer may run into.  For example, let’s say you are instructed to build a search screen within your application and bind the results to a DataGridView.  The user interface should allow an end-user to search and return joined row data from the data store.  After the data arrives to the client the end-user needs the ability to filter, group, and sort the data within the client.  What do you do?  Here is a rough draft of what we need.

image

In this case my default answer is a plain Dataset.  There are a few things to think about in this scenario.  The first one is how often the screen data might change.  In the example above it is returning CustomerId, CompanyName and ContactName and ContactTitle to the end-user.  The question is how do we handle a simple change if the business requirement changes a month from now and we need to add a new column of Email to the result set?  Let’s look at the three options we could go with to tackle this scenario.  It helps to visualize it on paper.

  Non-Typed Dataset Typed Dataset Business Entity
Fastest Development Time   X  
Requires Custom Programming for filter and sort     X
Requires Re-deploying Client To Add New Column   X X
Requires Re-deploying Service To Add New Column X X X
Heaviest Payload X    

Looking at the table we see the non-typed Dataset has the fewest checks (one less).  While it isn’t the fastest to develop because we do not get all of the automatic pre-built bindings it is still pretty fast.  Much faster than the custom business entity.  Even then we still don’t have to write our own sorting and filtering routines nor do we have to redeploy our client.  Having to redeploy is the biggest cost to a business and should be taken very seriously when developing Smart Clients for the enterprise.  Downtime is not cool in any business, even if it only takes a minute for a ClickOnce app to be redeployed.  In this scenario all we’d have to do is change the way we fill our Dataset within our middle tier (services layer) and then send it down the wire.  This change could be made pretty much whenever we want without having to interrupt the business.  Notice we get the flexibility of being able to change our business requirements on the fly so to speak, but we are using the heaviest payload to send our data over the wire to our client.  If you aren’t familiar with why the strong-typed Datasets can have smaller payloads via web services over the wire then read this tip on strong-typed Datasets and web services.

Is the Data Really an Entity?

The above example didn’t favor very well for using an entity.  Why?  I think it has to do with the fact that the problem we were trying to solve didn’t match itself well to an entity.  I even question if the results from a search in this scenario is an entity.  I argue that it isn’t, it is a result set based on an action.  Not a true business entity.  If we think of the entity above as a Customer entity we would have a lot more properties within the Customer entity.  For example addresses, contact information, orders maybe and so on.  In our scenario we didn’t need any of that data to be filled.  As with most ORM mappers which help developers build entities, this is where a lot of them fall short in the fact that only a few properties need to be loaded, yet we have to pay the entity tax as I call it just to get to a few fields of data and load all the data within the entity. 

What if we created a brand new entity with just the four fields we needed to display?  While we could create a plain old collection of C# objects that only have the fields we need, we are still back to the problem of filtering, sorting, grouping and deployment. 

In this scenario:  Dataset 1  Entity 0

Another Scenario

To take our example a little further, what would we do if the end-user was able to double-click one of the rows that was returned from the search?  The end-user would then be presented with a new screen so they could edit the customer record and see other data like address information, contact information and so on.  What would we do in this case?

In this scenario we are truly dealing with an entity.  We are dealing with a Customer entity and it makes perfect sense to handle all of our bindings directly to a business entity.  Sure we have to bake in all of the OnChange events, validation, and so on, but the end result is we have a very flexible way of dealing with our Customer.  Maintaining customer information in just a Dataset scenario is slow, lots of overhead and isn’t near as clean as just a plain old C# object (or entity, however you think of it).  We can wrap our Customer entity with policy injection and validation much cleaner than we can trying to represent a Customer in a DataSet no matter how we look at it. 

In this scenario:  Dataset 1  Entity 1

Deadlines and Size of Application

When it comes to making a decision in your applications I say use common sense when it comes to what you are developing.  Honestly if I’m building a one or two page web form for internal use within our company, I’m cutting corners to get the thing done.  I want it to work and make the client happy but then move on.  Datasets here I come!  On larger applications though that isn’t the case.  For example if you are building a commercial web site or large scale enterprise app the code base will be taken more seriously.  There will be plenty of time to identify entities and put in the proper plumbing such as filtering, sorting, grouping, change tracking and more.  You may also take the time to explore one of the many entity frameworks available for .Net as well to give yourself a jump start. 

Conclusion

No matter how many times we argue the benefits and merits of one versus another I think the best approach a developer can take when it comes to Datasets vs the entity argument is take a holistic approach at the problem that he or she is trying to be solve.  Don’t forget to take into account future changes and how much maintenance may be required.  Use common sense and match the best solution to the problem.  I hope this helps and is now officially clear as mud.

 

Technorati tags: , , ,

Good Architecture Design Creates More Code and Takes More Time

Posted by Keith Elder | Posted in Programming, Web Services | Posted on 08-08-2007

1

I came to a realization this evening when working on a new release of our internal CRM application.  We are in the middle of re-designing our CRM from the ground up with new data structures and tons of new features.  That may sound like a drastic measure but the functionality of it as well as the scope has changed over the years so that’s why we are redesigning it.  As a result we are taking the time to build our middle-tier using Windows Communication Foundation leveraging the Web Services Software Factory for WCF. 

WCF brings a lot to the table for us such as duplex messages, TCP binary messages and so on.  I’m a big fan of multi-tier design and good architecture design.  You know, the typical  UI->Middle Tier->Database approach.  Anyone that has built an application using three tiered design should know the majority of the work is in the middle-tier and about 70% of your effort is focused in this area. 

For our redesign, in the middle-tier we are following good architectural guidance by creating entities, separated business logic, separated data access and so on (WSSF really helps with this).   At times though I wish I could just drag and drop a database table onto a WinForm, create a strong type dataset and bind to a DataGridView control and totally forget the middle tier.  It sure is faster to code everything in the UI and hit the database directly this is no doubt.  In the end there are tons of draw backs though such as deployment, centralizing business rules, etc.  This is the first case in point that good architecture creates more code and ultimately takes more time.

This evening I was playing around with a test WinForm app for a prototype screen and realized since we are returning entities from the services layer we can’t just bind the returned collection of entities to a DataGridView control and get all the sorting / filtering goodness we get with a Dataset.  In the end I had to write more code to be able to achieve this.  Sure it only turned out to be about 500-750 lines of code that will get re-used over and over and over again, but the fact remains that I had to write more code because I followed a good design practice.  This is case in point number two. 

For those that leverage consultants this is the difference in quotes you probably receive from various firms.  Or for those that ask developers how long something will take and he/she tells you two months when you were hoping they would say two weeks.  Sure you can find someone to get it done faster, but did they get it done correctly?  Will the system be easy to maintain?  Scale?   Be easy to extend in the future?  These are the things that developers over time have come to realize when building systems and it is hard to justify sometime to the business or to clients (if you are a consultant).  Sure it may take a developer longer to follow a good three tiered approach, but the business gets a lot of benefit down the road by going with a better architecture.   

Cannot Invoke Workflows From Workflows Hosted in Asp.Net with Manual Scheduler

Posted by Keith Elder | Posted in Asp.Net, Web Services, Workflow Foundation | Posted on 30-07-2007

3

WasherAs you can see I’ve included my famous rusty washer on this post to denote a problem.  I was having a conversation with a team member about Workflow Foundation who was having a problem.  The problem was “Hey Keith, have you ever had workflows in Workflow Foundation invoke workflows?”.  Answer:  I  have not, but let’s talk about it, what’s the problem?

 My team member started explaining that they have a need for workflows to call other workflows.  Sounds simple enough since their is a pre-packaged activity in WF to do just this in the toolbox.

He further elaborated that when they invoke the workflows the workflow runtime dies.  At first I was perplexed, but then knowing they were hosting it in Asp.Net web service I asked which scheduler they were running.  He replied ManualWorkflowSchedulerService.  Ah ha!

The problem is a catch 22 and honestly I don’t have a solution for this.  The problem is that when hosting a workflow with the ManualWorkflowSchedulerService it cannot invoke other workflows.  I guess it makes sense since it cannot spawn a new thread, after all it is trying to process a new instance of a workflow on the same thread. (at least that is my theory) Is there a work around?  I don’t know.  Is it a problem, yeah!

To compound matters the only way to reliably host workflow foundation in Asp.Net service is with the ManualWorkflowSchedulerService.  Well, ok it isn’t the only way, but if there are problems using the default scheduler in Asp.Net.  Here is Paul Andrew’s post about a fix coming in WF where he explains the differences.

The DefaultWorkflowSchedulerService is the out-of-box Windows Workflow Foundation scheduler used by the runtime to execute workflow instances on new CPU threads.  These threads are leveraged from the.NET thread pool associated with the host application. This works well unless the workflow runtime is hosted in ASP.NET.  Because of the limited thread resource in the IIS server, we would be allocating an additional unnecessary thread to execute the workflow instance for every HTTP request that required workflow execution. The ManualWorkflowSchedulerService was developed to address this issue by allowing the thread that is processing the HTTP request to execute a workflow instance. Unfortunately in WF Beta 2.2 and prior, it didn’t handle the correct processing of timer events (i.e. delay activities).

When doing Asp.Net Web Services the only reliable way to return output parameters, catch exceptions, logging, handle faults, etc is to run the workflow on the same thread synchronously using the ManualWorkflowSchedulerService.  I’ve tried every which way I knew to get the default scheduler to work but couldn’t.  If you think about it, it makes sense since web services receive a message, then typically return a message.  With the default scheduler it is hard to bubble up exceptions properly so they can be logged or caught and handle gracefully in a service.

To test this theory I created a console workflow app and created two workflows with one calling the other like this.

image

I then wired up the ManualWorkflowSchedulerService in the Program.cs like this and run it.  It doesn’t run. If I comment out the manual scheduler and use the default, it works fine.

static void Main(string[] args)

        {

            try

            {

                using (WorkflowRuntime workflowRuntime = new WorkflowRuntime())

                {

                    ManualWorkflowSchedulerService service = new ManualWorkflowSchedulerService(false);

                    workflowRuntime.AddService(service);

                    AutoResetEvent waitHandle = new AutoResetEvent(false);

                    workflowRuntime.WorkflowCompleted += delegate(object sender, WorkflowCompletedEventArgs e) { waitHandle.Set(); };

                    workflowRuntime.WorkflowTerminated += delegate(object sender, WorkflowTerminatedEventArgs e)

                    {

                        Console.WriteLine(e.Exception.Message);

                        waitHandle.Set();

                    };

 

                    WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(WorkflowConsoleApplication1.Workflow1));

                    ManualWorkflowSchedulerService scheduler = workflowRuntime.GetService<ManualWorkflowSchedulerService>();

                    instance.Start();

                    scheduler.RunWorkflow(instance.InstanceId);

                    waitHandle.WaitOne();

                }

            }

            catch (Exception ex)

            {

                Console.WriteLine(ex.Message);

                Console.ReadLine();

            }

 

 So the catch 22 is I need to use the manual scheduler with Asp.Net Web Services but I can’t invoke other workflows if I do.  If I use the default scheduler, I can’t scale my service.  The “solution” we are going with right now for the work around is to host the service in WCF as a Windows Service outside of IIS.    The bad part of this is you can’t really scale this either.  That’s the nice thing about using IIS is it really is an app server.  Clustering and those things are widely available.  If you build your own windows service you lose all of that baked in functionality.  I think this is a HUGE rusty washer.   Anyone faced this challenge?  How did you solve it?  Am I off my rocker on this one?  Feedback would be appreciated.

UPDATE (8.28.2007)

One of the developers at work (Joe Brach) solved this problem since he was running into the same issue.  I thought I’d post his example of his fix since several people have been emailing me if I found a fix.  The “fix” is you create your own class that extends WorkflowRuntimeService.  Here is Joe’s fix:

    public class StartWorkFlowService: WorkflowRuntimeService

    {

        public Guid StartWorkflow(Type workflowType, Dictionary<string, object> inparms

        {

            //Get an instance of the runtime

            WorkflowRuntime wr = this.Runtime;

 

            //Create the type of Workflow

            WorkflowInstance wi = wr.CreateWorkflow(workflowType, inparms);

 

            //Start the WorkFlow

            wi.Start();

 

            //Get the Manual Workflow Service

            ManualWorkflowSchedulerService ss = wr.GetService<ManualWorkflowSchedulerService>();

 

            if (ss != null)

            {

                ss.RunWorkflow(wi.InstanceId);

            }

            return wi.InstanceId;

        }

    }

TechEd Day 2

Posted by Keith Elder | Posted in .Net, Asp.Net, Smart Clients, Web Services | Posted on 07-06-2007

0

It is day two of TechEd and my feet and legs are killing me.  My feet didn’t get a chance to rest after the four day whirlwind tour of Disney World.  I told my wife that I was going to have to go to TechEd just so I could rest 🙂  Here is day two’s activities.

Biztalk WCF Adapter

The first session I attended this morning was an overview of the WCF Adapter for Biztalk which is in the R2 release.  Don’t get your hopes up just yet because it isn’t available.  I was told it  wouldn’t be out until 3rd quarter.  The good thing is the WCF adapters bring a lot of cool things to the table.  I can’t remember all the reasons they gave but the one that a lot of people will use is the ability to do TCP binary messages to Biztalk I am sure.  Actually there are seven WCF adapters, one for each binding type.  The demo they showed was a message from a smart client being sent to Biztalk, and then Biztalk hitting SQL Server and then another service.  The orchestration had to route the message and handle the transaction so if the write to SQL Server failed or the the other service failed then it rolled back the data in SQL Server.  Definitely a real world example and it shows the ability of WCF to handle transactions.   

Wondering Around the Expo

After the Biztalk session I was walking from the north building back to the south building and ran into Bruce Thomas at the MVP booth.  I also saw Joe Healy, Jeff Palermo, DonXML, Miguel Castro, and Joe Fuentes.  I then walked down and caught some of the “Speaker Idol” that Carl Franklin and Richard Campbell were hosting.  This was pretty cool and I wish I had heard about it sooner so I could have entered.  The way it works is a speaker gets up on stage and does a 5 minute presentation.  Then they get judged by a bunch of Regional Directors.  It is a really great way to get feedback about your presentation skills no doubt.  The big challenge is doing just a 5 minute talk!

Smart Client Applications in Visual Studio 2008

After lunch I went to a presentation on the new smart client features in Visual Studio.  Some of these are nothing new in terms of new information (linq and wpf) but they are technically “new” in 2008 and some are new exciting features.

Working With Data

LINQ is of course the new way to work with data in VS2008.  The demos they showed were just connecting to a database locally which isn’t a true Smart Client architecture in my opinion since it isn’t services based.  Nevertheless, LINQ is new in VS2008 and we’ll all love him, hug him and call him George.

Taking Data Offline

This is something that I hadn’t seen yet and it really peaked my interest since we have a lot of uses for it.  SQL Server Compact Edition 3.5 which is in beta right now will be available to us.  You can read about the new features here.  The nice thing about 3.5 is it can be deployed with your application through ClickOnce and it doesn’t run as a service.  It also supports about 2-4 GBs of data which should be PLENTY for any application that needs to run offline.  The other piece is the Sync Agent which is tasked with the joy of keeping data in sync.  As soon as they said this I immediately thought “Smart Clients shouldn’t connect to the database directly” and as soon as I thought that the presenter said it supports sync via services.  Hotness! 

User Profiles

Something else that is new is Client Application Services.   Today web applications can store user profile information which is used for themes, preferences and so on but it is tough for Smart Clients that are service enabled.  Client Application Services allows us to reuse the existing profile and role based mechanisms Asp.Net offers today to allow us to centrally store profile information.  This may not sound like a big deal but today we have to create a lot of plumbing in Smart Clients to store application preferences and profile information away from the client’s desktop machine.  Storing preferences on the desktop machine doesn’t allow users to move from machine to machine and have the application setup the same way. Client Application Services fixes this by leveraging existing functionality so this will be good.  There will be a new services tab when you right click on properties on your project.  In the services tab you point to a web server which will hold the profile settings and any settings you create in your app are stored and retrieved from there. 

User Experience

Another new feature in VS2008 is going to be WPF.  They call this the “user experience” but a lot of us already know about WPF.  This will change how we build Smart Clients no doubt but we’ve been hearing about WPF since 2005 so nothing to see here.  Moving on.

Deployment

In VS2005 we got a new deployment technology called ClickOnce.  In VS2008 ClickOnce doesn’t go away, it just gets enhanced.  There are six new enhancements to it but the one I like the best is the ability to change the deployment URL and not have to rebuild the entire manifests.  I didn’t write them all down but there was something mentioned about ISV branding in ClickOnce in VS2008. 

Reusability – Acropolis

This was the new bomb shell that was dropped called “Acropolis”.  Acropolis is a framework that simplifies building composite clients and will replace the Smart Client Software Factory moving forward.  Bryan Adams just blogged about this on June 4th so read his initial post if this is your first time to hear about it (just remember you heard it hear first 🙂 ).  After you read that, read this one with more questions and this one with additional information including video and live docs.

Certification Study Hall

After the what’s new in Visual Studio 2008 I went to the certification center where I started practicing some certification exams.  At TechEd you can take certification exams for $50 here onsite as well as go through tons of test questions.   I studied until I just about fell asleep and then rolled to the hotel.  Day two is now in the books.

Web Service Software Factory and XML Data Columns

Posted by Keith Elder | Posted in Web Services | Posted on 30-05-2007

0

I found a rusty washer today while trying to generate stored procedures for the data access tier with Web Service Software Factory to build out a new service.  Apparently the software factory cannot handle XML data columns in a database.  Definitely disappointing.  Here is what it does in the guidance.

Basically it flags it as “The table cannot be selected because it contains column data types that are not supported.”.   My fix was to change the column to varchar(max) and then generate what I needed and then change the generated code.  But why?  We have a data type of XmlDocument!  Seems pretty easy to just set the property of the generated code to XmlDocument by default.  I don’t get it.