February 2005 Archives

As it is, there is only one way to signal "success"--to return rows affected = 1. This can only be done via a single-row action command. Having another mechanism would permit developers to have more sophisticated update logic and still have a way to signal success.
It also gives your code the ability to indicate whether some other condition has caused the update to fail. There are lots of reasons other than concurrency that can make an update fail. It would be nice to be able to differentiate between these issues. It would be nice to be able to pass back an exception code to the handler to let the client deal intelligently with the exception.
Maybe I didn't catch the meaning of your post, but --

"how to trick ADO.NET into thinking the update/insert/delete succeeded"

Why would you wanna do that anyway?

Concurrency and ADO.NET

| 4 Comments | No TrackBacks

I was answering a newsgroup question this evening and it reminded me of a revelation I had awhile ago but forgot to mention.

One of the problems developers have to worry about when using stored procedures to perform complex updates (where several tables may (or may not) be updated), is figuring out how to trick ADO.NET into thinking the update/insert/delete succeeded.  What if developers had the ability to change what ADO.NET expects to return from the action commands? As it is, ADO.NET expects rows affected = 1 and nothing else. What if the developer could choose to set a Return value of True (1), an OUTPUT parameter or some other mechanism to signal that the action succeeded? This way their own stored procedure logic could determine whether or not the action triggered a collision or other problem—that should not be considered as a concurrency issue.

Since ADO.NET is not getting any better at dealing with concurrency, it’s going to force more developers to use their own home-grown stored procedures to carry out these changes. It would be nice if they had more control over the interface.

 

Now back to your regularly scheduled program…

 

Bill

 

If I found a guy on my team passing a datareader to and from the presentation layer I would take him out to the woodshed for some intensive counseling. IMHO there is nothing better than a properly structured data access object or layer. But given the history of 6M VB programmers moving to .NET, I'm not surprised there are a few with problems. By the way, your own ADO.NET book suggested a solution with datareaders that was very nicley reasoned out.
I have to say, that for many cases, in the ASP.NET world, a DataReader really is the right tool for the job, though I will admit if not used (and more importantly, closed) correctly, it can be a problem.

I do not index DataReaders by field number, any more than I would index a DataSet by field numbers, and so generally, adding or rearranging the column list in a select or Stored Procedure will have no impact on my code.

I have been very rigorous to ensure that all code closes the DataReader, and in many sites, and many pages and many millions of hits, I have not seen a problem tied to not closing a datareader. I have certainly seen other code that does, but it has not been my experience. All of my methods returning DataReaders uses the CloseConnection, and I always do the close in a finally block...

DataReader

| No Comments | No TrackBacks
Amen !!

Two notable things in this letter -

1. A lot of people are still very CPU performance centric, and even then they forget to consider the highly concurrent environments which are completely different.
2. Bill Vaughn's favorite phrase "you're pooched".

This is a “letter to the editor“ I sent to Mike Otey at SQL Server magazine in reference to his ADO.NET tips from the March issue.


Mike, I thought your ADO.NET suggestions were right on—all good suggestions except for one. I’ve talked with a number of development shops and individual developers that avoid the DataReader as they think it’s too expensive to build and support. Sure it returns a fast connected data stream, but because you have to handle connections manually (and carefully), walk through each row and column one-at-a-time the amount of code needed (and schema-aware code at that) is too expensive to write, test and maintain. If the schema changes, the query changes or the stored procedure you execute changes, you’re pooched. Many of the problems associated with the connection pool can be traced back to connection management issues—often caused by improper handling of DataReaders passed from layer to layer. In an ASP.NET application it’s easy to bind a DataReader to a grid (which closes the DataReader, but does not close the connection if you don’t ask for CommandBehavior.CloseConnection.)

We feel that the DataAdapter.Fill method is a far better solution. It’s fast (within a few percent of the DataReader) for reasonable-sized rowsets. Yes, it creates a persistent DataTable so it has a memory footprint, but the added utility of having a sortable, filterable, selectable and updatable DataTable can be a big plus. Using or binding to the DataView (as you suggest) is a great idea, but can only be done if you use Fill.

Performance has two components: code performance and developer performance. Like any tool the DataReader is great for specific cases, but should not be a developer’s first choice.

 

I live in Europe. What is MSDE? ;)
Yes, the postal code rule is a pretty common as it’s easy to explain and relate to. But a business rule can be applied to more complex columns such as a part number, or other designator that’s encoded to permit grouping of like parts. Since the part number is used in many places, having a single place to update the rule is very handy and far safer. If we missed changing a constraint, we’re pooched. We’ve also seen rules applied to choice columns that don’t justify the overhead of a PK/FK. For example, a marriage status can be “Single”, “Divorced”, “Separated”, “Depressed”, “Civil Union” etc. A rule could easily deal with this. Sure, a constraint would work here, but there are situations where a global rule is safer and easier to work with.
Bill,
The classic example of rules that I've seen is something like POSTALCODERULE --- but the reality is that just how many different tables should there be in the databas that have postal code columns? I would suggest just one, and if that's the case, then using a constraint would likely be better an rule.

We've also adopted the goal of trying to put all business rules in the middle tier, no stored procs or business triggers. So far so good.
Ok thanks, I now know that we're not doing wicked things... I personally have the feeling to restrict as most as possible on the database, for the simple reason that your data cannot be messed up, no matter how you access it (except ofcourse if you disable your rules).

Ralph, to be clear, we decided to use that technique "from then on". We did not adapt our ongoing projects, so there were no transitions... (We even have old applications with maintenance contract -from the early ASP.NET days-, that have data access right in the code behind :S)
Implementing business rules in another “business-tier” has been a popular tactic. How effective this is depends on a number of factors. I think some folks do this because they don’t understand stored procedures or find T-SQL too limiting. People would rather implement the rules in Visual Basic or C# in another tier. Sure, Yukon will permit you to write stored procedures in a CLR language, but for a wealth of reasons this might not be such a good idea. For one thing, the performance penalty imposed by switching to CLR can be expensive—the round trip can be more expensive than simply performing the operation in T-SQL. However, if T-SQL is bogged down because of complex math operations or logic it’s not suited for, it might make abundant sense to switch to a CLR-based function or stored procedure. They’re mostly a replacement for Extended Stored procedures.
I’m not a big fan of creating layers where you don’t need layers. Layers help move logic into specific groups that make team development easier and provide a single place to address business rules, data access interfaces and other related logic. In many cases, rules, triggers, defaults have worked fine. I’ve been told that Rules are passé—first implemented in the “old” Sybase version. While that’s true, I disagree that the usefulness of Rules has passed. I think Rules are easy to understand and the fact that they can be defined globally and attached to user-defined types is a great feature. They say that check constraints are designed to replace them—I’ll believe it when I see it. I think that if Microsoft pulled rules from SQL Server, a lot of folks would march on Redmond—and I can point them to the right building.
There are lots of other strategies to implement business rules. For example, I often suggest that developers use auto-morphing business rules in the client—those that can be easily changed by altering a database element (like an Extended Property). I think you need to implement the technology with which you’re most comfortable.
If you’re a plumber, you solve problems with pipe.

Bill
Koen,

Interested in how thats going. I understand the want to put ALL business rules in one spot, which would lead to good separation of the tiers. But I'd like to know from your experience there, how that transition is going or how it went.
This is not really to the point but it has some connection. In my company they decided not to implement any kind of business rules on the database tier anymore, except for data types, not null constraints and unique constraints. This is because they want to centralize all business logic in one place (the middle/business tier that is). That means no more rules, triggers, check constraints, stored procedures/functions. The only thing we use are tables and views.

What's your view on this?

In the course of doing the research for my new book (co-authored with Peter Blackburn), I came across a troubling anomaly. In Chapter 2 I’m teaching new developers how to setup new tables and manage business rules. In earlier versions we used Rules, Defaults and Triggers to manage business rules. We could create a user-defined type (UDT) and assign a rule and default to this UDT. This way if the rule changes, we can easily change the rule anywhere it was used in a single operation—either explicitly bound to a column or indirectly bound via a UDT.

However, I noticed that the SQL Server 2005 help topic for rules now has an admonishment which boils down to “Don’t use Rules. They’re going to be dropped in a future version of SQL Server”. They suggest that developers use a CHECK CONSTRAINT instead. Okay, but to create a CHECK CONSTRAINT, you must do so as the table is being created. To change the constraint you have to execute an ALTER TABLE on each column bound to the constraint. One way around this would be to create a function that performed the check logic. That solves part of the problem, but it does not permit developers to assign a common CHECK CONSTRAINT function to a UDT. This means you have to define the column in terms of the UDT but you also have to remember to add the CHECK CONSTRAINT to the columns. If you miss one by accident (or by ignorance), you don’t get the range or other checking performed by the constraint.

Sure, you could implement the UDT as a CLR routine. Does anyone really think developers are going to change a 4-byte integer column to a (Lord knows how many byte) CLR UDT to manage a simsple business rule? I think that remains to be seen.

When creating a new UDT developers are still prompted (at least in the October and December Yukon CTPs) for a rule. Are these going to be dropped? Is it really necessary to drop rules? Why does Microsoft seem to think that this is going to somehow help customers? Aren’t transitions hard enough without dropping functionality that has worked so well so long?

 

Bill

Visual Studio Update

| No Comments | No TrackBacks

February 18, 2005 • Vol.27 Issue 7

I've been working more closely with the new Visual Studio 2005 (aka Whidbey) as I built my sessions for the VSLive talks in San Francisco (Feb. 6-10) and the Connections conference in Orlando (March 19-23). (See www.betav.com for details.) As usual, my focus has been on data access and how it integrates with SQL Server 2005 (Yukon).

re: Visual Basic Survey

| No Comments | No TrackBacks
Honestly Bill, there's another problem in migrating from VB6 to .NET.

Inspite of the fancy upgrade wizard - there is really no easy upgrade path. A lot of struggling companies that spent years in creating a product are faced with the (in my opinion) poor choice of using interop - which really doesn't present any benefit if the only intention is to "Get on the .NET bandwagon".

Newer development in an organization could be done in .NET, but the hurdles there are
a) Current application architecture.
b) Management resistance.
c) Lack of developer skillset.
d) Unwillingness of experienced .netters to work in a VB6 environment.

From a consulting point of view, I think there's plenty of moolah to be made if your knowledgebase spans COM/VC++ ---> VB --> .NET 1.1 ---> .NET 2.0.

I mention 1.1 and 2.0 as two entries there because such problems will plague users upgrading from 1.1 to 2.0 also - albeit at a much lesser magnitude.

In some more time we'd have longhorn - truly we'd have a mishmash of languages/os/platforms - that's gotta get interesting.

re: Welcome to my blog

| No Comments | No TrackBacks
Bill - I have high expectations. I hope to be able to get my daily fix of Bil V wisecracks every day of the week :-)

re: Welcome to my blog

| No Comments | No TrackBacks
Thrilled to see you enter the blogworld, hope to hear more of your rants and raves about ADO/ADO.Net.

re: Welcome to my blog

| No Comments | No TrackBacks
Ah, which story did I tell? There are lots of them.

re: Welcome to my blog

| No Comments | No TrackBacks
I attended a session back in October and he tells the fred-and-george-story everywhere I guess. And if he doesn't, Peter does...

re: Welcome to my blog

| No Comments | No TrackBacks
For the record: The name "Visual Fred" was my dad's concotion. Tell anyone who takes credit for it to come talk to me! And, yes, he does call me Fred (my sister's George).

Welcome to my blog

| 7 Comments | No TrackBacks

Peter and I were able (with very little difficulty) to get this .TEXT blog site working. We did have some trouble creating the actual blog with the dottexthelper applet. The entries it made in the blog_config table had imbedded commas (probably caused by the example in dotexthelper) and it was not at all clear what we were to provide for the user name and application name. After debugging via the SQL profiler and digging into the SPs we were able to get it working.

We also setup a role and used this to grant access to (just) the stored procedures. This seems to be working just fine and makes Peter (who is a security nut) happier.

Bill

re: Welcome to my blog

| No Comments | No TrackBacks
Good luck with the blog Bill (and Peter)! I'll be watching this one...

re: Welcome to my blog

| No Comments | No TrackBacks
ABOUT TIME U STARTED BLOGGING !! :)

The Life and Death of VB6

| 1 Comment | No TrackBacks

Over the last decade we've seen Visual Basic evolve from a clever (but limited) way to write simple interpreted BASIC Windows programs, to a full-fledged compiled object-oriented development platform. It was not until VB3 that the language really took hold. Many are of the opinion that the developer community would not buy into any new technology until it was in its third version. I don't know if that's true but it wasn't until VB3 that the development platform really gained general acceptance just as Windows did not take hold until version 3.0. I suspect that some developers do wait for a more stable platform, but I also think that many larger companies are too conservative to adopt new technology before their own staff can validiate its usefullness, applicability and stability and this takes time. I've seen this (understandable) attitude at several large companies including Boeing, EDS and others. When I visited to train their staff on VB6, their developers were still working on VB4 and VB5. In one case I described the benefits of MSDE but was told that it was not accepted--they were stuck with JET even though the issues they complained about were all cured with MSDE.

When a new version of any software (or hardware) arrives, these conservative companies (even those in blue states), spend considerable time fully evaluating the RTM version. They're less likely to invest the time to fully evaluate beta versions as they don't really reflect what they can depend on when the product ships. When tough times come, all companies large and small tend to delay conversions--they dig in and wait for the storm to pass. This makes it even harder to endorse conversion to a new technology. Expecting these companies to jump on any new version is unrealistic--especially in hard times.

Most of use understand these issues. What surprises me is how Microsoft has chosen to follow through with its plans to drop support for VB6 even though the adoption rates are so low. This is kinda like the government closing a road because they built a parallel light-rail line. Sure the new technology is better, faster and saves gas, but the general public was encouraged to buy and use cars over the last fifty years. They even paid higher taxes to make the roads wider (which never really does anything but encourage more people to drive).

I think Microsoft should continue support of VB6 technology and perhaps even upgrade it and re-release it. “VB7” could provide an easier migration path to the .NET techology so when conversions were needed, developers would have an easier time to convert. IMHO it would be fairly easy to create COM-based versions of the ADO.NET data access interfaces--they're far simpler in most respects than their COM-based ADO cousins.

Sure, I'm blowing in the wind. Microsoft does not have the bandwidth to support yet another VB technology so it's unlikely that they will do anything like this. I just feel sorry for the VB6 folks that feel that Microsoft has left them sitting in a traffic jam watching the new train go by--with lots of empty seats.

Bill

January 7, 2005 • Vol.27 Issue 1
Page(s) 22 in print issue

There is nothing that makes me feel older than a few days with my grandkids. My daughter and her children (Mary, 4, and Katie, 18 months) came for the holidays. They just left, and the house has returned to the deafening quiet that I seem to enjoy. Now I don't have to carefully navigate through the 4,000 plastic toys like a soldier feeling his way through a minefield at night in his bare feet.

February 4, 2005 • Vol.27 Issue 5
Page(s) 21 in print issue

This week has been spent getting my Christmas present (to me) working. You would think that installing a faster Intel processor on an Intel motherboard would be fairly easy. It wasn't.

About this Archive

This page is an archive of entries from February 2005 listed from newest to oldest.

December 2004 is the previous archive.

March 2005 is the next archive.

Find recent content on the main index or look in the archives to find all content.