I often code a multi-step "procedural" Stored Proc instead of a complex SQL statement. My reasoning is: when the SQL statement produces erroeous results (especially after it is in production!), the only way I know to debug it is to break the SQL into components and test one part at a time (and risk introducing another bug when reassembling the parts). On the other hand, it is fairly easy to add debug statements to a Stored Proc and/or see intermediate results in the debugger.
I understand the tradeoffs to be: - the complex SQL statement would no doubt be more efficient. - but when under pressure to solve a production problem, the Stored Proc and debugger are quick and easy.
I am just curious if others have an opinion or other tradeoffs to note?
Hm, it's difficult to comment on a rather vague design... - as you state you use a SP instead of (one) complex SQL statement, am I rigt to assume that you do use a temporary table to store intermediate results?
I do certainly agree that "divide and conquer" is a very helpful strategy to handle complexity (and I'm sure Breck will second, too:)).
IMHO it's often possible to use that within one complex SQL statement itself, too, say by using the WITH clause to build local views or by using derived tables to build intermediate results and join them finally...
I know that's a vague description, as well, but personally, I have learnt to build complex statements that way, and when doing so, I usually keep a SQL script with the "building blocks" in case I have to re-check the full statement. So that's another method to "debug the single steps"...
To get an idea, here's a sample from the forum to build an UPDATE statement (whether you call that complex or not is up to you, apparently):
> multi-step "procedural" Stored Proc instead of a complex SQL statement
I am sure that does not mean "record-oriented fetch loop instead of set-oriented SQL statement", but in case it does, please reconsider.
If necessary (to preserve your sanity) try separate set-oriented SQL operations to divide-and-conquer a single complex operation. Sometimes that means using intermediate temporary tables, but not always... views are often a great help; e.g., FROM derived tables, local WITH SELECT view clauses, CREATE VIEW statements.
Sometimes people use fetch loops so they can COMMIT a long UPDATE or DELETE at intermediate points along the way; before doing that, consider an ordinary loop containing a repeated set-oriented UPDATE TOP START AT plus COMMIT. Fetch loops are fraught with inter-connection conflict issues that don't affect atomic UPDATE statements; if you need proof just read the rules for DECLARE CURSOR NO SCROLL - DYNAMIC SCROLL - SCROLL - INSENSITIVE - SENSITIVE (tip: anything other than INSENSITIVE is asking for trouble).
answered 01 Jan '16, 09:37