SQLTeam.com | Weblogs | Forums

Proposing new Data Access Standard: Load/Store/Delete

All data access must go through Load, Store, or Delete. Push SQL back into the database where it belongs.



Stored procedures map parameters/resultsets to the object, where property names can be specified.


Without some sort of explication or rationale... Nope...

The vast majority of data access comes in the form of simple SELECT statements which requires none of what you mentioned.
Loading & storing are the same thing and I've seen many data bases that never do any physical deletes of data.

You've got to map the SQL result set to something (a recordset or data reader or table, etc.) so why not map it to your own custom object? It's easier and fewer bugs. No SQL in the business/application layer at all.

Dim c as Customer
c.ID = 1

That can be secured and monitored.

If you're app doesn't require a database on the back end or runs faster/more efficient by caching data on the client... Sure... You just described a cookie.

But... this is a SQL forum, so the context of using a RDBMS database is automatically implied, simply by being here.

Jason, I think you're misunderstanding something.

That code was written for an RDBMS. It replaces all SQL and data access layers (DAL's).

The code I've described above has been in operation for over 15 years. Originally written in Java, then VB6, and now C#. It works with ALL RDBMS and handles ANY data structure.

  1. set a connection string in Web/App.config
  2. add an attribute to the class (Persistable<storedProcSet, IdentityKeyName>)
  3. Database.Store(object)

That's it.

If you have a link to an article or blog post, please share. Otherwise, I don't have a clue what you're talking about.
Relational databases don't have classes and they are so much more than "data persistence mechanisms" for an application layer/tier.

When you say, "It works with ALL RDBMS and handles ANY data structure", I only see a big red flag... Sorry...

I've been using this approach for over 15 years. Works for everything:

It maps resultset/parameters to an application object.

Thank you. I'll give it a read.

Adopting a new standard will force a re-write of all Federal software. Push SQL back into the database where it belongs.

This I agree with. The only SQL that should be executed from the app layer should be calls to stored procedures.

All software is built to textbook standards.

Ehhh... I'm all for having a set of standard guidelines but at the end of the day, "the right tool for the job" should take precedence. I'll thake "Faster & More Efficient" over "Textbook" all day, every day.

Load/Store/Delete follows proper architectural n-tier design

???... I don't see that being defined anywhere, so no comment.

Most systems can be re-written side-by-side with no downtime.

Given unlimited manpower, hardware & $... Sure, anything is possible. I don't see a compelling argument that I could use to persuade those that hold the purse strings.

Supports surveillance and law enforcement while providing accountability.

Orwell's 1984 should be read as a cautionary tale, not an instruction manual.
As DBAs & Developers, our duty is to protect our customer's data , not support surveillance or law enforcement.

Sufficient savings can follow through on Congress' 'borrow and pay from savings' plan.

My libertarian must be showing by now... Seriously, just shoot me in the face now.

Allows control into the, otherwise opaque, 'cloud'.

? Okay... I'm done...

Seriously, there are already ANSII & ISO standards that each vendor follows to some degree and varies from to some degree... Which is how things should be is a free market system.

All database already have exposed interfaces that allow them to communicate with external applications. They'd be useless without them.

I don't see anything here that illustrates a "enormous efficiency and accelerates the development cycle" advantage to simply calling a stored procedure.

I agree with most of what you wrote, but simplifying things down to:


is as simple as it can be. It can be controlled and monitored and it's also faster and more efficient. There are no downsides and no reason to do anything other.

I'm suggesting: Load, Store, & Delete. It's up to the vendor/implementer to decide return type and parameters.

Anyone can understand Load/Store/Delete. It requires no technical expertise.

Assigning values is automatic. It replaces the DataReader. Compile-time type checking between SQLServer and .NET.

My lack of imagination is clearly preventing me from grasping something crucial. That and the fact that I'm a SQL Dev only. I don't do any application programming. So, I fully acknowledge that there may be some intricacies on the app side (that I'm simply not aware of) that this concept mat very well solve.

I swear, I'm not trying to bag on the idea. I'm just having a tough time visualizing an actual need.

From where I stand, there are two main way data finds it's way into a database. 1) Users using an application, inputting small amounts of data per transaction but having a number of such transactions. 2) Bulk data imports from files that are generated in external systems and delivered at preset intervals, typically 1 or 2 time per day.

So, for #1, I simply can't picture needing anything simpler than EXEC dbo.SomeStoredProc @P1 = 'Jason', @P2 = 'Long;

#2, on, the other hand, has lots of room for improvement... csv, fixed width & xml (and anything else) all have their own set of drawbacks. So, if your idea centered around data exchange formats, eliminate the delimiter hassles of csv, column width limitations of fixed width or the file bloat of xml... now I'm interested!

My idea on that front would be a system that automatically "normalized" & compressed all redundant data, splitting a file into multiple sub files and then zipping it all up into a single, consumable file... Similar to the new(ish) MS Office "X" fole types which are, in reality, zip folders containing multiple sub-folders & files... but... to the end user, function as a single file.

As far as it being a no downside "panacea" that allows my schema to talk to your schema without either of us having to know about each others schemas... Well, that sounds a lot like the sales pitch we all got with xml & more recently json (not to be confused with me... Jason).

I agree. I think you'll like this, because it places the DBA into full control.

Yes, it builds that call to the Stored Proc from the Class description. Here's an example:

<Persistable("Customer", "CustomerID")
class Customer {
int CustomerID;
string FirstName;

Then, in the web page or application:

Customer customer = new Customer();
customer.CustomerID = 123;

The Load statement builds that stored proc call, then maps the result set back to the customer object.