Stored Procedures: Good or Bad (Store Procedure: Good or bad)

xiaoxiao2021-03-06  72

Author

Date of Submission

User Level

Saikalyan Prasad Rao

07/06/2004

Intermediate

Author

Date of submission

user level

Saikalyan Prasad Rao

07/06/2004

intermediate

I am Sure This Issue Has Been Taken Up and Discussed in Lots of Articles On The Net. This Article AIMS To Look at Both Sides of the Coin. First We will Dwell on The Advantages of Stored Procedures.

I am sure that this problem has been discussed many times online. This article is discussed in two aspects. First we first discussed the advantages of the stored procedure. Stored Procedures provide performance benefits such as local to database, pre-compiling and caching, a programming framework with use of input / output parameters, reuse of procedures and security feature such as encryption and privilege limits to users. A part from that it offers modularization of code and changes are immediately affected unlike business components which need to be recompiled and deployed. Not forgetting that with the advent of .Net, deployment issues have been reduced quite a lot. But nevertheless changes made to any component do need to be rebuilt. .

The stored procedure provides features such as data localization, pretreatment, and buffer, a architecture using an input / output parameter, reuses the stored procedures and security features to limit users as encryption means and permission settings. One part of the code is modular, and it is necessary to recompile and deploy it if the data is changed. Don't forget that after the appearance of .NET, the deployment has been greatly simplified, but the changes to any component still need to recompile. Another benefit is to save the interaction time with the client application and speed up the network response. But on the flip side Stored Procedures do come with its own share of problems. Debugging and maintenance has always been a known issue and it makes it even the more difficult when developers like me get used to VS.Net debugger. On a side note, I do Think Microsoft Has Always Built a Very Good Debugger in vs / vs.net.

However, another side effect of the stored procedure is generated on itself. Debugging and maintenance has become a well-known topic, which may be more difficult for developers who are used to VS.NET, this may be more difficult. From a point of view, I think the Microsoft vs / vs.net debugger does not always behave very well. Managing changes in stored procedures and applying service pack releases can be a bit teething at times. Apart from this there are issues pertaining to migration. What if your application which was built with SQL Server needs to be ported to Oracle or any other database? It Would / IS A Nightmare Converting All Those Stored ProCedures and T-SQL Specific Code to a Compatible / ANSI SQL Code for That Database. Sometimes changes in the stored procedure and application service pack applications may be a bit inconvenient. In addition to this, there is also a grafting problem. What should I do if your app is written in SQL Sever, but now what should I do now to Oracle or other database? Turn all stored procedures and T-SQL code to a SQL code compatible with the target database is simply a nightmare. Personally, I would like to go in for stored procedures and leverage most of the database capabilities if I knew my project was going to use a specific database and would not change. I am sure many must be thinking on the same lines. After all one of the cool features that I liked about SQL Server was its support for XML. You should try doing bulk updates through XML, works like a charm and that too with less amount of code. in fact in .Net, datasets have the capability to output out XML representation of data which saves you the effort of writing code to formulate the XML. Pumping in of Business Logic in Stored Procedures have been done and makes a lot of sense for small projects. But if you want to scale up your application it poses a problem since your database and business logic get tied to your database tier. I am sure for small projects it would not matter much but for a large scale enterprise level solution this would at some point in time pose a huge problem.

If I know that my project engineering uses a specific database and does not change, I advocate using the stored procedure. I believe that many people have the same idea. After all, SQL Server has a feature that supports XML. You can make a lot of data updates via XML. In fact, in .NET, the data set can output the stored data to the XML file in XML data. The import of commercial logic during storage has made many small items very easy. However, if you want to extend your application, you will have a problem because your database and business logic bundle in your database Tier. This is not important for small projects, but for large enterprise-level solutions, it may be a big problem. I am sure there will always be two different schools of thoughts on whether or not to use stored procedures. All said and done, it does raise an interesting issue. If we were not to use stored procedures, what could be an alternative? Different solutions come to mind such as a generic DB layer component which would have all ANSI SQL statements which would allow one to connect to various databases or the ad-hoc SQL approach. But both of these approaches do come with its share of hurdles and pitfalls. We all know how brittle ad-hoc scripts are since any small change to the database could have sever impacts on your system. Building a generic DB component needs to have a properly designed database which would get affected every time your database changes. I believe that for Whether using the stored procedure must have different opinions. This will cause a very interesting topic. What is the alternative method if we don't use the stored procedure? Different solutions can result in such a database layer component: it has all ANSI SQL expressions to accommodate a variety of different databases or a specific SQL method. But both methods have a common disadvantage. We all know that specific scripts are very fragile, any small variation in the database may affect your system. Building a generic database component requires a well-designed database to accept the change in the external database. I guess with both sides having its own share of advantages and disadvantages, I feel the best approach would be is to make best of both the worlds. All insertions, updating, selects etc to be done in stored procedures which would enable me to leverage some Of The Cool Features of Sql Server Like XML Updates and Put The Business Logic Into Components Which Would Abow Me To Easily Debug and Scale Them.

I guess these two aspects have their advantages and disadvantages, I think the best solution is to look at the problem of two. All INSERT, UPDATE, SELECT, etc. can be done during the stored procedure of the XML update feature that I am using, and put business logic in components that make me easily debug and extended. The Upcoming Release of SQL Server "YUKON

"And asp.net" whidbey "AIMS to Address these Issues.

Yukon

is coming up with inbuilt support for CLR. That means we can now code stored procedures in any of the .Net languages ​​which is easier to write than T-SQL and at the same time leverage the powerful debugging features of VS.Net. In ASP .Net "Whidbey" there are plans of introducing a new extensibility point called Providers. This new Provider Model would support many new features likes Membership, Personalization, Role Manager, Site Navigation, Build Providers, and Health Monitoring etc. The Provider Model in ASP .Net Whidbey enables developers to completely un-plug the logic / behavior / data interaction of a particular feature of ASP.Net and replaces it with one's own logic / data layer. In short the Provider model provides both data and business logic abstraction.

The SQL Server "Yukon" is discussed with ASP.NET "Whidbey". YUKON supports the CLR. This means that we can use any .NET language to write a stored procedure in the vs.net with a powerful debug function. In ASP.NET "whidbey", there is a plan to propose a scalable method called Provider. This new Provider Model will support many new features, such as Membership, Personalization, Role Manager, Site Navigation, Build Providers, and Health Monitoring. The provider model in ASP.NET Whidbey allows developers to completely remove logic / behavior / data interactions in ASP.NET, replaced by a logical / data layer. Briefly, Provider Model abstracts the data and business logic. Both the Upcoming Releases Have Tried to Bridge The Gaps. I for One Am Eagerly Waiting for their Respective Releases To Happen. WHether Successful or Not, Only Time Will Tell.

I have been waiting for their respective versions. Regardless of success, time will prove everything.

转载请注明原文地址:https://www.9cbs.com/read-108521.html

New Post(0)