Http://www.microsoft.com/china/msdn/library/archives/dnast.asp
Windows DNA application data access component intensity test Mike Schelstrate Summary: This article discusses the importance of intensity testing of data access components of Microsoft Windows DNA applications, and how to perform test procedures more easily
Introduction
Why do you intend to test your application?
Preparation for intensity test
Permeate test
Assessment of intensity test results
Successful intensity test skills
to sum up
Introduction Strength Test is a step that Microsoft® Windows® DNA applications development and promotion is often ignored. The purpose of the test is to ensure that the maximum number of authorized users can access the program when accessing the program. performance. This paper focuses on the importance of intensity testing of programming using Microsoft Data Access Components (MDAC), which gives some techniques that make the test process easier to complete. This article is to help skilled developers and IT experts design a complete set of intensity test programs, and evaluate the results, and propose amendments to the deficiencies, readers should be familiar with Microsoft Windows NT® Server, Microsoft SQL ServerTM. , Microsoft Internet Information Server (IIS), Active Server Pages (ASP), Microsoft ActiveX® Data Objects (ADO) and Microsoft Component Services environment (or Microsoft Transaction Server [MTS], if you are using Windows NT) must be emphasized is that The business logic and data access processes used in COM or DCOM components including ADO must be correct. For performance and reliability, these processes can not reside in Active Server Pages. If you care about the use of the application in high strength, it seems that you have used these components to perform programming, and use the advantages of the Component Services environment. This article is not prepared to discuss intensity issues caused by client browsers or bandwidth restrictions, and mainly focus on server-side data access components and their interaction between the interactions between Internet Information Server, for interaction problems, for use Remote Data Services (RDS The problem arising is not discussed. Why do you intend to test your application? Before the application is officially promoted into the production environment, intensity testing is often required, and the strength test of network applications is to achieve the following basic purposes:
Getting the system's total user load increases a single user's true personal experience to determine the maximum load running the application hardware, which is determined whether the hardware is upgraded before going to the actual application. Depending on the average page response time, the threshold for the program is determined by the user to ensure that the system's threshold is still acceptable when the system is expected. Although the general number of web applications, The user's experience is the most important factor that determines whether the program is successful, but there are still many sufficient reasons to need you to intensify the program, including the following:
The development phase has a good program, in a high-intensity environment, performance may become very poor. For example, Internet Information Server or SQL Server will be used simultaneously by multiple programs. If there is no way to make it a good design in this case, then the implementation new program will affect, even interrupted is already running Each program. The user initially used the application will leave them the most important impression. If the impression is poor because of the problem of strength, even if you solve these problems, it is difficult to change their views. Conversely, if there is enough strength test before promoting the application, the excellent and fast procedures you have developed can run in the expected manner, you can establish a good developer's image in the user. The IT Working Group, which is responsible for promoting and maintaining the application, the strength test of you will be more appreciated with your customers more appreciated. They are in the first line, you first heard their opinions and use results, if you expect the process's elastic problems, you will have a big impact on the IT Working Group, when you need their experience, group Members of the members will be very interested in providing help. The user satisfaction matrix should also include potential business partners, if they decide to incorporate your program into larger package, will make the program greater success. In order to determine the analytical value of the best server, the intensity test is to determine the analysis value of the server, first test the configuration of the boot system in a controlled environment, then test in a simulated working environment, determine the configuration in the working environment What impact on the boot system is produced. In the preparation of intensity test during program development, you should pay attention to the following aspects;
Hardware and Software Configuration Server Configuration Security Setting User Load Settings Selecting Suitable Intensity Test Tool Hardware and Software Configuration boot systems must be as best possible to reflect the actual system. CPU, RAM, and network bandwidth these hardware configurations are the most important aspect of intensity testing, and they also need to reproduce software configuration, such as Microsoft Windows, Service Pack, and MDAC's own version, Internet Information Server configuration and Other practical systems will be the same as those running on the same machine. Business rules and data to install and register the intermediate layer access COM components, configure it according to the requirements of programming. The last setting is to determine whether the process is still tested for the Web site in the process. This choice will determine your web application is running the same or separate address space as IIS. This setting has an important impact on the intensity test to be performed. The following figure shows the settings of the Stress Properties property page in Microsoft Windows NT 4.0 and Microsoft Windows 2000. In Microsoft Windows NT 4.0, select the Run In Separate Memory Space (ISOLATED Process) check box to make the web application run in the process. In Windows 2000, the Application Protection option is set to Medium or High to make the web application run in the process. Figure one. Two 2 of the Stress Properties property page of Windows NT 4.0. Windows 2000 Stress Properties Properties Page Server Configuration Configure Internet Information Server to simulate the actual server. The Internet Services Manager property page provides IIS to adjust each option. More important is to decide whether to activate the log option (activation this option will significantly reduce the system running speed), and you must select the number of hits per day every day in the Performance tag. Most of the problems related to intensity are present on the data server. In order to more efficiently execute the query process, it is necessary to properly correct the design of the database, so the database of intensity test must be the same as the program actually use, while ensuring that the table is set in the maximum number of data to be generated. . It is also necessary to ensure that the configuration options for testing data servers are consistent with the actual phase (especially locking and isolation layers, and optimized techniques used, using optimized techniques, such as table index), security schemes, is at high strength. The procedure has a considerable impact, especially if this system uses, for example, Microsoft Cryptography API encryption technology. So you must set the same security solution for the system to be tested, but do not necessarily need the same certificate. The security protocol for accessing data to terminal storage data in intranet is the most commonly used Microsoft Windows NT Lan Manager (NTLM) authentication system. If possible, you should consider simplifying and rationalizing the security authentication process using Component Services (MTS in Windows NT), and enhances the performance and stability of the system. The user load setting first determines the maximum number of access to this program user you expect, and then double; a successful application is very likely to have much more than expected. Then, calculate most of the users will be accessed, determine the load of the network within that time, which is the time you will test. This strategy allows you to test the impact of the user load and the configuration of the system hardware to ensure that the program can also have an expected response during the peak network. Choosing the appropriate test tool In the environment of the actual data center, the web server will be at a very high connection level due to a large number of users to connect to the web program via the company's intranet or Internet.
The tool of the web program intensity test should have enough threads to maximize the number of parallel connections while controlling the size of the web server packet size, which can simulate high parallel connections. Fortunately, there are now many tools to simulate such an accurate environment. One of them is Microsoft's Web Application Stress Tool, which can be obtained free of charge from Microsoft's http://webtool.rte.microsoft.com/ site, it provides all the necessary features, and some additional excellent features, such as A variety of record features. Web Application Strength Test Tool Web Application Stress Tool Really simulates a large number of browsers to present a page request to the web program, create a test environment, and also record the page accessed by the browser in a scriptor in a scriptor . This script program can be saved and then run on the Windows NT or Windows 2000 client that has this application. Since the Web Application Stress Tool is able to simulate a large number of users on a separate workstation, they do not need to have the same more than the actual client. Note When you perform an intensity test, be careful not to improve the level of strength of the client, because,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, When testing a web-based application, several clients should be used to ensure that threads are distributed on various clients, which will reduce the requirements for each client resource. Performance Monitor To make effective testing and correct analysis of data access components, the most important thing is to have a method of monitoring and recording running information. The Performance Monitor included with Microsoft Windows NT and Microsoft Windows 2000 is a tool for monitoring and logging the information, which is suitable for Internet Information Server and data servers. Other tools should be a self-container on the data server in addition to running Performance Monitor on the Internet Information Server. Many high-performance data server applications, such as Microsoft SQL Server, Oracle, and Microsoft Exchange Server come with its own performance monitor, which can use it to measure the stability of the program and hardware running this program. After intensity testing is very simple after the test scheme is seriously developed, the process actually test is very simple. The first step in performance test is to use the tool similar to Microsoft Web Application Stress Tool to intensity testing, and measure the maximum number of requests that the web server can process per second, this is quantitative measurement. The second step determines which item in the CPU, memory, or terminal device limits a higher level of requests per second; this process is that it is a measurement technology, but it is better to say an art. First choose the ASP page you plan (find and use the slowest page on the site), which is determined based on the most frequent and most complicated page of the access database. This is very important, it is best to include some pages that are not within this, or miss some key code paths. In addition, if possible, you can consider the actual in the actual sequence, in accordance with a specific order, pass the cookie and query strings to each page to test the program. Note It may be necessary to prepare some ASP pages for testing depending on the actual program. Some of these pages will need code parameters that are usually generated by the application, which cannot be generated by the web intensity test tool.
When running the program on the Internet Information Server, the following indicators should be monitored (with Performance Monitor). Active Server Pages: Restably, the number of requests, the queue length and the current dialogue INetinfo process: private byte, virtual byte, and open handle number processor: user use time and percentage ratio If the program runs in separate memory in Windows NT 4.0 (or sets the Application Protection of Stress Properties pages to high [isolated] in Windows 2000, you should monitor MTX.exe, which is a DLLHOST process in Windows 2000). Not an inetinfo process. As shown in the figure below; Performance Monitor, the Active Server Pages request number will display the actual throughput capability of the program (which is 1.000 Requests / SEC), this data allows you to perform the performance of the Internet Information Server at high strength. Analysis, then find the accurate position of the potential bottleneck. In turn also enables you to determine the maximum number of users that the program can withstand in an acceptable response time. Figure 3. Performance Monitor Use ASP Technology's web server to specify a thread from each page request from a large number of threads that have been established; if all threads have been used, the subsequent page requests are in the queue. . Monitor the queue length in Performance Monitor, you can determine how many customers are waiting for the server response. The most common two related to database strength performance, unrelated to hardware is deadlock and parallel lock. The following indicators should be monitored at least when using the Performance Monitor monitor comes with the data memory on the data server.
Lock Request Eplocked Maxim of Per Second Locked Demonstration Table Decoction The WEB application is in progress, which should also be utilized to the advantages of the OLE DB resource merge, this application is made by the intermediate layer OLE DB Provider for Microsoft SQL Server 7.0 Automatically manages. Create a connection to each page base object, then release them immediately. Through this method, you can use little to open the database, and process thousands of parallel users, so that the database is protected, it also improves Stability, this enhanced performance is used to monitor (using Performance Monitor) with methods of tracking the number of user connections on the data server. When the number of visits increases, the number of user connections should remain stable because resource merge controls the number of connections actually generated on the data server. To achieve the expected performance, the process of debugging the program is critical based on the database, so it is necessary to be an important part of the development cycle. This process includes optimizing the size of allocated memory and the program on each disk drive and controller, and the machine location of the ActiveX component. As long as it is possible, you should take into account the queuing problem of data between processes, as this will take a very large amount of operation. The time to run the test should be more than half of the program continuous running time in the expected actual user environment. Many problems, especially in memory, only when the program has run longer for a long time. The problem you should find in the test Performance Monitor is determined by the configuration of the program and the hardware. Therefore, when testing, you should pay attention to whether each indicator deviates from the average. Monitor Internet Information Server When looking for system bottlenecks, the most demo monitoring on Internet Information Server is the following:
CPU utilization memory usage throughput is in the test process, depending on the different environments where the design is run, you will want to track other performance indicators, which are optional. There are any of the cases of the program, indicating that there may be problems, need to be corrected before eventually release. The decrease in the CPU utilization CPU use indicates that the performance of the program has declined, which may be monitored by the thread conflict problem to monitor the user usage of the CPU, please remember that the user's usage time should be occupied. CPU total use time 80-90% this rule. Therefore, the kernel usage time has more than 20% means that the Nuclear layer API call instruction has conflict. In order for you to be efficiently utilized for the investment of the machine, the Utilization of the CPU should be more than 50% when the load reaches peaks. If it is below this value, it indicates that the bottleneck problem needs to be solved elsewhere elsewhere in the system. After the memory usage is running the server program for a long time, memory usage has a common problem, which is a common problem that the normal resources exposed in the test phase. The throughput monitoring Active Server Pages A number of requests per second, which can diagnose whether or when to start performance issues. This value in the actual environment will appear normal fluctuations, but carefully set the thread and the number of parallel connections (such as the Web Application Stress Tool) to simulate a stable request number. Sudden reduction of this value also illustrates problems. Selective Test The following is a indicator of other things worth monitoring during the test.
The length of the team. The length of the total queue in Performance Monitor will fluctuate up and down, which is very typical. So if the CPU usage is very low, the length of the lead queue has never increased, which is a stable site, which is much higher than this load. However, if the leader team is fluctuating up and down, the CPU usage is less than 50%, which indicates that some requests are blocked, and the program needs to be further optimized. Browser Response Time. You can regularly access the Active Server Pages to monitor the response time to ensure that the test is running normally and the Web site can still provide ASP page service normally. It is recommended to carry out at least two times a day during the entire intensity test. Timeout error. In the browsing test, you should pay attention to the timeout error returned by Internet Information Server; these errors may indicate that there are too many users to access the program at the same time. Monitor the various MDAC services inside the data server data server and the formatting process of display data, usually occupy the server resources used by the most Web program. Therefore, it is necessary to give special attention to the performance of these components when the program is intensive, because it is associated with the data access and operational part of the program. The user connection of the database, the lock conflict and the deadlock are the main candidate parameters that require monitoring on the data server. Regularly check the process information in the Database Console (for example, the current activity in SQL Enterprise Manager), find the blocked server process ID, which is a common cause of the data query without a response. It is a conflict problem, usually to make great adjustments to database design or program logic can be resolved. The deadlock problem can be determined in many ways. The most commonly used is determined by the value of Number of Deadlocks / Sec's value in Performance Monitor. The deadlock problem of the program must be checked, and it is necessary to do it properly, because if it is determined by the data server to determine the receiver of the deadlock (ie, the user or conversation to solve the deadlock,) will make the program Large problem. When there is a deadlock, the program should be able to detect, and make corresponding countermeasures to solve the problem, the general method is to wait for a few milliseconds, in general, the deadlock is a time-sensitive mistake, Retry can eliminate problems. After evaluation of the intensity test result strength test, the target value is compared to the data acquired in the test, and some corrections should be found in order to meet the needs of users. The following is a project for performing amendment to check and evaluate each project.
Hardware Database Design ActiveX Component Client Corbine ASP execution IIS load hardware hardware upgrade is probably a simplest and lowest solution to improve program capabilities. Hardware upgrades are hired by a group of developers to rewrite more economies. For example, as long as the memory can be easily improved, the throughput capacity of the program can be doubled. However, if the test results indicate that the CPU uses the system's bottleneck, the upgrade cost will be quite expensive, because almost the machine is upgraded to reach the number of CPUs and the increase in speed. Others and hardware upgrades include increased the speed of the hard disk and controller, and add a brisk or additional network card. Database Design If the result of the assessment is related to the design of the database, then look for hotspots. Analyze the deadlock data, confirm that the program has made the maximum optimization to avoid dead locks. Consider changing the data access logic if necessary To resolve conflicts. Test different indexing methods. Check the query execution scenario of the data server, confirm that the query uses the correct index, and so on. ActiveX Component Optimization The ActiveX component of the ActiveX Data Objects type library must be carefully analyzed. Do not use the default properties of ADO. Always specify attributes to prevent accidental happening, such as Cursor Type and Cursor Location properties. Customer Cursor If the web application takes up a lot of memory, this problem may be incorrect by the cursor position. When using a customer cursor (Recordset.cursorLocation = aduseclient, you can first understand the client is really Internet Information Server instead of the browser. (The special case of this situation is Remote Data Services, which is not discussed herein). The mistake of the developer is a hypothesis that the customer cursor moves in the browser instead of IIS. Therefore, keeping in mind that the data group is actually stored on the machine running IIS will make you consciously pay attention to the reasonable utilization of resources. For example, the program is to access a valid state or country code, and this information is stored on the data server, generating a data set using the customer cursor and resides on IIS, and then accessing these code is higher in local access. This avoids the additional information round trip to the data server when the program is accessed. ASP execution If the ASP page with a data access process takes a long time to run, it is necessary to transfer these data access from the code to the ActiveX component, and more feasible is to place Microsoft Transaction Server (Windows NT) Or in the package of Component Services (Windows 2000), this depends on the system you use. The running efficiency of these compiled code is much faster than the code of the explanation script included in the Active Server Pages. Internet Information Server Load Monitoring the number of programs and types of Internet Information Server. You may need to add more servers to transfer programs to another server to run or execute Windows Load Balancing Service successfully performing strength testing.
Place all commercial logic in the ActiveX component, linked them with an ASP page, encapsulate the code that can be used to the library. Use MTS (Windows NT users). This is a power to provide additional threads and resource merges for server-side COM objects, and easily manage these objects. In addition, MTS also provides event processing capabilities. Use resource / connection merge. This feature of MDAC is default, but it should be monitored to ensure normal operation. Debug data storage. Call the store in a suitable location. Minimize the amount of data accesses with input, output, and data conversion. As long as you have the opportunity to use the procedure and ActiveX components to ensure that actual work is not allowed to be debugged. The debugger will force the thread. In the intranet environment, some workloads can be transferred to the client browser if possible. Upgrade to Windows 2000. When the program is intensity test, the newly released Windows 2000 performance and scalable enhancement can feel it. If you can't upgrade to Windows 2000, at least upgrade to Microsoft Visual Basic® Scripting Edition (VBScript) 5.0 is also a good way, the performance and functionality of the new version has also increased significantly. Consider using Microsoft Message Queue Server (MSMQ). Reasonable use of asynchronous information transfer can significantly increase the number of times that can handle user responses. Don't: Use programs or dialogues in Active Server Pages to place data access code in Active Server Pages. Communicate with the commercial logic with the data access function within the ActiveX component. HTTPS / Secure Sockets Layer is not absolutely necessary. The cost of this certification agreement is very expensive. Unnecessary usage programs or dialogists. Summary Internet makes your program more potential users than traditional customers - server mode programs. More and more organizations realize that the network is an important part of their business strategy, so they choose technology to cope with huge demand. These organizations are not only easy to use, but also have infrastructure that meet their user load. Therefore, more intensity tests should be a very important step, especially when you contain MDAC in the program. Note that the best method in the development link is a basic requirement for successful operation of the system in high strength. That is to say, in the development process, there is a capacity test under load and debugging to achieve performance requirements. It is obvious to perform strength testing and the benefits of repeated debugging in the load under load;
You can get the required ensuring that the program has sufficient throughput. You can accurately understand the scalability information of the program, so that the debugging program reaches the desired performance requirements. The design issues that reduce performance throughput can be found before the actual promotion, and make a correction. Due to the high performance of the program, you can have a good reputation in users and business partners. Send feedback on this article. Find Support Options.