SQL Server 2000 Pressure Test

zhaozj2021-02-11  217

SQL Server 2000 Pressure Test

- Large credit, large information capacity

This is one of the two test reports I just made when I have just entered the company. I am publishing, I hope to help everyone.

With the release of Microsoft's new generation database system SQL Server 2000 (hereinafter referred to as SQL2K), the industry is aimed at its performance, especially for the performance comparison of Oracle 8. In a series of tests, SQL2K did not live up to Microsoft's expectations, gain a quite good grade, and finally shoulder against Oracle, seize the mission of the high-end database market.

In the application of the relational database, many database performance declines when the data of the magnitude or capacity level is sharply expanded. Therefore, in a strict application, the Oracle database is usually the first choice for the user, but its high price is desirable. With the arrival of SQL2K, we look forward to Microsoft's SQL2K can have excellent performance in large recordings and large capacity. Here, we test the performance of SQL2K in these two aspects.

Test Plan:

The entire test process is divided into two parts. The first part is the implementation of the database large-capacity state, and the second part is the implementation of the database under large records. For the convenience of testing, we have written a program to perform various database operations and perform efficiency records.

In the test of two parts, we make various database basic operations in the empty database environment, and record the time required for each operation, and then insert a large capacity / large record amount of data, then do the same operation. And the time required for the record operation. Finally, the time before and after. Of course, due to network transmission, there may be some errors, but this will not have much impact on our testing.

Test preparation:

★ Test environment:

OS: Windows 2000 Server

Database: Microsoft SQL Server 2000

Database Server: ADV2000

★ Create a database:

Use Enterprise Manager to create a database TEST on the database server, and set the size of 10GB to avoid sales when the server is dynamically allocated with the server dynamic distribution disk space with the increase in the database capacity.

Then create a TAB table on the TEST database, including the following fields:

Field Name Data Type fields can be empty ID INT, 4 primary key, self-incoherent Name Char, 10

Caped Int, 4

不 Photos, 16

can

(Table I)

★ and we have written a test program using Delphi 6, connect the database server ADV2000 with the ADO interface. The test program mainly completes the function:

1. Insert 2000 data (Insert)

2, select 1000 data (SELECT)

3, update 1000 data (Update)

4, delete 1000 data (Delete)

5, insert 100,000 with picture data (for large volume test) / insert 1,000,000 without image test (for large record quantity test)

Testing process

The entire test process is divided into large-capacity data testing and large record quantities data tests:

Large capacity data test:

In a large capacity data test, we expand the capacity of the database by inserting images, so in all database operations, such as inserting data, all data of the picture. A 41,958-byte image is selected, and the large-capacity test is a test after the 100,000 records, so we can approximate the capacity of the data table (41958 * 100000) / (1024 * 1024) = 4001.43MB. First, the test procedure is tested in order, "inserts 2000 records in the empty list"> Select 1000 Records -> Update 1000 Records -> Delete 1000 Records ", and record the time required for each operation. The test results are as follows and tables:

(Figure 1)

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

132.781 s 41.94 s 0.841 s 1.552s

(Table II)

The above test is a test of various basic operations of the database in the empty data table, and records the required time. Then we insert 100,000 records of pictures, enabling the data volume of the data sheet to 4001.43MB, and the next job is to test various database operations in the mass environment.

(Figure II)

Testing in accordance with the above steps: "Insert 2000 Records -> Select 1000 Records -> Update 1000 Records -> Delete 1000 Records", and record the time of each operation, as follows:

(Figure 3)

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

139.05 S 42.36 S 0.971 s 2.264 s

(Table 3)

Through comparison, we found:

In the large number of records (under the volume of the data), the impact on SQL2K is not large, and its performance impact is relatively small, from the following picture to see:

(Figure 4)

From this histogram, it can reflect that in SQL operations that require a lot of time, SQL2K can basically keep in large capacity data environments, and there is no much performance drop. However, from the back two SQL operations that need to be performed very short time cannot reflect from the figure.

Then let's take a look at the calculation, the following is the percentage of time required to increase the time with the initial database environment in a large capacity data environment:

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

4.72% 1.001% 15.46% 45.88%

(Table 4)

As can be seen from the table, when Update and Delete are operated, it seems that performance loss in a large capacity environment is serious. However, considering that because the access database server and the client are located on a different machine, although through 100M internal network connection, due to the network problem, the delay of the network is caused. When the time required for the operation is relatively large, this delay has little effect on the time ratio, and once the time required for the operation is short, this delay is particularly highlighted.

For the performance loss of 45.55% in Delete operation, there is another reason that the database will rebuild the index when the record is deleted. When the amount of data is too large, the reconstruction index is time consumable.

Large record quantity data test:

In a large-scale environment, in order to make large-capacity tests do not affect this test, first we completely delete the table TAB used in large-capacity tests, and recreate a completely the same new table TAB for this test. The steps in this test are basically the same, and the time required for each basic database operation under the hollow table is also tested. However, in this test we have not inserted a picture, the photo field is not inserted, keeps empty: "Insert 2000 Records -> Select 1000 Records -> Update 1000 Records -> Delete 1000 Records", as shown below:

(Figure 5)

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

16.274 s 0.07 s 0.04 s 0.04 s

(Table 5)

In a large capacity data environment, we insert 100,000 pieces of data, resulting in data volume to 4001MB, and test in this large record amount of data, 100,000 data It is no longer able to meet our needs, we hope to test in a higher data volume environment to facilitate the increase in the difference in operational efficiency before and after the addition of large-order data, so that inserting 1,000,000 data is selected.

(Figure 6)

After completing our large-scale data insertion, perform the next test record: "Insert 2000 Records -> Select 1000 Records -> Update 1000 Records -> Delete 1000 Records", record each of the operations Time, as shown below:

(Figure 7)

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

16.574 s 0.05 s 0.05 s 0.051 s

(Table 6)

Compare the test results under both conditions:

(Figure 8)

From the first operation insert, the same SQL2K has little loss in a large record amount environment, but other operation results are still unable to manifest from this figure, we can only compare the time percentage:

INSERT 2000 record

SELECT 1000 records

Update 1000 records

DELETE 1000 records

1.84% -28.6% 25% 27.5%

(Table 7)

It can be seen from the percentage of growth time to be lost in these operations, but we cannot send whether it is because these operations are too short because the errors caused by the network.

Fourth, test results

Summary: Although in the test, because many SQL operations are too short because the time required is too short, resulting in the impact of network transmission. But we can still summarize the longer SQL operations required:

Whether in large capacity (number GB unit) or large record amount (Millions of records), SQL Server 2000 can maintain a high level. Under normal circumstances, performance is less than 5%, so we can fully meet our usual practical applications.

However, because of the restrictions such as hardware, we cannot test greater capacity (ten GB, Hundred GBs and even TB capacity), larger records (10 million, billion data volume).

Overall, SQL Server 2000 is already able to meet our requirements in ordinary enterprise applications.

转载请注明原文地址:https://www.9cbs.com/read-4396.html

New Post(0)