Today, the younger brother has sent a "500W data page performance test" post in the 9CBS forum, and I can't figure out the attention of many friends, so I intend to write my experience ~~~~ I used 1000W data test when I test, but the younger machine Only 512 memory, with 1000W, the performance is still very good, but my machine can't do anything else, so changed to 500W, but the performance is almost different! Just 1000W is slower, memory consumption is larger ~~ 500W consumes more than 90 M, 1000W will consume more than 180 m ...
Post Address: http://community.9cbs.net/expert/topic/3132/3132779.xml? Temp = 9.943789E-02
First, after the amount of data is large, let's first analyze, where is the bottleneck? We use the classic and popular stored procedure paging as an example ~~ What did the stored procedure do? He established a temporary memory table that stores ID, causing ID, which can acquire data of a location, but if the amount of data is very large, such as 500W or 1000W, even if the only ID is also very hard ~~ ~ The key to solving the problem is that how to do not read this ID, the database is not good, so we can read this ID index in the program, then read the data according to this index, isn't it fast? ? Slowly, the first time is slow ~~ So I read this ID list to the program, and organize to become an arraylist, then locate the real ID of the required data according to the elements of this Al, I have an ID. If you get data according to ID, is it difficult?
Next, a new problem occurs, if the data is frequently updated? Re-establish Al? No, can't you die brain, change your way? Why can't I change this AL while changing the data? This doesn't have to be re-established, OK, the core thinking is like this ~~
Further is some troublesome process, but it is better, but it's just a trouble. That is, every time you update the database, you can update the AL in the cache. To achieve this, it is a bit of trouble ~~ I will pack all of these operations. A entity control class, using DataRow and DataTable as an entity, internal implementation of the modified method, and updated Al in synchronization while implementing
The probably the principle is this, in fact, it is very simple, just thinking about the problem ~~~
Talk about the deficiencies of this method, and of course, through this method, through this method, the rules, such as sorting, such as filtration, must be pre-specified, can not be replaced at any time ~~~
Regarding further improvement, the younger brother can think of a way, but also improve performance, but now there is still no realization, still thinking, don't say more, if you have successful, you can avoid wasting everyone ~~~
Finally, introduce my entity control class ^ _ ^
1. A cache acceleration function is achieved. The above way is the core idea, of course, also supports conventional stored procedures and ID filtering, and integrated. 2, do not use XML, you can automatically establish a mapping object according to the database structure. 3. Solve the data concurrent update conflict (still in production) 4, you can establish a data table according to the object. 5, use DataTable and DataRow as entities, I feel pretty, universal and convenient ~~~~ 6, single query, set query, data single and collection "delete" "modification" is naturally implemented I don't know what it is, huh, huh, if you have a need, you can contact me ~~~
CNLAMARHOTMAIL.COM
Published on June 30, 2004 3:31 PM
href = "/ cnlamar / services / pingback.aspx" Rel = "pingback">
comment
#
Reply: a little experience in the big data volume paging
2004-06-30 3:33 PM
cnlamar
Further improvement is the index paging, temporary data cache, cache increment update, is a bit trouble, technical difficulty is more easy, mainly trouble, and a little dizzy
#
Reply: a little experience in the big data volume paging
2004-06-30 3:36 PM
Lyshe
Everyone is saying .NET. The younger brother has no bodies. Time, I have to give up first.
#
Reply: a little experience in the big data volume paging
2004-06-30 3:46 PM
cnlamar
Not necessarily .net
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 4:01 PM
Pass
You said that the ID will get the program, is the first time that all the IDs have the ID?
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 4:40 PM
cnlamar
Yes it is
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 6:30 PM
Pass
The idea of the brother is really good, I also see, the speed is also very fast,
But I want to say, this is not practical .. Why? The general list will have a query problem, sort problems
The reason for the current speed is because the main key will take the Pagesize strip data when taking the data, of course, fast. If you add a Like query, I estimate that it will faint ..... There is also a problem, then don't say .. .
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 6:34 PM
cnlamar
Defects are to sort and retrieving conditions must be reservations
Can't change midway, this I seem to say above ~~~ Oh, as for use, I want to see the benevolence, not all the occasions, but I believe there is his application range, at least I have: D
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 6:35 PM
cnlamar
For data retrieval, I will provide a comparative general approach to solve this problem.
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 8:00 PM
Binary
Brother, I have sent you soil, the speed is really good, how to retrieve the LIKE statement, there is no way, some will pay more, if this may use the zero table storage might be a good method, I don't know Cannot use ADO when using a zero table. Net update means! ! !
#
Reply: Big data (500W or 1000W data) Paging is a little experience
2004-06-30 8:28 pmcnlamar
I intend to do it in the normal way, do it according to my method, data search mode, use the storage process or TOP filtering ~~~ Currently this intention
As long as the rules are fixed, I think this method is to support multi-table query, because it doesn't seem to involve any problem: D