PHP + MYSQL handles large capacity data storage,

xiaoxiao2021-03-06  58

PHP MYSQL handles large capacity data storage,

The database is very like 100,000, 1GB, how to search?

I would like to ask the master, I am debugging a forum, writing more than 100,000 information in the database. At this time, the main content is concentrated on one table, there are nearly 1GB sizes.

Search for this forum, very slow, especially when searching, even

Select * from `CDB_POSTS` WHERE Message = 'This broken old brochure is taken out from the arms to my hand.'

It is also very slow, and it is not running on the desktop. But I heard that Sun's Forum (nearly 100 sub-forums below) is not more than 20 seconds, we can do it.

When I run the search program, the CPU and memory usage are very low. Only the hard drive light has been flashing. I always think that the program is not efficient enough, can I write the program to "multi-thread" to improve efficiency.

Please ask the moderator and heroes!

Thank you.

-------------------------------------------------- ------------------------------

Some people suggest it with Oracle

I think everyone should pay attention, that is, if you let Oracle execute a select * from xxxx where xxx = 'xxxxxxxxxxxxx';

This table is definitely not much faster than MySQL when there is 1G.

It is impossible to make the same operation on so many popular database systems.

The key is your database design.

I am doing a search engine for business information, I get some experience in the process. as follows:

1. Establish a summary table

2. For those digital forms and frequent queries must establish an index, I don't remember where to see a post: "Don't use the garbage index". Depressed N long. This post is available. . Hey.

Recommendation: The use of small size in WHERE is generally indexed. Stored in a separate table of sizes and stored in a separate table of the LIKE query.

3. Do not select * ..., first select id .... then get the ID, and remove the ID to an array in the current page. . Perform SELECT * from xxx where id in (Implode (',', $ ID);

If the obtained data is to be join, please do this in this SELECT statement. Because such a database will make a lot of JOIN's selection factors.

4. Separate data to be retrieved in full text and the current data. The form of the second paradigm in the database design is represented. Information corresponding to the data primary key in a separate table.

5. Add a primary, int (smallint, tinyint to see how many data to see).

6. After doing your system, use Explain Select .... Analyze the statement of the system. See how mysql handles this statement.

7. Subtimately store information (can also be distributed on different database servers)

I have recently seen other people to study how to handle large-capacity databases, and the results are in CSDN. Some good methods have been found in the QQ group

Http://blog.9cbs.net/ericfine/archi.../10/100130.aspx ASP.NET Processes 6 million records

http://community.9cbs.net/expert/to...temp =.2488367

I don't have a very good way to use the number of groups that I have recently been! Session is very high in memory! Do you have any good opinions?

转载请注明原文地址:https://www.9cbs.com/read-117895.html

New Post(0)