Google PageRank ranking new algorithm (http:www.colorbird.com)

xiaoxiao2021-03-06  66

1-1-1 Keyword Stemming

Another change in Google is to start searching for the word drying method. In other words, if you type "dietary" in the search box, some "DIET" search results will be obtained. This is also asked to see the explanation of Google - "Google currently uses the root reduction technology." That is, it is no longer to search for the identical words identical to the input query, and will provide the user with some search results with some or all query words. For example, for query words "PET Lemur Dietary Needs", Google also displays search results in "PET Lemur Diet Needs" and other morphological changes. And Google is highlighted in bold on the search results on the search results page. "

In fact, the intervention of the word dry method has been a period of time, but we can't determine whether it is a play or test nature, but now, Google has officially introduced the words in its search technology. .

1-1-2. Information is the first time, in the "new" Google search results, many of our many people noticed that, even when they queries with pure commercial nature, they are on top of the top. It is often not some related commercial sites, but is occupied by a large number of "information" and "resource". But I didn't think it would inevitably explained that Google prejudice to the business site.

I think that the change in Google's current search results reveals some "true face" of some of the original ranking, although they earn higher page level, but they are just a virtual name, it does not provide any weight Website content. And in the past, a website should be ranked in the top ten is not difficult, as long as the page is optimized, get enough external links (regardless of whether these links have the same topic or even if it is related), then you click - Ranked effect Etienca. If the content of a website itself is very small, and the relevant external links are small, and all external links are received through link exchange, and the ranking of this website is often a significant impact in the update of Google. Google's new algorithm seems to be extremely advantageous for the theme sites that contain a lot of practical content, and users like this site.

If the content of a website itself is very small, and the relevant external links are small, and all external links are received through link exchange, and the ranking of this website is often a significant impact in the update of Google. Google's new algorithm seems to be extremely advantageous for the theme sites that contain a lot of practical content, and users like this site.

1-1-3. Adjustment Ranking Algorithm at any time

Since November 15 last year, Google has always adjusted the ranking algorithm. There are many websites that have a large decline in rankings not only resume ranking, but also have a significant increase in the past. We can expect that Google will insist on adjusting and changing the algorithm to improve the quality of search results.

1-2. Google's development strategy

If the search engine can understand the original meaning of the user, it is not difficult to provide users with high quality search results. However, when the user typed "DVD Player", do you know what he wants? Want to buy a DVD player, still want to know how to connect DVD and TV? I want to see the comments on DVD playing software, or I still have to find software that can play DVD on my computer?

Google's development strategy is different from other search portals, and its goal is to provide different types of search services to different types of search users. "New" Google is interesting to start using other search tools in its primary search results.

Danny Sullivan in the "Search Engine Observing" magazine is invented by "Invisible Tabs", used to describe how the search engine may try to provide a closer to user's intentional search results. He believes that search engines (such as Google) have been able to use multiple resources (like Google provides web pages, catalogs, online forums, news, online shopping, book search, etc.), these resource information will be introduced into search engines The main search results. 1-2-1. Google's Commodity Shopping Search Engine Froogle

In December 2002, Google launched a commodity shopping search engine Froogle test version. When the user logs in to FroGLE.google.com, you can find a website that you want to shop online, you can compare the prices of similar products around the world.

In addition, when the user performs a normal query, if the query term is related to the business, Google will obtain relevant product information from the Froogle and present it in the top of the normal search results.

For online shopping sites, Froogle is really a good choice. Not only free of charge your website, but the steps are also quite simple.

1-2-2. Google's directory search service

The Google Directory is based on the largest manual editing network directory DMoz, combined with Google's "Web Level" technology, allowing web pages to be arranged in terms of importance, and the green horizontal length before the webpage is marked.

In normal search results provided by Google, if a web page has been included in DMOZ, Google will list the description and corresponding directory information in DMOZ. In addition, at the top position of the normal search results page, you can also see the directory links that match the query conditions.

Google's directory search service meets users who want to browse about a particular topic. If your website has not been included in DMOZ, let's take action.

If you are still questioning the importance of the list of websites, I suggest you go to read the last paragraph on the Google search tips: "When you are unable to determine the query condition, we recommend that users use the Google web directory. It can be valid The search range does not display other similar but-independent web pages to users. For example, if you look for "Saturn" in the "Astronomy" directory, return information about Saturn. When searching "Saturn" in the car directory, return to universal Information about the company "Saturn Car". When the search range covers too wide, use the directory service can effectively narrow the search. "

Google wants to know what kind of information they want to find in the directory. When I talked about the "theme page level", you will understand how Google provides search results that match the catalog topic.

1-2-3. Google News Search Service

Google News has retained more valuable reports and headlines that occurred in 30 days, and users can use their news search system to find current events. The news report is sorted in accordance with the issuance date, the number of related reports, and the popularity of news sources.

In a general search, we can see some news search results at the top of the search results page, and you can see the relevant news content when the user performs a general query.

The Google News of the beta is still slightly different from other search services, and its news source is not limited to large media like "New York Times". At present, Google News Search Services has 4,500 media sources. Although other large ICP websites also provide links to important news websites, it is often not as good as Google on the source of coverage and message.

1-2-4 Google's Book Retrieval Service

After the Amazon website, Google also launched its beta book search service Google Print. Users can find a brief selection of books, comments, and authors, etc., and even find out a look at the book. The search results also provide where you can buy related links of this book and related ads of Google. Google has always explored how to further improve the level of search services. The test of book search services in this launched is the company's continuous improvement of search services. At present, the number of print books provided by the service is still very limited, but Google does not let us down, let us wait and see. 1-2-5. Google will launch more search services? Google continues to introduce new search services, and its purpose has a better search experience for users. This is not only good news to users, because they can find their own information more convenient, and it is also a good thing to the website. Soon after the next website, the website will only be targeted for the target group, and there is no need to shout the scorpion in increasingly huge search results.

Google's main change begins with No. 16, November 2003. At that time, online forums began to speculate on this discussion and have produced a variety of speculation. This update is a bad news for some people, but someone is good news (but most guess comes from those who think that this is bad news). Google only does not mention it, of course, we don't expect it to say something. Therefore, this article represents my personal partial speculation, I hope that it will provide some "more reliable speculation" for the reader.

In addition to some significant changes in the ranking of the search results, Google also has some fundamental changes in the format of search results. In my opinion, these changes have greatly showed the overall development direction of Google.

I will first review the recent changes in Google and then put forward some of my exploration of Google's new strategy and clarify some bad rumors. Finally, how to achieve my personal suggestions in "new" Google, for your reference. 1-1. GOOGLE New Change is now, Google is undoubtedly in conducting some new and complete different changes in the past, and so that so many query conditions are affected, so that their search results are completely different from previous. Let us first see if there are "tremendous changes" before, and there is no minor changes (as shown below). In the user, these small changes are added to have produced many new features.

2-1. Google's theme tendency 2-1-1. The idea of ​​the PageRank and Google's old algorithm has problems with the PageRank computing system is: through a "random" via the Internet Sports "to tell you which site is the most important. The system simulation is a random surfer follow-up to click on a random link on a page, press the "Return" button when the deepest page is deepened. The higher the level of the page, the random network surfers discover its chance. This idea is actually quite ideas. The more external links of a web page, the more opportunities for any network surfers, the greater it. At the same time, in the page level algorithm system, the more popular the page, the more it will benefit from the link - this is because any network surfers have found the chance of these links. In terms of research papers in specific areas, the page level system is almost impeccable. For example, if the user queries the paper (or web page) literature on the study of plain particles, the page level algorithm can quickly tell you quickly, which paper is the most inquiry condition Relevant and most important papers, the reason is that these papers have more references to other papers. If the resources on the Internet have the same theme, then this kind of work can be said to be perfect. But as we know, the resources on the Internet have covered millions or even more themes, and in people's actual life, query users are often some of some information about specific topics. So although the page level system considers all links, it ignores the theme of the link page. Google has tried to overcome this limitation of the textual content of the link to the ranking algorithm. But savvy search engine marketers deceive Google's ranking algorithms by establishing links to the changing keywords on the network. A new workshop industry also encourages PageRank - ie pavilous exchanges and transactions from links from a higher Page Level page. If the website is enhanced from the unrelated site to purchase or transaction, the website ranks will be improved, the page level technology has no high quality search results for the vast query criteria. We have reason to believe that when the world's top search engine is found to be deteriorated once the quality of its search results begins to deteriorate, it will not take place. 2-1-2. New technology debut: Topic-Sensitive PageRank 2002, a Ph.D. Shenghar, University of Stanford. Taeh H. Haveliwala published a very interesting papers called "Topic-Sensitive PageRank". What is more interesting is that Javilira is a piece of Google after a year. The "Theme Page Level Calculation System" adds a "deviation" to random motion of the random query user to handle the basic page level computing system exists. This new random query user has a clear set of query and is more interested in following related links on related pages with a particular topic. This is a relatively novel idea that solves a variety of key issues in the quality of search results. There is no doubt that Haseliwara will become a pivotable role in the search engine industry. He has made some substantive research work in other areas of search technology, including how to more effectively calculate some more interesting research on page levels. In the initial research paper, Haseliwara describes how he uses the network database of Stanford University, and calculates the "Theme" page score for 16 topics of the 16 top directories of the ODP (open directory). Although the subject and data of the study (80 million web pages) are very limited, it can be seen that this new system can improve the search results and have the ability to understand what the subject is interested in users. Last year, when I went back to read this paper again, I noticed that the system described in Hasevara still has two problems for search engines. However, we will see below, these two issues can now be properly solved. The first question is to fully expand the number of topics.

To improve search results, 16 themes are naturally not enough. However, due to the high price of Google's page level computing system, Google is unlikely to implement this new system unless the new system can provide some improvement measures. But with the far-reaching development of this field in the past year, I believe that it is not a big problem in the number of topics. The second question is how to determine the subject of a query condition may correspond to the user input, such as a "bicycle", the user wants to buy bicycles, or want to ride a bicycle? Below I will simply elaborate how Google may match a given query condition with the most appropriate topic, and why some query conditions are more susceptible to the impact. 2-1-3. About Applied Semantics and its patented Circa Applied Semantics Network Advertising Software is an expert in Internet advertising, acquired by Google in April 2003. Google This is designed to enhance search and advertising functions. The company's technology has had a profound impact on Google. For example, in the Google's AdWords Keyword Adsystem, the content-based advertisement for PPC advertisers is AdSense Technology for Applied Semantics. In fact, Google is not only adsense technology in this acquisition, and AdSense's background technology is actually the patented technology Circa owned by Applied Semantics. The Circa technology is based on an independent language and has a highly scalability, in which the conceptual relationship between millions of words, words, words and words in these words and other natural languages ​​are included. The ontology supported by complex search technology is the basis of conceptual understanding of the diversity meaning of the words. It enables the computer to more efficiently manage and retrieve information, thus providing search for users to provide exploration knowledge opportunities. . The role of CIRCA technology is that it can determine the concept of specific words or phrases. This technology is currently used in numerous content for advertisers with relevant advertising services, and can also be applied to Google's keywords root reduction system. It is particularly worth mentioning that CIRCA can calculate the "Phrase A" to the "concept B". For example, if the user queries "Colorado Bicycle Trips", Circa can conceptually link it to the topics such as "Colorado", "cycling", "travel". This means they can calculate the "distance" between different concepts and user query conditions in their database. This is very important. 2-1-4. Organic Combination of the two: The theme search engine now has a certain understanding of the theme page level and Circa, then the next question is: How to organically contact between these two of? In other words, how does Google combine these technologies to generate a better search engine? First, let's imagine: If you have a lot of (up to thousands of thousands) theme or concept, Google has resolved how to calculate its theme page level. In the page level system used in Google, the accuracy of the calculation results is quite important. But with the development of theme algorithm, we may see that we may see fast and good effect is what they need. From the above papers, we are not difficult to see this is prone to this. Now, if the user queries, the words in the query condition will be closely matched with a number of topics in the CIRCA database. Google is fully capable of providing the "Top Level" score based on "distance" between the query conditions used by the user and the topics included in the database, providing users with better search results. The closer the inquiry and the theme, the better the theme page level score effect.

Since a given search query condition is likely to match multiple topics in the database, any small errors that appear in the page level calculation will be flattened by multiple theme page levels affecting the query criteria, and thus only need to approximate The theme page level score is sufficient to provide high quality search results. When there is no topic in the database matches the user's query criteria, Google can use the original page level system. If there are too many topics that match the query conditions, the new system is still used to calculate the subject page level score, although the new algorithm may be similar to the scores drawn by the original algorithm. If the correlation between the matching topic and the query condition is very low, the effect will also be greatly reduced. 2-1-5. Enron acceptance and understanding algorithm may have a large change in search results returned by some query conditions - but the original number of top 100 web pages in search results is ranked less. There are few days. One big problem facing effective data is that the reports have been raised for a change in search results. From these "readme" data we can see that many of the search results of Google have a thorough change. It is happened that the reason we see this situation is exactly that most of these "readme" data is provided by those ranked websites. We did not change from these "readmes", but an additional approach, that is, from several available online resources records the search results, and then observe changes in the search results. We randomly (without any visual) studied the real existence of the real existence of hundreds of people, and identified the total number of changes in each of the query conditions, and then we found that the degree of change still maintains a consistent clean. situation. In actual life, this fundamental change is only an exception that occurs, and it is wrong to treat it as a rule. 2-1-6. Theme is not a keyword ... It is not a perfect ten order, don't confuse the "theme" and "keywords". The subject represents a comprehensive topic, such as "calculation processing", "network marketing", etc. And specific query conditions (keywords), such as "Laptop Rental", "email marketing", etc., will be linked to more comprehensive topics. However, from some of the search results currently provided by Google, it is not difficult to find that the subject matter of some of these search results is wrong. For example, for "Laptop Rental", users search "Notebook Rent" often want to rent a laptop, but in the search results returned by Google, the row is in front of the notebook rental information in the university - www.google.com / search? sourceId = navclient & q = laptop rental. What is going on? Just look at the link to these pages, you can find that these links can be mostly themes, such as computing, housing (students rental in campus), and so on. Everyone can query with other words, then analyze the external links on those pages of the rankings, it will be easier to understand why "Laptop Rentals" will have such search results. Google is still possible to provide a search results that are not ideal. Of course, it is also possible to be deceived again, but this chance is getting smaller and smaller. At the same time, we believe that Google will spend more time to fix these issues. 2-1-7. Why is there a fundamental change in some search results? We don't have to pay attention to the unusurus of Google's new algorithm. Just look at real data, you don't even understand why some query conditions are more vulnerable than other query conditions. In the case of "Real Estate" as an example, in accordance with the methodology of Scroogle.org, 77 rankings have fallen in the top 100 web rankings to 100. For more specific query conditions "Colorado Real Estate", 24 rankings have been affected by the top 100 websites. In those web pages that have been brushed, I saw a page title "Southern California Real Estate".

Interestingly, if you use "Southern California Real Estate to make a more specific query, you will find it ran out the second place. In other words, these pages are not subject to Google's penalties, but only because the lack of the subject matter of the query conditions, the ranking will fall. There is also a very small number of competitive query keywords that have no effect on the ranking of search results. This phenomenon has been used as a basis for the correctness of various guessing algorithms of Google Ranking Algorithm. But I think the interpretation of this phenomenon is also very simple - taking "Search Engine Optimization" as an example, in return of the search results, the top 30 webpage lists have almost no change. If you analyze the page of the top of the top, you can find that the external links on these pages have good correlation, such a web page will do very well under a basic page level computing system. Maybe Google's new algorithm is in a big trim, but I have not found better guess than this so far. However, this is not important, we only need to know it: No matter how Google changes its algorithm, the secret of success is actually quite simple - the winning site is often those who have a lot of content and a large number of related links (including import links and export links). We will not be so lucky with the website that uses the door page and the link exchange. 3-1. Research on Keywords: The central idea of ​​expanding the coverage search engine strategy is to determine the right keyword to lock the target group. However, many webmasters are rising in this step, but they are only riveting in several flatual keywords that can be numbered, but they don't know that this strategy is biased. The main objective of keyword studies should be to determine all keywords that target groups, including "modified components" such as brand name, geographic location and modifying. Although the keyword optimization can only be optimized around two to three keywords, it can also increase the coverage of the search results by organically combined with keywords and modified components. For Google's updates made on November 15 last year, there are many website administrators who have implemented a valid keyword strategy, and they have not noticed. The reason is very simple: they lock all possible related keywords, which is widely covered, so although the rankings of individual general keywords have fallen, the overall visits of the website have not been affected. If only for a few keywords, and these keywords have a good ranking, this strategy seems to be quite good. However, once the search engine changes the ranking algorithm, just like Google this time, this lack of flexibility can hit the South Wall - the correct approach is to lock all possible related keywords, expand the cover of the search results. Face, lay a solid foundation for your search engine strategy. 3-2. The validity of the website structure is to ensure the effectiveness of "wide keyword" strategy, the website needs to organize, so that the search engine can smoothly traverse or retrieve all the webpages in the website. In order to better understand this, let's take a look at how the Search of the SPIDERS traversed the website. In the first visit to the website, Spider first gets a file called Robots.txt to determine whether it allows it to retrieve. When Spiders discovers the link from another site, if there is no declaration in the robots.txt file, SPIDERS saves this page. It is not necessarily the home page, or it may be the other page of your website. SPIDER retrieves this page, it will extract some information about the content and all links on the page, and put it in the search engine database. If it thinks that your website is important, it will retrieve these links later. If you put a set of links to the main part of the website (Site Global Navigation Link), then these link pages are likely to get the SPIDER search. Suppose there is a link to the main content of the website on each page on the website, then the search engine wants to traverse the entire website is really easy.

Thus, the most effective website structure is a top-down or a pyramid design structure. Larger websites (at least ten pages) should set up a website page to list all the internal links of the website. Sun Microelectronics's online master character Jayb Nielsen believes: an important sign of a sight of a website design is whether there is website map. Many people have rejected attitudes to the text links to the website navigation (probably because it is not so beautiful), but is passionate about Flash or dynamic web (DHTML) menu. People who use this navigation method should be careful, because the search engine is currently not well supported this navigation system. But don't worry, just put the text navigation link to the bottom of the page, so that you have maintained your favorite website style, but also provide a more taste to the search engine, which can be happy. The content clear textual link not only allows your website to easily get SPIDERS retrieval, but also for our true visitors. Watch a question when you do a text link: Try to let the content of the website can be opened up to two or three clicks, you can create a website map to put these links together. 3-3. Creating a large number of optimized website content sites is like a framework. We already know how to build a friendly website structure, and then fill in the structure. Remember all related keywords and corresponding modifications found after completing step one (3-1), now we can put on the relevant main keywords in the following places, and Applique: 1. Page 2. HTML Source Code and Description Meta Keywords & Meta Description 3. Main Title text on the page <Heading> and part of the title text <headings (H1- H6)> 4. Paragraph (<p> </ p>) and list (<p> </ p>) in the text (<list> </ list>) do not have to worry too much about "density" such as "Keyword" . It is only necessary to use keywords naturally, using different words variations, and supplemented properly. And only a few keywords appear in the text content you can see on the page, and put these keywords on the page is OK. There is no need to "fill" "fill" to other unpryngeable places. Suppose you have drawn a lot of content for the website you want to build - the website that is quite a lot of content is a task that is a fearful task. In fact, there are some people's expenditure and turn, and use "doors" such as machine automatically generated "do not believe in these speculative methods, they will only increase your possibility of search engine punishment) Sex rather than the ranking you want. Moreover, the development of the website content is not so difficult. If you are doing this job yourself, you can do a few steps to do your job, so you have much easier. For example, if you do 10 steps, you need to create 10 pages in each step, you can complete one or two pages per day, wait for one step to complete the next step. It takes two or three months to build a site that is used in both useful sites. Don't forget to add fresh content to the website - even if you only add some content a week, the user will always see the quality of the content. And the more the content of the website, the more the search engine users find it chance. 3-4. Link Policy Now your website is not only beautiful and practical, not only rich in content, but also the keywords covers the query conditions that all users may use. Each page is a model of optimization, and is greatly linked to each other ... Can you pine up? Don't worry, the revolution has not been successful. If you are stagnant now, then you will be disappointed. Search engines will not be favored by a website without any external link. Because external links often means that other websites in the industry have your cognition.</p></div><div class="text-center mt-3 text-grey"> 转载请注明原文地址:https://www.9cbs.com/read-110958.html</div><div class="plugin d-flex justify-content-center mt-3"></div><hr><div class="row"><div class="col-lg-12 text-muted mt-2"><i class="icon-tags mr-2"></i><span class="badge border border-secondary mr-2"><h2 class="h6 mb-0 small"><a class="text-secondary" href="tag-2.html">9cbs</a></h2></span></div></div></div></div><div class="card card-postlist border-white shadow"><div class="card-body"><div class="card-title"><div class="d-flex justify-content-between"><div><b>New Post</b>(<span class="posts">0</span>) </div><div></div></div></div><ul class="postlist list-unstyled"> </ul></div></div><div class="d-none threadlist"><input type="checkbox" name="modtid" value="110958" checked /></div></div></div></div></div><footer class="text-muted small bg-dark py-4 mt-3" id="footer"><div class="container"><div class="row"><div class="col">CopyRight © 2020 All Rights Reserved </div><div class="col text-right">Processed: <b>0.052</b>, SQL: <b>9</b></div></div></div></footer><script src="./lang/en-us/lang.js?2.2.0"></script><script src="view/js/jquery.min.js?2.2.0"></script><script src="view/js/popper.min.js?2.2.0"></script><script src="view/js/bootstrap.min.js?2.2.0"></script><script src="view/js/xiuno.js?2.2.0"></script><script src="view/js/bootstrap-plugin.js?2.2.0"></script><script src="view/js/async.min.js?2.2.0"></script><script src="view/js/form.js?2.2.0"></script><script> var debug = DEBUG = 0; var url_rewrite_on = 1; var url_path = './'; var forumarr = {"1":"Tech"}; var fid = 1; var uid = 0; var gid = 0; xn.options.water_image_url = 'view/img/water-small.png'; </script><script src="view/js/wellcms.js?2.2.0"></script><a class="scroll-to-top rounded" href="javascript:void(0);"><i class="icon-angle-up"></i></a><a class="scroll-to-bottom rounded" href="javascript:void(0);" style="display: inline;"><i class="icon-angle-down"></i></a></body></html><script> var forum_url = 'list-1.html'; var safe_token = 'pq70wK7_2FP48jOVru4K7FPc5R18_2FJLhoYQ_2BlXET_2FOseo7WwRo1HMDUD_2FT5iIqGc2dD4sime0NRtpvDoTW9H_2F69A_3D_3D'; var body = $('body'); body.on('submit', '#form', function() { var jthis = $(this); var jsubmit = jthis.find('#submit'); jthis.reset(); jsubmit.button('loading'); var postdata = jthis.serializeObject(); $.xpost(jthis.attr('action'), postdata, function(code, message) { if(code == 0) { location.reload(); } else { $.alert(message); jsubmit.button('reset'); } }); return false; }); function resize_image() { var jmessagelist = $('div.message'); var first_width = jmessagelist.width(); jmessagelist.each(function() { var jdiv = $(this); var maxwidth = jdiv.attr('isfirst') ? first_width : jdiv.width(); var jmessage_width = Math.min(jdiv.width(), maxwidth); jdiv.find('img, embed, iframe, video').each(function() { var jimg = $(this); var img_width = this.org_width; var img_height = this.org_height; if(!img_width) { var img_width = jimg.attr('width'); var img_height = jimg.attr('height'); this.org_width = img_width; this.org_height = img_height; } if(img_width > jmessage_width) { if(this.tagName == 'IMG') { jimg.width(jmessage_width); jimg.css('height', 'auto'); jimg.css('cursor', 'pointer'); jimg.on('click', function() { }); } else { jimg.width(jmessage_width); var height = (img_height / img_width) * jimg.width(); jimg.height(height); } } }); }); } function resize_table() { $('div.message').each(function() { var jdiv = $(this); jdiv.find('table').addClass('table').wrap('<div class="table-responsive"></div>'); }); } $(function() { resize_image(); resize_table(); $(window).on('resize', resize_image); }); var jmessage = $('#message'); jmessage.on('focus', function() {if(jmessage.t) { clearTimeout(jmessage.t); jmessage.t = null; } jmessage.css('height', '6rem'); }); jmessage.on('blur', function() {jmessage.t = setTimeout(function() { jmessage.css('height', '2.5rem');}, 1000); }); $('#nav li[data-active="fid-1"]').addClass('active'); </script>