Although Microsoft has always been committed to search, investing heavily in time and money, but so far, Bing has always been a foil in the search market. Yahoo! is even more savvy. After acquiring all the search engines except Google and Bing, Yahoo! has completely ruined its martial arts and completely abandoned all the search technologies that it has accumulated and acquired over the years. The search function of Yahoo! Using Bing’s technology – can’t help but make people feel awkward: Are you sure that Google is not undercover? One of the most important opponents is gone?
However, Bing has always been very friendly to SEOs, and Bing engineers have given many constructive opinions to the SEO industry. For example, regarding the construction of external chains, I think the most exciting sentence is what a Bing engineer said. The general idea is that the best external links are those that you don’t even know about their existence.
Bing greatly increases the URL submission limit
At the end of January, Bing’s blog post sent a post saying that the Bing webmaster tool URL submission tool has been upgraded, which greatly increases the maximum number of URLs that the webmaster can submit. From the original, only 10 can be submitted per day, up to 50 per month. It has grown 1,000 times and can now submit 10,000 per day without a monthly cap. The only restriction is that the website needs to be verified by the Bing Webmaster Tools for more than 180 days:
The post also said a very interesting passage. If the search engine develops in this direction in the future, SEOs may not have to worry about crawling and crawling the page in the future:
We believe that enabling this change will trigger a fundamental shift in the way that search engines, such as Bing, retreive and are notified of new and updated content across the web. Instead of Bing monitoring often RSS and similar feeds or frequently crawling websites to check For new pages, discover content changes and/or new outbound links, websites will notify the Bing directly about relevant URLs changing on their website. This means that ultimately search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content.
Translated to the effect is:
We believe this change will lead to fundamental changes in the way search engines discover and crawl new content, including new or updated content. The webmaster will be able to directly notify the Bing website of URL changes without requiring Bing to discover new pages by monitoring RSS or crawling frequently. This actually means that the search engine can reduce the crawling frequency of the website when it finds and updates the indexed content.
Content indexing no longer depends on crawling?
All along, I think SEO has three major difficulties: content, internal link structure, and external chain construction. Among them, the internal link structure of the website is to solve the problem of effective crawling and crawling of the search engine . It is necessary to grasp the whole, grasp fast, and avoid unnecessary content, and also have to distribute the weight reasonably. The bigger the station, the harder it is to solve this problem perfectly. SEOs who have done a big job must have a deep understanding. No matter how the website structure is adjusted, 100% of the pages cannot be indexed.
Bing’s post tells SEOs that in the future it is very likely that you don’t have to worry about crawling. With a new page, or an old page with an update, you can submit the URL directly (via the webmaster tool or API) without having to rely on it. Bing’s spiders crawl and crawl. In fact, Bing spiders will greatly reduce crawling, but will not delay indexing new content.
Subsequent SearchEngineLand interviews with Bing search engineers confirmed that reducing crawling is indeed Bing’s goal. They hope that in the next few years, Bing will no longer rely on crawling to discover new content. Bing believes that the method of crawling websites is inefficient and resource-intensive, and it is not as good as the webmaster to submit content directly. Bing said that submitting a URL is a “very strong” signal for page crawling. To get a quick index of content, use the submitting tool.
Indeed, large websites, pages are crawled, it may take a few weeks, or even longer, then there are new pages on the site, or the old page content is updated, to be crawled, there may be a delay of several weeks. Submitting a URL is a faster way.
In fact, Baidu has been doing similar things. Submitting content on the Baidu resource platform is a good way to quickly crawl new pages. The hourly inclusions can be included in a few hours, sometimes in a few minutes. However, the Baidu resource platform submission does not seem to support the old but updated pages.
Potential impact on the SEO industry
If Google follows this approach in the future, it may have a small impact on the SEO industry. The way search engines discover, crawl, and index new content becomes simpler and more straightforward, and the need and difficulty of SEOs to adjust the structure of their sites will be greatly reduced.
However, there are also potential problems. For example, black hat SEOs will obviously not miss this opportunity. It has become more and more difficult for spam to be crawled by crawling, because crawling from one page to another is a process of authentication and filtering. After many active submissions of spam pages become possible, what should the search engine do?
For example, if the search engine reduces crawling, it saves resources, but the number of old, non-updated pages that are re-crawled will be greatly reduced. Will this lead to inaccurate and untimely calculation of link relationships? By the way, I don’t know why people always say that links are not important to rankings, but they are still very important.
The basic approach to SEO has not changed much over the past decade, but SEO is still a constantly changing industry.