Human or neural network who writes better for SEO

Team TeachWiki

Google did not officially call generated content a stop factor. Algorithms are not against artificial intelligence, but only if they create high-quality content with its help. How does this work in practice? The Contentim studio has prepared a review of interesting experiments on this topic.

Generative neural networks are taking bread away from copywriters. Enterprising webmasters have calculated that texts written by robots are ten times cheaper. And they take noticeably less time. Some advantages.

The question remains how search engines treat such content—whether it is possible to reach the top search results using AI texts.

Google did not officially call generated content a stop factor. Algorithms are not against artificial intelligence, but only if it is used to create high-quality content according to EEAT principles. No spam or attempts to manipulate search results.

How does this work in practice? The Contentim studio has prepared a review of interesting experiments on this topic. Who wins in the fight for the top search results - authors or robots?

Case The Verge

Let's start small. The Verge published a fun article about the best printer of 2023, which was written by ChatGPT. Short, only 300 words. Surprisingly, it made it to the first page of search results, and this was for a highly competitive query. Outperformed Cnet, TechRadar, New York Times, Forbes.

It would seem that the robots are winning. If the resource is authoritative and has many high-quality links, the generated content can outperform strong competitors. However, the page’s position gradually began to decline, most likely due to negative behavioral factors. However, the article is still on the first page.

Rahul Bhatia's experiment

SEO strategist Rahul Bhatia began receiving questions from clients about how to reduce costs for copywriters using AI - whether a neural network and an editor would be enough to fill and promote a website. In this regard, the expert decided to conduct an experiment and find out how the generated texts would affect the ranking.

To do this, he bought a vacant domain in a technical niche with a good reputation of DA 36 and 478 referring domains. Rahul’s team planned a content strategy for the resource: analyzed competitors and collected keywords. The approach to technical optimization was the same as on other projects, with the only difference - all the content for the site was generated by neural networks.

From December 2021 to March 2022, more than 400 such articles longer than 1,000 words were posted. Most of the materials were written on general topics in list format. They were optimized for low competition keywords and the “Related Questions” section of Google.

Rahul notes that AI copes better with the list format than with longreads . The neural network makes fewer mistakes in it. After two months of the experiment, organic traffic began to grow noticeably—it increased by 327%. Interestingly, the average engagement was 1.13 minutes - visitors actually read the articles. After another two months, traffic increased by 129%.

In May 2022, Google rolled out a large-scale update designed to lower low-quality content, pages with duplicate and weak meta tags, and keyword-rich texts.

results

Immediately after the update was launched, traffic on the site dropped by 23%. In June it continued to decline - by 44.2%.

With this, Rahul completed the experiment. It became obvious that Google had learned to identify generated content and began to demote it in search results.

For comparison, the expert applied the same content strategy, but all the texts were written by the authors. In 6 months, organic search traffic increased by 250%.

Neil Patel Experiment

Neil Patel's team came to similar conclusions . They had as many as 100 AI-generated sites at their disposal. Actually 681, but most of them did not receive statistically significant traffic. Only 100 resources attracted 3,000 visitors per month from organic search.

Of those 100 projects, 53 featured purely AI-generated content, including tags and titles. There were no links on the pages, including internal ones, because neural networks do not know how to add them.

The 47 remaining sites were filled with content that was written by neural networks, but edited by people. The latter added internal and external links, changed meta tags, uploaded images and videos, and made the content generally more useful to readers.

Here's what happened to these resources after Google's algorithm update.

results

Organic traffic for the first group fell by 17.29%, for the second - by 6.38%. Positions decreased by 7.9 and 3.3 points, respectively. Conclusion: neural networks show better results in SEO if the texts they generate are edited by a person.

Reboot experiment

The Reboot team conducted the most objective experiment possible. In previous cases, positions could be influenced by factors not related to the texts. Here they tried to exclude them.

Additionally, content for test sites was created using the new GPT-4 model, which is more advanced than previous versions.

For the experiment, Reboot came up with non-existent keywords for which Google did not show results. A neural network helped them generate them. Then they registered domains with made-up words that had no links or history.

  • co.uk (AI)

  • co.uk (AI)

  • co.uk (AI)

  • co.uk (AI)

  • co.uk (AI)

  • co.uk (Authors)

  • co.uk (Authors)

  • co.uk (Authors)

  • co.uk (Authors)

  • co.uk (Authors)

The guys put together very similar sites using HTML templates and placed them on hosting sites with the same technical characteristics. The colors, fonts, CSS classes and IDs were different.

The texts were prepared so that they were equally optimized for keywords.

We were guided by the following principles:

  1. The materials contain the same number of keywords.

  2. Keys are mentioned in the same places in the text.

  3. The message is the same on all sites.

  4. The length of the texts is approximately the same.

  5. On every site, keywords are placed in positions for better ranking, unless this contradicts the second rule.

  6. There are no external or internal links in the content.

  7. Domains should not have additional content, pages, or authority signals.

Ready texts from people and neural networks were checked using AI detectors . At first the instruments didn't notice the difference. But upon subsequent testing, only one service, Crossplag, did not distinguish between human-generated and human-written content.

The articles were posted on the sites within one hour to exclude the influence of historical factors. We connected a check of loading speed and resource performance, the data was updated every hour. There was no mention of the sites anywhere. During the experiment, no one searched for made-up words on Google. All these measures allowed us to minimize the influence of external factors.

results

Ranking data was collected once a day. When all the sites appeared in the search results, we received the first results. The average position for generated texts was 6.6, for those written by people - 4.4. Of the 25 pairs of sites, in 21 the authors were ahead of the robots in terms of SEO.

conclusions

Experiments show that in most cases, texts generated by neural networks are inferior in SEO to those written by authors . Moreover, if AI texts are edited by people, the results are slightly better. There are exceptions, usually associated with external factors.

So far, artificial intelligence cannot compare with authors in terms of the quality and usefulness of texts, even for search engine optimization purposes. Don’t risk your positions, order texts from professionals. Write to us, let's discuss your project. Let's fill your resource with high-quality expert content - Google will appreciate it.

Comments