AI’s role in traffic management is evolving. AI search engines and tools like ChatGPT and Perplexity are shifting from directing users to external sources to providing information directly. This results in reduced traffic to news websites and blogs, potentially diminishing the need for users to click through to original sources.
AI search engines are now focused on delivering information on the search pages. This shift in user behavior could possibly lead to a decrease in click-through rates (CTRs) and overall traffic to news sites and blogs. Reduced traffic translates to fewer ad impressions and potential revenue losses for news sites and blogs, which rely on advertising as a primary revenue source.
It could also impact subscription models, because only fewer users may be willing to pay for content if they can access information directly from AI search results. This article will further discuss how AI search engines are driving less traffic to news sites and blogs. So, keep reading to learn more.
Why AI search engines are sending less traffic to news sites and blogs?
Recent studies show that news sites and blogs receive 96% less referral traffic from AI engines than traditional Google searches. This is because AI-powered search engines are moving forward, providing answers directly within the search interface rather than directing users to external websites. AI’s current impact on overall traffic is limited, with AI-driven traffic constituting a small percentage of global online traffic.
However, AI is becoming more like an ‘answer engine’ rather than a search engine. AI search engines are not only providing summaries of existing content but also generating new content directly within the search results, this could lead to a shift in how users consume information. Thus, it can be concluded that more people will rely on AI-generated content rather than visiting the original source. This shift poses a challenge for news publishers, who historically relied on search engines for a large share of their traffic.
Some argue that AI interfaces are becoming the ‘gatekeepers’ of information, impacting the attention market and the ability of new publishers to reach the audience. While AI companies are increasingly training their models on data from the web, publishers are concerned about the lack of transparency and the potential for AI to compete with the content it is trained on.
Website owners and publishers may be unaware of the AI companies that access their content as the bots are not properly identified or disclosed. This is a serious issue and the new sites and blogs need to adapt to the evolving search landscape and find new ways to attract and engage users. It is advised to focus on creating high-quality, unique content that AI cannot easily replicate, which includes long-tail keywords and taking advantage of social media and other channels to drive traffic.
Challenges faced by news sites and blogs due to AI search engines
News sites and blogs are the most impacted by AI search engines, including drastic reductions in traffic, potential threats to their business models, and lack of transparency in AI web crawler activity. This happens because AI-generated summaries and answers reduce the need for users to click on original content. There is a substantial decrease in referral traffic, which is 96% less than traditional search.
This decline in traffic translates to lower website views, reduced ad revenue, and a lower ability to sustain operations. Also, news sites and blogs face challenges like rising server costs from increased bot traffic and difficulty identifying and managing AI scraping bots.
AI bots used by search engines scrape and analyze content, leading to increased website traffic and higher server costs for publishers. Thus, publishers are struggling to manage the rising costs associated with hosting and maintaining websites that are being heavily accessed by AI bots. However, identifying and managing AI scraping bots can be challenging as many developers won’t disclose their user agents.
So, publishers are taking legal action against AI companies for alleged intellectual property violations. Analysts warn that if unchecked, AI-generated content and the decline of traditional search traffic will lead to a scenario where high-quality journalism will be undermined. This can force trusted content creators to be out of business while the internet fills with lower-quality information.
Also Read: About NASA Capturing GPS Signals On The Moon
Conclusion
News sites and blogs face significant challenges because of AI search engines, including a drastic reduction in referral traffic and rising server costs due to increased bot traffic. Studies show there is 96% less traffic than traditional search when it comes to news sites and blogs, causing AI to undermine high-quality journalism. Search engines and tools like ChatGPT, Perplexity, and AI Overview are shifting from directing users to external sources to directly providing information on the search results.
They are now becoming more of an answer engine rather than a search engines. This rise in AI content could dilute the search engine’s ability to identify and rank new, valuable content, filling the web with low-quality information. Indeed, AI can create vast amounts of content quickly, but it can also flood the search engines with repetitive material, making it difficult for new sites and blogs to compete.