Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch
Google Discover is one of the most sought-after traffic sources by publishers. It’s also one of the most confusing from a visibility standpoint.
For many, it’s an enigma. Some publishers receive millions of clicks per month, while others receive absolutely none. And a ton of traffic can turn into no traffic in a flash, just like when a broad core update rolls out.
Google has explained that it’s looking for sites that “contain many pages that demonstrate expertise, authoritativeness, and trustworthiness (E-A-T)” when covering which content ranks in Discover. Strong E-A-T can take a long time to build up, or so you would think. For example, when a new site launches and has no history, no links, etc., it often can take a long time to cross the threshold where it can appear in Discover (and consistently).
That’s why the following case study is extremely interesting. I’ll cover a new site, run by a well-known person in the tech news space, and the site broke into Discover in a blistering four weeks from its launch. It is by far the quickest I have seen a new site start ranking in Discover.
And if you’re in the SEO industry, I’m sure you’ll get a kick out of who runs the site. It’s no other than Barry Schwartz, the driving force behind a lot of the news we consume in the SEO industry. But his new site has nothing to do with SEO, Search Engine Roundtable, or Search Engine Land.
Or does it?
Let’s jump in.
Unrelenting writing and publishing
As many of you know, Barry’s work ethic is insanely strong. He’s written more than 30,000 articles about search. When he decides to do something, lookout.
So as you can guess, Barry took his Search Engine Roundtable blogging process and employed that for Lucid Insider, his blog dedicated to news about Lucid Motors, a car manufacturer producing the luxury electric sedan Lucid Aid. He blogs every day, with multiple posts covering what’s going on in the Lucid World.
Time To Discover Visibility (TTDV)
Haven’t heard of TTDV yet? That’s because I just made it up.
Lucid Insider started surfacing in Discover just four weeks from the time the site launched. For the most part, that’s before any (consistent) strong signals could be built from an E-A-T standpoint, before earning many links, before publishing a ton of content on the topic, etc.
Articles and Web Stories ranking in Discover
After Barry launched Lucid Insider, I pinged him and said he should build some Web Stories, especially as Discover showed signs of life for Lucid Insider. I have seen first-hand what Web Stories can do visibility-wise in search and Discover.
Source: Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch
We’ve crawled the web for 32 years: What’s changed?
It was 20 years ago this year that I authored a book called “Search Engine Marketing: The Essential Best Practice Guide.” It is generally regarded as the first comprehensive guide to SEO and the underlying science of information retrieval (IR).
I thought it would be useful to look at what I wrote back in 2002 to see how it stacks up today. We’ll start with the fundamental aspects of what’s involved with crawling the web.
It’s important to understand the history and background of the internet and search to understand where we are today and what’s next. And let me tell you, there is a lot of ground to cover.
Our industry is now hurtling into another new iteration of the internet. We’ll start by reviewing the groundwork I covered in 2002. Then we’ll explore the present, with an eye toward the future of SEO, looking at a few important examples (e.g., structured data, cloud computing, IoT, edge computing, 5G),
All of this is a mega leap from where the internet all began.
Join me, won’t you, as we meander down the search engine optimization memory lane.
Clearly, this image didn’t earn me any graphic design awards. But it was an accurate indication of how the various components of a web search engine came together in 2002. It certainly helped the emerging SEO industry gain a better insight into why the industry, and its practices, were so necessary.
Although the technologies search engines use have advanced greatly (think: artificial intelligence/machine learning), the principal drivers, processes, and underlying science remain the same.
An important history lesson
We use the terms world wide web and internet interchangeably. However, they are not the same thing.
You’d be surprised how many don’t understand the difference.
The first iteration of the internet was invented in 1966. A further iteration that brought it closer to what we know now was invented in 1973 by scientist Vint Cerf (currently chief internet evangelist for Google).
The worldwide web was invented by British scientist Tim Berners-Lee (now Sir) in the late 1980s.
Why you need to know all of this
The web was never meant to do what we’ve now come to expect from it (and those expectations are constantly becoming greater).
Berners-Lee originally conceived and developed the web to meet the demand for automated information-sharing between scientists in universities and institutes around the world.
Source: We’ve crawled the web for 32 years: What’s changed?
GA4 isn’t all it’s cracked up to be. What would it look like to switch?
Google Analytics is the top player when it comes to tracking website visitors. The platform’s value is reflected in its popularity, which is why it’s the market leader boasting an 86% share. But with great value comes great responsibility, and Google Analytics lacks in that department.
Designed to maximize data collection often at the expense of data privacy, Google Analytics and its mother company, Google LLC, have been on the radar of European privacy activists for some time now. Reports of questionable privacy practices by Google have led to legal action based on the General Data Protection Regulation (GDPR) that might result in a complete ban on Google Analytics in Europe.
On top of that, Google recently announced it will end support for Universal Analytics in July of 2023, forcing users to switch to Google Analytics 4 (GA4). So, if the switch must be made, why not seek a new analytics provider? Great free and paid solutions allow organizations to balance valuable data collection with privacy and compliance. With a GDPR-compliant analytics solution in place, your data collection becomes as it should be predictable and sustainable.
Source: GA4 isn’t all it’s cracked up to be. What would it look like to switch?
What to look for in a technical SEO audit
According to Techradar, there are more than 547,200 new websites every day. Google has to crawl and store all these sites in their database, occupying physical space on their servers.
The sheer volume of content available now allows Google to prioritize well-designed, fast sites and provide helpful, relevant information for their visitors.
The bar has been raised, and if your site is slow or has a lot of jargon in the code, Google is unlikely to reward your site with strong rankings.
If you really want to jump ahead of your competitors, you have a huge opportunity to be better than them by optimizing your site’s code, speed and user experience. These are some of the most important ranking signals and will continue to be as the internet becomes more and more inundated with content.
Auditing your website’s technical SEO can be extremely dense and with many moving pieces. If you are not a developer, it may be difficult to comprehend some of these elements.
Ideally, you should have a working knowledge of how to run an audit to oversee the implementation of technical SEO fixes. Some of these may require developers, designers, writers or editors.
Fortunately, various tools will run the audits for you and give you all the comprehensive data you need to improve your website’s technical performance.
Not-so-seamless migration
GA4 introduces a different reporting and measurement technology that is neither well understood nor widely accepted by the marketing community. There is no data or tag migration between the platforms, meaning you’d have to start from scratch. The challenge grows with the organization’s size—you can have hundreds of tags or properties to move.
Limits on custom dimensions
A custom dimension is an attribute you configure in your analytics tool to dive deeper into your data. You can then pivot or segment this data to isolate a specific audience or traffic for deeper analysis. While GA4 allows you to use custom dimensions to segment your reports, there’s a strict limit—you can only use up to 50.
Lack of custom channel grouping
Channel groupings are rule-based groupings of marketing channels and, when customized, allow marketers to check the performance of said channels efficiently. Unlike Universal Analytics, GA4 does not allow you to create custom channel groupings in the new interface, only default channel groupings.
Source: What to look for in a technical SEO audit
More news:
New TikTok Tool Surfaces Useful Insights For Marketers
Microsoft Bing drops anonymous sitemap submission due to spam issues