E-commerce SEO guide: New documentation from Google
With COVID forcing many retailers online, there are more e-commerce options than ever. Google Search Central recently released new guidelines for developers to help improve search visibility for e-commerce sites. “When you share your e-commerce data and site structure with Google, Google can more easily find and parse your content, which allows your content to show up in Google Search and other Google surfaces. This can help shoppers find your site and products,” Google said in the guide.
Where content can appear. The guide says that e-commerce content can actually appear in more results than just traditional search. These include Google Search, Images, Lens, the Shopping tab, Google My Business,\, and Maps. “Product data is the most obvious type of e-commerce related content, but other types of information can also be useful to shoppers at different stages of their shopping journey,” according to the guide. Google recommends promoting content like product reviews, offers, customer service touchpoints, and even live streams.
Adding product data. Structured data can also help your e-commerce products show in Google search properties. The guide recommends the following ways to show Google what your products are:
- Include structured data in your site’s product pages.
- Tell Google directly which products you want to show on Google by uploading a feed to Google Merchant Center.
Read more: E-commerce SEO guide: New documentation from Google
The future of attribution is data-driven
As the industry continues to evolve, last-click attribution will increasingly fall short of advertisers’ needs. The most successful marketers will switch to a data-driven approach. While Google Ads offers data-driven attribution, some advertisers haven’t been able to use it due to minimum data requirements or unsupported conversion types. To help all advertisers take advantage of better attribution and improve their performance, we’re removing the data requirements and adding support for additional types of conversions. With these improvements, we’re also making data-driven attribution the default attribution model for all new conversion actions in Google Ads.
Better performance with better attribution
Unlike other models, data-driven attribution gives you more accurate results by analyzing all of the relevant data about the marketing moments that led up to a conversion. Data-driven attribution in Google Ads takes multiple signals into account, including the ad format and the time between an ad interaction and the conversion. We also use results from holdback experiments to make our models more accurate and calibrate them to better reflect the true incremental value of your ads. And like all of our measurement solutions, we respect people’s decisions about how their data should be used and have strict policies against covert techniques, like fingerprinting, that can compromise user privacy.
More campaigns, more advertisers
Today, data-driven attribution supports Search, Shopping, Display, and YouTube ads. But we know that data-driven attribution can improve advertiser performance, regardless of campaign or conversion type. That’s why we’re adding support for more conversion types, including in-app and offline conversions. We’re also removing the data requirements for campaigns so that you can use data-driven attribution for every conversion action.
Read more: The future of attribution is data-driven
How Dennis Publishing made first-party data core to its business transformation
In a recent session at the MarTech Fall Conference, Dennis Publishing’s chief product & data officer, Pete Wootton, joined Jackie Rousseau-Anderson of customer data platform BlueConic to explain how the company is scaling its first-party data strategy, including the launch of ‘Autovia,’ a business unit that combines the power of content with e-commerce to establish a highly engaged auto buying audience.
The intersection of product and data
“Customer data has become an instrumental part of our business strategy,” said Wootton, adding that the company uses the consented data it collects to understand its audiences, inform engagement, and drive growth in all areas of the business, including advertising, demand generation, subscriptions, and e-commerce. “All of these efforts are predicated on having high-quality first-party data.”
Expanding CDP use cases beyond marketing
At the start of its CDP journey, Dennis Publishing had a very specific vision for how a CDP could empower its growth-focused teams. But as their knowledge of the technology grew, so did the possibilities.
“We started off with an idea of what our initial use cases would look like, but that evolved and changed over time, and we saw opportunities in certain areas that we didn’t when we began,” said Wootton.
Read more: How Dennis Publishing made first-party data core to its business transformation
MUM brings multimodal search to Lens, deeper understanding of videos and new SERP features
What is MUM?
Google first previewed its Multitask Unified Model (MUM) at its I/O event in May. Similar to BERT, it’s built on a transformer architecture but is reportedly 1,000 times more powerful and capable of multitasking to connect information for users in new ways.
MUM enhancements to Google Lens
Google demoed a new way to search that combines MUM technology with Google Lens, enabling users to take a photo and add a query.
Related topics in videos
Google is also applying MUM to show related topics that aren’t explicitly mentioned in a video. In the example above, the video does not explicitly say “macaroni penguin’s life story,” but Google’s systems are able to understand that the topics are related and suggest the query to the user. This functionality will be launching in English in the coming weeks, and the company will add more visual enhancements over the coming months. It will first be available for YouTube videos, but Google is also exploring ways to make this feature available for other videos.
More announcements from Search On
In addition to the MUM-related announcements above, Google also previewed a more “visually browsable” interface for certain search results pages, enhancements to its About this result box, a more “shoppable” experience for apparel-related queries, in-stock filters for local product searches, as well as the ability to make all images on a page searchable via Google Lens. You can learn more about those features in our concurrent coverage, “Google search gets larger images, enhances ‘About this result,’ gets more ‘shoppable’ and more.”
Read more: MUM brings multimodal search to Lens, deeper understanding of videos and new SERP features