Why will Helpful Content Update prioritize experts?
Google's latest update, along with MUM, BERT, and several others, prioritizes content . Since Hummingbird, Google – and its competitors/admirers – have been trying to make search as semantic as possible. And in the absence of adoption – or misuse – of Schema.org , it is in
In this context, the Helpful Content Update (HCU) aims to identify quality content produced by humans, for humans. It's a bold proposal. It puts Google face-to-face with AI-generating content on one side, and bad content producers on the other. To do this, it will need to seek support from a group that is gaining more relevance for Google every year: the experts.
Here, we will explain the thesis that experts are the preferred group for Google, and how this impacts content production. The reasoning follows:
Purpose of the Helpful Content Update
This update , like others that came before it, aims to defend Google's greatest asset: reliability. The value of search engines comes from their ability to meet a demand. To fulfill an agreement. And that agreement is to deliver the answer to the user.
Unlike McDonald's, where we go in expecting a specific burger flavor, on Google we expect the answer to a question. Like fast food, we want it quick, hot, and tasty.
As a for-profit company, Google needs to ensure the relevance of its assets. The most important of these is its algorithm . By valuing it—and this is where HCU comes in—the company gains financially and also on another front that we cannot forget: politically.
HCU and financial valuation
Looking at Google's numbers, the company continues to value and invest in new business . The "Other Bets" segment of Google's investment portfolio has been consistently losing money for years. This is because they gamble. They believe that one of the dozens of startups they acquire or business models they create will yield monstrous returns.
HCU goes against the grain: it's a bet on the company's main asset. Its results will impact the company's main source of capital. And for that reason, its relevance is not insignificant.
HCU and political valorization
A constant risk for companies is when they begin to engage with politics. In Brazil, business owners have had their tax secrecy breached, a move that will impact their companies. Their image is already tarnished.
On the other hand, scandals like those involving Eike Batista's companies show the risks of dealing with politics. Search engines are among the main drivers of decision-making today. The Band debate, the first presidential debate of the year, placed YouTube metrics alongside those of career journalists. Search engines are political.
HCU enters the game – also – as a strategy to distance itself politically. It's no coincidence that the tool claims to "average the opinions of experts." Expertise developed before and flourished during the pandemic will also be used in the more specific contexts of niche markets.
How does Helpful Content Update work?
The impacts and details of HCU are still to come. The update, which hasn't yet completed its full scan, will impact websites . SEO gurus will be looking for case studies to attest to its quality and sell the "how-to" of the process. And until then, maybe John Mueller will say something.
What we can verify is Google's history and what has already been done, in addition to the objectives (which we listed above). The HCU emerges as a follow-up to the MUM . The Multitask Unified Model creates consensus among experts using AI , and then delivers the results.
Both are part of the same movement on Google's part: one that is both algorithmic and political .
The aspects we know Google analyzes are the metadata of the pages. Its good structure, from a semantic point of view , will optimize the page's indexing. What we can see is that the content – the body of the page or article – will come into play, with terms of authority and reference terms being calibrated to inform the algorithm of the author's level of expertise.
And it is precisely at the interface between metadata and content that specialists gain an advantage over generalists.
Author metadata for HCU
The more authoritative the portal, the more detailed the metadata. In the finance field, for example, few are ahead of Investopedia in English. And its semantic structure of author, reviewer, and fact-checker in the metadata is impeccable.


The CBOT vs. the CME: What's the Difference? (investopedia.com)
Here we see the level of detail Investopedia goes into when defining its authors. The degree of specificity doesn't change for the reviewer and fact-checker. Arguments from authority and other metadata (links to social networks and websites, for example) are used in abundance. The goal? To build credibility for the mechanism.
At this point we enter the realm of conjecture, but considering that this data tends to repeat itself across pages, it would be a great waste not to use it to evaluate rankings and cross-reference statistics. If I know that the "Investopedia Team" publishes dozens of articles on the subject monthly, and that it is read, accessed, and meets the information of its readers, then the Investopedia entity must have some value. Right?
From this point on, the author's metadata gains semantic meaning. The Investopedia Team writes in this way (article 1). And also in this other way (article 2). After 40, 60 articles, it gains identity. However, if I put 60 laypeople to write these articles, what do I get? Inconsistency and incompetence.
How to optimize author metadata
HCU author metadata can be optimized. It's important to remember that this data provides semantic and can refer to other entities . The closer these entities are to their semantic universe, the more relevant their mentions are in the author metadata.
This short checklist can help you put together your metatag:author biography.
- Educational institutions – acronym, course, and university (if applicable);
- Years of experience ;
- Places where you have worked;
- Positions previously held;
- Published articles (scientific articles aside);
- Published books.
Quality of original content
Now the quality of the original content will be analyzed. Once I write my texts, I guarantee uniformity. Certain expressions, figures of speech, and anecdotes tend to be repeated. The subjects in which I am an expert also appear more frequently. They become clearer. And I make the search engine .
Consequently, if I am an authority, the author metadata associated with me tends to receive that score. It's tied to a valuable product. Relevant information. And the next quality text reinforces that impression. As does the next. And the next. And so on.
The author – both the metadata creator and the flesh-and-blood individual – will need to combine both content expertise (a term I first saw in a post by Gustavo Rodrigues ) and content quality. In the finance market, this is very common. Terms like Capex, net profit, amortization, and technical analysis are for the "insiders," and they mark competence. On the other hand, they alienate the majority of readers.
How to optimize your content experience
Unlike before, updates require you to no longer invest in regularly repeating keywords. Don't even think about keywords . The goal is to provide context to the information in a pleasant way.
Here, it's important to understand the dichotomy between easy and difficult versus simple and complex. What is complex can be easy, and what is simple can be difficult. The goal of content experience is to optimize complex content so that it becomes increasingly easier. And to understand that simplicity has no excuse for being difficult.
We've considered several strategies to help improve the information consumption experience:
- Be clear about the search intent : "What time is it?" doesn't require an answer longer than 5 digits (including the colon). However, "How to do semantic SEO on my blog" might require more than 3,000 words. This intent will guide your content.
- Seek out dialogue : Engaging content is dialogic. It's no coincidence that we want to talk to someone when we have a problem with the bank, or when we feel blocked in asynchronous and remote classes. Dialogic communication appears in short sentences, in a middle ground between formality and avoiding complex words.
- Write and revise : The original content, the first version, will rarely be the best. Even with years of practice, there will be a need to revise the content. If possible, dedicate a few hours between finalization and revision. Ideally, a whole day.
- Grammar matters : Incorrect text is painful. Like a bumpy road. The view may be wonderful, but it hinders understanding. It definitely doesn't care about the user.
- Stay within your scope : Unlike what social media preaches, we don't seek information from those who know everything about everything. At least, Google doesn't want that. So focus on your expertise. Value your studies.
Signal your limits: Since we don't know everything, make that explicit. "We won't go into that because we're not (insert profession here)." This gesture enhances your credibility. It shows that you know your limits.
How can we make them both work together?
Author metadata and content quality need to work together to tell Google that you deserve to be read. In the meantime, there are ways to make them work together, and this becomes clear in the context of Investopedia: review.
The "reviewedby" metadata argument allows the injection of a second author, permitting two possible combinations:
| Author | Reviewer | Profile |
| SEO Copywriter | Specialist | Electioneer |
| Specialist | SEO Copywriter | Qualifier |
In the "Investigator" profile, the Specialist will work on SEO writing. They need to have some understanding of the "how" and "why" of an argument, but will bring precision to the use of terms. Artistic freedom is somewhat restricted, but factual accuracy brings credibility.
In the "Qualifier" profile, the informational product will have a Specialist writing something more technical, and the SEO Writer will optimize it. This will lead to anecdotes or storytelling appearing in the middle of the text, supported by the technical explanations. It also favors something that Google already invests in: fact-checking.
Since the launch of Google's fact-checking search and the contracts between the company and Poynter, fact-checking has gained preponderant attention from the search giant. This leads to a scenario where third-party validation is encouraged. And that's what a reviewer does: brings consensus to a page's metadata.
This article was written by Victor Gabry , UX Researcher and SEO Copywriter at Americanas.com. Victor seeks to specialize in the areas of technology (SaaS) and finance to make technical subjects easy to understand.




