Episodes

  • The New Semantic SEO Stops AI Invisibility
    Dec 29 2025

    In this final episode of the first season, we focus on the Semantic Workflow, created by Alexander Rodrigues Silva, drawing on 20 years of experience optimizing digital projects and connecting Semantic SEO to Library Science and Information Science.


    In it, we explain why the Semantic Workflow was created and summarize its main points. Then, we explain Domain Analysis in more detail as a replacement for keyword research, and how the creation of other artifacts—controlled vocabulary and taxonomy—is the perfect complement to this new SEO strategy.


    This episode was based on the book "Semantic SEO - Semantic Workflow," available on Amazon.

    Show More Show Less
    20 mins
  • Google is an information organizer
    Dec 22 2025

    The episode is based on an article by Birger Hjørland, entitled "Fundamentals of Knowledge Organization" (Fundamentos da Organização do Conhecimento), which examines in depth the field of Knowledge Organization (OC), especially in Information Science and Library Science (LIS).


    The author argues that OC is a vast interdisciplinary field, historically fragmented and excessively influenced by technological development, and outlines five stages of technological development, including manual indexing and citation-based recovery.


    Most importantly, the connection between the organization of information and the development of the two most used mechanisms in the world: Google! Find out how Google is intimately connected to the information organization.

    Show More Show Less
    41 mins
  • The Death of Sources: Why AI Answers Are Triggering a Digital Earthquake and How to Survive the Attribution Crisis
    Dec 15 2025

    In this episode, we discuss the profound shift in the search paradigm brought about by the rise of Generative Artificial Intelligence (GAI) and Large Language Models (LLMs).


    It was based on the article I published yesterday, which was initiated after I saw the video presented by Professor Jenna Hartel, offering a detailed analysis of Olaf Sundin's conference paper. The paper theorizes how GAI is forcing a re-evaluation of the concepts of search, sources, and information evaluation.


    To write the blog post, I used Sundin's work as a starting point to argue that GAI is causing the "death of sources," since systems now provide direct answers instead of directing users to source documents.


    We then agreed that this shift undermines the traditional evaluation of information and concluded that the SEO solution (Search Engine Optimization) professionals should focus on semantic SEO, making content structured and semantically rich so that it becomes a reliable source of facts that feeds AI algorithms.

    Show More Show Less
    23 mins
  • The Semantic Backbone of Modern AI
    Dec 2 2025

    This episode is based on the Semantic Blog post and offers a comprehensive overview of the fundamentals, development, and advanced applications of ontologies in the technology landscape between 2022 and 2025.


    We define ontologies as the backbone of semantic understanding in Artificial Intelligence (AI) systems, detailing their essential components (classes, properties, and axioms) and distinguishing them from Knowledge Graphs (KGs).


    You will hear about Ontology Engineering, including development methodologies and the growing synergy with Large Language Models (LLMs), which are used both to accelerate ontology creation and to improve their accuracy.


    We conclude the episode by illustrating the versatility of ontologies across use cases, including Explainable AI (XAI), semantic information retrieval, and the integration of heterogeneous data in sectors such as healthcare and construction.

    Show More Show Less
    59 mins
  • LLMs, Knowledge Graphs, and Semantic SEO: How the Fusion of Artificial Intelligence and Connected Data Redefines Search.
    Nov 25 2025

    Hello! I recently had one of those conversations that makes us pause and reflect on the speed at which the world of search is evolving.


    In this episode, we dive into the heart of this change, exploring the synergy between three forces that are transforming the way we access and create information on the web: large-scale language models (LLMs), such as GPT or Gemini, knowledge graphs (KGs), and the search engines we use every day.


    On the one hand, we have the fluidity of LLMs' language and, on the other, the precision and factuality of KGs, which map entities and their relationships. What happens when they come together? Search is no longer just a list of links; it becomes an intelligent system, capable of understanding the semantic intent of your query, resolving ambiguities, and delivering direct and accurate answers, rather than just blue links.


    We discussed the techniques that make this possible, such as Retrieval-Augmented Generation (RAG), which combats the obsolescence of LLMs and reduces the dreaded "hallucinations" when querying direct facts from KGs.


    And, of course, as an International SEO Specialist and consultant focused on Semantic SEO, the big question that drives me is: how does this redefine website optimization? The focus now shifts from the keyword to the meaning, structure, and semantic quality of content. It's a call for valuable content that demonstrates E-E-A-T (Experience, Expertise, Authority, and Trust) and for the correct implementation of structured data (Schema.org), ensuring that your brand is a well-defined entity in the search engine's knowledge map.


    Finally, we leave you with a thought: will the convenience of having ready-made answers from AI make us lose the serendipitous discovery and critical thinking that comes from exploring the web?


    Join us to understand this complex yet fascinating interconnection and discover the new priorities for your SEO.


    You can use my Notebook to ask questions and learn more about this topic: https://notebooklm.google.com/notebook/a98aa6db-9931-473f-836c-6a0b0f4bb3a6

    Show More Show Less
    39 mins
  • What is Semantic SEO?
    Nov 18 2025

    This episode offers a detailed definition of Semantic SEO, contrasting it with traditional SEO practices based on search volume and keywords. Semantic SEO is presented as a method that focuses on constructing meaning and aligning concepts to help search engines understand content, increasing its perceived quality and relevance.


    We emphasize that Semantic SEO does not use keywords and differs from terms like Entity SEO and Topic Cluster, although it does use entities to resolve ambiguities. The essence of this approach lies in analyzing domain knowledge and structuring the project from the inside out, rather than focusing externally on competitors or Search trends.

    Show More Show Less
    29 mins
  • From Saga to ODKE+: The Automated Evolution of Trustworthy Knowledge Graphs and LLM Grounding
    Nov 11 2025

    Address the issue. Today, we have a different episode. We analyze ODKE+, a pipeline that promises to solve a significant problem with graph usage in Semantic SEO projects: their freshness and reliability.


    The article comprehensively introduces ODKE+, a production-grade system developed by Apple for the automatic extraction of open-domain knowledge from web sources, using Large Language Models (LLMs) and ontologies. This system is designed to maintain up-to-date and complete knowledge graphs (KGs), addressing the challenges of volume, variety, veracity, and velocity associated with online information. ODKE+ employs a modular pipeline that includes components such as an Extraction Launcher to detect obsolete facts, a hybrid Knowledge Extractor (based on patterns and ontology-driven LLMs), and a Corroborator with a lightweight LLM for grounding verification, ensuring high factual accuracy of 98.8% across millions of ingested facts. The system supports both batch and streaming modes, significantly improving data coverage and freshness compared to traditional methods.

    Show More Show Less
    28 mins
  • Six Shocking Secrets: Unpacking the Transformer, Attention, and the Geometry of LLM Intelligence
    Nov 4 2025

    This episode is based on an article written by Alexander Rodrigues Silva of the Semantic SEO blog, which presents an in-depth analysis of the inner workings of Large-Scale Language Models (LLMs), particularly the Transformer engine and its central Attention component. The author shares six surprising discoveries that emerged from a series of interactions with his AI agent, offered as a service called Agent+Semantic. The explanations focus on how words acquire contextual meaning through initial vectors and internal Query, Key, and Value dialogues, showing that meaning is encoded as geometric directions in a multidimensional space. Finally, the text demystifies the concept of machine "learning," comparing it to a mathematical optimization process, like a ball rolling downhill in the cost function.

    Show More Show Less
    30 mins