Clients are asking web agencies new, uncomfortable questions: Will this site perform in Traditional & AI Search? How do LLMs interpret our content? Are current build choices helping or hurting our visibility?
For this webinar, Cara Corbett will be joined by our Head of Technical SEO, Filip Ruprich, to discuss the modern SEO and GEO standards that underpin leading web design in 2026. Together, they’ll cover:
- How AI consumes, chunks, and summarises website content
- Why CMS structure, hierarchy, and semantic HTML matter more than ever
- Protecting SEO performance during website migrations
- Common build pitfalls even leading web design agencies are making
- What dev teams should own, and where specialist support makes sense
Scroll down to watch the recording.
Click here to access the presentation slides.
Webinar Transcript
00:00:00 –> 00:00:37
Thanks everyone for tuning in today, welcome to the webinar. This one’s called ‘Building Websites for Humans, Search Engines and Now AI’. I’m Cara Corbett, the partner growth manager at SUSO, and today I’m joined by our head of SEO, Filip Ruprich — SUSO webinar first-timer, so go easy on him. We’re here to talk about something very prevalent right now for pretty much anyone that handles or builds websites. That’s how do we structure and build websites properly so that they can be found and used by all the different types of visitors in 2026 — whether that’s humans, search engine bots, or now even agents.
00:00:38 –> 00:01:08
So here’s a quick look at what we’re going to be covering today. First, we’re going to walk through how these search engines find and extract website content, expanding from Google to AI tools like ChatGPT. And this is really going to give us some good perspective and insights into the technical foundations that we need to be considering when we’re building websites, which we’re then going to review in the second section together. And then after that, we’ll go through some of the most common mistakes that we see in website builds that can impact your online visibility in 2026.
00:01:09 –> 00:01:17
And of course, we’ll have time for some questions at the end, as we always do. As I said, you can pop those into the chat panel on the right side throughout the webinar.
00:01:17 –> 00:01:51
So let’s get into it and we’ll start off with a bit of a reality check. So in 2026, people are no longer only using Google as their main method of search, right? We’ve seen a huge increase in AI tools like ChatGPT and Claude, especially in the last few years. And because of this new user journey, the websites we build aren’t necessarily the final destination anymore. In fact, research from Bain & Company shows that roughly sixty percent of searches on these tools end without a single click because users are getting all that information that they need within the AI response.
00:01:51 –> 00:01:59
So there’s not really any need for them to click into a third party site to find what they’re looking for. So we’re calling this the zero click search reality.
00:02:01 –> 00:02:29
Additionally, we’re now even seeing the early stages of an agentic web where a user might instruct an AI agent to buy a new pair of running shoes for them or maybe make a reservation at a restaurant or a hotel for a specific time and date. And even though the implementation and the adoption of agentic checkout features remain limited at the moment, we just can’t ignore that the entire funnel from initial discovery to conversion can potentially be handled by an agent in a chat without the user visiting the website a single time.
00:02:30 –> 00:03:00
Now, these realities don’t make websites irrelevant, but it does change their role entirely. After all, AI tools have to get their verified and trusted information about a business or an organization or company from somewhere, right? And that’s where we have a really important role to play as website builders and owners, because we’re no longer just building something beautiful for humans to engage with. We’re actively building an information layer to train and fuel AI responses and action.
00:03:01 –> 00:03:14
Agents and bots are increasingly becoming the middleman between your website and the end user. And we need to make it easy as possible for the middleman to find, parse, and extract your website’s information.
00:03:15 –> 00:03:20
But in order to do that, we need to understand how these search engines work at the foundational level first.
00:03:20 –> 00:03:23
So Filip, why don’t you tell us a bit more about this process?
00:03:26 –> 00:03:36
Sure. So on the left side, this is the model we all know. Search engine bots crawl your site, index it, and then rank pages based on the keywords, backlinks, and technical factors.
00:03:37 –> 00:04:14
So the goal was simple. Get the click. On the right side, this is how AI engines works today. They don’t rank pages in the same way. It takes a user’s prompt and breaks it down into multiple sub queries in the background. And that’s called query fanout. Then it retrieves specific pieces of content from the web to address all those sub queries, often through existing indexes like Google or Bing. And then it combines those pieces or chunks and synthesize them into a single answer. So instead of ranking pages, AI is assembling answers.
00:04:15 –> 00:04:26
And this is the fundamental shift in technical priority. We are no longer optimizing pages to be clicked. We are optimizing content to be extracted.
00:04:26 –> 00:05:02
So for years, we built pages. We structured content to be read from start to finish. But AI doesn’t think in pages. It thinks in chunks. And chunks are smaller self-contained pieces of information that can be pulled out, combined, and reused to answer a specific question. So instead of reading your whole page, AI is extracting the exact part it needs. And that means each section of your content now has to stand on its own, clear, structured, and complete.
00:05:03 –> 00:05:42
The shift is simple: we’re now optimizing pieces of knowledge, not pages. But before AI can use those chunks, it first has to find them. There are two ways your brand can show up in AI answers. First, through training data — that’s what the model already knows. If your brand is there, you have a head start. But most of the time, AI relies on the retrieval. It pulls content live from the web to answer a specific question. And this is the key point. Even if you are not in the training data, you can still win because what matters is simple.
00:05:43 –> 00:05:53
Can your content be found? If it’s not retrieved, it doesn’t exist. If you can’t be found, you won’t be used.
00:05:53 –> 00:06:11
So if your content can be found, the next question is, Can it be used? Because in AI search, it’s not enough to exist or even rank. Your content needs to be easy to extract, understand and reuse. And this is exactly where web development plays a critical role.
00:06:12 –> 00:06:18
A great build won’t guarantee visibility, but a poor one can completely block it.
00:06:18 –> 00:06:25
So if your content is hard to parse, poorly structured or hidden, AI simply won’t use it.
00:06:26 –> 00:07:03
So what matters, starting from semantic HTML? So this is how AI understands structure. All of the headings, lists, and tables, these are signals that can help AI split content into usable chunks. So here on the left side, you can see a wall of text, not just without much CSS, but also without the HTML tags. And on the right side, we have another text structured with headings and subheadings, bullet points and tables, which is not just easier for the human eye, but also better chunked for LLM perception.
00:07:06 –> 00:07:45
The next technical factor important for AI visibility is schema markup so marking up content with JSON helps search engines and LLMs understand meaning and context so for example here on the screen you can see two snippets of schema validator test the first one is the web page schema from SUSO digital.com the other one is the blog article also from SUSO digital Moving on, AI builds context through internal links. They show relationships between topics, not just navigation paths. So it’s important to link together related content.
00:07:46 –> 00:07:57
For example, a service page and a supporting blog article, a service and sub services or blog articles from the same category.
00:07:57 –> 00:08:23
And next we have URL structure. So URLs help explain what the page is about. So clear, descriptive URLs make content easier to understand, and folders show how everything is organized. On the screenshots, you can see an example of nice structure part of the web. Then content needs to be visible. If it’s hidden, of course, it doesn’t exist.
00:08:24 –> 00:08:43
So for example, AI may skip tabs, accordions, connected with JS, Heavy rendering. Always remember about server-side rendering. So, overall, it’s pretty simple. The easier your content is to extract, understand, and connect, the more likely it is to build.
00:08:47 –> 00:08:58
So here’s the thing. The decisions you make during a website build or migration can either set your online visibility up for success, or quietly destroy years of potential search equity.
00:08:58 –> 00:09:33
So let’s talk about why SEO has to be a part of the conversation from day one. I’m sure we’ve all been there. A project is nearing the finish line, the client is obsessed with the new UI, and there’s a massive push to hit the go live date. In that environment, it’s pretty tempting to treat SEO as a phase two task after the site is launched. But when we look at the data from hundreds of migrations, That decision creates immediate technical debt. If the SEO isn’t in the room during the discovery, wireframe, and build phases, you’re not just launching a website, you’re launching a recovery project.
00:09:33 –> 00:09:51
So on the left, we can see the search afterthought model. You go through the discovery, the build, and the QA, focusing purely on the functional and the visual, and then you launch. Unfortunately, the result is things like broken redirects, drops in rankings and traffic, and inevitably a bad first impression with AI.
00:09:52 –> 00:10:11
Now, if the LLM crawlers hit a site that is structurally fragmented or explicitly blocks AI crawlers, then they stop citing it. And according to our data, a botched migration can lead to a visibility drop that takes over five hundred days to recover. That’s a year and a half of budget bleed and emergency reworks just to get back to where you started.
00:10:12 –> 00:10:16
Now on the right, this is the integrated workflow that we use at SUSO.
00:10:17 –> 00:10:38
So Discovery includes an SEO audit where we benchmark your current SEO and AI search performance before a single line of code is ever changed. And we’re also happy to review wireframes and make suggestions according to best practices. We ensure that high performing pages and content on the old site have a role to play on the new website and that those high value URLs get mapped to their closest equivalents.
00:10:39 –> 00:10:54
And then our QA includes a full site crawl, where we’re gonna check the new URL structure, internal links, content, meta tags, schema markup, crawl issues, 404 pages, and the site maps that were able to catch any issues before they hit the live server.
00:10:54 –> 00:11:05
When you launch this way, traffic and rankings are preserved, the AI retrieval is green, And the brand maintains and grows both its SEO performance and its citations and AI answers.
00:11:05 –> 00:11:28
So the takeaway is pretty simple here. A beautiful site that can’t be read by an AI is a failed build. So to avoid this, we follow a rigorous four-stage migration checklist from pre-launch benchmarks to post-launch monitoring. And that ensures that the foundation is as polished as the interface. And we’ve included a QR code at the end of this deck so that you can download our full checklist and use it for your own projects as well.
00:11:29 –> 00:11:46
So it’s easy to talk about technical debt in the abstract, but Filip and our technical team see the same patterns emerging across almost every technical SEO audit that we perform. And so we’ve categorized those into three main fail points that may prevent a site from ever reaching an LLM’s response window.
00:11:46 –> 00:11:48
So Filip, why don’t you take us through what those look like?
00:11:50 –> 00:11:57
Sure. So the first type of issue is access. If AI can’t access your content, nothing else matters.
00:11:58 –> 00:12:33
So let’s start with JavaScript rendering. Modern frameworks are great for UX, but they can be a real risk for AI visibility. If your key content is loaded only on the client side, AI bots may never see it. The second issue is blocking AI bots. And I’ve seen that a lot of websites unintentionally block crawlers like GPT bot or Perplexity bot. And if these bots can’t access your site, you won’t show up in AI answers. And the third one is performance, especially time to first byte. AI system needs fast access to content.
00:12:34 –> 00:12:47
So if your server is slow, they may skip it and use a faster source instead. So before anything else, fix access because you are not competing on content yet. You are competing on whether AI can even see it.
00:12:50 –> 00:12:58
And once AI can access your site, the next step is understanding it. And this is where a lot of websites lose.
00:12:58 –> 00:13:11
So let’s start with structure. If everything sits flat, all pages on the same level, AI can’t tell what’s important and what’s not. What’s the core service? What’s the supporting content?
00:13:12 –> 00:13:37
Now, on top of that, no schema. So without structured data, AI has to guess. What’s the product? What’s the location? Who’s the author? And if it has to guess, it’s less likely to trust it. Then, internal linking. This is how AI builds context. If your pages aren’t connected in a logical way, AI can’t understand relationships between the pages. And without that, you can’t build topical authority.
00:13:39 –> 00:13:47
And the last common mistake here, hidden content. If important content is hidden behind massive interaction, it may never be properly extracted.
00:13:49 –> 00:14:04
So the pattern here is the following. Site access, AI needs clear structure, meaning, and connection. Okay, we’ve covered a bit of technical ground and it’s easy for a development team to maybe feel like the goalposts have moved again.
00:14:04 –> 00:14:33
But it’s important to remember that even a technically flawless build is the foundation, not necessarily guarantee of success. In our experience working on joint projects with web dev teams, we’ve seen that non-technical factors also heavily impact how LLMs site a website. So being conscious of these, helps you also stay confident when clients ask why they aren’t showing up yet. And you can clearly demonstrate that the technical foundation is solid and then help the client look at the other two pillars, which are content and authority.
00:14:33 –> 00:14:46
So for content to be cited, it has to be more than just live. It needs to be original, value dense, and fresh. LLMs look for grounding claims backed by original research or verified data.
00:14:46 –> 00:14:53
So if a client’s content is thin or it’s outdated, even the most precise semantic structure won’t be enough to earn a citation.
00:14:54 –> 00:15:21
Then there’s offsite validation. AI models prioritize brands that have third party mentions and backlinks from high authority industry relevant sources. In 2026, context and sentiment are key. The models are looking at social signals like case studies and Reddit discussions and reviews on platforms like Google or G two or Yelp, depending on the model and to verify that they want to make sure that the brand is trustworthy.
00:15:22 –> 00:15:48
Now, what we’ve learned from our successful agency partnerships is that the most profitable projects have a very clear division of labor. The dev teams we work with do what they do best, building high performing sites with the logical structures and the clean semantics under our SEO guidance. They ensure that the site is fast enough for both humans and bots, and that’s the part of the project success that they own. And in those collaborations, our role as SEO partners to handle that AI specific strategy that sits alongside the build.
00:15:48 –> 00:16:03
We provide the intent map content requirements that fill the templates. We manage the digital PR to build the external trust signals that LLMs require. And besides that, we provide an in-depth AI visibility report and measure progress over time with our bespoke reporting dashboard.
00:16:03 –> 00:16:16
So the question we always have to answer at the start of a project is, how do we prove to the client which part of this system needs attention? How do we know if it’s a technical crawl error or maybe a lack of brand authority.
00:16:16 –> 00:16:26
And to solve that, we had to build a way to measure the invisible, which leads us to our AI visibility checker, which is a really cool tool that we have built in-house.
00:16:26 –> 00:16:36
Filip, maybe you want to show us how this works. I’ll just change my screen over to you. Sure, thank you.
00:16:37 –> 00:17:12
So the problem we had was that standard SEO tools don’t show how LLMs see your websites. They track, of course, clicks and keywords, but not how AI interacts with your content or your server. So that’s why we built AI Checker to make AI visibility measurable. So on our websites, you can find this AI search visibility tool under SEO tools in the menu. And here we have a short form that you need to complete the website, your email and the main services or products you want AI to find you for.
00:17:13 –> 00:17:55
Click ‘Generate Report’ and you’ll receive the report in your inbox within a couple of minutes. I generated an example before the webinar to show you what’s inside. Starting from the top, we have the AI visibility score — for this tested domain it shows limited visibility. It’s calculated from over 100 data points across six components of AI visibility across ChatGPT, Perplexity, Gemini, and Grok. One of those components is answer frequency: how visible your brand is for the prompts we generated.
00:17:55 –> 00:18:37
During this process we create 20 prompts and check your brand against each one. If you’re visible, that contributes to your share of visibility — how you compare to competitors in your market. We also check online reputation and brand authority: not just whether your brand appears, but the sentiment behind it. The last two sections I want to highlight — since we already covered them in the webinar — are training data and technical accessibility.
00:18:38 –> 00:19:13
So starting from the training data, I can click on this section. I have all the information that I need. And of course, this is the presence within the data sets and sources used to train AI models. Right now for this tested domain, it’s a weak presence. We can see that there were no information available in this model’s training data in GPT-V-II, V-I. The good information is that there was some information on Gemini free Flash Preview, which means the brand influenced the training data for this model.
00:19:14 –> 00:19:54
But when you scroll down a bit, we can see that most of the LLMs and their models didn’t have any information about this brand. Which means two possible action points. I would definitely go and check the technical accessibility if these models have access to the website and they can be influenced by this brand, by this website, but also some other off-site activities that starting from Google business profile, reviews, listings, citations, mentions, or digital PR, we can also influence the knowledge about the brand.
00:19:56 –> 00:20:36
And going to accessibility, it’s also there. So how easily AI tools are able to access, read, and use your website content? We are checking two things. First thing is AI bot accessibility analysis. So in the table on the left, you can find all of the bots that we are checking, their names, and also the purpose of these bots. And whether they are allowed to access our websites or not with some details like HTTP status, but also robots.txt and the latency. And for this specific example, we can see that GPT bot, the first one was blocked.
00:20:36 –> 00:21:13
It doesn’t have access to our website, right? So that should be really critical point to fix as soon as possible. And the second thing that we are checking is the AI data readability check. And here we check the time to first byte. And ideally, we would like to have the time to first byte below, zero point two seconds here. But for this case, for this tested domain, we have almost two seconds, right? Which means that If the LLMs would like to retrieve some information from our website, it will take a while.
00:21:13 –> 00:21:49
There’s a strong chance LLMs will skip that website entirely — another critical action point. On the bot accessibility: when we looked at this report, it showed GPT wasn’t in the training data. And we can see why — GPT Bot, OpenAI’s training crawler, is blocked. You can diagnose that directly from this report. GPT User, which handles live retrieval rather than training, does have access — so the site could still appear in ChatGPT answers. But it won’t be in the training data while GPT Bot is blocked.
00:21:49 –> 00:22:00
So it’s not to say that this potential site might not ever show up, but they certainly will not be present in the training data if the GPT bot, which is their training robot, doesn’t have access to the site.
00:22:00 –> 00:22:31
Exactly. That’s a good point. And that’s what I wanted to show. There are many more sections on AI visibility in this tool — I encourage you to test it on a few websites. All sections are expandable and have video explainers. Feel free to reach out if you have any questions.
00:22:31 –> 00:23:12
Perfect. Thanks, Filip. Okay, so this tool is just one of the many free resources that we offer in this thing called the SUSO Partner Club, which is a completely free resource for agencies and consultants. I like to call it a community, if you will, but it gives you and your team access to things like free training and workshops. So I jump on a lot of one-to-one workshops with agencies where I teach them the foundational SEO and AI search techniques. The basics covered. We also have a free help desk, which connects your team directly to our SEO and GEO specialists.
00:23:12 –> 00:23:27
So if you’re ever working on something like a migration and you have a quick question or maybe a client has a specific question about AI search that you don’t know the answer to, we have an email or an actual form on our site that you can use and get a response within twenty four hours.
00:23:28 –> 00:23:32
And then we also do free white labeled audits and custom reports.
00:23:33 –> 00:23:48
So these are pretty cool too. You would work directly with me. You would give me the client’s domain, any specific areas that they wanna improve their visibility around, whether that’s keywords or prompts or topics or products or services. And if there’s any competitors, that’s really helpful too.
00:23:48 –> 00:24:17
And then we’ll have our team go and do some manual analysis on the site. We’ll package that up into a fifteen page document. It’ll be white labeled in your branding. We’ll get that to you in about two to three business days. So a lot of free value up front for agency partners. It really helps you build the skills to support your clients with SEO and GEO without investing in it. To date, we’ve got over twenty five hundred members, which is really cool. You can see some of the great agencies that we work with there.
00:24:17 –> 00:24:37
We’d also love to work with you guys as well. So go ahead, join today for free if you haven’t yet. There’s a QR code on this slide. We’ll also have a slide on the next slide with all the QR codes, some of the things we mentioned today. We’ve got that migration guide checklist. We’ve got our AI visibility search checker that Filip just walked through.
00:24:37 –> 00:25:07
And then we’ve got the partner club there as well. A nice and short and sweet one today. We’ve got some time for some questions. So I know there’s a bit of discussion happening in the chat. I’ll just pull up my question document here that I’ve got on the side. So Jamie, this is our head of content. He said, not a leading question as head of content. What do you think matters more for AI search content or technical setup? Let’s argue. Ooh, okay. Well, Filip, do you want to take it or you want me?
00:25:07 –> 00:25:16
We can both give our perspective. There’s been a really interesting conversation in the comments.
00:25:17 –> 00:26:03
Jamie, content and technical are equally important — but if you have superb content and poor technical accessibility, that content won’t help. So I’d bet on technical. I agree with Filip. The technical side cannot be understated. As we saw in that AI visibility report, if the technicals aren’t right, there’s no way for that content to ever be cited.
00:26:03 –> 00:26:35
So they are both very important. You can’t do one without the other. But I lean more towards technical and I always scream that to the rooftops. And that’s why we do a lot of stuff with PR agencies because we’re training them on that, that technical skill gap that they have so that they make sure that the content that they write is able to be seen. I don’t know, Jamie, if you got to come back. We can stay friends. Okay. Glad to hear. Bouchon says, that’s interesting. It seems real SEO efforts are working twice as well compared to the snake oil being sold on social media about SEO being dead.
00:26:36 –> 00:27:03
It’s funny — I always say this in my workshops. When I first started in SEO, about seven or eight years ago, I was monitoring Help a Reporter Out as a digital PR specialist, pitching to journalists to build backlinks. And I remember a headline: ‘Looking to speak to an SEO specialist — because SEO is dead because of voice search.’ That was eight years ago.
00:27:04 –> 00:27:35
And as far as I’m concerned, people have been saying SEO is dead since the dawn of SEO still hasn’t gone anywhere. People are always going to need, Answers to the questions are always going to search for those answers. So depending on how the channel changes, SEO is certainly not dead. Definitely more prevalent than ever, in my opinion. And I think we’re at a really exciting time where SEO is being redefined and we get to reverse engineer a whole new thing. And we’re really in the infancy of AI search, which is super exciting.
00:27:36 –> 00:28:10
Matt sounds like content is king, but it’s nothing without the technical setup. I agree, Matt. Love that. Ettienne technical. Okay, we got lots of commentary. Technical is the skeleton. Content is the meat. Content draws AI to the site, but it needs the technical backbone. I love that we’re all aligned here, and I’m sure a lot of developers here would agree. Bouchon would lean content first. Technical setup helps AI discover and understand pages, but it cannot make thin content valuable in AI search. The biggest wins seem to come from content that is clear, specific, trustworthy, and easy to extract with strong technical SEO acting as a support system.
00:28:12 –> 00:28:50
Matt asks: is AI really that bad at understanding JavaScript? When it comes to SEO and AI search, the answer is the same — it depends how JS-heavy the website is. I’d evaluate it case by case, and not just per website — per LLM too. Gemini, for example, may handle JavaScript rendering well, while another LLM might struggle with it.
00:28:50 –> 00:29:22
We want to minimize that risk, and each website should be analyzed individually. JavaScript is normal and fine — it’s on pretty much every website. But don’t rely on very JS-heavy rendering as your foundation.
00:29:24 –> 00:29:38
Great answer. Bhushan asks: how do you make sure a website is included in training data? There’s no submit button for it — unlike sitemap XML.
00:29:39 –> 00:30:20
There’s no direct way to force it, but there are two things you can do. First, check your technical accessibility — run it through an AI checker. If everything looks good, there’s a strong chance you’re already in the training data. If not, there are a few offsite activities you can pursue, ranging in cost depending on your goals and market.
00:30:21 –> 00:30:55
That’s also something we can help with. Darren asks: are we too late if we’re not yet indexed in current-gen frontier models? Is it still early? Can ground be made up from a cold start? It’s definitely not too late — different LLMs are launching new models practically every month.
00:30:55 –> 00:31:36
There’s always an opportunity to influence training data — and this is genuinely good timing to act. I’d add, though, that brands with more online presence and content will generally have more awareness and will likely outcompete smaller brands, especially for recommendation-type and consideration-based prompts.
00:31:36 –> 00:32:28
That said, you can absolutely build a technically solid website with content tailored to your ICP. If you answer their questions better than anyone else, you have just as much chance of being cited.
00:31:57 –> 00:32:28
Dionne asks: the AI visibility platform looks great, but it seems US-focused. She used a .co.uk domain but the insights were all US, not UK. There are studies showing ChatGPT tends to favor US brands regardless of where you’re searching from. Filip, can you speak to how the checker handles geography?
00:32:29 –> 00:33:07
By default it does pull more from the US index — there’s a lot of data behind that. But I’d recommend tracking AI visibility by specific country. It’s possible in some tools already. Relying only on US data means you could miss important insights around citations and mentions in other markets. Definitely something worth setting up.
00:33:08 –> 00:33:48
I’d definitely set up a UK monitor as well. Is that possible within our tool — to filter by geography? Because if someone checks a UK domain and gets US results… Right now that’s not in the AI checker itself, but we have internal tools that cover it, and it’s included in our service. If you need something more specific, Dionne, get in touch — our help desk can support that, and as I mentioned, we do those free audits.
00:33:48 –> 00:33:55
So we don’t just use that tool, which is meant to really just be a self-serve snapshot. We can do some analysis for you if you’d like on a specific client.
00:33:57 –> 00:34:03
And then we’ve got one from Ettienne. Would a website built with Lovable rank well on Google and in LLMs?
00:34:06 –> 00:34:46
I played with Lovable a couple of months ago. They heavily relied on client-side rendering — not sure if that’s changed since. If it hasn’t, that’s a significant risk for AI visibility, and exactly the JS-heavy rendering issue we discussed. That said, there may be external libraries available to add server-side rendering, so it’s worth investigating before ruling it out.
00:34:47 –> 00:35:44
Definitely something to factor in when choosing a CMS or platform. It’s always a point of contention between web design agencies and technical SEO teams — designers want to build something beautiful and use the tools that let them do that. But when you focus too much on JavaScript and visual effects, it can really impede performance. If you have questions about a specific CMS, we’re happy to weigh in — we’ve had questions about Webflow and many others. More control over the code generally means better results. It’s always hard to find that middle ground.
00:35:45 –> 00:36:24
It’s not always about which CMS you use — it’s about how you architect it. Take WordPress, still the most popular CMS. You can build a great-looking, AI-ready site on WordPress. But you can also build a poorly structured WordPress site that’s completely invisible in AI search. It’s not just the CMS — it’s the architecture behind it.
00:36:27 –> 00:37:01
Totally. All right, that’s all the questions we had. Thank you so much for being so engaged. We’ll give everyone maybe thirty more seconds. Just put in any last final questions if you have. There’s still seventeen of you lingering, so obviously there’s some good stuff here. Darren, can you share some examples you found of SUSO performing well in GEO and AEO responses? Is that something we have prepared right now, Filip? Definitely we have the setup already for, as I mentioned, for US and UK. But yeah, something that I would need to pull out from our internal tools.
00:37:03 –> 00:37:41
I think it’s also worth mentioning, because I don’t know that we really got into it, but the snapshot that you’re pulling on the AI checker, that’s always going to be a one-time snapshot. And all of this is based on synthetic data. So when I say synthetic, I mean ChatGPT, Google’s AI mode, none of these tools give us any insight into what prompts people are searching. So tools like ours and tools like Profound and Scrunch and Peak and all of these other big ones, They’re all running on synthetic data and they’re putting out hundreds of different prompts to test whether or not your client was mentioned or cited.
00:37:42 –> 00:38:15
Ours works the same way. It’s a snapshot — it doesn’t tell you definitively whether your client was cited every time or never. That’s a black box, and I know it’s frustrating. But it’s meant to be directional. Take each snapshot with a grain of salt and track it over time. The technical side, though, is always definitive — that’s what you can control, and that’s what gives bots access.
00:38:15 –> 00:38:54
Worth adding — Rand Fishkin from Datos and SparkToro did research on how consistently the same prompt returns the same response. The output varies enormously every single time. Measurement is very much a black box — even if we showed you SUSO performing well or badly, there are too many variables to draw firm conclusions. That’s exactly why daily tracking matters.
00:38:54 –> 00:38:59
Not weekly, not monthly. Daily tracking is key here, really.
00:39:01 –> 00:39:41
We’ll wrap up there — thank you so much for joining. If you have any other questions, my email is on screen. Feel free to join the Partner Club. We work with web design and dev agencies on migrations, SEO, AI search, GEO, LLMO — whatever you want to call it. See you at the next webinar, stay tuned, join the mailing list, and have a great day!