I LOVED speaking brightonSEO and I bumped into many amazing people (I'll do an appreciation post and will tag everyone) but for this post I wanted to share resources. QUERY COUNTING DECK - https://lnkd.in/ePWKAVb5 QUERY COUNTING WEBINAR https://lnkd.in/eST3kq5y Here's the long and thick of why query counting works and how it can help you: 1. It gives you a VISUAL representation of a pages weighting by the volume of queries it is served for over time 2. It allows you to see what trajectory a page / group of pages / domain is heading in 3. It gives you a much better way of interpreting whether the things you are DOING are having the intended effect 4. It's easier to ascertain whether clicks are lost to either content DEVALUATION or SLIPPAGE - devaluation being worse, slippage being easier to recover from in most instances It's the same process I've used over and over again for our assets, clients etc. It's how I've ranked for SEO consultant, SEO Audits, Play Blackjack Online, No Deposit Bonus, Online Pharmacy, Secured Loans and loads of high competition queries. LEARN IT - It honestly will help you with your SEO endeavours. Are you query counting already? if so leave a comment let me know your thoughts or if you have any questions. PS whoever the guy is with that awesome t-shirt, someone tag him so I can buy him a beer 😂 #seo
How to use query counting for SEO
More Relevant Posts
-
“Pot, meet kettle.” After a week of meetings with business owners about the health of their own websites, I took a peek at mine - and despite it being new to the scene, it wasn’t quite the picture of perfection. It’s been a few years since I’ve been on the SEO tools, but as a new business owner, you’ve got to roll up your sleeves and do all the things. A look at Google Search Console and Ahrefs revealed a few hidden issues: ▶️ Multiple H1 tags raising red flags (H1 page titles were detected - but were invisible to the naked eye because they were in white text). ▶️ Missing meta descriptions on a couple of pages. ▶️ A couple of titles were longer than my coffee order. ▶️Some pages weren’t being indexed. ▶️ And yes, the usual suspect, some broken links. So I dedicated the day working on these technical fixes and resubmitting my sitemap to Google. The results I'm hoping for: ▶️ Search engines crawling and interpreting my site efficiently, without being hindered by technical issues. ▶️ Pages being optimally positioned to appear in relevant search results. ▶️ Improved click-through rates through more concise titles and meta descriptions. Looks like this pot needed a little polish too. #FractionalCMO #TuesdayLogic
To view or add a comment, sign in
-
-
The 2 AM Search That Changed Everything Someone types a query into Google. Deletes it. Types again. Deletes. Rephrases. What are they really looking for? I tracked this for six months. The gap between what people type and what they actually need? Huge. Here's the thing: "Best running shoes" at midnight isn't the same search as "best running shoes" at noon. Different person. Different need. Different fear. I watched someone search "red patches arm" seventeen times. Each query more desperate. We gave them medical definitions. They wanted to know if they'd be okay. One Page Changed Everything Client's conversion rate: stuck at 2%. Good traffic. Zero results. We rewrote one product description. Started with their pain: "Your back hurts after three hours at your desk." Sales jumped 34% in two weeks. Same product. Different understanding of the human behind the search. What Most People Miss Someone searches "change careers at 40." What they're really asking: Am I too old? Will I regret this? Can I afford this? Most content answers the search. The best content answers the person. But there's more to this. Way more. I can't fit everything in one post. The patterns I've found. The mistakes that kill conversions. The exact moments where content either connects or fails. So I wrote it all out. Full article on my profile: "When Search Queries Reveal What We're Actually Looking For" What you'll find there: The 4 types of search intent most people miss completely Why standard "user intent" guides get this backwards Real examples of identical searches that need different content The framework I use to decode what people really mean Case studies with actual numbers 👉 Visit my profile → Articles section → Read the breakdown This isn't theory. These are patterns from analyzing thousands of searches and the content that either worked or bombed. Understanding the person behind the search? That's where real optimization starts.
To view or add a comment, sign in
-
𝗜𝗺𝗽𝗿𝗲𝘀𝘀𝗶𝗼𝗻𝘀 𝗗𝗼𝘄𝗻, 𝗔𝘃𝗲𝗿𝗮𝗴𝗲 𝗣𝗼𝘀𝗶𝘁𝗶𝗼𝗻 𝗨𝗽? 𝗛𝗲𝗿𝗲’𝘀 𝗪𝗵𝗮𝘁’𝘀 𝗥𝗲𝗮𝗹𝗹𝘆 𝗛𝗮𝗽𝗽𝗲𝗻𝗶𝗻𝗴 Google has quietly made a change that's having a big impact. Impressions are going down, while average position is going up. Google removed the num=100 results parameter. That means far fewer queries beyond page 1 are counted as impressions, so many sites are seeing 𝗶𝗺𝗽𝗿𝗲𝘀𝘀𝗶𝗼𝗻𝘀 ↓ 𝘄𝗵𝗶𝗹𝗲 𝗮𝘃𝗲𝗿𝗮𝗴𝗲 𝗽𝗼𝘀𝗶𝘁𝗶𝗼𝗻 ↑ without real ranking gains. It’s a reporting/visibility shift more than a win or loss. Search Engine Land+2Search Engine Land+2 𝗪𝗵𝗮𝘁 𝗶𝘀 𝗻𝘂𝗺=𝟭𝟬𝟬? It was a URL parameter that told Google to show 100 results per page. Many tools and workflows used it when checking rankings. With it gone, Google serves fewer results per page by default, so page-2/3 appearances are less often counted as impressions in your reports. Your actual rankings may not have moved, the counting changed. 𝗪𝗵𝗮𝘁 𝗶𝘁 𝗿𝗲𝗮𝗹𝗹𝘆 𝗺𝗲𝗮𝗻𝘀 • Page-2/3 terms now surface less → fewer impressions. • Your “avg position” can improve simply because page-2+ data thinned out. • Community data points show the same pattern across accounts. Search Engine Roundtable+2Search Engine Roundtable+2 𝗦𝗼 𝘄𝗵𝗮𝘁 𝗱𝗼 𝘄𝗲 𝗱𝗼? As usual with SEO, first principles matter: • 𝗣𝗿𝗼𝗺𝗼𝘁𝗲 𝗯𝗼𝗿𝗱𝗲𝗿𝗹𝗶𝗻𝗲 𝘁𝗲𝗿𝗺𝘀 (𝗽𝗼𝘀 𝟭𝟭–𝟮𝟬) 𝗶𝗻𝘁𝗼 𝗽𝗮𝗴𝗲 𝟭 with targeted MoFu/BoFu support, internal links, and clear intent match. • 𝗧𝗶𝗴𝗵𝘁𝗲𝗻 𝗶𝗻��𝗼𝗿𝗺𝗮𝘁𝗶𝗼𝗻 𝗴𝗮𝗶𝗻 (specifics, proofs, specs, local entities) so you earn clicks when you do show. • 𝗗𝗲-𝗱𝘂𝗽𝗹𝗶𝗰𝗮𝘁𝗲 𝗰𝗮𝗻𝗻𝗶𝗯𝗮𝗹𝘀 so each page owns a distinct intent. 𝗢𝗻 𝗯𝗮𝗰𝗸𝗹𝗶𝗻𝗸𝘀 & 𝗔𝗜: Backlinks still matter, but you can’t lean on them alone. If LLMs and scrapers bias to what’s 𝘃𝗶𝘀𝗶𝗯𝗹𝗲 𝗶𝗻 𝘁𝗵𝗲 𝘁𝗼𝗽 𝟭𝟬, then 𝘀𝘁𝗿𝘂𝗰𝘁𝘂𝗿𝗲𝗱, 𝗲𝗻𝘁𝗶𝘁𝘆-𝗿𝗶𝗰𝗵 𝗰𝗼𝗻𝘁𝗲𝗻𝘁 in that set wins the synthesis game. Several SEOs see the num=100 removal as a nudge toward real expertise and distribution, not just link tactics.
To view or add a comment, sign in
-
-
The "Near Me" Search is Dead. Long Live "Near Me" Searches. Let's clear something up about local search in 2025. People aren't searching "near me" less. They're just doing it differently. Here's what's actually happening: People aren't typing "near me" less. They're just doing it differently. Voice search? "Hey Google, find me a plumber." (That's a near me search) Zero-click results? Google shows the map pack before you even finish typing. (Still triggered by local intent) AI overviews? They're pulling heavily from GBP data and local citations. (Guess what matters there?) **The searches evolved. The fundamentals didn't.** What I'm seeing work right now for local businesses: → Getting hyper-specific with service pages. Not just "plumber in Boston" but "emergency pipe repair in Back Bay" → Actually responding to reviews (yes, all of them - Google's watching) → Local link building that isn't spammy directories. Think: local news mentions, chamber of commerce, genuine community partnerships → Publishing content that answers the questions your customers are literally asking in consultations The businesses crushing it locally in 2025? They stopped obsessing over algorithm updates and started obsessing over their actual customers' search behavior. Your competitor is optimizing for Google. You should be optimizing for the person standing on the corner with their phone out, desperately needing what you offer. --- What local SEO tactics are you seeing work (or completely flop) lately? Let's compare notes in the comments. 👇 #LocalSEO #DigitalMarketing #SmallBusiness #SEO #GoogleBusinessProfile
To view or add a comment, sign in
-
Indeed’s SEO doesn’t “pray for crawl.” It predicts it. Chris Reynolds shared that Indeed built an ML model to decide which pages deserve discovery and indexation. Not more XML. Smarter prioritization. Here’s the playbook you can steal: 1. Train on outcomes, not wishes. Use historical crawl → index → traffic to learn what actually earns inclusion. 2.. Score every URL continuously. Signals: internal link depth, freshness, demand, duplication risk, and business value. 3. Route by score, not politics. High-score pages get surfaced in nav and feeds. 4. Low-score pages get de-emphasized or merged. Sitemaps report, they don’t rescue. 5. Close the loop. Re-score after publishing. 6. If a page fails to index or stick, downgrade its pattern and fix the template, not the URL. Result: fewer wasted fetches, faster paths to indexation, and a site IA that reflects what users (and Google) want—at scale. Power takeaway: ML is the compass; internal linking is the road. Sitemaps are just the receipt. If you manage >100k URLs, what one rule would you codify in a scoring model today? #SEO #TechSEO #CrawlBudget If this was useful, repost it so others can benefit too. Full episode sources in the comments.
To view or add a comment, sign in
-
Should you obsess over getting 100/100 on Google’s PageSpeed Insights? …. I decided to find out …🧐 After testing 105 websites across some of the UK’s most competitive industries …..finance, legal, insurance, weight loss, medical, online casino, and (of course) SEO, my results might surprise you. 🤔 Average performance score was just 52/100 – even among the biggest players. 😱 Some top-ranking sites scored as low as 2/100. 😎 SEO and finance sites led the pack overall, while online casinos consistently scored the worst. So what’s the takeaway? Yes, site speed and technical optimisation matter. But ranking #1 isn’t always about hitting perfect green scores. Content, authority, and backlinks still carry huge weight in competitive industries.(among other things) The biggest lesson? Don’t lose sleep chasing a 100/100 score. A 70/100 will do just fine for now! Focus on fixing real on-page issues, building authority, and publishing quality content. That’s what actually gets results!
To view or add a comment, sign in
-
What if I told you that some domains potentially hold rank on BEHAVIOUR alone? Do you know I was called a conspiracy theorist by LOTS of SEOs in the past for my advocating that Google used user behaviour as a ranking factor. Google denied it initially until they got exposed in the DOJ trial. Turns out I was right all along. SOME DOMAINS CAN RANK ON BEHAVIOUR ALONE! there are of course nuances to this statement - a site may rank initially on general factors, then perform well and sustain rank based on behaviour. This is why some sites just rank even if their link profille is garbage, content is poor etc. HELPFUL CONTENT is heavily weighted. As SEOs you need to realise that: The NUMBER 1 factor in determining whether something is good is by how people behave. So, when you are Google and you have access to the amount of user data that they have, you can do amazing things with it. GOOGLE CHROME = Trojan horse all along sending usage and telemetry data to Google GOOGLE SEARCH = They see click behaviour before the website, this data can be used in predictive analysis (CTR) Google has access to an unprecedented amount of data. Google has DATA and MACHINE LEARNING at scale. This is why: Some crappy, terrible, non conforming websites rank above you despite you doing everything better Some domains just seem to hold rank year after year - sailing through core updates Some sites just appear to come out of nowhere and hold rank without the backbone that other established sites may have SEO is NOT JUST ABOUT TECH, CONTENT, LINKS. It's MASSIVELY impacted by user behaviour and this is where So many site owners got their knickers in a twist when they got wiped out by HCU. It wasn't that their content was bad, it's that Google's machine learning model (HCU classifier) gave weighting to engagement. As I've said before and I'll say it again. HELPFUL CONTENT isn't always GOOD CONTENT and GOOD CONTENT isn't always HELPFUL CONTENT The sooner you realise that SEO is a headfuck because there's no "linear" baseline for anyone to work to. Every SERP is different Ranking factors vary by site by position by query Machine learning actively helps to shape search No 2 websites will get the same ranking benefit doing the same things Good SEO is about being able to DISCOVER through trial and error what works and what does not on a case by case basis. See #1 in Google is vatcalculator(.)co.uk for VAT calculator, position 1 for 15+ years, NON HTTPS, not mobile friendly, no SEO. Incredible ENGAGEMENT rate, low bounce, high exit. People use it and then end the session - that has helped it to maintain rank despite it being "non conforming". This ranked ALL the way through the "death of EMDs". There's what Google says and then there's the truth. #seo
To view or add a comment, sign in
-
-
Your traffic isn’t growing, but your revenue still can. A few of our clients are seeing the same trend: - Search volume is flat (or dropping because of AI overviews) - Overall traffic has plateaued - Rankings aren’t “up” in the way people expect But here’s the twist: Revenue is still increasing. Why? Because traffic doesn’t mean money. Search intent does. Most founders obsess over page views. But page views don’t pay bills, bottom-funnel intent does. Here’s what we’re seeing across accounts: - Sites with 50,000+ pages - But only 3–5 pages drive 90–95% of the revenue - Those pages rank for keywords with commercial intent - Even if overall traffic stays the same, the right traffic converts harder And with AI eating up generic search queries, this matters even more. The game isn’t “get more visitors.” It’s: - Rank for queries people buy from - Optimise the pages that drive pipeline - Stop chasing keywords that don’t convert - Track revenue per page, not visits per page So if your traffic graph is flat but revenue is growing? You’re not failing, you’re finally playing the right game. Want help making your SEO revenue-first instead of traffic-obsessed? DM me. ♻️ Repost this, follow me for more.
To view or add a comment, sign in
-
-
Your business is about to disappear from the most important search results page Google has ever created. For years, the local pack was the prize. Getting into that "Map 3-Pack" meant a flood of calls. But the new AI-powered Search Generative Experience (SGE) is changing everything. It's not just showing a list of three businesses anymore; it's conversing with the user, summarizing options, and pulling data from across the web to make a recommendation. If your local SEO isn't built for this AI-driven world, you're becoming invisible. The businesses that will win are those whose online presence is so consistent and authoritative that Google's AI sees them as the definitive answer. It's no longer just about keywords in your GMB (now Google Business Profile); it's about building comprehensive reputation signals that AI can easily understand and trust. To rank in the age of AI Search, you need to master these three pillars: ▶️ AI-Friendly GBP Optimization: Move beyond basic categories. Your "Products" and "Services" sections must be meticulously filled out with detailed descriptions. AI uses this structured data to match user queries with precision. ▶️ Local Citation Dominance: Inconsistencies in your NAP (Name, Address, Phone) across directories like Yelp, Yellow Pages, and local chambers of commerce confuse AI and kill your credibility. A pristine, consistent citation footprint is non-negotiable. ▶️ Review Intelligence: It's not just about the 5-star rating anymore. AI is scanning the content of your reviews for specific keywords like "affordable," "fast service," or "knowledgeable staff." These are the very phrases it will use to recommend you. Google's AI is essentially a new, hyper-intelligent gatekeeper for your local customers. Are you giving it the right signals to let you in? What's the one local search term you wish you ranked for? Drop it in the comments, and I'll tell you the first signal you're probably missing.
To view or add a comment, sign in
-
🔍 Ever Wondered How a Modern Search Engine Really Works? Behind every Google search theer is a fascinating process that helps you get the most relevant results: Here’s the 5-step journey of every web page: 1️⃣ Discovery – The search engine finds new web pages through links and sitemaps. 2️⃣ Crawling – Bots (or spiders) scan these pages to gather information. 3️⃣ Rendering – The page’s code is visually processed to understand layout and content. 4️⃣ Indexing – Information is stored and categorized in the search engine’s massive database. 5️⃣ Ranking – Algorithms decide which pages best answer your query — and display them on top. 💡 Each step ensures users get fast, accurate, and useful answers every single time they search. REPOST ♻️ if you learned something new here. #SEO #SearchEngine #DigitalMarketing #GoogleSearch #SEOInsights #CrawlingAndIndexing #SEOTips #RankingFactors #LearnSEO
To view or add a comment, sign in
-
Query counting makes it so much easier to spot shifts before they become issues. Been using a similar approach, and it’s helped connect content changes to actual impact. Appreciate you sharing the deck and webinar! Daniel Foley Carter