I can improve your SEO.
I’ve been working in SEO since 2010.
I started as a freelance writer. I’ve been a cog in a pre-Panda SEO assembly line. I’ve been an analyst trying to pick up the pieces Panda and Penguin left behind.
I’ve managed a team, worked with editors and freelancers. I’ve set SEO strategy for multiple sites and had weekly meetings with CEOs.
SEO is a constantly changing field, but that’s part of what makes it so interesting. But because it’s so fluid, you can’t do it halfway and expect meaningful results — and you certainly can’t expect them quickly.
You have to look at the big picture, cover all the minutia, and have patience.
Tricking Google for any length of time is impossible. But you can optimize your site so that no matter which way the algo wind blows, you’re ready to not just weather it, but thrive.
SEO in the age of AI
A few years ago, Google updated its algorithm to start assuming search intent. At the time, I was working in the legal space. Whereas in the past if someone search for, say, “drunk driving,” they’d get a list of article about drunk driving. After the update, searching for “drunk driving” would get you a list of drunk driving lawyers in your area.
Google began connecting certain terms with certain actions. It de-emphasized what you could call passive searches. Google wanted you to do something.
Informational pages that didn’t have a direct course of action attached began losing rank regardless of how good they were.
Google’s Bard is the next step in this evolution.
Yes, Bard will cite its sources, but if you’re getting the answer you need, why would you click on any of the sources? If the answer you get leads you to additional questions, you can just ask those, as opposed to clicking onto a page that may or may not have what you need.
The sites and pages that are going to survive AI are the ones that actually help you do something. Informational pages are going to suffer.
Sites that generate most of their revenue through pay per click are going to have to rethink their strategy. Top of the funnel traffic is going to dry up.
Search is now transactional.
SEO without a strategy isn’t SEO.
SEOs talk about “big” SEO and “little” SEO. The former are large projects that take a great deal of time and planning, the latter involves the tedious, repetitious work that most basic SEO audits uncover. “Big” and “little” SEO are often talked about as if they can stand on their own, independent of each other. But that’s not the case.
The same thing happens with technical SEO and content. They are inseparable to the point where I’ve had people refer to things like title tags as technical SEO, basically because they show up in an audit.
Never stop reviewing.
Generating new pages is great, but ignoring the old ones is fatal.
I once worked on site with upwards of 100K pages. Our domain authority was so strong that we could publish almost anything (within reason) and start to get traffic soon after.
Because of this, we were told to generate more pages, because, if X = $ then 10X = $$$$$$$$$$, right?
But that’s not how SEO works — which I tried to explain to no avail.
Sure enough, after a few months of increased production, the site got hammered.
I’d prepared for it, though, and had a plan in place. Because I mentioned that the plan would shrink the site’s footprint, it was unceremoniously dubbed “Bigfoot.”
“Bigfoot” is just an incredibly involved site audit with one significant difference: it considers context.
Context is king.
People regularly pay for audits of their sites to get secret SEO knowledge like “these title tags are too long.” Besides the fact that getting that info isn’t hard, it lacks context.
You could have 2K pages that need their title tags fixed but haven’t gotten any traffic in the last five years. I have no doubt that the title tags are bad. I also have no doubt that there is much more wrong with those pages. In many cases, they shouldn’t even exist.
So why blindly update title tags, meta descriptions, etc. on pages that won’t improve? It’s a waste of time and money.
In “Bigfoot,” we looked at the pages within the context of the site as a whole. We grouped pages into four buckets:
- Update — pages that are worth* keeping and just need polished
- Rewrite — pages that have some value, but need a lot of work
- Combine — pages that overlap each other and can be merged into a “super” page
- Remove — pages that are only diluting the quality of the site
*worth = traffic, rank, revenue, back links
It took less than 6 months for us to get all the traffic we’d lost back
And then it kept growing: slow, steady, organic growth, month over month
Not long after that, we had a comparable situation arise with two other sites. Because of bandwidth limitations, we only applied “Bigfoot” to one of them. Six months later, the “Bigfoot” site was as healthy as ever, while the other site continued to die on the vine.
New keywords should come from old keywords.
You can’t grow a money tree in a desert.
Context also applies to growing your keyword universe.
I’ve seen so many people find the biggest pie in the sky terms and then generate pages to rank for them. That works out about as well as you would imagine.
Build out your keywords organically.
If you have a page about primary colors that does well, create a page on secondary colors and link to the new page. Create a page on making green out of blue and yellow and link to it. Then you’ve got orange and purple! And pink and black and white and grey!
Hey, you’ve got so much content that you could create a hub page for all these pages so they’re easy to find and not too far from the home page!
More pages do not mean more traffic.
This has been true for years. It’s even more important with the advent of AI generated content.
The more thin content you have, the less value your domain will have. It’s a lot like trying to find a treasure at a thrift store. If 90% of the items aren’t worth anything, how much time are you willing to commit to finding that other 10%? And if you found a treasure, would you no longer consider it a thrift store but a treasure store?
Of course not. You and all your surfer friends know it’s a thrift store. If it wants to be a treasure store it’s going to have to, you know, have more treasure than not.
Congrats, you’ve published 1K pages to your site in the last month with AI! Are all those pages important? Do they all value? Are they substantive?
The value of a writer’s byline is going to plummet if that author “wrote” 1K pages in a month, because that raises red flags.
It’s tricky; Google has started coming down on sites that are content factories, but they still have a long way to go to identifying them. But they will. This is Google. The sites that are seeing short term gains are going to find themselves with long term pains down the road.
Back links require quality over quantity.
In theory, back links are the best signal for how useful a page is.
The people (other sites referencing your content) have spoken (follow back links with anchor text). If a bunch of high authority web sites are linking to your site in reference to something, then your site must be good. Sites that are killing it don’t link to sites that are bad. It sends the wrong message to their users.
As we all know, if there’s an SEO signal that can be exploited, we will find a way. Thus began the era of buying links.
Google caught on, as they always do, which is why it’s so surprising to me to see people out there still buying and selling links.
SEO in general is a slow process and generating valuable back links is perhaps the slowest part.
The best back links you can get happen organically, which means creating the best possible pages.
If a reporter for CNN needs a reference and they Google the subject, one of those results (most likely from the top 3) is going to get that link.
Paying bloggers to post on random sites that link back to you gets you watered down links that send the wrong signal to Google.