Tumgik
#old url + completely dead blog but i still like this design as well. you WILL look at my dovewing
Text
#1yrago My RSS feeds from a decade ago, a snapshot of gadget blogging when that was a thing
Tumblr media
Rob Beschizza:
I chanced upon an ancient backup of my RSS feed subscriptions, a cold hard stone of data from my time at Wired in the mid-2000s. The last-modified date on the file is December 2007. I wiped my feeds upon coming to Boing Boing thenabouts: a fresh start and a new perspective.
What I found, over 212 mostly-defunct sites, is a time capsule of web culture from a bygone age—albeit one tailored to the professional purpose of cranking out blog posts about consumer electronics a decade ago. It's not a picture of a wonderful time before all the horrors of Facebook and Twitter set in. This place is not a place of honor. No highly-esteemed deed is commemorated here. But perhaps some of you might like a quick tour, all the same.
The "Main" folder, which contains 30 feeds, was the stuff I actually wanted (or needed) to read. This set would morph over time. I reckon it's easy to spot 2007's passing obsessions from the enduring interests.
↬ Arts and Letters Daily: a minimalist blog of links about smartypants subjects, a Drudge for those days when I sensed a third digit dimly glowing in my IQ. But for the death of founder Denis Dutton, it's exactly the same as it was in 2007! New items daily, but the RSS feed's dead.
↬ Boing Boing. Still around, I hear.
↬ Brass Goggles. A dead feed for a defunct steampunk blog (the last post was in 2013) though the forums seem well-stocked with new postings.
↬ The Consumerist. Dead feed, dead site. Founded in 2005 by Joel Johnson at Gawker, it was sold to Consumer Reports a few years later, lost its edge there, and was finally shuttered (or summarily executed) just a few weeks ago.
↬ Bibliodyssey. Quiescent. Updated until 2015 with wonderful public-domain book art scans and commentary. A twitter account and tumblr rolled on until just last year. There is a book to remember it by should the bits rot.
↬ jwz. Jamie Zawinski's startling and often hilariously bleak reflections on culture, the internet and working at Netscape during the dotcom boom. This was probably the first blog that led me to visit twice, to see if there was more. And there still is, almost daily.
↬ Proceedings of the Athanasius Kircher Society. Curios and weirdness emerging from the dust and foul fog of old books, forbidden history and the more speculative reaches of science. So dead the domain is squatted. Creator Josh Foer moved on to Atlas Obscura.
↬ The Tweney Review. Personal blog of my last supervisor at Wired, Dylan Tweney, now a communications executive. It's still going strong!
↬ Strange Maps. Dead feed, dead site, though it's still going as a category at Big Think. Similar projects proliferate now on social media; this was the wonderful original. There was a book.
↬ BLDGBLOG. Architecture blog, posting since 2004 with recent if rarer updates. A fine example of tasteful web brutalism, but I'm no longer a big fan of cement boxes and minimalism with a price tag.
↬ Dethroner. A men's self-care and fashion blog, founded by Joel Johnson, of the tweedy kind that became wildly and effortlessly successful not long after he gave up on it.
↬ MocoLoco. This long-running design blog morphed visually into a magazine in 2015. I have no idea why I liked it then, but indie photoblogs' golden age ended long ago and it's good to see some are thriving.
↬ SciFi Scanner. Long-dead AMC channel blog, very likely the work of one or two editors and likely lost to tidal corporate forces rather than any specific failure or event.
↬ Cult of Mac. Apple news site from another Wired News colleague of mine, Leander Kahney, and surely one of the longest-running at this point. Charlie Sorrel, who I hired at Wired to help me write the Gadget blog, still pens articles there.
↬ Ectoplasmosis. After Wired canned its bizarre, brilliant and unacceptably weird Table of Malcontents blog, its editor John Brownlee (who later joined Joel and I in editing Boing Boing Gadgets) and contributor Eliza Gauger founded Ectoplasmosis: the same thing but with no hysterical calls from Conde Nast wondering what the fuck is going on. It was glorious, too: a high-point of baroque indie blogging in the age before Facebook (and I made the original site design). Both editors later moved onto other projects (Magenta, Problem Glyphs); Gauger maintains the site's archives at tumblr. It was last updated in 2014.
↬ Penny Arcade. Then a webcomic; now a webcomic and a media and events empire.
↬ Paul Boutin. While working at Wired News, I'd heard a rumor that he was my supervisor. But I never spoke to him and only ever received a couple of odd emails, so I just got on with the job until Tweney was hired. His site and its feed are long-dead.
↬ Yanko Design. Classic blockquote chum for gadget bloggers.
↬ City Home News. A offbeat Pittburgh News blog, still online but lying fallow since 2009.
↬ Watchismo. Once a key site for wristwatch fans, Watchismo was folded into watches.com a few years ago. A couple of things were posted to the feed in 2017, but its time has obviously passed.
↬ Gizmodo. Much has changed, but it's still one of the best tech blogs.
↬ Engadget. Much has changed, but it's still one of the best tech blogs.
↬ Boing Boing Gadgets. Site's dead, though the feed is technically live as it redirects to our "gadgets" tag. Thousands of URLs there succumbed to bit-rot at some point, but we have plans to merge its database into Boing Boing's and revive them.
↬ Gear Factor. This was the gadget review column at Wired Magazine, separate from the gadget blog I edited because of the longtime corporate divorce between Wired's print and online divisions. This separation had just been resolved at the time I began working there, and the two "sides" -- literally facing offices in the same building -- were slowly being integrated. The feed's dead, but with an obvious successor, Gear.
↬ The Secret Diary of Steve Jobs. Required reading at the time, and very much a thing of its time. Now vaguely repulsive.
↬ i09. This brilliant sci-fi and culture blog deserved more than to end up a tag at Gizmodo.
↬ Science Daily: bland but exhaustive torrent of research news, still cranking along.
The "Essentials" Folder was material I wanted to stay on top of, but with work clearly in mind: the background material for systematically belching out content at a particular point in 2007.
↬ Still alive are The Register, Slashdot, Ars Technica, UMPC Portal (the tiny laptop beat!), PC Watch, Techblog, TechCrunch, UberGizmo, Coolest Gadgets, EFF Breaking News, Retro Thing, CNET Reviews, New Scientist, CNET Crave, and MAKE Magazine.
↬ Dead or quiescent: GigaOm (at least for news), Digg/Apple, Akihabara News, Tokyomango, Inside Comcast, Linux Devices (Update: reincarnated at linuxgizmos.com), and Uneasy Silence.
Of the 23 feeds in the "press releases" folder, 17 are dead. Most of the RSS no-shows are for companies like AMD and Intel, however, who surely still offer feeds at new addresses. Feeds for Palm, Nokia and pre-Dell Alienware are genuine dodos. These were interesting enough companies, 10 years ago.
PR Newswire functions as a veneering service so anyone can pretend to have a big PR department, but it is (was?) also legitimately used by the big players as a platform so I monitored the feeds there. They're still populated, but duplicate one another, and it's all complete garbage now. (It was mostly garbage then.)
My "Gadgets and Tech" folder contained the army of late-2000s blogs capitalizing on the success of Gizmodo, Boing Boing, TechCrunch, et al. Back in the day, these were mostly one (or two) young white men furiously extruding commentary on (or snarky rewrites of) press releases, with lots of duplication and an inchoate but seriously-honored unspoken language of mutual respect and first-mover credit. Those sites that survived oftentimes moved to listicles and such: notionally superior and more original content and certainly more sharable on Facebook, but unreadably boring. However, a few old-timey gadget bloggers are still cranking 'em out' in web 1.5 style. And a few were so specialized they actually had readers who loved them.
Still alive: DailyTech, technabob, CdrInfo.com, EverythingUSB, Extremetech, GearFuse, Gizmag, Gizmodiva, Hacked Gadgets, How to Spot A Psychopath/Dans' Data, MobileBurn, NewLaunches, OhGizmo!, ShinyShiny, Stuff.tv, TechDigest, TechDirt, Boy Genius Report, The Red Ferret Journal, Trusted Reviews, Xataca, DigiTimes, MedGadget, Geekologie, Tom's Hardware, Trendhunter, Japan Today, Digital Trends, All About Symbian (Yes, Symbian!), textually, cellular-news, TreeHugger, dezeen.
Dead: jkkmobile.com, Business Week Online, About PC (why), Afrigadget (unique blog about inventors in Africa, still active on FaceBook), DefenseTech, FosFor (died 2013), Gearlog, Mobile-Review.com (but apparently reborn as a Russian language tech blog!), Robot's Dreams, The Gadgets Weblog, Wireless Watch Japan, Accelerating Future, Techopolis, Mobile Magazine, eHome Upgrade, camcorderinfo.com (Update: it became http://Reviewed.com), Digital Home Thoughts (farewell), WiFi Network News (farewell), Salon: Machinist, Near Future Lab, BotJunkie (twitter), and CNN Gizmos.
I followed 18 categories at Free Patents Online, and the site's still alive, though the RSS feeds haven't had any new items since 2016.
In the "news" folder, my picks were fairly standard stuff: BBC, CNET, digg/technology, PC World, Reuters, International Herald Tribune, and a bunch of Yahoo News feeds. The Digg feed's dead; they died and were reborn.
The "Wired" feed folder comprised all the Wired News blogs of the mid-2000s. All are dead. 27B Stroke 6, Autopia, Danger Room, Epicenter, Gadget Lab, Game|Life, Geekdad, Listening Post, Monkey Bites, Table of Malcontents, Underwire, Wired Science.
These were each basically one writer or two and were generally folded into the established mazagine-side arrangements as the Age of Everyone Emulating Gawker came to an end. The feed for former EIC Chris Anderson's personal blog survives, but hasn't been updated since his era. Still going strong is Bruce Sterling's Beyond the Beyond, albeit rigged as a CMS tag rather than a bona fide site of its own.
Still alive from my 2007 "Science" folder are Bad Astronomy (Phil Plait), Bad Science (Ben Goldacre), Pharyngula (PZ Myers) New Urban Legends, NASA Breaking News, and The Panda's Thumb.
Finally, there's a dedicated "iPhone" folder. This was not just the hottest toy of 2007. It was all that was holy in consumer electronics for half a decade. Gadget blogging never really had a golden age, but the iPhone ended any pretense that there were numerous horses in a race of equal potential. Apple won.
Still alive are 9 to 5 Mac, MacRumors, MacSlash, AppleInsider and Daring Fireball. Dead are TUAW, iPhoneCentral, and the iPhone Dev Wiki.
Of all the sites listed here, I couldn't now be paid but to read a few. So long, 2007.
https://boingboing.net/2017/12/29/my-rss-feeds-from-a-decade-ago.html
12 notes · View notes
Text
The 9-Step SEO Strategy for 2019 [NEW]
Today I’m going to show you a VERY effective SEO strategy for 2019.
(Step-by-step)
In fact, I recently used these exact steps to rank #1 in Google for “Video SEO”.
And “keyword research tool”.
Let’s dive right in…
Step #1: Find an “Opportunity Keyword”
Let’s face it:
A #1 ranking isn’t what it used to be.
That’s because Google keeps adding stuff to the search results.
For example, look at the keyword “SEO Tools”:
You’ve got ads at the top of the page:
A Featured Snippet:
And a “People also Ask” box:
THEN you get to the #1 result:
That’s why you want to focus on Opportunity Keywords.
Opportunity Keywords are keywords with a high organic click-through-rate (CTR).
How about an example?
I recently created a post optimized around the term “SEO Audit”:
And “SEO Audit” is an Opportunity Keyword.
Sure, there are ads:
But that’s actually a good thing.
(More ads=higher commercial intent)
Other than ads, there isn’t a lot to distract people from the organic results:
You can also estimate organic CTR with Ahrefs.
For example, when I put “SEO Audit” into Ahrefs, it says that 61% of searchers click on a result.
Not bad.
Which leads us to…
Step #2: Analyze Google’s First Page
OK, so you found an Opportunity Keyword.
Now it’s time to see what’s already working for that keyword.
To do that, just type your keyword into Google.
Scan the top 10 results:
And jot down any patterns that you notice.
For example, the first page for “SEO Tools” is PACKED with lists of tools:
So you’d want to jot down: “lots of list posts”.
Then, move onto step #3…
Step #3: Create Something Different… Or Better
When it comes to content, you’ve got two options:
Option #1: You can create something different.
Option #2: You can create something better.
Here’s how…
Different
Sometimes you want to create something bigger and better than what’s out there.
(aka The Skyscraper Technique)
But sometimes you’re better off with content that’s completely different.
Why?
Because it helps your content STAND OUT.
For example:
A few months ago I sat down to write a piece of content optimized around: “Mobile SEO”.
And I noticed Google’s first page was littered with list posts, like: “X Ways to Mobile Optimize Your Site.”
Now:
I could have created a BIGGER list post like: “150 Ways to Mobile Optimize Your Site”.
But that wouldn’t make any sense.
Instead, I created something totally different.
Specifically, I published an ultimate guide to mobile optimization.
And because my content stood out, it got a ton of shares:
Comments:
And most important of all, backlinks:
Better
This is a lot more straightforward.
All you need to do is find out what’s working…
…and publish something WAY better.
For example:
A while back I noticed that most content about “SEO tools” only listed 10-20 tools.
And I knew that publishing another list of 20 tools wouldn’t work.
So I decided to create a list of 188 SEO tools.
And it did GREAT.
In fact, it now ranks in the top 3 for the keyword “SEO Tools”:
Step #4: Add a Hook
Here’s the deal:
If you want to rank in 2019, you need backlinks.
Question is:
HOW?!
First, you need to figure out WHY people link to content in your industry.
(“The Hook”)
Then, include that “Hook” in your content.
For example:
Last year I noticed more and more bloggers writing about voice search.
I noticed something else too:
When people wrote about voice search, they linked to content that featured stats and data:
So I decided to do a voice search study that was PACKED with stats:
And it worked!
To date, this single post has racked up 848 backlinks:
And 90%+ of these backlinks cite a specific stat from my post:
That said:
Data is just one type of Hook that you can use to get links to your content.
Here are 3 other Hooks that are working great right now:
New Approaches and Strategies
Think about it:
What do bloggers and journalists LOVE writing about?
New stuff!
And if you create something new, you’ve got yourself a hook.
For example, a few years ago, I coined the phrase “Guestographics”.
This was a new strategy that no one knew about.
And because Guestographics were new (and had a unique name), 1,200 people have linked to my post so far:
Massive Guides
When you publish a massive guide, your guide itself is The Hook.
I’ll explain with an example…
A few years back I published Link Building: The Definitive Guide.
It was (and still is) the most complete guide to link building out there.
Here’s where things get interesting…
Every now and again a blogger will mention “link building” in a post.
But they don’t have room to cover the entire topic.
So they link to my guide as a way for their readers to learn more:
Very cool.
Case Study Results
Case studies are GREAT for getting links.
But to get links to your case study, you need to feature a specific result.
For example, a while back I published this case study:
This was a SUPER in-depth case study.
But I didn’t feature ONE result in the post.
Instead, I listed out 20+ results:
Which meant my case study didn’t have a single Hook for people to link to.
And very few people linked to it.
Flash forward to a few years later when I published this case study:
This time, I focused on ONE result (a 785% increase in my blog’s conversion rate):
And that single result was The Hook that led to hundreds of links:
Nice!
Step #5: Optimize For On-Page SEO
This step is all about keyword-optimizing your content for SEO.
And here are the 3 on-page SEO strategies that are working best for me right now:
Internal Linking
Yup, internal linking still works.
But you have to do it right.
Specifically, you want to link FROM high-authority pages TO pages that need authority.
For example, I published Google Search Console: The Definitive Guide earlier this year.
So I found a page on my site with a ton of authority…
…and linked from that page to my new guide.
Simple.
Short, Keyword-Rich URLs
Our analysis of 1 million Google search results found something that surprised a lot of people:
Short URLs crush long URLs.
That’s why I make my URLs either just my keyword…
… Or my target keyword plus one more word:
Either way works.
Semantic SEO
Finally, I optimize my content for Semantic SEO.
In other words:
I find words that are related to my target keyword.
Then, I use those terms in my content.
Here are the deets:
First, pop your keyword into Google Images.
And Google will give you words and phrases they consider closely-related to that topic:
Second, type the same keyword into normal Google search. And scroll down to the “Searches related to…” section.
Finally, sprinkle some of those terms into your content:
And you’re set.
Step #6: Optimize For User Intent
In other words: The Skyscraper Technique 2.0.
I’ll show you how this works with a quick example.
A few years ago I wrote a post about getting more traffic to your site.
It did OK.
But it never cracked the top 5 for my target keyword (“increase website traffic”).
And when I analyzed Google’s first page, I realized why:
My page didn’t satisfy user intent.
I’ll explain…
Most of the content ranking for “increase website traffic” listed bite-sized traffic tips.
But my post gave them a high-level process.
So I rewrote my content to match this keyword’s User Intent.
Specifically, I turned my process into a list post:
And now that my content matches User Intent, it ranks in the top 3 for my target keyword:
Which led to a 70.43% boost in organic traffic compared to the old version of the post:
That said:
You can also publish User Intent optimized content right out of the gate.
In fact, that’s what I did with my recent post: The Ultimate SEO Audit.
I saw that most of the content ranking for “SEO Audit” listed out non-technical steps.
So I included simple strategies that anyone could use:
I even emphasized the fact that my audit was non-technical.
(This hooks people so they don’t bounce back to the search results)
And this User Intent optimization (and my site’s Domain Authority…more on that later) helped my post crack the first page of Google within a month.
Step #7: Make Your Content Look Awesome
Design is THE most underrated part of content marketing.
You can have the best content ever written.
But if it looks like this…
…it’s not gonna work.
That’s why I invest A LOT of time and money into content design.
For example, you’ve probably seen one of my definitive guides:
These guides are designed and coded 100% from scratch.
(Which makes them super expensive to make)
That said:
Great content design doesn’t have to break the bank.
In fact, here are 4 types of visual content that are super easy to pull off.
Graphs and Charts
These work so well that I try to include at least one chart in every post.
Why?
Because they make data EASY to understand.
For example, take this stat from my mobile SEO guide.
I don’t know about you, but I have a hard time picturing 27.8 billion ANYTHING.
So I had our designer create a nice chart.
As a bonus, people will sometimes use your chart in a blog post… and link back to you:
Screenshots and Pictures
You might have noticed that I use LOTS of screenshots in every post.
In fact, this single post has 78 screenshots:
To be clear:
I don’t use screenshots just for the sake of using screenshots.
I only use them if it helps someone implement a specific step.
For example, these screenshots make the 2 steps from this guide dead-simple to follow:
That said:
Screenshots only make sense when you describe something technical.
What if you’re in a non-technical niche… like fitness?
Well, pictures serve the same purpose.
For example, my friend Steve Kamb at Nerd Fitness uses pictures to show you how to do exercises the right way:
Blog Post Banners
Unlike graphs and screenshots, blog post banners serve no practical purpose.
They just look cool 🙂
Depending on the post, I either use a right-aligned 220×200 image…
…or a giant banner at the top of the post:
Graphics and Visualizations
Graphics and visualizations are kind of like charts.
But instead of visualizing data, they visualize concepts.
To be clear:
These DON’T have to be fancy.
For example, in this post I explain how all 4 versions of your site should redirect to the same URL:
This isn’t rocket science.
But it’s hard to picture this idea in your mind.
So our designer made a simple visual that makes this concept easy to understand.
Step #8: Build Links to Your Page
Now it’s time to actively build links to your content.
Specifically, we’re going to tap into 3 link building strategies that are working GREAT right now.
Broken Link Building
Here’s where you find a broken link on someone’s site…
…and offer your content as a replacement.
For example, this is an outreach email that I sent to a blogger in the marketing niche:
(Note how specific I am. I don’t say “Please consider linking to me in a blog post”. I have a specific place on a specific page where my link makes sense)
And because I helped the person out BEFORE asking for anything, they were happy to add my link:
Competitor Analysis
This strategy is old school.
But it still works.
First, find a site that’s ranking for a keyword you want to rank for.
For example, I’m trying to rank for the keyword “SEO Audit”.
So I grab this result from the first page…
…and look at their backlinks.
I can see that this page has links from 160 domains:
So I should be able to get at least a handful of the same links they have.
To do that, I go one-by-one through their backlinks.
And find pages where my link would add value.
For example, this post mentions the Ahrefs content by name:
There’s no reason to link to my post there. So I moved onto the next opportunity on the list.
And I came across this post:
This time, the link to Ahrefs is part of a big list of resources.
A list that would be even BETTER and more complete with a link to my SEO audit post.
Evangelist Method
This strategy is less about links… and more about getting your content in front of the right people.
(Specifically: people that run blogs in your niche)
I’ll explain how this strategy works with an example…
A while back I wanted to promote a new Skyscraper Technique case study.
So I used BuzzSumo to see who recently shared content about The Skyscraper Technique.
And emailed everyone a variation of this template:
And when they replied “sure, I’ll check it out”, I sent them a link to the post:
(Note how I DON’T ask for a share. This is a Judo move that makes your outreach stand out)
Which led to dozens of shares to my brand post:
Step #9: Improve and Update Your Content
This is working amazingly well right now.
You might have read about the time that I used The Content Relaunch to boost my organic traffic by 260.7%:
And I’m happy to tell you that this approach still works.
For example, last year I relaunched this list of SEO techniques.
But I didn’t just re-post the same content and call it “new”.
Instead, I went through and removed old screenshots and images:
Added new strategies:
And deleted strategies that didn’t work anymore:
The result?
An 62.60% organic traffic boost to that page:
Bonus Step #1: Increase Your Domain Authority
This is the ultimate SEO superhack.
When you have a high Domain Authority, SEO gets A LOT easier.
For example, let’s look at the keyword “SEO audit”:
According to Ahrefs, you need backlinks from 108 websites to rank for this term:
But my content cracked the top 3 within weeks…
…with only 38 websites linking to me:
That’s the power of Domain Authority.
Here are 3 ways to increase your Domain Authority:
Content Partnerships
Partnerships can 2-5x the number of shares and links that you get from your content.
For example, my friend Larry Kim and I co-created this infographic:
And we both promoted it to our audiences on the same day:
Which got our infographic in front of thousands of people.
In fact, I still get links from this co-branded content… 2+ years later:
Publish Studies and Data
I touched on this in Step #4.
But it’s worth repeating.
In fact, if you look at my site, 3 of my top 5 most linked-to posts are studies or data-driven guides:
Guest Posts, Interviews, Speaking Gigs (and Yes) Roundup Posts
In other words:
Get your name out there… and the links will follow.
In fact, when I first started Backlinko, I guest posted like crazy:
I went on any podcast that would have me:
And I spent hours flying to countries like Romania and the Czech Republic to speak at conferences:
Even that wasn’t enough…
I was so determined to promote Backlinko that I added an “Interview Me” page on my site:
(That “Interview Me” page didn’t work. But at least I tried 🙂 )
Basically: I hustled to get my name out there.
It didn’t happen overnight.
But over time, all this work resulted in a ton of exposure… and links.
A while back Google said that comments can help your rankings:
To be clear:
I’m not convinced that blog comments are a direct Google ranking factor.
But I am convinced that a community indirectly helps with SEO.
(For example, community members are more likely to share your stuff on social media)
With that, here are 2 quick tips for getting more comments on every post:
Be Picky
This is counterintuitive.
But stay with me…
Imagine you just read an AWESOME post.
And you want to leave a comment with your two cents.
But when you hit the comments section, you see this:
Are you still going to leave that comment? Probably not.
That’s why I’m SUPER picky about the comments I let through.
And this pickiness fosters great discussions, like this:
Reply To Comments
I reply to 90% of the comments that come in.
And considering we have 24,189 total comments on the Backlinko blog…
…that’s approximately 21,000 replies.
Wow. That’s a lot of replies.
And I have ZERO regrets about replying to so many comments.
Why?
These replies show people that I care.
Which turns random commenters into active members of the Backlinko Community.
Now I’d Like To Hear From You
There you have it:
My 9-step SEO strategy for 2019.
Now I’d like to hear from you…
Which strategy from today’s post are you ready to try first?
Are you going to update and relaunch older content?
Or maybe you want to try Broken Link Building.
Either way, let me know by leaving a comment below right now.
Source link
0 notes
dillenwaeraa · 6 years
Text
How to rank for head terms
Over the last few years, my mental model for what does and doesn’t rank has changed significantly, and this is especially true for head terms - competitive, high volume, “big money” keywords like “car insurance”, “laptops”, “flights”, and so on. This post is based on a bunch of real-world experience that confounded my old mental model, as well as some statistical research that I did for my presentation at SearchLove London in early October. I’ll explain my hypothesis in this post, but I’ll also explain how I think you should react to it as SEOs - in other words, how to rank for head terms.
My hypothesis in both cases is that head terms are no longer about ranking factors, and by ranking factors I mean static metrics you can source by crawling the web and weight to decide who ranks. Many before me have made the claim that user signals are increasingly influential for competitive keywords, but this is still an extension of the ranking factors model, whereby data goes in, and rankings come out. My research and experience are leading me increasingly towards a more dynamic and responsive model, in which Google systematically tests, reshuffles and refines rankings over short periods, even when site themselves do not change.
Before we go any further, this isn’t an “SEO is dead”, “links are dead”, or “ranking factors are dead” post - rather, I think those “traditional” measures are the table stakes that qualify you for a different game.
Evidence 1: Links are less relevant in the top 5 positions
Back in early 2017, I was looking into the relationship between links and rankings, and I ran a mini ranking factor study which I published over on Moz. It wasn’t the question I was asking at the time, but one of the interesting outcomes of that study was that I found a far weaker correlation between DA and rankings than Moz had done in mid-2015.
The main difference between our studies, besides the time that had elapsed, was that Moz used the top 50 ranking positions to establish correlations, whereas I used the top 10, figuring that I wasn’t too interested in any magical ways of getting a site to jump from position 45 to position 40 - the click-through rate drop-off is quite steep enough just on the first page.
Statistically speaking, I’d maybe expect a weaker correlation when using fewer positions, but I wondered if perhaps there was more to it than that - maybe Moz had found a stronger relationship because ranking factors in general mattered more for lower rankings, where Google has less user data. Obviously, this wasn’t a fair comparison, though, so I decided to re-run my own study and compare correlations in positions 1-5 with correlations in positions 6-10. (You can read more about my methodology in the aforementioned post documenting that previous study.) I found even stronger versions of my results from 2 years ago, but this time I was looking for something else:
Domain Authority vs Rankings mean Spearman correlation by ranking position
The first thing to note here is that these are some extremely low correlation numbers - that’s to be expected when we’re dealing with only 5 points of data per keyword, and a system with so many other variables. In a regression analysis, the relationship between DA and rankings in positions 6-10 is still 98.5% statistically significant. However, for positions 1-5, it’s only around 41% statistically significant. In other words, links are fairly irrelevant for positions 1-5 in my data.
Now, this is only one ranking factor, and ~5,000 keywords, and ranking factor studies have their limitations.
Image: https://moz.com/blog/seo-ranking-factors-and-correlation
However, it’s still a compelling bit of evidence for my hypothesis. Links are the archetypal ranking factor, and Moz’s Domain Authority* is explicitly designed and optimised to use link-based data to predict rankings. This drop off in the top 5 fits with a mental model of Google continuously iterating and shuffling these results based on implied user feedback.
*I could have used Page Authority for this study, but didn’t, partly because I was concerned about URLs that Moz might not have discovered, and partly because I originally needed something that was a fair comparison with branded search volume, which is a site-level metric.
Evidence 2: SERPs change when they become high volume
This is actually the example that first got me thinking about this issue - seasonal keywords. Seasonal keywords provide, in some ways, the control that we lack in typical ranking factor studies, because they’re keywords that become head terms for certain times of the year, while little else changes. Take this example:
This keyword gets the overwhelming majority of its volume in a single week every year. It goes from being a backwater search term where Google has little to go on besides “ranking factors” to a hotly contested and highly trafficked head term. So it’d be pretty interesting if the rankings changed in the same period, right? Here’s the picture 2 weeks before Mother’s Day this year:
I’ve included a bunch of factors we might consider when assessing these rankings - I’ve chosen Domain Authority as it’s the site-level link-based metric that best correlates with rankings, and branded search volume (“BSV”) as it’s a metric I’ve found to be a strong predictor of SEO “ranking power”, both in the study I mentioned previously and in my experience working with client sites. The “specialist” column is particularly interesting, as the specialised sites are obviously more focused, but typically also better optimised. - M&S (marksandspencer.com, a big high-street department store in the UK) was very late to the HTTPS bandwagon, for example. However, it’s not my aim here to persuade you that these are good or correct rankings, but for what it’s worth, the landing pages are fairly similar (with some exceptions I’ll get to), and I think these are the kinds of question I’d be asking, as a search engine, if I lacked any user-signal-based data.
Here’s the picture that then unfolds:
Notice how everything goes to shit about seven days out? I don’t think it is at all a coincidence that that’s when the volume arrives. There are some pretty interesting stories if we dig into this, though. Check out the high-street brands:
Not bad eh? M&S, in particular, manages to get in above those two specialists that were jostling for 1st and 2nd previously.
These two specialist sites have a similarly interesting story:
These are probably two of the most “SEO'd” sites in this space. They might well have won a “ranking factors” competition. They have all the targeting sorted, decent technical and site speed, they use structured data for rich snippets, and so on. But, you’ve never heard of them, right?
But there are also two sites you’ve probably never heard of that did quite well:
Obviously, this is a complex picture, but I think it’s interesting that (at the time) the latter two sites had a far cleaner design than the former two. Check out Appleyard vs Serenata:
Just look at everything pulling your attention on Serenata, on the right.
Flying Flowers had another string to their bow, too - along with M&S, they were one of only two sites mentioning free delivery in their title.
But again, I’m not trying to convince you that the right websites won, or work out what Google is looking for here. The point is more simple than that: Evidently, when this keyword became high volume and big money, the game changed completely. Again, this fits nicely with my hypothesis of Google using user signals to continuously shuffle its own results.
Evidence 3: Ranking changes often relate more to Google re-assessing intent than Google re-assessing ranking factors
My last piece of evidence is very recent - it relates to the so-called “Medic” update on August 1st. Distilled works with a site that was heavily affected by this update - they sell cosmetic treatments and products in the UK. That makes them a highly commercial site, and yet, here’s who won for their core keywords when Medic hit:
Site Visibility Type WebMB +6.5% Medical encyclopedia Bupa +4.9% Healthcare NHS +4.6% Healthcare / Medical encyclopedia Cosmopolitan +4.6% Magazine Elle +3.6% Magazine Healthline +3.5% Medical encyclopedia
Data courtesy of SEOmonitor.
So that’s two magazines, two medical encyclopedia-style sites, and two household name general medical info/treatment sites (as opposed to cosmetics). Zero direct competitors - and it’s not like there’s a lack of direct competitors, for what it’s worth.
And this isn’t an isolated trend - it wasn’t for this site, and it’s not for many others I’ve worked with in recent years. Transactional terms are, in large numbers, going informational.
The interesting thing about this update for this client, is that although they’ve now regained their rankings, even at its worst, this never really hit their revenue figures. It’s almost like Google knew exactly what it was doing, and was testing whether people would prefer an informational result.
And again, this reinforces the picture I’ve been building over the last couple of years - this change is nothing to do with “ranking factors”. Ranking factors being re-weighted, which is what we normally think of with algorithm updates, would have only reshuffled the competitors, not boosted a load of sites with a completely different intent. Sure enough, most of the advice I see around Medic involves making your pages resemble informational pages.
Explanation: Why is this happening?
If I’ve not sold you yet on my world-view, perhaps this CNBC interview with Google will be the silver bullet.
This is a great article in many ways - its intentions are nothing to do with SEO, but rather politically motivated, after Trump called Google biased in September of this year. Nonetheless, it affords us a level of insight form the proverbial horse’s mouth that we’d never normally receive. My main takeaways are these:
In 2017, Google ran 31,584 experiments, resulting in 2,453 “search changes” - algorithm updates, to you and me. That’s roughly 7 per day.
When the interview was conducted, the team that CNBC talked to was working on an experiment involving increased use of images in search results. The metrics they were optimising for were:
The speed with which users interacted with the SERP
The rate at which they quickly bounced back to the search results (note: if you think about it, this is not equivalent to and probably not even correlated with bounce rate in Google Analytics).
It’s important to remember that Google search engineers are people doing jobs with targets and KPIs just like the rest of us. And their KPI is not to get the sites with the best-ranking factors to the top - ranking factors, whether they be links, page speed, title tags or whatever else are just a means to an end.
Under this model, with those explicit KPIs, as an SEO we equally ought to be thinking about “ranking factors” like price, aesthetics, and the presence or lack of pop-ups, banners, and interstitials.
Now, admittedly, this article does not explicitly confirm or even mention a dynamic model like the one I’ve discussed earlier in this article. But it does discuss a mindset at Google that very much leads in that direction - if Google knows it’s optimising for certain user signals, and it can also collect those signals in real-time, why not be responsive?
Implications: How to rank for head terms
As I said at the start of this article, I am not suggesting for a moment that the fundamentals of SEO we’ve been practising for the last however many years are suddenly obsolete. At Distilled, we’re still seeing clients earn results and growth from cleaning up their technical SEO, improving their information architecture, or link-focused creative campaigns - all of which are reliant on an “old school” understanding of how Google works. Frankly, the continued existence of SEO as an industry is in itself reasonable proof that these methods, on average, pay for themselves.
But the picture is certainly more nuanced at the top, and I think those Google KPIs are an invaluable sneak peek into what that picture might look like. As a reminder, I’m talking about:
The speed with which users interact with a SERP (quicker is better)
The rate at which they quickly bounce back to results (lower is better)
There are some obvious ways we can optimise for these as SEOs, some of which are well within our wheelhouse, and some of which we might typically ignore. For example:
Optimising for SERP interaction speed - getting that “no-brainer” click on your site:
Metadata - we’ve been using this to stand out in search results for years
E.g. “free delivery” in title
E.g. professionally written meta description copy
Brand awareness/perception - think about whether you’d be likely to click on the Guardian or Forbes with similar articles for the same query
Optimising for rate of return to SERPs:
Sitespeed - have you ever bailed on a slow site, especially on mobile?
First impression - the “this isn’t what I expected” or “I can’t be bothered” factor
Price
Pop-ups etc.
Aesthetics(!)
As I said, some of these can be daunting to approach as digital marketers, because they’re a little outside of our usual playbook. But actually, lots of stuff we do for other reasons ends up being very efficient for these metrics - for example, if you want to improve your site’s brand awareness, how about top of funnel SEO content, top of funnel social content, native advertising, display, or carefully tailored post-conversion email marketing? If you want to improve first impressions, how about starting with a Panda survey of you and your competitors?
Similarly, these KPIs can seem harder to measure than our traditional metrics, but this is another area where we’re better equipped than we sometimes think. We can track click-through rates in Google Search Console (although you’ll need to control for rankings & keyword make-up), we can track something resembling intent satisfaction via scroll tracking, and I’ve talked before about how to get started measuring brand awareness.
Some of this (perhaps frustratingly!) comes down to being “ready” to rank - if your product and customer experience is not up to scratch, no amount of SEO can save you from that in this new world, because Google is explicitly trying to give customers results that win on product and customer experience, not on SEO.
There’s also the intent piece - I think a lot of brands need to be readier than they are for some of their biggest head terms “going informational on them”. This means having great informational content in place and ready to go - and by that, I do not mean a quick blog post or a thinly veiled product page. Relatedly, I’d recommend this in-depth article about predicting and building for “latent intents” as a starting point.
Summary
I’ve tried in this article to summarise how I see the SEO game-changing, and how I think we need to adapt. If you have two main takeaways, I’d like it to be those two KPIs - the speed with which users interact with a SERP, and the rate at which they quickly bounce back to results (lower is better) - and what they really mean for your marketing strategy.
What I don’t want you to take away is that I’m in any way undermining SEO fundamentals - links, on-page, or whatever else. That’s still how you qualify, how you get to a position where Google has any user signals from your site to start with. All that said, I know this is a controversial topic, and this post is heavily driven by my own experience, so I’d love to hear your thoughts below!
from Marketing https://www.distilled.net/resources/how-to-rank-for-head-terms/ via http://www.rssmix.com/
0 notes
heavenwheel · 6 years
Text
How to rank for head terms
Over the last few years, my mental model for what does and doesn’t rank has changed significantly, and this is especially true for head terms - competitive, high volume, “big money” keywords like “car insurance”, “laptops”, “flights”, and so on. This post is based on a bunch of real-world experience that confounded my old mental model, as well as some statistical research that I did for my presentation at SearchLove London in early October. I’ll explain my hypothesis in this post, but I’ll also explain how I think you should react to it as SEOs - in other words, how to rank for head terms.
My hypothesis in both cases is that head terms are no longer about ranking factors, and by ranking factors I mean static metrics you can source by crawling the web and weight to decide who ranks. Many before me have made the claim that user signals are increasingly influential for competitive keywords, but this is still an extension of the ranking factors model, whereby data goes in, and rankings come out. My research and experience are leading me increasingly towards a more dynamic and responsive model, in which Google systematically tests, reshuffles and refines rankings over short periods, even when site themselves do not change.
Before we go any further, this isn’t an “SEO is dead”, “links are dead”, or “ranking factors are dead” post - rather, I think those “traditional” measures are the table stakes that qualify you for a different game.
Evidence 1: Links are less relevant in the top 5 positions
Back in early 2017, I was looking into the relationship between links and rankings, and I ran a mini ranking factor study which I published over on Moz. It wasn’t the question I was asking at the time, but one of the interesting outcomes of that study was that I found a far weaker correlation between DA and rankings than Moz had done in mid-2015.
The main difference between our studies, besides the time that had elapsed, was that Moz used the top 50 ranking positions to establish correlations, whereas I used the top 10, figuring that I wasn’t too interested in any magical ways of getting a site to jump from position 45 to position 40 - the click-through rate drop-off is quite steep enough just on the first page.
Statistically speaking, I’d maybe expect a weaker correlation when using fewer positions, but I wondered if perhaps there was more to it than that - maybe Moz had found a stronger relationship because ranking factors in general mattered more for lower rankings, where Google has less user data. Obviously, this wasn’t a fair comparison, though, so I decided to re-run my own study and compare correlations in positions 1-5 with correlations in positions 6-10. (You can read more about my methodology in the aforementioned post documenting that previous study.) I found even stronger versions of my results from 2 years ago, but this time I was looking for something else:
Domain Authority vs Rankings mean Spearman correlation by ranking position
The first thing to note here is that these are some extremely low correlation numbers - that’s to be expected when we’re dealing with only 5 points of data per keyword, and a system with so many other variables. In a regression analysis, the relationship between DA and rankings in positions 6-10 is still 98.5% statistically significant. However, for positions 1-5, it’s only around 41% statistically significant. In other words, links are fairly irrelevant for positions 1-5 in my data.
Now, this is only one ranking factor, and ~5,000 keywords, and ranking factor studies have their limitations.
Image: https://moz.com/blog/seo-ranking-factors-and-correlation
However, it’s still a compelling bit of evidence for my hypothesis. Links are the archetypal ranking factor, and Moz’s Domain Authority* is explicitly designed and optimised to use link-based data to predict rankings. This drop off in the top 5 fits with a mental model of Google continuously iterating and shuffling these results based on implied user feedback.
*I could have used Page Authority for this study, but didn’t, partly because I was concerned about URLs that Moz might not have discovered, and partly because I originally needed something that was a fair comparison with branded search volume, which is a site-level metric.
Evidence 2: SERPs change when they become high volume
This is actually the example that first got me thinking about this issue - seasonal keywords. Seasonal keywords provide, in some ways, the control that we lack in typical ranking factor studies, because they’re keywords that become head terms for certain times of the year, while little else changes. Take this example:
This keyword gets the overwhelming majority of its volume in a single week every year. It goes from being a backwater search term where Google has little to go on besides “ranking factors” to a hotly contested and highly trafficked head term. So it’d be pretty interesting if the rankings changed in the same period, right? Here’s the picture 2 weeks before Mother’s Day this year:
I’ve included a bunch of factors we might consider when assessing these rankings - I’ve chosen Domain Authority as it’s the site-level link-based metric that best correlates with rankings, and branded search volume (“BSV”) as it’s a metric I’ve found to be a strong predictor of SEO “ranking power”, both in the study I mentioned previously and in my experience working with client sites. The “specialist” column is particularly interesting, as the specialised sites are obviously more focused, but typically also better optimised. - M&S (marksandspencer.com, a big high-street department store in the UK) was very late to the HTTPS bandwagon, for example. However, it’s not my aim here to persuade you that these are good or correct rankings, but for what it’s worth, the landing pages are fairly similar (with some exceptions I’ll get to), and I think these are the kinds of question I’d be asking, as a search engine, if I lacked any user-signal-based data.
Here’s the picture that then unfolds:
Notice how everything goes to shit about seven days out? I don’t think it is at all a coincidence that that’s when the volume arrives. There are some pretty interesting stories if we dig into this, though. Check out the high-street brands:
Not bad eh? M&S, in particular, manages to get in above those two specialists that were jostling for 1st and 2nd previously.
These two specialist sites have a similarly interesting story:
These are probably two of the most “SEO'd” sites in this space. They might well have won a “ranking factors” competition. They have all the targeting sorted, decent technical and site speed, they use structured data for rich snippets, and so on. But, you’ve never heard of them, right?
But there are also two sites you’ve probably never heard of that did quite well:
Obviously, this is a complex picture, but I think it’s interesting that (at the time) the latter two sites had a far cleaner design than the former two. Check out Appleyard vs Serenata:
Just look at everything pulling your attention on Serenata, on the right.
Flying Flowers had another string to their bow, too - along with M&S, they were one of only two sites mentioning free delivery in their title.
But again, I’m not trying to convince you that the right websites won, or work out what Google is looking for here. The point is more simple than that: Evidently, when this keyword became high volume and big money, the game changed completely. Again, this fits nicely with my hypothesis of Google using user signals to continuously shuffle its own results.
Evidence 3: Ranking changes often relate more to Google re-assessing intent than Google re-assessing ranking factors
My last piece of evidence is very recent - it relates to the so-called “Medic” update on August 1st. Distilled works with a site that was heavily affected by this update - they sell cosmetic treatments and products in the UK. That makes them a highly commercial site, and yet, here’s who won for their core keywords when Medic hit:
Site Visibility Type WebMB +6.5% Medical encyclopedia Bupa +4.9% Healthcare NHS +4.6% Healthcare / Medical encyclopedia Cosmopolitan +4.6% Magazine Elle +3.6% Magazine Healthline +3.5% Medical encyclopedia
Data courtesy of SEOmonitor.
So that’s two magazines, two medical encyclopedia-style sites, and two household name general medical info/treatment sites (as opposed to cosmetics). Zero direct competitors - and it’s not like there’s a lack of direct competitors, for what it’s worth.
And this isn’t an isolated trend - it wasn’t for this site, and it’s not for many others I’ve worked with in recent years. Transactional terms are, in large numbers, going informational.
The interesting thing about this update for this client, is that although they’ve now regained their rankings, even at its worst, this never really hit their revenue figures. It’s almost like Google knew exactly what it was doing, and was testing whether people would prefer an informational result.
And again, this reinforces the picture I’ve been building over the last couple of years - this change is nothing to do with “ranking factors”. Ranking factors being re-weighted, which is what we normally think of with algorithm updates, would have only reshuffled the competitors, not boosted a load of sites with a completely different intent. Sure enough, most of the advice I see around Medic involves making your pages resemble informational pages.
Explanation: Why is this happening?
If I’ve not sold you yet on my world-view, perhaps this CNBC interview with Google will be the silver bullet.
This is a great article in many ways - its intentions are nothing to do with SEO, but rather politically motivated, after Trump called Google biased in September of this year. Nonetheless, it affords us a level of insight form the proverbial horse’s mouth that we’d never normally receive. My main takeaways are these:
In 2017, Google ran 31,584 experiments, resulting in 2,453 “search changes” - algorithm updates, to you and me. That’s roughly 7 per day.
When the interview was conducted, the team that CNBC talked to was working on an experiment involving increased use of images in search results. The metrics they were optimising for were:
The speed with which users interacted with the SERP
The rate at which they quickly bounced back to the search results (note: if you think about it, this is not equivalent to and probably not even correlated with bounce rate in Google Analytics).
It’s important to remember that Google search engineers are people doing jobs with targets and KPIs just like the rest of us. And their KPI is not to get the sites with the best-ranking factors to the top - ranking factors, whether they be links, page speed, title tags or whatever else are just a means to an end.
Under this model, with those explicit KPIs, as an SEO we equally ought to be thinking about “ranking factors” like price, aesthetics, and the presence or lack of pop-ups, banners, and interstitials.
Now, admittedly, this article does not explicitly confirm or even mention a dynamic model like the one I’ve discussed earlier in this article. But it does discuss a mindset at Google that very much leads in that direction - if Google knows it’s optimising for certain user signals, and it can also collect those signals in real-time, why not be responsive?
Implications: How to rank for head terms
As I said at the start of this article, I am not suggesting for a moment that the fundamentals of SEO we’ve been practising for the last however many years are suddenly obsolete. At Distilled, we’re still seeing clients earn results and growth from cleaning up their technical SEO, improving their information architecture, or link-focused creative campaigns - all of which are reliant on an “old school” understanding of how Google works. Frankly, the continued existence of SEO as an industry is in itself reasonable proof that these methods, on average, pay for themselves.
But the picture is certainly more nuanced at the top, and I think those Google KPIs are an invaluable sneak peek into what that picture might look like. As a reminder, I’m talking about:
The speed with which users interact with a SERP (quicker is better)
The rate at which they quickly bounce back to results (lower is better)
There are some obvious ways we can optimise for these as SEOs, some of which are well within our wheelhouse, and some of which we might typically ignore. For example:
Optimising for SERP interaction speed - getting that “no-brainer” click on your site:
Metadata - we’ve been using this to stand out in search results for years
E.g. “free delivery” in title
E.g. professionally written meta description copy
Brand awareness/perception - think about whether you’d be likely to click on the Guardian or Forbes with similar articles for the same query
Optimising for rate of return to SERPs:
Sitespeed - have you ever bailed on a slow site, especially on mobile?
First impression - the “this isn’t what I expected” or “I can’t be bothered” factor
Price
Pop-ups etc.
Aesthetics(!)
As I said, some of these can be daunting to approach as digital marketers, because they’re a little outside of our usual playbook. But actually, lots of stuff we do for other reasons ends up being very efficient for these metrics - for example, if you want to improve your site’s brand awareness, how about top of funnel SEO content, top of funnel social content, native advertising, display, or carefully tailored post-conversion email marketing? If you want to improve first impressions, how about starting with a Panda survey of you and your competitors?
Similarly, these KPIs can seem harder to measure than our traditional metrics, but this is another area where we’re better equipped than we sometimes think. We can track click-through rates in Google Search Console (although you’ll need to control for rankings & keyword make-up), we can track something resembling intent satisfaction via scroll tracking, and I’ve talked before about how to get started measuring brand awareness.
Some of this (perhaps frustratingly!) comes down to being “ready” to rank - if your product and customer experience is not up to scratch, no amount of SEO can save you from that in this new world, because Google is explicitly trying to give customers results that win on product and customer experience, not on SEO.
There’s also the intent piece - I think a lot of brands need to be readier than they are for some of their biggest head terms “going informational on them”. This means having great informational content in place and ready to go - and by that, I do not mean a quick blog post or a thinly veiled product page. Relatedly, I’d recommend this in-depth article about predicting and building for “latent intents” as a starting point.
Summary
I’ve tried in this article to summarise how I see the SEO game-changing, and how I think we need to adapt. If you have two main takeaways, I’d like it to be those two KPIs - the speed with which users interact with a SERP, and the rate at which they quickly bounce back to results (lower is better) - and what they really mean for your marketing strategy.
What I don’t want you to take away is that I’m in any way undermining SEO fundamentals - links, on-page, or whatever else. That’s still how you qualify, how you get to a position where Google has any user signals from your site to start with. All that said, I know this is a controversial topic, and this post is heavily driven by my own experience, so I’d love to hear your thoughts below!
from Digital https://www.distilled.net/resources/how-to-rank-for-head-terms/ via http://www.rssmix.com/
0 notes
ronijashworth · 6 years
Text
How to rank for head terms
Over the last few years, my mental model for what does and doesn’t rank has changed significantly, and this is especially true for head terms - competitive, high volume, “big money” keywords like “car insurance”, “laptops”, “flights”, and so on. This post is based on a bunch of real-world experience that confounded my old mental model, as well as some statistical research that I did for my presentation at SearchLove London in early October. I’ll explain my hypothesis in this post, but I’ll also explain how I think you should react to it as SEOs - in other words, how to rank for head terms.
My hypothesis in both cases is that head terms are no longer about ranking factors, and by ranking factors I mean static metrics you can source by crawling the web and weight to decide who ranks. Many before me have made the claim that user signals are increasingly influential for competitive keywords, but this is still an extension of the ranking factors model, whereby data goes in, and rankings come out. My research and experience are leading me increasingly towards a more dynamic and responsive model, in which Google systematically tests, reshuffles and refines rankings over short periods, even when site themselves do not change.
Before we go any further, this isn’t an “SEO is dead”, “links are dead”, or “ranking factors are dead” post - rather, I think those “traditional” measures are the table stakes that qualify you for a different game.
Evidence 1: Links are less relevant in the top 5 positions
Back in early 2017, I was looking into the relationship between links and rankings, and I ran a mini ranking factor study which I published over on Moz. It wasn’t the question I was asking at the time, but one of the interesting outcomes of that study was that I found a far weaker correlation between DA and rankings than Moz had done in mid-2015.
The main difference between our studies, besides the time that had elapsed, was that Moz used the top 50 ranking positions to establish correlations, whereas I used the top 10, figuring that I wasn’t too interested in any magical ways of getting a site to jump from position 45 to position 40 - the click-through rate drop-off is quite steep enough just on the first page.
Statistically speaking, I’d maybe expect a weaker correlation when using fewer positions, but I wondered if perhaps there was more to it than that - maybe Moz had found a stronger relationship because ranking factors in general mattered more for lower rankings, where Google has less user data. Obviously, this wasn’t a fair comparison, though, so I decided to re-run my own study and compare correlations in positions 1-5 with correlations in positions 6-10. (You can read more about my methodology in the aforementioned post documenting that previous study.) I found even stronger versions of my results from 2 years ago, but this time I was looking for something else:
Domain Authority vs Rankings mean Spearman correlation by ranking position
The first thing to note here is that these are some extremely low correlation numbers - that’s to be expected when we’re dealing with only 5 points of data per keyword, and a system with so many other variables. In a regression analysis, the relationship between DA and rankings in positions 6-10 is still 98.5% statistically significant. However, for positions 1-5, it’s only around 41% statistically significant. In other words, links are fairly irrelevant for positions 1-5 in my data.
Now, this is only one ranking factor, and ~5,000 keywords, and ranking factor studies have their limitations.
Image: https://moz.com/blog/seo-ranking-factors-and-correlation
However, it’s still a compelling bit of evidence for my hypothesis. Links are the archetypal ranking factor, and Moz’s Domain Authority* is explicitly designed and optimised to use link-based data to predict rankings. This drop off in the top 5 fits with a mental model of Google continuously iterating and shuffling these results based on implied user feedback.
*I could have used Page Authority for this study, but didn’t, partly because I was concerned about URLs that Moz might not have discovered, and partly because I originally needed something that was a fair comparison with branded search volume, which is a site-level metric.
Evidence 2: SERPs change when they become high volume
This is actually the example that first got me thinking about this issue - seasonal keywords. Seasonal keywords provide, in some ways, the control that we lack in typical ranking factor studies, because they’re keywords that become head terms for certain times of the year, while little else changes. Take this example:
This keyword gets the overwhelming majority of its volume in a single week every year. It goes from being a backwater search term where Google has little to go on besides “ranking factors” to a hotly contested and highly trafficked head term. So it’d be pretty interesting if the rankings changed in the same period, right? Here’s the picture 2 weeks before Mother’s Day this year:
I’ve included a bunch of factors we might consider when assessing these rankings - I’ve chosen Domain Authority as it’s the site-level link-based metric that best correlates with rankings, and branded search volume (“BSV”) as it’s a metric I’ve found to be a strong predictor of SEO “ranking power”, both in the study I mentioned previously and in my experience working with client sites. The “specialist” column is particularly interesting, as the specialised sites are obviously more focused, but typically also better optimised. - M&S (marksandspencer.com, a big high-street department store in the UK) was very late to the HTTPS bandwagon, for example. However, it’s not my aim here to persuade you that these are good or correct rankings, but for what it’s worth, the landing pages are fairly similar (with some exceptions I’ll get to), and I think these are the kinds of question I’d be asking, as a search engine, if I lacked any user-signal-based data.
Here’s the picture that then unfolds:
Notice how everything goes to shit about seven days out? I don’t think it is at all a coincidence that that’s when the volume arrives. There are some pretty interesting stories if we dig into this, though. Check out the high-street brands:
Not bad eh? M&S, in particular, manages to get in above those two specialists that were jostling for 1st and 2nd previously.
These two specialist sites have a similarly interesting story:
These are probably two of the most “SEO'd” sites in this space. They might well have won a “ranking factors” competition. They have all the targeting sorted, decent technical and site speed, they use structured data for rich snippets, and so on. But, you’ve never heard of them, right?
But there are also two sites you’ve probably never heard of that did quite well:
Obviously, this is a complex picture, but I think it’s interesting that (at the time) the latter two sites had a far cleaner design than the former two. Check out Appleyard vs Serenata:
Just look at everything pulling your attention on Serenata, on the right.
Flying Flowers had another string to their bow, too - along with M&S, they were one of only two sites mentioning free delivery in their title.
But again, I’m not trying to convince you that the right websites won, or work out what Google is looking for here. The point is more simple than that: Evidently, when this keyword became high volume and big money, the game changed completely. Again, this fits nicely with my hypothesis of Google using user signals to continuously shuffle its own results.
Evidence 3: Ranking changes often relate more to Google re-assessing intent than Google re-assessing ranking factors
My last piece of evidence is very recent - it relates to the so-called “Medic” update on August 1st. Distilled works with a site that was heavily affected by this update - they sell cosmetic treatments and products in the UK. That makes them a highly commercial site, and yet, here’s who won for their core keywords when Medic hit:
Site Visibility Type WebMB +6.5% Medical encyclopedia Bupa +4.9% Healthcare NHS +4.6% Healthcare / Medical encyclopedia Cosmopolitan +4.6% Magazine Elle +3.6% Magazine Healthline +3.5% Medical encyclopedia
Data courtesy of SEOmonitor.
So that’s two magazines, two medical encyclopedia-style sites, and two household name general medical info/treatment sites (as opposed to cosmetics). Zero direct competitors - and it’s not like there’s a lack of direct competitors, for what it’s worth.
And this isn’t an isolated trend - it wasn’t for this site, and it’s not for many others I’ve worked with in recent years. Transactional terms are, in large numbers, going informational.
The interesting thing about this update for this client, is that although they’ve now regained their rankings, even at its worst, this never really hit their revenue figures. It’s almost like Google knew exactly what it was doing, and was testing whether people would prefer an informational result.
And again, this reinforces the picture I’ve been building over the last couple of years - this change is nothing to do with “ranking factors”. Ranking factors being re-weighted, which is what we normally think of with algorithm updates, would have only reshuffled the competitors, not boosted a load of sites with a completely different intent. Sure enough, most of the advice I see around Medic involves making your pages resemble informational pages.
Explanation: Why is this happening?
If I’ve not sold you yet on my world-view, perhaps this CNBC interview with Google will be the silver bullet.
This is a great article in many ways - its intentions are nothing to do with SEO, but rather politically motivated, after Trump called Google biased in September of this year. Nonetheless, it affords us a level of insight form the proverbial horse’s mouth that we’d never normally receive. My main takeaways are these:
In 2017, Google ran 31,584 experiments, resulting in 2,453 “search changes” - algorithm updates, to you and me. That’s roughly 7 per day.
When the interview was conducted, the team that CNBC talked to was working on an experiment involving increased use of images in search results. The metrics they were optimising for were:
The speed with which users interacted with the SERP
The rate at which they quickly bounced back to the search results (note: if you think about it, this is not equivalent to and probably not even correlated with bounce rate in Google Analytics).
It’s important to remember that Google search engineers are people doing jobs with targets and KPIs just like the rest of us. And their KPI is not to get the sites with the best-ranking factors to the top - ranking factors, whether they be links, page speed, title tags or whatever else are just a means to an end.
Under this model, with those explicit KPIs, as an SEO we equally ought to be thinking about “ranking factors” like price, aesthetics, and the presence or lack of pop-ups, banners, and interstitials.
Now, admittedly, this article does not explicitly confirm or even mention a dynamic model like the one I’ve discussed earlier in this article. But it does discuss a mindset at Google that very much leads in that direction - if Google knows it’s optimising for certain user signals, and it can also collect those signals in real-time, why not be responsive?
Implications: How to rank for head terms
As I said at the start of this article, I am not suggesting for a moment that the fundamentals of SEO we’ve been practising for the last however many years are suddenly obsolete. At Distilled, we’re still seeing clients earn results and growth from cleaning up their technical SEO, improving their information architecture, or link-focused creative campaigns - all of which are reliant on an “old school” understanding of how Google works. Frankly, the continued existence of SEO as an industry is in itself reasonable proof that these methods, on average, pay for themselves.
But the picture is certainly more nuanced at the top, and I think those Google KPIs are an invaluable sneak peek into what that picture might look like. As a reminder, I’m talking about:
The speed with which users interact with a SERP (quicker is better)
The rate at which they quickly bounce back to results (lower is better)
There are some obvious ways we can optimise for these as SEOs, some of which are well within our wheelhouse, and some of which we might typically ignore. For example:
Optimising for SERP interaction speed - getting that “no-brainer” click on your site:
Metadata - we’ve been using this to stand out in search results for years
E.g. “free delivery” in title
E.g. professionally written meta description copy
Brand awareness/perception - think about whether you’d be likely to click on the Guardian or Forbes with similar articles for the same query
Optimising for rate of return to SERPs:
Sitespeed - have you ever bailed on a slow site, especially on mobile?
First impression - the “this isn’t what I expected” or “I can’t be bothered” factor
Price
Pop-ups etc.
Aesthetics(!)
As I said, some of these can be daunting to approach as digital marketers, because they’re a little outside of our usual playbook. But actually, lots of stuff we do for other reasons ends up being very efficient for these metrics - for example, if you want to improve your site’s brand awareness, how about top of funnel SEO content, top of funnel social content, native advertising, display, or carefully tailored post-conversion email marketing? If you want to improve first impressions, how about starting with a Panda survey of you and your competitors?
Similarly, these KPIs can seem harder to measure than our traditional metrics, but this is another area where we’re better equipped than we sometimes think. We can track click-through rates in Google Search Console (although you’ll need to control for rankings & keyword make-up), we can track something resembling intent satisfaction via scroll tracking, and I’ve talked before about how to get started measuring brand awareness.
Some of this (perhaps frustratingly!) comes down to being “ready” to rank - if your product and customer experience is not up to scratch, no amount of SEO can save you from that in this new world, because Google is explicitly trying to give customers results that win on product and customer experience, not on SEO.
There’s also the intent piece - I think a lot of brands need to be readier than they are for some of their biggest head terms “going informational on them”. This means having great informational content in place and ready to go - and by that, I do not mean a quick blog post or a thinly veiled product page. Relatedly, I’d recommend this in-depth article about predicting and building for “latent intents” as a starting point.
Summary
I’ve tried in this article to summarise how I see the SEO game-changing, and how I think we need to adapt. If you have two main takeaways, I’d like it to be those two KPIs - the speed with which users interact with a SERP, and the rate at which they quickly bounce back to results (lower is better) - and what they really mean for your marketing strategy.
What I don’t want you to take away is that I’m in any way undermining SEO fundamentals - links, on-page, or whatever else. That’s still how you qualify, how you get to a position where Google has any user signals from your site to start with. All that said, I know this is a controversial topic, and this post is heavily driven by my own experience, so I’d love to hear your thoughts below!
from Digital Marketing https://www.distilled.net/resources/how-to-rank-for-head-terms/ via http://www.rssmix.com/
0 notes
donnafmae · 6 years
Text
How to rank for head terms
Over the last few years, my mental model for what does and doesn’t rank has changed significantly, and this is especially true for head terms - competitive, high volume, “big money” keywords like “car insurance”, “laptops”, “flights”, and so on. This post is based on a bunch of real-world experience that confounded my old mental model, as well as some statistical research that I did for my presentation at SearchLove London in early October. I’ll explain my hypothesis in this post, but I’ll also explain how I think you should react to it as SEOs - in other words, how to rank for head terms.
My hypothesis in both cases is that head terms are no longer about ranking factors, and by ranking factors I mean static metrics you can source by crawling the web and weight to decide who ranks. Many before me have made the claim that user signals are increasingly influential for competitive keywords, but this is still an extension of the ranking factors model, whereby data goes in, and rankings come out. My research and experience are leading me increasingly towards a more dynamic and responsive model, in which Google systematically tests, reshuffles and refines rankings over short periods, even when site themselves do not change.
Before we go any further, this isn’t an “SEO is dead”, “links are dead”, or “ranking factors are dead” post - rather, I think those “traditional” measures are the table stakes that qualify you for a different game.
Evidence 1: Links are less relevant in the top 5 positions
Back in early 2017, I was looking into the relationship between links and rankings, and I ran a mini ranking factor study which I published over on Moz. It wasn’t the question I was asking at the time, but one of the interesting outcomes of that study was that I found a far weaker correlation between DA and rankings than Moz had done in mid-2015.
The main difference between our studies, besides the time that had elapsed, was that Moz used the top 50 ranking positions to establish correlations, whereas I used the top 10, figuring that I wasn’t too interested in any magical ways of getting a site to jump from position 45 to position 40 - the click-through rate drop-off is quite steep enough just on the first page.
Statistically speaking, I’d maybe expect a weaker correlation when using fewer positions, but I wondered if perhaps there was more to it than that - maybe Moz had found a stronger relationship because ranking factors in general mattered more for lower rankings, where Google has less user data. Obviously, this wasn’t a fair comparison, though, so I decided to re-run my own study and compare correlations in positions 1-5 with correlations in positions 6-10. (You can read more about my methodology in the aforementioned post documenting that previous study.) I found even stronger versions of my results from 2 years ago, but this time I was looking for something else:
Domain Authority vs Rankings mean Spearman correlation by ranking position
The first thing to note here is that these are some extremely low correlation numbers - that’s to be expected when we’re dealing with only 5 points of data per keyword, and a system with so many other variables. In a regression analysis, the relationship between DA and rankings in positions 6-10 is still 98.5% statistically significant. However, for positions 1-5, it’s only around 41% statistically significant. In other words, links are fairly irrelevant for positions 1-5 in my data.
Now, this is only one ranking factor, and ~5,000 keywords, and ranking factor studies have their limitations.
Image: https://moz.com/blog/seo-ranking-factors-and-correlation
However, it’s still a compelling bit of evidence for my hypothesis. Links are the archetypal ranking factor, and Moz’s Domain Authority* is explicitly designed and optimised to use link-based data to predict rankings. This drop off in the top 5 fits with a mental model of Google continuously iterating and shuffling these results based on implied user feedback.
*I could have used Page Authority for this study, but didn’t, partly because I was concerned about URLs that Moz might not have discovered, and partly because I originally needed something that was a fair comparison with branded search volume, which is a site-level metric.
Evidence 2: SERPs change when they become high volume
This is actually the example that first got me thinking about this issue - seasonal keywords. Seasonal keywords provide, in some ways, the control that we lack in typical ranking factor studies, because they’re keywords that become head terms for certain times of the year, while little else changes. Take this example:
This keyword gets the overwhelming majority of its volume in a single week every year. It goes from being a backwater search term where Google has little to go on besides “ranking factors” to a hotly contested and highly trafficked head term. So it’d be pretty interesting if the rankings changed in the same period, right? Here’s the picture 2 weeks before Mother’s Day this year:
I’ve included a bunch of factors we might consider when assessing these rankings - I’ve chosen Domain Authority as it’s the site-level link-based metric that best correlates with rankings, and branded search volume (“BSV”) as it’s a metric I’ve found to be a strong predictor of SEO “ranking power”, both in the study I mentioned previously and in my experience working with client sites. The “specialist” column is particularly interesting, as the specialised sites are obviously more focused, but typically also better optimised. - M&S (marksandspencer.com, a big high-street department store in the UK) was very late to the HTTPS bandwagon, for example. However, it’s not my aim here to persuade you that these are good or correct rankings, but for what it’s worth, the landing pages are fairly similar (with some exceptions I’ll get to), and I think these are the kinds of question I’d be asking, as a search engine, if I lacked any user-signal-based data.
Here’s the picture that then unfolds:
Notice how everything goes to shit about seven days out? I don’t think it is at all a coincidence that that’s when the volume arrives. There are some pretty interesting stories if we dig into this, though. Check out the high-street brands:
Not bad eh? M&S, in particular, manages to get in above those two specialists that were jostling for 1st and 2nd previously.
These two specialist sites have a similarly interesting story:
These are probably two of the most “SEO'd” sites in this space. They might well have won a “ranking factors” competition. They have all the targeting sorted, decent technical and site speed, they use structured data for rich snippets, and so on. But, you’ve never heard of them, right?
But there are also two sites you’ve probably never heard of that did quite well:
Obviously, this is a complex picture, but I think it’s interesting that (at the time) the latter two sites had a far cleaner design than the former two. Check out Appleyard vs Serenata:
Just look at everything pulling your attention on Serenata, on the right.
Flying Flowers had another string to their bow, too - along with M&S, they were one of only two sites mentioning free delivery in their title.
But again, I’m not trying to convince you that the right websites won, or work out what Google is looking for here. The point is more simple than that: Evidently, when this keyword became high volume and big money, the game changed completely. Again, this fits nicely with my hypothesis of Google using user signals to continuously shuffle its own results.
Evidence 3: Ranking changes often relate more to Google re-assessing intent than Google re-assessing ranking factors
My last piece of evidence is very recent - it relates to the so-called “Medic” update on August 1st. Distilled works with a site that was heavily affected by this update - they sell cosmetic treatments and products in the UK. That makes them a highly commercial site, and yet, here’s who won for their core keywords when Medic hit:
Site Visibility Type WebMB +6.5% Medical encyclopedia Bupa +4.9% Healthcare NHS +4.6% Healthcare / Medical encyclopedia Cosmopolitan +4.6% Magazine Elle +3.6% Magazine Healthline +3.5% Medical encyclopedia
Data courtesy of SEOmonitor.
So that’s two magazines, two medical encyclopedia-style sites, and two household name general medical info/treatment sites (as opposed to cosmetics). Zero direct competitors - and it’s not like there’s a lack of direct competitors, for what it’s worth.
And this isn’t an isolated trend - it wasn’t for this site, and it’s not for many others I’ve worked with in recent years. Transactional terms are, in large numbers, going informational.
The interesting thing about this update for this client, is that although they’ve now regained their rankings, even at its worst, this never really hit their revenue figures. It’s almost like Google knew exactly what it was doing, and was testing whether people would prefer an informational result.
And again, this reinforces the picture I’ve been building over the last couple of years - this change is nothing to do with “ranking factors”. Ranking factors being re-weighted, which is what we normally think of with algorithm updates, would have only reshuffled the competitors, not boosted a load of sites with a completely different intent. Sure enough, most of the advice I see around Medic involves making your pages resemble informational pages.
Explanation: Why is this happening?
If I’ve not sold you yet on my world-view, perhaps this CNBC interview with Google will be the silver bullet.
This is a great article in many ways - its intentions are nothing to do with SEO, but rather politically motivated, after Trump called Google biased in September of this year. Nonetheless, it affords us a level of insight form the proverbial horse’s mouth that we’d never normally receive. My main takeaways are these:
In 2017, Google ran 31,584 experiments, resulting in 2,453 “search changes” - algorithm updates, to you and me. That’s roughly 7 per day.
When the interview was conducted, the team that CNBC talked to was working on an experiment involving increased use of images in search results. The metrics they were optimising for were:
The speed with which users interacted with the SERP
The rate at which they quickly bounced back to the search results (note: if you think about it, this is not equivalent to and probably not even correlated with bounce rate in Google Analytics).
It’s important to remember that Google search engineers are people doing jobs with targets and KPIs just like the rest of us. And their KPI is not to get the sites with the best-ranking factors to the top - ranking factors, whether they be links, page speed, title tags or whatever else are just a means to an end.
Under this model, with those explicit KPIs, as an SEO we equally ought to be thinking about “ranking factors” like price, aesthetics, and the presence or lack of pop-ups, banners, and interstitials.
Now, admittedly, this article does not explicitly confirm or even mention a dynamic model like the one I’ve discussed earlier in this article. But it does discuss a mindset at Google that very much leads in that direction - if Google knows it’s optimising for certain user signals, and it can also collect those signals in real-time, why not be responsive?
Implications: How to rank for head terms
As I said at the start of this article, I am not suggesting for a moment that the fundamentals of SEO we’ve been practising for the last however many years are suddenly obsolete. At Distilled, we’re still seeing clients earn results and growth from cleaning up their technical SEO, improving their information architecture, or link-focused creative campaigns - all of which are reliant on an “old school” understanding of how Google works. Frankly, the continued existence of SEO as an industry is in itself reasonable proof that these methods, on average, pay for themselves.
But the picture is certainly more nuanced at the top, and I think those Google KPIs are an invaluable sneak peek into what that picture might look like. As a reminder, I’m talking about:
The speed with which users interact with a SERP (quicker is better)
The rate at which they quickly bounce back to results (lower is better)
There are some obvious ways we can optimise for these as SEOs, some of which are well within our wheelhouse, and some of which we might typically ignore. For example:
Optimising for SERP interaction speed - getting that “no-brainer” click on your site:
Metadata - we’ve been using this to stand out in search results for years
E.g. “free delivery” in title
E.g. professionally written meta description copy
Brand awareness/perception - think about whether you’d be likely to click on the Guardian or Forbes with similar articles for the same query
Optimising for rate of return to SERPs:
Sitespeed - have you ever bailed on a slow site, especially on mobile?
First impression - the “this isn’t what I expected” or “I can’t be bothered” factor
Price
Pop-ups etc.
Aesthetics(!)
As I said, some of these can be daunting to approach as digital marketers, because they’re a little outside of our usual playbook. But actually, lots of stuff we do for other reasons ends up being very efficient for these metrics - for example, if you want to improve your site’s brand awareness, how about top of funnel SEO content, top of funnel social content, native advertising, display, or carefully tailored post-conversion email marketing? If you want to improve first impressions, how about starting with a Panda survey of you and your competitors?
Similarly, these KPIs can seem harder to measure than our traditional metrics, but this is another area where we’re better equipped than we sometimes think. We can track click-through rates in Google Search Console (although you’ll need to control for rankings & keyword make-up), we can track something resembling intent satisfaction via scroll tracking, and I’ve talked before about how to get started measuring brand awareness.
Some of this (perhaps frustratingly!) comes down to being “ready” to rank - if your product and customer experience is not up to scratch, no amount of SEO can save you from that in this new world, because Google is explicitly trying to give customers results that win on product and customer experience, not on SEO.
There’s also the intent piece - I think a lot of brands need to be readier than they are for some of their biggest head terms “going informational on them”. This means having great informational content in place and ready to go - and by that, I do not mean a quick blog post or a thinly veiled product page. Relatedly, I’d recommend this in-depth article about predicting and building for “latent intents” as a starting point.
Summary
I’ve tried in this article to summarise how I see the SEO game-changing, and how I think we need to adapt. If you have two main takeaways, I’d like it to be those two KPIs - the speed with which users interact with a SERP, and the rate at which they quickly bounce back to results (lower is better) - and what they really mean for your marketing strategy.
What I don’t want you to take away is that I’m in any way undermining SEO fundamentals - links, on-page, or whatever else. That’s still how you qualify, how you get to a position where Google has any user signals from your site to start with. All that said, I know this is a controversial topic, and this post is heavily driven by my own experience, so I’d love to hear your thoughts below!
from Marketing https://www.distilled.net/resources/how-to-rank-for-head-terms/ via http://www.rssmix.com/
0 notes
Text
Is SEO 2019 The Most Trending Thing Now?
Good SEARCH ENGINE OPTIMIZATION articles increase a website's SEARCH ENGINE OPTIMIZATION traffic because articles are listed online. SEO or the particular Search Engine Optimization is certainly an method of increasing visitors generation for an internet company. Cisco approximated that by 2019, video will certainly make up 80% of most consumer internet traffic. In line along with the previous point, SEOs can consider various landing pages along with proper keywords, meta tags plus the overall website structure through the SEO point of look at. Including relevant key phrases in the title, URL, plus headers of the page plus making sure that a internet site is crawlable are actions that will site owners can take in order to improve the SEO of their particular site. To fulfill intent and position well in the long work, build your SEO marketing technique around topics, not keywords In the event that you do that, there are plenty of a person can naturally optimize for essential keywords, anyway. Artificial cleverness (AI) and voice searches are usually two technologies that are currently affecting SEO but will most likely have an even bigger effect in 2019 and the yrs to come. They are usually also great for video SEARCH ENGINE OPTIMIZATION, and get great organic rating online and Google. Remember keywords are usually the KEY in SEO composing guidelines. It deeply analyze your web site and reveals each opportunity intended for your SEO improvement by concentrating on key factors like sociable media, technologies, local SEO, guests, usability, search ranking efforts plus mobile SEO. A SEO expert or agency can audit your own site and appear at exactly how it is performing against your own most important keywords. Given that lookup engines have complex algorithms that will power their technology and everybody's marketing needs are unique, wish unable to provide specific SEARCH ENGINE OPTIMIZATION advice to our customers. All of us think SEO in 2019 might find a shift to focusing much more on user intent, problem resolving, and hyper locality in purchase to capitalize on the ongoing rise of voice search. Private demographic information would come in a great deal handier in 2019 as significantly as the ranking of key phrases is concerned. Links are one of the particular most important SEO ranking aspects. However, more advanced that will readers will recognize the reduce quality of sites employing dark hat SEO at the cost from the reader experience, which usually will reduce the site's visitors and page rank over period. So - THERE IS SIMPLY NO BEST PRACTICE AMOUNT OF HEROES any SEO could lay down since exact best practice to GUARANTEE a title will certainly display, in full in Search engines, at least, since the search little title, on every device. While getting as many pages listed in Google was historically the priority for an SEO, Search engines is now rating the high quality of pages in your site plus the type of pages this really is indexing. Jerrika Scott, Digital Marketing Specialist with Archway Cards Ltd, also thinks in voice being the craze of 2019 rather than 2018. Today, regardless of all the hype about regardless of whether SEO is dead, we discover that organic search is nevertheless one of the highest RETURN ON INVESTMENT digital marketing channels. A reputable SEARCH ENGINE OPTIMIZATION company carries on together with your Company Profile and then do individuals profile building and then Regional Business Listing Optimization. Our first five steps had been dedicated to on-page SEO techniques Link-building is the primary goal of off-page SEO, and will be also an enormous factor within how search engines rank your own web pages. So far because I know, this only functions for HTML or CSS web pages - I don't go very much for Flash websites, and We am unsure how that cookware out regarding search engines plus SEO. In fact, a large part of SEO in 2018 is writing for humans PLUS search engines (also known because: SEO copywriting”). SEO is the significant part of any web marketing strategy. Canonical: This connect to handles content syndication, which basically allows other blogs to post your job (similar to franchising) with out hurting your website's SEO ranking—simply by having a rel=canonical can obtain your brand and content out there on the web in several outlets, ensuring a greater get to and bigger audience without harming your own search results. We are usually offering affordable search engine search engine optimization SEO services to clients throughout the globe. If you want the strong social media strategy, a person simply can't ignore SEO. Another SEO-related plugin, W3 Total Cache is used to improve the performance of your Wp blog or a website simply by integrating features such as content material delivery networks to be capable to reduce the loading instances for your pages.
Tumblr media
forty five. Use Google Analytics to calculate social SEO factors like the particular number of owned and beneficial results. The common SEO issue for e-commerce sites is that product evaluation functionality tends to rely upon AJAX, iframes, or subdomains, which usually make it very hard, or even even impossible, for search motors to fit product reviews with item pages. Local SEARCH ENGINE OPTIMIZATION is an effective way in order to market your business online. Intended for more on Squarespace's SEO-friendly features plus how to use them, go to Increasing your site's visibility in order to search engines. For Ecommerce websites, SEO agencies can see which usually paths users take in purchase to complete a sale, most the way down to which usually keyword they used to research for you prior to buying. Continue to increase your SEARCH ENGINE OPTIMIZATION to construct Ten Lessons That Will Teach You All You Need To Know About SEO 2019 business and drive traffic plus rank for more terms. SEO refers in order to the process of gaining a good impressive rank in search motor listings. Organic SEO will be also less costly long-term when you establish search credibility, as longer as you maintain it along with the consistent creation of high quality content and social media use. Within the 6th section of our SEO for newcomers guide, we will discuss hyperlink building - one of the particular most important aspects of research engine optimization. Search engine optimization can furthermore be known as SEO plus it is defined as the technique or procedure for enhancing the website within an efficient and efficient way in purchase to make a good existence on major search engines. The traditional method to SEO has been devoted to creating excellent content that will is easily searchable via research engine bots. To realize SEO, you'll also need in order to know how Google search functions. 2019 can still use many of these types of newly implemented tactics, but lookup engine optimization experts are furthermore suggesting there will be a great deal more. Currently more than half associated with searches account for mobile gadgets, and the number will surely move up in 2019. Within the remaining 2018 quarter, a person need to invest in mobile-first content that will rank a person higher in mobile search within 2019. The vast majority of your SEO learning ought to come from online resources, yet there are a few textbooks that will help you conceptually understand the history of lookup, search engines, and how SEARCH ENGINE OPTIMIZATION has changed through the many years. There are several methods that webmasters use within order to entice potential clients to their site—one of the particular most important and effective associated with those being Seo (SEO). SEO is important for a lot of companies because if individuals find you using a web lookup and find what they're searching for, you can receive plenty of new web visitors that will can help you earn even more money. Good SEO follows guidelines that Search engines determines are best practices in order to have your articles ranked upon top. SEARCH ENGINE OPTIMIZATION services do thorough keyword analysis for the specific website, and after that optimize the information on the particular basis of these keywords plus theme of website. Performing SEO upon your own websites is the great method to practice plus hone your SEO ability. We optimize your web site both of internal and exterior factors thats Google's engine believe in and reliable for top rating search result, Gurantee your SEARCH ENGINE OPTIMIZATION ranking No succeed can refundable. Looking deeper: An SEARCH ENGINE OPTIMIZATION cost often means one associated with two things: the investment inside your organic search strategy, or just how much you pay for compensated search engine marketing (SEM) providers like Google AdWords. We all are dealing with new methods designed to target old design SEO tactics and that concentrate around the truism that WEBSITE ‘REPUTATION' plus Plenty of WEB PAGES plus SEO equals Plenty associated with Keywords equals LOTS of Search engines traffic. BrightonSEO is a one-day search advertising conference and series of education courses held, unsurprisingly, in Brighton. From narrowing straight down target markets to changing the particular way content is written, AI and voice search will have got a continuous effect on SEARCH ENGINE OPTIMIZATION moving forward. One aspect of SEO is definitely link building, which we will certainly discuss slightly below, which usually leads to thin content. Applications for the particular 2018-2019 cycle of the SEO Progress programme are now closed and even will re-open again in Springtime 2019. SEO could be difficult because search engines are constantly reevaluating and changing how they will prioritize search engine results. Moreover, it will help SEARCH ENGINE OPTIMIZATION by gaining backlinks, likes, feedback or shares. SEO focuses upon rankings in the organic (non-paid) search results. SEO as a result helps you get traffic through search engines. With Ahrefs, a great starting place for keyword research regarding SEO is the Keywords Explorer tool. Topics: SEO, backlink, articles marketing, social media marketing plus advertising, analytics, and more. 60+ sessions on hot topics, accomplishment stories and strategies in SEARCH ENGINE OPTIMISATION, SEA, PPC, Social Media, On-line Marketing and SMX Future Monitor. 2018 (I believe) will be a lot associated with catch up on current SEARCH ENGINE OPTIMIZATION themes, with the biggest trends” occurring around voice queries to look results. Therefore, SEOmonitor tracks all of the related data that could influence SEARCH ENGINE OPTIMIZATION performance and displays it within the Keyword Events Timeline. If you forget that will quality content is a best priority, then you can definitely forget about having an SEARCH ENGINE OPTIMIZATION strategy. The particular way we asking the device is different from person in order to person. therefore, optimizing your site 100% mobile friendly to complete cellular voice search is very essential to SEO 2019. We am just newbie and significantly i do get frustrated whenever articles doesn't rank on best or near top, but individuals like you and many additional also inspire me to never ever give up. There are many points that i wasn't recognized up to now but appreciate to you, you are often make us learn important points about seo.
Tumblr media
You will become introduced to the foundational components of how search engines such as google work, how the SEARCH ENGINE OPTIMIZATION landscape is promoting and exactly what you can expect in the particular future. The SISTRIX Toolbox consists associated with six modules 1) SEO, 2) Universal, 3) Links, 4) Advertisements, 5) Social and 6) Optimizer. Low-quality content can severely impact the achievements of SEO, within 2018. When your own SEO starts building strong reasons, competitors can start maligning your own SEO backlinks. ” With content marketing spend likely to reach $300 billion by 2019, this statistic is worrisome. Now the electronic marketing companies know how in order to use AI for SEO, plus in coming years AI may dominate in developing the SEARCH ENGINE OPTIMIZATION strategies of the digital advertising companies. Ray Cheselka, AdWords and SEARCH ENGINE OPTIMIZATION Manager at webFEAT Complete, the design and SEO agency, states that in 2019 search purpose will continue to become even more important. Expert writers of SEO articles may take the time to study the particular industry or specialized niche market. 2 tools to help with nearby SEO are BrightLocal (for rankings) and MozLocal (for local lookup optimization). Concerning on-page SEO best practices, I usually link out to other quality related pages on other websites exactly where possible and where a individual would find it valuable. It includes a well known Regular Table of SEO Success Factors”, a 9-chapter guide to SEARCH ENGINE OPTIMIZATION basics, and links to several from the site's most useful blog posts. Something which usually was troubling about 2017, plus as we head into 2018, may be the new wave of companies merely bolting on SEO since a service without any genuine appreciation of structuring site architectures and content for both human beings and search engine understanding. A huge part of SEO is within creating content which is focused towards the keywords that lookup engines users are searching intended for. Building links could be the big part of your work as an SEO because this plays such a big component in the algorithms used simply by search engines to look regarding the order of their outcomes. Search engines like google through the King Google to supplementary ones like Bing and Bing rank the websites on the particular basis of its SEO. It's since creating content and ranking this in Google is what SEOs do! SEO providers look at their links, through where their links are arriving, and what keywords they possess chose to optimize for their internet sites. Hobo UK SEARCH ENGINE OPTIMIZATION - A Beginner's Guide (2018) is really a free pdf file ebook you can DOWNLOAD TOTALLY FREE from here (2mb) that will contains my notes about traveling increased organic traffic to the site within Google's guidelines. My conjecture is that the biggest SEARCH ENGINE OPTIMIZATION trend in 2019 is heading to be Amazon search. TYPES OF SEO Right now there are two major types associated with seo, white hat search motor optimization (the ‘good' kind), plus black hat (the 'not therefore good' kind). Google's always tweaking its lookup algorithms, so there's no warranty that SEO practices that proved helpful in the past will maintain working in the future. SEO & Content Marketing Software for eCommerce Business, Agencies and Enterprises. Calib Backe, SEO Manager for Walnut Holistics, writes that mobile plus voice are going to carry on their domination of importance since we rely on desktop much less and less. If much associated with your competitors has hired SEARCH ENGINE OPTIMIZATION firms to focus on 10-20 keywords within a moderately competitive industry, after that you will have to spend a small more. The Google Lookup Console could be the most essential SEO tool on the world. No BS. When you're done with the exhausted cliches told over and more than again at SEO Conferences, and then you're prepared to experience UnGagged - an UnConventional SEO in addition to Digital Marketing conference that gives real-world results. SEO is a good acronym for the phrase "search engine optimization. " Search motor optimization is focused on doing specific points to your website to operate a vehicle even more traffic to it so that will you can increase online product sales - and traffic. By 2019, the method we search might not alter completely, but these new systems will definitely change the way all of us build links, engage users, plus generate leads through content marketing and advertising. Site Champion® increases site visitors by helping shoppers find your own products in search engines via increased keyword rankings using SEARCH ENGINE OPTIMIZATION automation. While link quantity is nevertheless important, content creators and SEARCH ENGINE OPTIMIZATION professionals are realizing that hyperlink quality is now more essential than link quantity, and since such, creating shareable content may be the 1st step to earning valuable hyperlinks and improving your off-page SEARCH ENGINE OPTIMIZATION.
Tumblr media
Since her business, web traffic, plus customer base grow, Sue may require some outside support with regard to keeping her SEO on observe so she can certainly still sell this best shoes on the prevent. Whilst links continue to be essential and it's incredibly difficult in order to rank well without links through other websites, content and on-page SEO has become increasingly essential. For businesses searching to raise their search ranks, what this means is that will a comprehensive social media technique may be in order - within addition to all of the particular usual SEO tactics. Because of protocol changes and the trend in order to more local searches, it actually is no longer an massive cost to implement good SEARCH ENGINE OPTIMIZATION. Whether you're killing it along with SEO, or struggling to break Blog9T the very first page, a good SEO audit can assist give your own rankings a shot within the particular arm. According to web marketing experts, the impact associated with AI and Voice Browse SEARCH ENGINE OPTIMIZATION can be expected in 2019 after that. Which it. If you'd like in order to speak to us about a good SEO campaign, or even multilingual SEO marketing and keyword analysis then get in touch associated with just start a live conversation if we're around. Moreover, Google will keep on to elevate the importance associated with usability and technical SEO aspects, for example site security, page rate, mobile friendliness, and navigability. As a consultant, this individual has helped many different businesses—including, Lonely Planet, Zillow, Tower Information, and literally countless medium plus small businesses—with SEO and online-marketing advice.
Tumblr media
SEO's basic importance comes from the particular fact that most users display strong search dominance — that will is definitely, search is the main method people go places on the particular Internet. Reading blogs connected with SEO might also be very useful within locating out concerning the essential companies on the market which usually are offering comprehensive and genuine Search engine optimization services in the direction of the corporate sector. The initial SEO is dependent mostly on number of key phrases targeted and the size associated with your web site, while the particular ongoing link campaign depends even more on the competitiveness of the particular keywords chosen. Whether most likely an SEO newbie or the seasoned practitioner, I encourage a person to fully read this in order to understand how you can obtain your content on top associated with search results. Huge Brand campaigns are far, much distinctive from small business SEO strategies that have no links, to start with, to give you yet an example. These SEO companies possess a strategy requiring clients in order to pay for the major research engines (including Google and Yahoo) for monthly website maintenance. Specialists are reporting that 2019 will certainly be the year of tone of voice search, and that the tone of voice search algorithm may change and even supersede text search relatively. A TOP DOG blog is simply one component of social media distribution, a significant SEO strategy according to SEARCH ENGINE OPTIMIZATION Consult You should be disseminating links to fresh content upon your site across appropriate interpersonal networking platforms. But undoubtedly, the acronym many people ask about is SEARCH ENGINE OPTIMIZATION, or search engine optimization. Between this plus 2017's Best SEO Campaign prize from the UK Search Honours, it's clear the world is usually starting to recognize Elephate since one of the most reliable SEO and Content Marketing organizations in Europe. To understand how SEO functions to improve search rankings, we're going have to break it straight down a little. Whichever way you choose in order to categorize keywords, one of the particular most important steps in SEARCH ENGINE OPTIMIZATION does keyword research. Some good examples of White Hat SEO consist of optimizing META tags, putting key phrases in URLs, submitting sites in order to directories, making sitemaps, obtaining hyperlinks on related sites, and generating keyword-optimized content. An SEO Executive optimizes sites for making them show up increased on search engines like search engines and gain more website website visitors. Nevertheless, the brand-new trend in SEO is ideal for long-tail keywords and even more conversational searches. SEO is the great marketing tool for the particular websites promoting their businesses on the web. Still, the keyword factor of SEO is becoming more and more difficult with Google Adwords concealing volume data. Making use of advanced Search engines semantic search algorithms, we link the gap between old college SEO and the new content material marketing. Tug has been appointed specifically to roll out a multi-lingual search engine marketing techniques campaign using its experience in international SEO and PAY PER CLICK.
Tumblr media
The Technical Audit Checklist Produced For Human Beings — Within this post by Distilled, a person will find a link in order to a Google sheet that provides an technical SEO audit guidelines and links to resources upon how to complete each checkbox. Then each time the phrase SEO appears on your web site, it's automatically changed into the link you specified. Keyword Study for SEO: The Definitive Guidebook — This comprehensive guide simply by Brian Dean teaches you the number of strategies for obtaining keywords and determining intent intended for your target market. On-page and off-page SEO function together to improve your research engine rankings in complementary style; however, SEOs generally advise obtaining your on-page SEO ducks within a row before focusing as well much on off-page SEO. Within 2019, web-based businesses will embrace more voice-to-text technology to boost engagement and search activity. There are many SEO sites suggesting that they can supply a service to make the website LSI friendly, or fulfill ‘LSI requirements'. The particular art of web SEO is situated in focusing on how individuals search for things and knowing what type of results Search engines wants to (or will) screen to its users. Social press SEO would encourage your present customers to come back whilst helping you develop authority regarding potential ones. He is an expert in SEO, Content Marketing, plus Pinterest Marketing. An SEARCH ENGINE OPTIMIZATION expert plays a huge part in helping companies build their particular businesses and attract new clients through website traffic. Since it turns out, there's more in order to on-page SEO than optimizing with regard to keywords. Search engines motor optimization (organic SEO) describes the particular methods used to obtain the high placement (or ranking) upon a search engine results web page in unpaid, algorithm-driven results upon the given search engine. Also, videos possess a lot of untapped probable - great for SEO plus make for good user diamond. The really best SEO expert 2019 may tell you for High-End mobile devices, we're seeing more format changes to focus, provide the better experience, search results. Social SEO is especially helpful for online reputation administration. It isn't just the approach that Google ranks optimized written content, but the way that that they rank poorly constructed or taboo content that will push your current ranking to where it should to be in 2019. Ray Cheselka, SEO & Google adwords Manager at SEO and style agency, webFEAT Complete, predicts that will sites with over a 2 second load time will become penalized, and search intent will be going to still grow within importance. SEO consists of ordering the site's architecture and hyperlinks to make pages inside the particular website easier to find plus navigate. In contrast, articles that no one is humming about have minimal effect upon social SEO. They are usually generally knowledgeable within the are usually of both SEO and content material marketing. When it arrives to reviews, customers work as a good army of link builders plus keyword writers so your SEARCH ENGINE OPTIMIZATION structure is shaped without a person having to lift a hand.
0 notes
Text
By @Beschizza:  My RSS feeds from a decade ago, a snapshot of gadget blogging when that was a thing
Tumblr media
I chanced upon an ancient backup of my RSS feed subscriptions, a cold hard stone of data from my time at Wired in the mid-2000s. The last-modified date on the file is December 2007. I wiped my feeds upon coming to Boing Boing thenabouts: a fresh start and a new perspective.
What I found, over 212 mostly-defunct sites, is a time capsule of web culture from a bygone age—albeit one tailored to the professional purpose of cranking out blog posts about consumer electronics a decade ago. It's not a picture of a wonderful time before all the horrors of Facebook and Twitter set in. This place is not a place of honor. No highly-esteemed deed is commemorated here. But perhaps some of you might like a quick tour, all the same.
The "Main" folder, which contains 30 feeds, was the stuff I actually wanted (or needed) to read. This set would morph over time. I reckon it's easy to spot 2007's passing obsessions from the enduring interests.
↬ Arts and Letters Daily: a minimalist blog of links about smartypants subjects, a Drudge for those days when I sensed a third digit dimly glowing in my IQ. But for the death of founder Denis Dutton, it's exactly the same as it was in 2007! New items daily, but the RSS feed's dead.
↬ Boing Boing. Still around, I hear.
↬ Brass Goggles. A dead feed for a defunct steampunk blog (the last post was in 2013) though the forums seem well-stocked with new postings.
↬ The Consumerist. Dead feed, dead site. Founded in 2005 by Joel Johnson at Gawker, it was sold to Consumer Reports a few years later, lost its edge there, and was finally shuttered (or summarily executed) just a few weeks ago.
↬ Bibliodyssey. Quiescent. Updated until 2015 with wonderful public-domain book art scans and commentary. A twitter account and tumblr rolled on until just last year. There is a book to remember it by should the bits rot.
↬ jwz. Jamie Zawinski's startling and often hilariously bleak reflections on culture, the internet and working at Netscape during the dotcom boom. This was probably the first blog that led me to visit twice, to see if there was more. And there still is, almost daily.
↬ Proceedings of the Athanasius Kircher Society. Curios and weirdness emerging from the dust and foul fog of old books, forbidden history and the more speculative reaches of science. So dead the domain is squatted. Creator Josh Foer moved on to Atlas Obscura.
↬ The Tweney Review. Personal blog of my last supervisor at Wired, Dylan Tweney, now a communications executive. It's still going strong!
↬ Strange Maps. Dead feed, dead site, though it's still going as a category at Big Think. Similar projects proliferate now on social media; this was the wonderful original. There was a book.
↬ BLDGBLOG. Architecture blog, posting since 2004 with recent if rarer updates. A fine example of tasteful web brutalism, but I'm no longer a big fan of cement boxes and minimalism with a price tag.
↬ Dethroner. A men's self-care and fashion blog, founded by Joel Johnson, of the tweedy kind that became wildly and effortlessly successful not long after he gave up on it.
↬ MocoLoco. This long-running design blog morphed visually into a magazine in 2015. I have no idea why I liked it then, but indie photoblogs' golden age ended long ago and it's good to see some are thriving.
↬ SciFi Scanner. Long-dead AMC channel blog, very likely the work of one or two editors and likely lost to tidal corporate forces rather than any specific failure or event.
↬ Cult of Mac. Apple news site from another Wired News colleague of mine, Leander Kahney, and surely one of the longest-running at this point. Charlie Sorrel, who I hired at Wired to help me write the Gadget blog, still pens articles there.
↬ Ectoplasmosis. After Wired canned its bizarre, brilliant and unacceptably weird Table of Malcontents blog, its editor John Brownlee (who later joined Joel and I in editing Boing Boing Gadgets) and contributor Eliza Gauger founded Ectoplasmosis: the same thing but with no hysterical calls from Conde Nast wondering what the fuck is going on. It was glorious, too: a high-point of baroque indie blogging in the age before Facebook (and I made the original site design). Both editors later moved onto other projects (Magenta, Problem Glyphs); Gauger maintains the site's archives at tumblr. It was last updated in 2014.
↬ Penny Arcade. Then a webcomic; now a webcomic and a media and events empire.
↬ Paul Boutin. While working at Wired News, I'd heard a rumor that he was my supervisor. But I never spoke to him and only ever received a couple of odd emails, so I just got on with the job until Tweney was hired. His site and its feed are long-dead.
↬ Yanko Design. Classic blockquote chum for gadget bloggers.
↬ City Home News. A offbeat Pittburgh News blog, still online but lying fallow since 2009.
↬ Watchismo. Once a key site for wristwatch fans, Watchismo was folded into watches.com a few years ago. A couple of things were posted to the feed in 2017, but its time has obviously passed.
↬ Gizmodo. Much has changed, but it's still one of the best tech blogs.
↬ Engadget. Much has changed, but it's still one of the best tech blogs.
↬ Boing Boing Gadgets. Site's dead, though the feed is technically live as it redirects to our "gadgets" tag. Thousands of URLs there succumbed to bit-rot at some point, but we have plans to merge its database into Boing Boing's and revive them.
↬ Gear Factor. This was the gadget review column at Wired Magazine, separate from the gadget blog I edited because of the longtime corporate divorce between Wired's print and online divisions. This separation had just been resolved at the time I began working there, and the two "sides" -- literally facing offices in the same building -- were slowly being integrated. The feed's dead, but with an obvious successor, Gear.
↬ The Secret Diary of Steve Jobs. Required reading at the time, and very much a thing of its time. Now vaguely repulsive.
↬ i09. This brilliant sci-fi and culture blog deserved more than to end up a tag at Gizmodo.
↬ Science Daily: bland but exhaustive torrent of research news, still cranking along.
The "Essentials" Folder was material I wanted to stay on top of, but with work clearly in mind: the background material for systematically belching out content at a particular point in 2007.
↬ Still alive are The Register, Slashdot, Ars Technica, UMPC Portal (the tiny laptop beat!), PC Watch, Techblog, TechCrunch, UberGizmo, Coolest Gadgets, EFF Breaking News, Retro Thing, CNET Reviews, New Scientist, CNET Crave, and MAKE Magazine.
↬ Dead or quiescent: GigaOm (at least for news), Digg/Apple, Akihabara News, Tokyomango, Inside Comcast, Linux Devices, and Uneasy Silence.
Of the 23 feeds in the "press releases" folder, 17 are dead. Most of the RSS no-shows are for companies like AMD and Intel, however, who surely still offer feeds at new addresses. Feeds for Palm, Nokia and pre-Dell Alienware are genuine dodos. These were interesting enough companies, 10 years ago.
PR Newswire functions as a veneering service so anyone can pretend to have a big PR department, but it is (was?) also legitimately used by the big players as a platform so I monitored the feeds there. They're still populated, but duplicate one another, and it's all complete garbage now. (It was mostly garbage then.)
My "Gadgets and Tech" folder contained the army of late-2000s blogs capitalizing on the success of Gizmodo, Boing Boing, TechCrunch, et al. Back in the day, these were mostly one (or two) young white men furiously extruding commentary on (or snarky rewrites of) press releases, with lots of duplication and an inchoate but seriously-honored unspoken language of mutual respect and first-mover credit. Those sites that survived oftentimes moved to listicles and such: notionally superior and more original content and certainly more sharable on Facebook, but unreadably boring. However, a few old-timey gadget bloggers are still cranking 'em out' in web 1.5 style. And a few were so specialized they actually had readers who loved them.
Still alive: DailyTech, technabob, CdrInfo.com, EverythingUSB, Extremetech, GearFuse, Gizmag, Gizmodiva, Hacked Gadgets, How to Spot A Psychopath/Dans' Data, MobileBurn, NewLaunches, OhGizmo!, ShinyShiny, Stuff.tv, TechDigest, TechDirt, Boy Genius Report, The Red Ferret Journal, Trusted Reviews, Xataca, DigiTimes, MedGadget, Geekologie, Tom's Hardware, Trendhunter, Japan Today, Digital Trends, All About Symbian (Yes, Symbian!), textually, cellular-news, TreeHugger, dezeen.
Dead: jkkmobile.com, Business Week Online, About PC (why), Afrigadget (unique blog about inventors in Africa, still active on FaceBook), DefenseTech, FosFor (died 2013), Gearlog, Mobile-Review.com (but apparently reborn as a Russian language tech blog!), Robot's Dreams, The Gadgets Weblog, Wireless Watch Japan, Accelerating Future, Techopolis, Mobile Magazine, eHome Upgrade, camcorderinfo.com, Digital Home Thoughts (farewell), WiFi Network News (farewell), Salon: Machinist, Near Future Lab, BotJunkie (twitter), and CNN Gizmos.
I followed 18 categories at Free Patents Online, and the site's still alive, though the RSS feeds haven't had any new items since 2016.
In the "news" folder, my picks were fairly standard stuff: BBC, CNET, digg/technology, PC World, Reuters, International Herald Tribune, and a bunch of Yahoo News feeds. The Digg feed's dead; they died and were reborn.
The "Wired" feed folder comprised all the Wired News blogs of the mid-2000s. All are dead. 27B Stroke 6, Autopia, Danger Room, Epicenter, Gadget Lab, Game|Life, Geekdad, Listening Post, Monkey Bites, Table of Malcontents, Underwire, Wired Science.
These were each basically one writer or two and were generally folded into the established mazagine-side arrangements as the Age of Everyone Emulating Gawker came to an end. The feed for former EIC Chris Anderson's personal blog survives, but hasn't been updated since his era. Still going strong is Bruce Sterling's Beyond the Beyond, albeit rigged as a CMS tag rather than a bona fide site of its own.
Still alive from my 2007 "Science" folder are Bad Astronomy (Phil Plait), Bad Science (Ben Goldacre), Pharyngula (PZ Myers) New Urban Legends, NASA Breaking News, The Panda's Thumb, and James Randi's blog,
Finally, there's a dedicated "iPhone" folder. This was not just the hottest toy of 2007. It was all that was holy in consumer electronics for half a decade. Gadget blogging never really had a golden age, but the iPhone ended any pretense that there were numerous horses in a race of equal potential. Apple won.
Still alive are 9 to 5 Mac, MacRumors, MacSlash, AppleInsider and Daring Fireball. Dead are TUAW, iPhoneCentral, and the iPhone Dev Wiki.
Of all the sites listed here, I couldn't now be paid but to read a few. So long, 2007.
https://boingboing.net/2017/12/29/my-rss-feeds-from-a-decade-ago.html
8 notes · View notes
netmaddy-blog · 7 years
Text
How to Exhume Your Buried Site and Get Recovery From Google
New Post has been published on https://netmaddy.com/how-to-exhume-your-buried-site-and-get-recovery-from-google/
How to Exhume Your Buried Site and Get Recovery From Google
Is There Really a Recovery From Google Slaps?
If you head out on Google and search for ‘recovery from Google’ or ‘recover from Google Penguin,’ or Panda or algorithm updates, you’re going to find a ton of people out there touting that they’ll get you back on track for X amount of dollars.
You’re told you have to contact site owners and have links removed coming into your site. You’re told you have to do this – do that – do something else, create new links this way, listen to “gurus” tell you this is the “new way” to build links post-Panda and post-Penguin.
And for a limited time, you can get our services for $997! What?
Seriously, why would you want to pay someone for something you can do – and should do – yourself?
My e-commerce site was not only slammed by Google’s algorithm updates in September 2012, it was buried – cremated may sound even better. How, you might ask? To this day, I’m not really too sure if it was the Penguin or Panda update, but one of those bad-asses animals took me down – HARD.
My site went from about 150-175 unique visitors a day to TWO – overnight. Like many, many other site owners out there, I was devastated! In less than 24 hours, my site got buried back on page 15+ on Google.
And if that wasn’t enough, Google started de-indexing my pages – one by one until I was left with nothing but the handful of solid backlinks I’d gotten over the last few years of being in business.
I didn’t know what to do. I’d actually had the site and business up for sale for a couple of weeks before the take-down but after Google hammered me, I couldn’t have given the site away.
So I left it to sit for about another month or so while Google systematically de-indexed about 85% of the site’s pages. Slow but sure death and cremation of a site I’d worked hours and hours and hours to build and rebuild. Gone.
Now, I have to say, this wasn’t one of those “niche sites” just for making AdSense or Amazon cash. This was a full-fledged storefront for a product that I design and sell myself.
And I’d taken it from nothing to being on the front page of Google between #1 and #8 position for most of the primary keywords I wanted. I was just gaining an edge on some of my top competitors when – BAM – Google decided my site just wasn’t good enough.
I’d worked on this site for the better part of four years and had registered a brand new NON-exact match domain name about six months before the slam. So I was rather livid, to say the least. I will say that hearing some mega-major corporations being slammed as well did make me feel a little better. At least I knew I wasn’t alone, for whatever that was worth.
So what did I do get back into Google’s good graces? I didn’t file a reconsideration petition. I didn’t follow the “advice” from the “gurus” and buy a new domain name or contact the site owners from where my allegedly “spammy links” were coming from. In fact, when I did a backlink analysis, there really wasn’t much there to be concerned with. So that left on-page issues like overusing certain keywords and/or lacking significant content.
Some “gurus” said my site was worthless now so I may as well ‘put my big girl panties on’ and accept it. Quit. Do something else. You’re done. You did something wrong and now you can’t undo it.
I am so glad I didn’t listen to any of them!!
I knew I had a valuable website with something to say, a great product to offer and content that just needed some updating to be more valuable. So I decided to start on my path to recovery, exhume the ashes of my now more-than-dead site and get to work.
Step One – Remove the Site from Google Analytics That’s right. They de-indexed the site so why bother analyzing it anymore. They took it from me, I took it from them.
Step Two – Take It Down. Yup. I removed every last file from the server and replaced it all with only two files – the index and the custom 404. Regardless of where people landed, they got the same message – and this is crucial in retaining visitors even during downtime.
I did not use any “under construction” images or anything like that. I did a video – me on camera – explaining that I had taken the site down temporarily while doing a complete renovation, to please bookmark the site and stop back around December 5.
Why did I give them a date? Because I didn’t want to have to do another video, for one, and it gave me a deadline to work towards so I wouldn’t get sidetracked doing anything else. The same video announcement went on both the temporary home page and on the custom 404 page.
Step Three – Get Educated. This was something I learned while wandering around YouTube in a daze one day wondering what to do with my site. Sometimes a good redesign with some updated content can do the trick. But you have to know what is wrong and how to fix it. And that takes some research.
My site was built as an HTML/CSS static site, which now that I know more about WordPress, I know using an HTML site leaves a HUGE margin for error if you’re not careful. And now I believe this was one of the main negatives for my site. The URL structure was not correct for the site and the way I was copy/pasting pages to use for other pages encouraged the dreaded “duplicate content” within the site.
The duplicate content actually came in the form of repeated meta tag keywords and meta descriptions showing up on more than one page. Google does not like that – at all. And they’ll let you know by starting to de-index those alleged ‘spammy’ pages.
So I knew I had to rebuild the site using WordPress. And that meant learning a lot more about WordPress than just setting up a content blog. This was serious business here. My old HTML site had the lightbox effect for the product images and a custom order form. I had to replicate that in the new site.
Step Four – Go to YouTube And watch a lot of instructional videos about WordPress. On YouTube I found out how to install the lightbox plug-in, includes the plug-in for WordPress, and how to structure the URLs for a solid site that Google would love.
Step Five – Dig In and Start to Work! I’d already started to get familiar with the Atahualpa theme since using it to rebuild another site, so I figured I’d start with that theme and go from there. Again, because I’m a self-admitted YouTube junkie, I go there first when I need help with stuff. So I spent a lot of time with two browser windows open – one with a YouTube instructional video, the other with the WordPress dashboard open following along.
I’ve also learned a lot about PhotoShop via YouTube so that helped me design a new header image for the site.
Once the framework was done, i.e., color scheme, layout, CSS adds, etc., it was time to start copy/pasting over 200 pages of HTML into Pages and Posts. I love using shortcodes and Include files so I went back to YouTube and found the video I needed to learn how to do that in WordPress. A piece of cake.
Step Six – Time for the New Sitemap! After about 60 hours, yes, an entire week of “overtime,” the site was ready for its new sitemap and re-add to Google Analytics.
Step Seven – Time, Patience, Education and WORK Paid off! I added the site back to Google Analytics along with a brand new 267-page sitemap on December 1. December 2 I checked and they had already indexed 179 pages! And that was after they’d de-indexed all but 23 of 260 pages a couple of months earlier!
So what did I learn from this?
First, when it needs to be done right – Do It Yourself! There are just some things that can’t and shouldn’t be outsourced to anyone else. And when it comes to your business website, this is something that needs to be taken on by you alone.
Why? You know what you want and it’s beneficial to actually get the education and learn new things along the way because they’ll be easier to fix in the future if you have to edit something.
Also, as I moved along through the rebuilding, I recognized several areas where what I had created the first time probably was seen by the Google ‘bots as spamming and duplicate content. If I’d outsourced the job to someone else to “copy/paste” their way through rebuilding, they may not have recognized it.
The ‘Gurus’ Aren’t Always Right! I’d been told by so many people over the last couple of months that my site was dead in the water. Forget it. Start over with something new. Let the domain name expire because it’s worthless now.
And a lot of people probably would have. A lot of people have, especially if they were only small niche sites set up only for the purpose of making Google AdSense or Amazon Associate cash.
I also think there are a lot of people who may be afraid to start over because they may not have invested the ‘blood, sweat & tears’ into their businesses to begin with. They may feel they have nothing to lose so why bother, right?
I could have just ditched this business too, but I was like, I’m gonna beat you, Google! I am not going down without trying! LOL
And then there are the “stereotypical” online marketers who are pretty much just too lazy to put in the effort it will take to exhume their sites from the Google cemetery and they’re perfectly content to start over. Whatever works for them!
But I had a feeling that if you can show Google you’re 100% serious about your business by doing what it takes to survive out there, they’ll work with you. And – I never filed any of those stupid “Reconsideration” things either. Not necessary. I would have felt like I was begging for mercy or something.
I didn’t touch any of the already incoming links to the site either. I left those as they are, wherever they’re coming from. I don’t think I got hit over that, to begin with. I still think it was on-page “over-optimization” – in Google’s algorithms anyway – that nailed my site. And I think reconfiguring and rebuilding all of my ‘old’ content into some fresh stuff made the difference.
Going Forward from Recovery Content – Content – Content. There are two main people I have to give a shout-out to during this entire process both for their encouragement, the way they keep their websites at the top and their 100% focus on building quality content.
Lisa Irby of 2createawebsite.com. Lisa absolutely refused to let me give up on my site. So I’ve been keeping her posted on my progress.
David Boozer of Online Mentor Now. I only just started following David recently after finding him on YouTube and immediately signed up for his email list and free mentoring.
Both David and Lisa will tell you what works – Content. Regularly building high-quality content is the only way to go today and going forward. They’ll tell you that the ‘gurus’ with their latest allegedly push-button, no-work schemes that promise you thousands of dollars overnight – those things don’t work. I know they don’t. I tried them and I got “punished” for using them – harshly.
So with the insight and education of these two awesome online marketers, I put everything else aside and dug my website out of the Google Cemetery. Will it rank high again on Google? I’m expecting it will, now that I have it all in place the way it should be.
So Can You Make a Recovery from Google Slaps? Yes, but you have to WANT it bad enough and then put in the work to make it happen. If you only have a five or ten-page niche site only set up for ads and affiliate links, maybe not.
But if you redesign it – in WordPress – stay away from the shortcuts, do your due diligence and add quality content, you should be able to pull yourself back up.
0 notes