How many times have you searched for a keyword, come up with hundreds of responses and clicked on one from the first page (probably), only to discover that you were being taken to a site that doesn’t really have the information you’re looking for? Instead, it’s been architected to do well in search results, and it’s got plenty of ads served up that aren’t remotely interesting.
Danger. Danger. Will Robinson. Don’t be fooled! The robots aren’t that smart!
The robots we’re worried about all the time just aren’t that smart. In fact, they aren’t even robots. We call them bots because it’s nicer than saying algorithms.
Here are 9 things that robots can’t really do:
- Robots don’t read your page. They don’t know how to read. They can make a log of keywords, they can count the number of words on the page, they can identify if keywords are used in various headings, image alt tags, and whether they are bold. But they can’t read. Don’t even think about them as having any idea of what you are really saying.
- Robots can’t tell good writing from bad. You can test your copy with the Flesch-Kincaid readability index (another algorithm) that measures its difficulty, but that has flaws too. If you want to beat that system, prepare to write to 6th graders. It’s not just a question of straightforward writing, it’s bad writing, with limited vocabulary, and simplistic sentence construction
- Robots are victimized by people who know how to break the rules. But not in a way that the bots can spot. That’s why we see so many horrid little pages serving up snippets of information and mostly displaying ads. The volume of traffic to these pages is generated through misdirection and deception. Google doesn’t care, because they still manage to follow the rules. But’s it’s enough to make you cry.
- Robots don’t read headlines. At least not the way you think. They scan words that are enclosed in H1 tags and they note whether your keyword is in that snippet of copy, and they note how early in the copy it appears. They like it when the keyword is early in the headline. So forget about creative positioning lines. Get used to self-explanatory encyclopedia headings. If you’re writing an encyclopedia entry, it works like a charm (think wikipedia). If you’re writing something else, it doesn’t work well at all.
- Robots don’t appreciate engaging, conversational intro copy. Just when you’re trying to warm up your audience and make a point, you get penalized. Because the bot thinks that the first words following the headline should also feature the keywords. But are we really writing for the reader? Is it too late to introduce subtlety, to provide historical context, or have a sentence structure more complicated than Go Dog Go. (If Dog is the keyword, it’s already in second place in that sentence.)
- Robots can’t see images. If you’re making your point with an image, the robot doesn’t appreciate it. Sure, you can have an alt tag that repeats your keyword again, but as the old saying goes: a picture is worth a thousand words. And if you write a 1000 word alt tag, you can bet you’ll be penalized.
- Robots don’t appreciate complexity. One topic for every page. Websites that are devoted to a topic rank higher than websites than feature several products. If you are a marketing agency, like HiveMind Studios, we would rank better for web design if 1) our name had the word web design in it, our URL had the word web design in it, and the home page presented that we provide website design services. Once we deviate from that and say that we provide marketing services that include website design, SEO, email marketing, inbound marketing, social media marketing, content marketing, copywriting, graphic, etc. we’re screwed. (although to be fair, Google is trying to get better in this area, allowing synonyms and related words).
- Robots don’t understand business. If you have a local business, a robot values your address and “place”, assuming that you are a retail establishment and you want traffic. That’s great for the restaurants, dentists, doctors, plumbers, and lawyers. But it doesn’t help small businesses that serve a local area. We work with a Janitorial Services company that serves the Bay Area, and has offices in Campbell, CA. They aren’t looking for people to come to their offices. But their ranking in Campbell is much higher than their ranking for the other 10 cities they serve that are within a 50 mile radius of their office.
- Robots don’t understand information overload. They encourage everyone to blog and become a content leader. This has companies across the company regurgitating the same content ideas over and over again. If I stay within my niche for an example, how many blog posts are out there talking about Email subject lines, website redesigns, social media cheat sheets. It goes on forever. Why? Because the robot can spot duplicate copy, but not regurgitated copy. And it currently favors blog post copy over website copy.
My end comment on this rant (and it definitely qualifies as a rant) is this: don’t go crazy trying to please the robots. Certainly, don’t try to outsmart, or fool them. Just stop thinking of them as information processing systems and start to think of them as elaborate bean counters, keeping track of minutia and making decisions about the value of web pages. They are the best thing we have right now, but they’re simply not that smart.
Also published on Medium.