Exact match keywords, in SEO, are search terms that precisely match the term for which a page is optimised. There can be no pluralisation and no punctuation. The term must appear exactly as it is written in the site content; and in the URL; and in the Meta description.
Partial or broad match keywords are search terms that have the same sense as an exact match, but which are physically different. They may have spelling mistakes. They may pluralise a singular. They may even contain different combinations of the same words – or combinations of words that are almost the same as the original, but which mean essentially the same thing.
For a long time, it was the case that having exact match keywords raised your SEO capabilities. That is to say, when someone searched for precisely the same term as the one you had written into your content and your code, then the search engines gave you extra weighting in the returned results.
The latest development in SEO, as announced by Google recently, is that exact and broad matches will no longer carry different weighting in the relevant contexts. This is another step on Google’s path to becoming almost human.
The argument goes like this: people don’t spell properly, they don’t have perfect grammar and two of them will use two different – though clearly conceptually linked – ways to describe the same thing. So why shouldn’t search engines recognise this, by combining related searches into a single results list?
The SEO ramifications of this are hard to fathom. On the one hand, opening out the validity of search terms in this way means someone can optimise for “bright green running shorts”, still get anyone who types that exact phrase in, but also catch people who type “bright green running short” or even “birght (sic) green running short”. On the other, this means that if you have two pages optimised for different exact matches, which weren’t in direct competition with each other for the same search intent – now they are.
So once again the goalposts have moved, and SEO companies are going to have to scramble to work out what to do with it. This time, the focus is on the intent of the searcher. Google quite properly believes that what someone means, when he or she type sin a search term, is more important than what he or she actually says – so SEO must now focus on targeting meaning as well as actual, physical collections of words and punctuation marks.
Ultimately there may be a dichotomy at work here. Giving weight to intent, which makes a search engine act more like a human being, could conceivably lead to a dilution in the quality of Google’s returned results or a lottery for the page owners.
It all depends on how you look at it. If exact match and broad search terms are conflate in the way described above (bear in mind no-one knows the actual architecture of a Google algorithm, and can only guess at precisely what it is doing), then it’s entirely possible that random sets of websites will return every time you make a search – because there are so many out there fighting for variations on the same sorts of phrases.
If this happens, it raises a further question – what is Google actually for? The answer may, after all, be that it is there for commercial purposes. In which case, expect the algorithm to change.
About the author:
Shannen is a web journalist and online pundit. She contributes to several^( blogs and symposia.