
-
so clearly disenfranchises thousands of sites without warning according to nothing other than Google’s subjective opinion as to what they find “spammy” (a thinly veiled euphemism devised to punish sites that, we have to consider, must include, sites that simply do not fit into Google’s revenue model). But also:
-
there should be no danger (or so I thought, see below) in simply telling us what exactly is low quality or high quality in their eyes. The irony is that most webmasters do want to have a high quality site. And we have no choice but to rank highly in Google.”
- Good usage metrics showing User Satisfaction with your content / presentation
- Positive Social Shares / Mentions
- Positive “Reviews” on an Independent Google Verifiable Source
- Authoritative Outlinks in Your Content / Citing Your Sources
- .com, .net, and .org a quality/trust factor
- Address and /or Contact Clearly Listed on Each page
- Bad usage metrics showing possible User Dissatisfaction with your content / presentation (including speed, UI, whitespace (or lack thereof), too many options, bad/thin/poorly written content, didn’t answer their problem / question fast or good enough, etc.)
- EDIT: Duplicate or Aggregate, “Tag”, or “Category” Content
- Duplicate Titles and Meta Description
- Aggressive “search phrase” keyword use onsite, INCLUDING: URL string, page content, AND HTML code like TITLE or ALT attributes
- EDIT: Keyword Stuffed Internal Links on Blog pages etc.
- Garbage text, single sentence pages, spun text, bad construction / spelling / grammar, Bad Search Results Pages, errors on page, etc.