Sunday 2 July 2017

Option Trading For Dummies Free Download


Sirengus nam, ar i darb eigoje, danai mintys pradeda suktis apie kiemo aplink. Keletas landafto architekts patarim kaip aplink susiplanuoti patiems. Prie pradedant galvoti apie glynus arba alpinariumus, svarbiausia yra pirmi ingsniai tai funkcinis teritorijos planavimas. Nesuskirsius teritorijos tinkamas zonas, augalai pasodinami dez, kur j visai nereikia, ar iltnamis pastatomas toje vietoje, kur jis Skaityti daugiau. Tel. 370 608 16327 El. p. Infoskraidantikamera. lt Interneto svetain: skraidantikamera. lt Socialiniai tinklai: facebook paskyra Apraymas: Filmuojame 8211 fotografuojame i 70 8211 100 metr aukio naudojant dron. Sukuriame HD raikos nuotraukas ir video siuetus. Silome pasli, sod, mik, medelyn apiros nuotraukas i aukio. Daugiau ms darb pavyzdi rasite interneto Skaityti daugiau. Profesionalios technins, sodo arnos (gera kaina) PVC laistymo arnos: PVC, dviej sluoksni laistymo arna, sutvirtinta tinkleliu i poliesterio silvets ultravioletters spinduliams kokybs sertifikatas spalva alia 58 skersmens, 16 mm, 8211 kaina 0.90 Ltm 34 skersmens, 19 mm. 8211 kaina 1,20 Ltm 1 col. Skersmens, 25 mm, 8211 kaina 2.30 Ltm Profesionalios PVC auktos kokybs Skaityti daugiau. Tutorial de SEO para Iniciantes em 2016 O que é SEO A pesquisa de E ngine O em 2016 é um processo técnico, analítico e criativo para melhorar a visibilidade de um site nos motores de busca. Sua principal função é dirigir mais visitas a um site que se converte em vendas. As dicas de SEO gratuitas que você vai ler nesta página irão ajudá-lo a criar um site bem-vindo SEO amigável você mesmo. Tenho mais de 15 anos de experiência em tornar os sites classificados no Google. Se você precisar de serviços de otimização, consulte meus serviços SEO SEO ou de pequenas empresas. Este artigo é um guia do iniciante8217 para o SEO do chapéu branco efetivo. Eu deliberadamente evito as técnicas que podem ser 8216grey hat8217, pois o que é cinza hoje é, muitas vezes, amanhã, no que diz respeito ao Google. Nenhum guia de uma página pode explorar este tópico complexo na íntegra. O que você lê aqui são as respostas às perguntas que tive quando eu estava começando neste campo. O Google insiste que os webmasters aderem aos seus 8216rules8217 e visam recompensar sites com conteúdo de alta qualidade e técnicas notáveis ​​de marketing web 8282wh82 com classificações altas. Por outro lado, também precisa penalizar os sites que conseguem classificar no Google ao infringir essas regras. Essas regras não são 8216laws8217, mas 8216guidelines8217, para classificação no Google estabelecidas pelo Google. Você deve notar, no entanto, que alguns métodos de classificação no Google são, de fato, ilegais. Hacking, por exemplo, é ilegal no Reino Unido e nos EUA. Você pode optar por seguir e respeitar essas regras, dobrá-las ou ignorá-las 8211, todas com diferentes níveis de sucesso (e níveis de retribuição, da equipe de spam da Web do Google8217s). Os chapéus brancos fazem isso pelas regras que os chapéus pretos ignoram as 8216rules8217. O que você lê neste artigo está perfeitamente dentro das leis e também dentro das diretrizes e irá ajudá-lo a aumentar o tráfego para o seu site através de páginas orgânicas ou naturais de resultados de pesquisas (SERPs). Existem muitas definições de SEO (autoria de motor de pesquisa espelhada no Reino Unido, Austrália e Nova Zelândia, ou otimização de mecanismo de pesquisa nos Estados Unidos e no Canadá), mas o SEO orgânico em 2016 é principalmente sobre como obter o tráfego livre do Google , O motor de busca mais popular do mundo (e quase o único jogo na cidade no Reino Unido): a arte da web SEO reside na compreensão de como as pessoas procuram coisas e compreendem o tipo de resultados que o Google quer (ou irá) exibir para Seus usuários. It8217s sobre colocar muitas coisas juntas para procurar oportunidades. Um bom otimizador tem uma compreensão de como os motores de busca como o Google geram seus SERPs naturais para satisfazer os usuários8217 de navegação. Consultas de palavras-chave informativas e transacionais. Um bom comerciante de mecanismos de pesquisa tem uma boa compreensão dos riscos a curto e longo prazo envolvidos na otimização de rankings nos motores de busca e na compreensão do tipo de conteúdo e sites que o Google (especialmente) QUER retornar em seus SERPs naturais. O objetivo de qualquer campanha é mais visibilidade nos motores de busca e este seria um processo simples se não fosse por muitas armadilhas. Há regras a serem seguidas ou ignoradas, riscos a tomar, ganhos para fazer e batalhas a serem conquistadas ou perdidas. Um porta-voz da Mountain View, uma vez, chamou o motor de busca 8216 kingmakers 8216, e essa não era mentira. Ranking alto no Google é MUITO VALIDADE 8211 it8217s efetivamente 8216free advertising8217 no melhor espaço publicitário do mundo. O tráfego das listas naturais do Google é ainda o tráfego orgânico mais valioso para um site do mundo e pode fazer ou quebrar um negócio on-line. O estado de jogo, em 2016, é que você ainda pode gerar leads altamente segmentados, GRATUITAMENTE, apenas melhorando seu site e otimizando seu conteúdo para ser o mais relevante possível para um comprador que procura sua empresa, produto ou serviço. Como você pode imaginar, há muita concorrência agora para esse tráfego gratuito 8211, mesmo do Google () em alguns nichos. Você não deve competir com o Google. Você deve se concentrar em competir com seus concorrentes. O processo pode ser praticado, com sucesso, em um quarto ou em um local de trabalho, mas tradicionalmente sempre envolveu dominar muitas habilidades à medida que surgiram, incluindo tecnologias de marketing diversas, incluindo, entre outras,: Design do site Acessibilidade Usabilidade Experiência do usuário Desenvolvimento do site PHP, HTML, CSS , Etc. Gerenciamento de servidor Gerenciamento de domínio Redação de planilhas Análise de backlink Pesquisa de palavras-chave Promoção de mídia social Desenvolvimento de software Análise de análise e análise Arquitetura de informações Análise de registro de pesquisa Observando o Google por horas, leva muito, em 2016, a classificação em mérito de uma página no Google Em nichos competitivos. A grande vara do Google está atingindo cada webmaster com (no momento, e para o futuro previsível) é o 8216 QUALITY USER EXPERIENCE 8216 stick. Se você espera classificar no Google em 2016, you8217d melhor tem uma oferta de qualidade, não baseada inteiramente em manipulações, ou táticas da velha escola. É uma visita ao seu site uma boa experiência do usuário Caso não seja 8211, fique atento ao manual 8216Q uality Raters8217 e tenha cuidado com os algoritmos do Google PandaSite Quality que estão à procura de uma experiência de usuário fraca para seus usuários. O Google aumentando o 8282quality bar8217, ano a ano, garante um maior nível de qualidade no marketing on-line em geral (acima da nota de qualidade muito baixa observada nos últimos anos). O sucesso online envolve o investimento em conteúdo na página de qualidade superior, arquitetura do site, usabilidade, conversão para otimização e promoção. Se você não aceitar essa rota, você será perseguido pelos algoritmos do Google8217s em algum momento do próximo ano. O 8216 que é o guia do SEO 8216 (e este site inteiro) não é sobre o tipo de queima e queima do Google SEO (chamado webspam para o Google), pois isso é muito arriscado para implantar em um site de negócios real em 2016. O que é uma estratégia bem sucedida Obter relevante. Seja confiável. Get Popular. Já não é apenas sobre a manipulação em 2016. It8217s sobre a adição de conteúdo de qualidade e muitas vezes útil ao seu site que, em conjunto, cumprem um OBJETIVO que ofereça SATISFAÇÃO DE USUÁRIO. Se você é sério sobre obter mais tráfego livre dos motores de busca, prepare-se para investir tempo e esforço em seu site e marketing on-line. O Google quer classificar os documentos de QUALIDADE em seus resultados e forçar aqueles que desejam classificar alto para investir em conteúdo de alta qualidade ou ótimo serviço que atrai links editoriais de sites respeitáveis. Se você estiver disposto a adicionar um grande conteúdo excelente ao seu site e criar buzz sobre sua empresa, o Google irá classificá-lo no alto. Se você tentar manipular o Google, ele irá penalizá-lo por um período e, muitas vezes, até você corrigir a questão ofensiva 8211, que podemos saber, ANTES DE ANOS. Os backlinks em geral, por exemplo, são ainda muito pesados ​​FAR positivamente pelo Google e eles são manipulados para dirigir um site para as posições superiores 8211 por um tempo. Por essa razão, os blackhats o fazem 8211 e eles têm o modelo comercial para fazê-lo. It8217s é a maneira mais fácil de classificar um site, ainda hoje. Se você é um verdadeiro negócio que pretende construir uma marca on-line 8211 você pode usar métodos de chapéu preto. Ponto final. Reparar os problemas não trará necessariamente o tráfego orgânico de volta, como era antes de uma penalidade. A recuperação de uma penalidade do Google é um processo de 8282 novo crescimento8217, tanto quanto é um processo 8216clean-up8217. O Google Rankings está em constante Ever-Flux It8217s O trabalho do Google8217s FAZER MANIPULAÇÃO DAS SERPs DURAS. Portanto, as pessoas que estão por trás dos algoritmos continuam a mover os postes 8217, modificando as 8216rules8217 e elevando os padrões de 8216qualidade8217 para páginas que competem pelos rankings dos dez melhores. Em 2016, o 8211 apresentamos um fluxo contínuo no SERPs 8211 e isso parece adequar-se ao Google e evitar que todos adivinhem. O Google é muito secreto sobre o seu molho 8217secret8217 e oferece dicas às vezes úteis e, por vezes, vago 8211 e alguns dizem que oferece uma má direção 8211 sobre como obter mais do tráfego valioso do Google. O Google está no registro dizendo que o motor está empenhado em otimizadores de mecanismos de pesquisa 8216frustrating8217 para melhorar a quantidade de tráfego de alta qualidade para um site 8211 pelo menos (mas não limitado a) 8211 usando estratégias de baixa qualidade classificadas como spam na web. No seu núcleo, a otimização do mecanismo de pesquisa do Google ainda é sobre KEYWORDS e LINKS. It8217s sobre RELEVANCE. REPUTAÇÃO e CONFIANÇA. Trata-se de QUALIDADE DE CONTEÚDO SATISFAÇÃO DE VISITANTES DE AMÉRICOS. Uma boa EXPERIÊNCIA DE USUÁRIO é uma chave para vencer o 8211 e manter o ranking mais alto em várias verticais. Relevância, Autoridade Otimização da página da Web O Trust é sobre como fazer uma página da Web relevante e confiável o suficiente para classificar para uma consulta. It8217s sobre ranking de palavras-chave valiosas para o longo prazo, em mérito. Você pode jogar por 8216white hat8217 regras estabelecidas pelo Google, e pretende construir esta Autoridade e Confiar, naturalmente, ao longo do tempo, ou você pode optar por ignorar as regras e ir a tempo inteiro 8216black hat8217. A maioria das táticas de SEO ainda funcionam, por algum tempo, em algum nível, dependendo de quem as faça e de como a campanha é implantada. Seja qual for a rota que você leva, saiba que se o Google tentar atrapalhar a sua classificação usando métodos manifestamente óbvios e manipuladores, eles classificarão você um spammer na web e seu site será penalizado (você não classificará alto para palavras-chave relevantes). Essas penalidades podem durar anos se não forem resolvidas, já que algumas penalidades expiram e outras não 8211 e o Google quer que você limpe todas as violações. O Google não quer que você tente modificar e modificar onde você classifica, facilmente. Os críticos diriam que o Google preferiria que você os pagasse para fazer isso usando o Google Adwords. O problema para o Google é o ranking 8211 nas listas orgânicas do Google é uma prova social real para um negócio, uma maneira de evitar os custos PPC e ainda, simplesmente, a MELHOR MANEIRA para direcionar o tráfego VALIDADO para um site. It8217s GRÁTIS, também, uma vez que você tenha encontrado os critérios sempre crescentes necessários para classificar o topo. 8216 Experiência do usuário8217 O assunto é a experiência do usuário Um fator de classificação A experiência do usuário é mencionada 16 vezes no conteúdo principal das diretrizes dos avaliadores de qualidade (PDF oficial), mas foi informado pelo Google que não é, por exemplo, um fator classificável 8216ranking 8216 na área de trabalho Procure, pelo menos. No celular, claro. Uma vez que a UX é a base da atualização amigável para dispositivos móveis. Na área de trabalho atualmente não. (Gary Illyes: Google, maio de 2015) Enquanto a UX, nos dizem, não é literalmente um fator 8217 de registro8217, é útil entender exatamente o que o Google chama de uma experiência de usuário do 8216%, porque se quaisquer sinais UX pobres forem identificados em seu site, isso é Não vai ser uma coisa saudável para seus rankings em breve. O conselho consistente de SEO Matt Cutts foi focar em uma experiência de usuário satisfatória. Para a classificação do Google 8211 UX, pelo menos de uma perspectiva de rater8217s de qualidade, gira em torno de marcar a página para baixo: Desenho enganoso ou design potencialmente enganoso redirecionamentos furtivos (links de afiliados em camadas) downloads mal-intencionados e conteúdo gerado por usuários spam (comentários e postagens não modificados) Qualidade MC (conteúdo principal da página) SC de baixa qualidade (conteúdo suplementar) O que é SC (conteúdo suplementar) Quando se trata de uma página web e ux positivo, o Google fala muito sobre a funcionalidade e utilidade do Conteúdo Suplementar útil 8211 eg Links de navegação úteis para usuários (que não são, geralmente, MC ou anúncios). O Conteúdo Suplementar contribui para uma boa experiência do usuário na página, mas não ajuda diretamente a página a atingir seu objetivo. O SC é criado pela Webmasters e é uma parte importante da experiência do usuário. Um tipo comum de SC é links de navegação que permitem aos usuários visitar outras partes do site. Observe que, em alguns casos, o conteúdo por trás das guias pode ser considerado parte do SC da página. Para resumir, a falta de SC útil pode ser um motivo para uma classificação de baixa qualidade, dependendo da finalidade da página e do tipo de site. Temos padrões diferentes para sites pequenos que existem para atender suas comunidades versus grandes sites com um grande volume de páginas e conteúdo. Para alguns tipos de páginas da Web, como arquivos PDF e arquivos JPEG, esperamos que nenhum SC seja completo. Vale lembrar que o Good SC não pode salvar o Poor MC (8220Ma no Conteúdo é qualquer parte da página que ajuda diretamente a página a atingir sua finalidade 8220.) de uma classificação de baixa qualidade. O bom SC parece certamente ser uma opção sensata. Sempre foi. Os pontos-chave sobre o Conteúdo Suplementar SC podem ser uma grande parte do que torna uma página de alta qualidade muito satisfatória para seu propósito. O SC útil é um conteúdo direcionado especificamente ao conteúdo e finalidade da página. Sites mais pequenos, como sites para empresas locais e organizações comunitárias, ou sites e blogs pessoais, podem precisar de menos SC para o propósito deles. Uma página ainda pode receber uma classificação Alta ou Máxima com nenhum SC. Aqui estão as cotações específicas contendo o termo SC: O Conteúdo Suplementar contribui para uma boa experiência do usuário na página, mas não ajuda diretamente a página a alcançar seu objetivo. O SC é criado pela Webmasters e é uma parte importante da experiência do usuário. Um tipo comum de SC é links de navegação que permitem aos usuários visitar outras partes do site. Observe que, em alguns casos, o conteúdo por trás das guias pode ser considerado parte do SC da página. SC que contribui para uma experiência de usuário satisfatória na página e no site. 8211 (Uma marca de um site de alta qualidade 8211, esta declaração foi repetida 5 vezes) No entanto, esperamos que sites de grandes empresas e organizações façam um grande esforço na criação de uma boa experiência de usuário em seu site, incluindo ter SC útil . Para sites grandes, a SC pode ser uma das principais maneiras pelas quais os usuários exploram o site e encontram o MC, e a falta de SC útil em sites grandes com muito conteúdo pode ser um motivo para uma classificação baixa. No entanto, algumas páginas foram projetadas deliberadamente para mudar a atenção dos usuários do MC para os anúncios, links monetizados ou SC. Nestes casos, o MC torna-se difícil de ler ou usar, resultando em uma experiência de usuário pobre. Essas páginas devem ser avaliadas como Baixa. O design enganoso ou potencialmente enganoso torna difícil dizer que não há resposta, tornando esta página uma experiência de usuário pobre. O redirecionamento é o ato de enviar um usuário para um URL diferente do inicialmente solicitado. Há muitos bons motivos para redirecionar de um URL para outro, por exemplo, quando um site se move para um novo endereço. No entanto, alguns redirecionamentos são projetados para enganar os motores de busca e os usuários. Estas são uma experiência de usuário muito pobre, e os usuários podem se sentir enganados ou confusos. Vamos chamar esses redirecionados furtivos. Os redirecionamentos Sneaky são enganosos e devem ser classificados como o mais baixo. No entanto, você pode encontrar páginas com uma grande quantidade de discussões de fórum com spam ou comentários de usuários com spam. Bem, considere um comentário ou discussão no fórum para ser spam quando alguém publica comentários não relacionados que não se destinam a ajudar outros usuários, mas sim a anunciar um produto ou criar um link para um site. Freqüentemente, esses comentários são publicados por um bot e não por uma pessoa real. Os comentários espontâneos são fáceis de reconhecer. Eles podem incluir anúncios, download ou outros links, ou às vezes apenas cordas curtas de texto não relacionadas ao tópico, como Bom, Olá, eu sou novo aqui, como você está hoje, etc. Os webmasters devem encontrar e remover este conteúdo porque é Uma experiência de usuário ruim. As modificações tornam muito difícil a leitura e são uma experiência de usuário pobre. (MC de qualidade mais baixa (conteúdo copiado com pouco ou nenhum tempo, esforço, experiência, cução manual ou valor agregado para os usuários)) Às vezes, o MC de uma página de destino é útil para a consulta, mas a página exibe anúncios pornográficos ou Links pornográficos fora do MC, o que pode ser muito perturbador e potencialmente fornecer uma experiência de usuário pobre. A consulta ea utilidade do MC devem ser equilibradas com a experiência do usuário da página. As páginas que fornecem uma experiência de usuário ruim, como páginas que tentam baixar software malicioso, também devem receber avaliações baixas, mesmo que tenham algumas imagens apropriadas para a consulta. Em suma, ninguém irá recomendá-lo para criar um UX pobre, de propósito, à luz dos algoritmos do Google8217 e dos avaliadores de qualidade humana que estão mostrando um interesse óbvio nessas coisas. O Google classifica os sites móveis sobre o que as classes estão frustrando no UX 8211, embora em certos níveis o que as classes do Google 8216UX8217 possam estar bastante distantes do que um profissional da UX conhece das mesmas maneiras que as ferramentas de classificação móvel do Google8217s diferem, por exemplo, W3c Ferramentas de teste para dispositivos móveis. O Google ainda está interessado em classificar o conteúdo principal da página em questão e a reputação do domínio na página está em 8211 em relação ao seu site e as páginas concorrentes em outros domínios. Um UX satisfatório pode ajudar seus rankings, tomando em consideração os fatores de segunda ordem. Um UX pobre pode afetar seriamente sua classificação revisada por humanos, pelo menos. Os algoritmos de punição do Google8217s são provavelmente as páginas da classe como algo parecido com um UX fraco, se eles atendem determinados critérios detectáveis, p. Ex. Falta de reputação ou coisas de SEO da velha escola como uma palavra-chave que enche um site. Se você está melhorando a experiência do usuário, concentrando-se principalmente na qualidade do MC de suas páginas e evitando o 8211, mesmo removendo 8211 técnicas de SEO da velha escola 8211, esses certamente são passos positivos para obter mais tráfego do Google em 2016 8211 e do tipo de conteúdo O desempenho das recompensas do Google está no final em grande parte pelo menos sobre uma experiência de usuário satisfatória. Balanceando as conversões com a usabilidade Ampola satisfação do usuário Pegue as janelas pop-up ou o pop-unders como exemplo: de acordo com o especialista em usabilidade Jakob Nielson, 95 dos visitantes do site odiavam janelas pop-up inesperadas ou indesejadas, especialmente aquelas que contêm publicidade não solicitada. De fato, Pop-Ups foi votado consistentemente como a Técnica de publicidade mais odiada do número 1 desde que eles apareceram pela primeira vez há muitos anos. Estudantes de acessibilidade também concordam: criar uma nova janela do navegador deve ser a autoridade do pop-up do usuário, o novo Windows não deve desordenar a tela dos usuários. Todos os links devem ser abertos na mesma janela por padrão. (Uma exceção, no entanto, pode ser feita para páginas que contenham uma lista de links. É conveniente, nesses casos, abrir links em outra janela, para que o usuário possa voltar facilmente à página de links. Mesmo em tais casos, é aconselhável Para dar ao usuário uma nota anterior que os links abrirão em uma nova janela). Diga aos visitantes que estão prestes a invocar uma janela pop-up (usando o atributo lttitlegt do link) As janelas pop-up não funcionam em todos os navegadores. Eles são desorientadores para usuários Fornecer ao usuário uma alternativa. É inconveniente para os aficionados à usabilidade ouvir que os pop-ups podem ser usados ​​com sucesso para aumentar enormemente as conversões de assinatura de inscrição. EXEMPLO: TEST Com o uso de uma janela Pop Up Pop ups sugam, todos parecem concordar. É o pequeno teste que realizei em um subconjunto de páginas, um experimento para ver se os pop-ups funcionam neste site para converter mais visitantes aos assinantes. Eu testei isso quando eu não bloguei por alguns meses e o tráfego era muito estável. Testando Pop Up Windows Results Isso é um aumento justo nos assinantes de e-mail em todo o quadro neste pequeno experimento neste site. Usar um pop-up parece ter um impacto imediato. Já testei isso durante poucos meses e os resultados do pequeno teste acima foram repetidos repetidamente. Eu testei diferentes layouts e diferentes chamadas para ações sem pop-ups, e eles também funcionam, até certo ponto, mas normalmente demoram um pouco mais para implantar do que ativar um plugin. Eu realmente não gosto de pop-ups, pois eles têm sido um impedimento para a acessibilidade da web, mas é estúpido descartar a mão de qualquer técnica que funcione. Também não encontrei um cliente que, se tivessem esse tipo de resultado, escolheria a acessibilidade para inscrições. Eu realmente não uso o pop-up nos dias que postei no blog, como em outros testes, realmente pareceu matar quantas pessoas compartilham um post nos círculos das redes sociais. Com o Google agora mostrando interesse com intersticiais. Eu ficaria muito nervoso ao empregar uma janela pop-up que obscurece a principal razão para visitar a página. Se o Google detectar uma insatisfação, acho que isso seria uma notícia muito ruim para seus rankings. Estou, no momento, usando uma janela de pop-up de estratégia de saída, como espero, quando o usuário vê esse dispositivo, eles ficam satisfeitos com o meu conteúdo que eles vieram a ler. Eu posso recomendar isso como uma maneira de aumentar seus assinantes, no momento, com uma taxa de conversão similar à do pop-up 8211, se NÃO MELHOR. Acho que, como um otimizador, é sensível converter clientes sem usar técnicas que possam impactar negativamente os rankings do Google. NÃO deixe a conversão entrar no caminho do PRIMÁRIO porque um visitante é CURRENLTY em QUALQUER PÁGINA ESPECÍFICA ou você arrisca o Google a detectar a dissaturação relativa com o seu site e isso não o ajudará como Google8217s RankBrain melhora para resolver o que 8216quality8217 realmente significa . Google quer classificar sites de alta qualidade O Google tem um histórico de classificar seu site como algum tipo de entidade, e seja lá o que for, você não deseja um rótulo de baixa qualidade. Coloque-o por algoritmo ou humano. Os avaliadores manuais podem não afetar diretamente seus rankings, mas qualquer sinal associado ao Google que marque seu site como de baixa qualidade provavelmente deve ser evitado. Se você está fazendo sites para classificar no Google sem práticas não naturais, você terá que atender às expectativas do Google8217s nas Diretrizes Quality Raters (PDF). Páginas de baixa qualidade são insatisfatórias ou inexistentes em algum elemento que os impede de alcançar seus objetivos bem. Em alguns casos, há razões suficientes para marcar imediatamente a página em algumas áreas, e o Google orienta os avaliadores de qualidade para que o façam: uma quantidade insatisfatória de MC é uma razão suficiente para dar uma página de baixa qualidade. MC de baixa qualidade é uma razão suficiente para dar uma página de classificação de baixa qualidade. A falta de E-A-T apropriado é motivo suficiente para dar uma página de classificação de baixa qualidade. A reputação negativa é um motivo suficiente para dar uma página de classificação de baixa qualidade. O que são páginas de baixa qualidade Quando se trata de definir o que é uma página de baixa qualidade, o Google está evidentemente interessado na qualidade do Conteúdo Principal (MC) de uma página: o Google diz que MC deve ser o principal motivo 8216 de uma página existe8217 . A qualidade do MC é baixa. Existe uma quantidade insatisfatória de MC para a finalidade da página. Há uma quantidade insatisfatória de informações do site. POOR MC amp POOR USER EXPERIENCE Este conteúdo tem muitos problemas: má gramática e gramática, completa falta de edição. Informações imprecisas. A má qualidade do MC é uma razão para a classificação mais baixa a baixa. Além disso, os anúncios popover (as palavras que são sublinhadas em azul) podem tornar o conteúdo principal difícil de ler, resultando em uma experiência de usuário fraca. As páginas que fornecem uma experiência de usuário ruim, como páginas que tentam baixar software malicioso, também devem receber avaliações baixas, mesmo que tenham algumas imagens apropriadas para a consulta. FOCO DE DESENHO NÃO NO MC Se uma página parece mal projetada, dê uma boa olhada. Pergunte a si mesmo se a página foi deliberadamente projetada para chamar a atenção do MC. Se assim for, a classificação baixa é apropriada. O design da página está faltando. Por exemplo, o layout da página ou o uso do espaço distraem do MC, dificultando a utilização do MC. MC FALTA DE EXPERIÊNCIA DO AUTOR Você deve considerar quem é responsável pelo conteúdo do site ou conteúdo da página que você está avaliando. A pessoa ou organização tem conhecimentos suficientes para o tópico se falta experiência, autoridade ou confiabilidade, use a classificação Baixa. Não há provas de que o autor tenha conhecimentos médicos. Porque este é um artigo médico da YMYL, falta de experiência é um motivo para uma classificação baixa. O autor da página ou site não possui conhecimentos suficientes para o tópico da página e o site não é confiável ou autoritário para o tópico. Em outras palavras, o site da página está faltando o E-A-T. Após o conteúdo da página, os seguintes valores são os mais importantes para determinar se você possui uma página de alta qualidade. CONDIÇÕES SECUNDÁRIAS POBRES O SC não útil ou de distração que beneficia o site em vez de ajudar o usuário é uma razão para uma classificação baixa. O SC é distrativo ou inútil para a finalidade da página. A página está faltando SC útil. Para sites grandes, a SC pode ser uma das principais maneiras pelas quais os usuários exploram o site e encontram o MC, e a falta de SC útil em sites grandes com muito conteúdo pode ser um motivo para uma classificação baixa. Por exemplo, um anúncio para um O modelo em um biquíni revelador provavelmente é aceitável em um site que vende roupas de banho, no entanto, um anúncio pornográfico extremamente distrativo e gráfico pode justificar uma classificação baixa. Se o site se sentir atualizado de forma inadequada e mantido de forma inadequada para o objetivo, a classificação baixa provavelmente está garantida. O site está sem manutenção e atualizações. AVISO SERP SENTIMENTO REVISÕES NEGATIVAS A reputação credível negativa (embora não maliciosa ou financeiramente fraudulenta) é uma razão para uma classificação baixa, especialmente para uma página YMYL. O site tem uma reputação negativa. Quando se trata de atribuir a sua página a classificação mais baixa, você provavelmente terá que ir para isso, mas dá uma direção que você deseja garantir que evite a todo custo. O Google diz em todo o documento, que existem certas páginas que 8230 sempre devem receber a classificação mais baixa e estas são apresentadas abaixo. Nota 8211 Estas declarações abaixo estão espalhadas por todo o documento dos avaliadores e não estão listadas da maneira como os listei lá. Não acho que se perca qualquer contexto, apresentando-os assim, tornando-o mais digerível. Qualquer pessoa familiarizada com as Diretrizes para webmasters do Google estará familiarizada com a maioria dos seguintes: verdadeira falta de páginas ou sites de propósito. Às vezes, é difícil determinar o propósito real de uma página. Páginas em sites da YMYL com informações completamente inadequadas ou sem website. Páginas ou sites criados para ganhar dinheiro com pouca ou nenhuma tentativa de ajudar os usuários. Páginas com MC extremamente baixo ou de menor qualidade. Se uma página for criada deliberadamente sem MC, use a classificação mais baixa. Por que uma página existiria sem MC Pages sem MC são geralmente falta de páginas de propósito ou páginas enganosas. As páginas que são criadas deliberadamente com um mínimo de MC, ou com o MC que é completamente inútil com a finalidade da página, devem ser consideradas que não tem nenhuma página MC criada deliberadamente, sem MC deve ser avaliada como o mais baixo. Importante: A classificação mais baixa é apropriada se todos ou quase todos os MC na página forem copiados com pouco ou nenhum tempo, esforço, experiência, cura manual ou valor agregado para os usuários. Essas páginas devem ser classificadas como mais baixas, mesmo que a página atribua crédito para o conteúdo a outra fonte. Importante: A classificação mais baixa é apropriada se todos ou quase todos os MC na página forem copiados com pouco ou nenhum tempo, esforço, experiência, cura manual ou valor agregado para os usuários. Essas páginas devem ser classificadas como mais baixas, mesmo que a página atribua crédito para o conteúdo a outra fonte. Páginas nas páginas da YMYL (Seu dinheiro ou suas páginas da Transação de vida) com informações completamente inadequadas ou sem website. Páginas em sites abandonados, pirateados ou desfigurados. Páginas ou sites criados sem conhecimentos ou páginas que são altamente confiáveis, não confiáveis, não autorizados, imprecisos ou enganadores. Páginas ou sites prejudiciais ou maliciosos. Sites que têm reputações extremamente negativas ou maliciosas. Use também a classificação mais baixa para violações das Diretrizes de Qualidade do Google Webmaster. Finalmente, o mais baixo pode ser usado tanto para páginas com muitas características de baixa qualidade como para páginas cuja falta de uma única característica de qualidade de página faz você questionar o verdadeiro propósito da página. Importante: a reputação negativa é um motivo suficiente para dar uma classificação de baixa qualidade a uma página. A evidência de comportamento verdadeiramente malicioso ou fraudulento garante a classificação mais baixa. Páginas ou sites enganosos. As páginas da Web enganosas parecem ter uma finalidade útil (o propósito declarado), mas são realmente criadas por algum outro motivo. Use a classificação mais baixa se uma página da página da Web for deliberadamente criada para enganar e prejudicar usuários potencialmente para beneficiar o site. Algumas páginas são projetadas para manipular os usuários para clicar em certos tipos de links através de elementos de design visual, como layout de página, organização, colocação de links, cor de fonte, imagens, etc. Consideraremos esses tipos de páginas para ter um design de página enganosa. Use a classificação mais baixa se a página for deliberadamente projetada para manipular usuários para clicar em anúncios, links monetizados ou suspeitar de links de download com pouco ou nenhum esforço para fornecer MC útil. Às vezes, as páginas simplesmente não se sentem confiáveis. Use a classificação mais baixa para qualquer uma das seguintes opções: Páginas ou sites que você suspeite veementemente são fraudes Páginas que solicitam informações pessoais sem um motivo legítimo (por exemplo, páginas que solicitam nome, data de nascimento, endereço, conta bancária, número de identificação do governo, Etc.). Sites que compartilham palavras-passe para o Facebook, o Gmail ou outros serviços populares online. Páginas com links de download suspeitos, que podem ser malwares. Use a classificação mais baixa para sites com reputações extremamente negativas. Websites 8216Lacking Care and Maintenance8217 são avaliados 8216Low Quality8217. Às vezes, um site pode parecer um pouco negligenciado: os links podem estar quebrados, as imagens podem não ser carregadas e o conteúdo pode ficar obsoleto ou desatualizado. Se o site se sentir atualizado de forma inadequada e mantido de forma inadequada para o objetivo, a classificação baixa provavelmente está garantida. Páginas quebradas ou não funcionais classificadas como de baixa qualidade O Google oferece conselhos claros sobre a criação de páginas úteis de 404: Indique aos visitantes claramente que a página que eles estão procurando não pode ser encontrada Use um idioma amigável e convidativo Certifique-se de que sua página 404 usa o mesmo aspecto e sensação (Incluindo navegação) como o resto do seu site. Considere adicionar links aos seus artigos ou postagens mais populares, bem como um link para a página inicial do seu site. Pense em fornecer uma maneira para os usuários reportarem um link quebrado. Certifique-se de que seu servidor web retorna um código de status HTTP real 404 quando uma página perdida é solicitada. Ratings para Páginas com Mensagens de Erro ou Não MC O Google doesn8217t deseja indexar páginas sem uma finalidade específica ou conteúdo principal suficiente. Uma boa página de 404 e uma configuração adequada impedem que isso aconteça em primeiro lugar. Algumas páginas carregam com o conteúdo criado pelo webmaster, mas têm uma mensagem de erro ou faltam MC. Páginas podem faltar MC por vários motivos. Às vezes, a página está quebrada eo conteúdo não é carregado corretamente ou não. Às vezes, o conteúdo não está mais disponível e a página exibe uma mensagem de erro com essas informações. Muitos sites têm algumas páginas quebradas ou que não funcionam. Isso é normal, e essas páginas não funcionais ou quebradas em um site mantido de outra forma devem ser classificadas de baixa qualidade. Isso é verdade, mesmo que outras páginas do site sejam de alta ou alta qualidade. Does Google programmatically look at 404 pages We are told, NO in a recent hangout 8211 8211 but 8211 in Quality Raters Guidelines Users probably care a lot. Do 404 Errors in Search Console Hurt My Rankings 404 errors on invalid URLs do not harm your sites indexing or ranking in any way. JOHN MEULLER It appears this isn8217t a once size fits all answer. If you properly deal with mishandled 404 errors that have some link equity, you reconnect equity that was once lost 8211 and this 8216backlink reclamation8217 evidently has value. The issue here is that Google introduces a lot of noise into that Crawl Errors report to make it unwieldy and not very user-friendly. A lot of broken links Google tells you about can often be totally irrelevant and legacy issues. Google could make it instantly more valuable by telling us which 404s are linked to from only external websites. Fortunately, you can find your own broken links on site using the myriad of SEO tools available. I also prefer to use Analytics to look for broken backlinks on a site with some history of migrations, for instance. John has clarified some of this before, although he is talking specifically (I think) about errors found by Google in Search Console (formerly Google Webmaster Tools ): In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How do you tell Double-check the origin of the crawl error. If there8217s a broken link on your site, in your page8217s static HTML, then that8217s always worth fixing What about the funky URLs that are clearly broken When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those URLs and find a 404, thats great and expected. We just dont want to miss anything important If you are making websites and want them to rank, the 2015 and 2014 Quality Raters Guidelines document is a great guide for Webmasters to avoid low-quality ratings and potentially avoid punishment algorithms. Google Is Not Going To Rank Low-Quality Pages When It Has Better Options If you have exact match instances of key-phrases on low-quality pages, mostly these pages wont have all the compound ingredients it takes to rank high in Google in 2016. I was working this, long before I understood it partially enough to write anything about it. Here is a few examples of taking a standard page that did not rank for years and then turning it into a topic oriented resource page designed around a users intent: Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concepttopic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page. Technical SEO If you are doing a professional SEO audit for a real business, you are going to have to think like a Google Search Quality Rater AND a Google search engineer to provide real long term value to a client. Google has a LONG list of technical requirements it advises you meet . on top of all the things it tells you NOT to do to optimise your website. Meeting Google8217s technical guidelines is no magic bullet to success 8211 but failing to meet them can impact your rankings in the long run 8211 and the odd technical issue can actually severely impact your entire site if rolled out across multiple pages. The benefit of adhering to technical guidelines is often a second order benefit. You don8217t get penalised, or filtered, when others do. When others fall, you rise. Mostly 8211 individual technical issues will not be the reason you have ranking problems, but they still need addressed for any second order benefit they provide. When making a site for Google in 2016, you really need to understand that Google has a long list of things it will mark sites down for, and that8217s usually old-school seo tactics which are now classed as 8216 web spam 8216. Conversely, sites that are not marked down are not demoted and so improve in rankings. Sites with higher rankings pick up more organic links, and this process can float a high-quality page quickly to the top of Google. So 8211 the sensible thing for any webmaster is to NOT give Google ANY reason to DEMOTE a site. Tick all the boxes Google tell you to tick. I have used this simple (but longer term) strategy to rank on page 1 or thereabouts for 8216SEO8217 in the UK over the last few years, and drive 100 thousand relevant organic visitors to this site, every month, to only about 70 pages, without building any links over the last few years (and very much working on it part-time): What Is Domain Authority Domain authority, or as Google calls it, 8216online business authority8217, is an important ranking factor in Google. What is domain authority Well, nobody knows exactly how Google calculates popularity, reputation, intent or trust, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted 8211 all of which can be faked, of course. Most sites that have domain authorityonline business authority have lots of links to them 8211 that8217s for sure 8211 hence why link building has traditionally been so popular a tactic 8211 and counting these links is generally how most 3rd party tools calculate it a pseudo domain authority score, too. Massive domain authority and ranking 8216trust8217 was in the past awarded to very successful sites that have gained a lot of links from credible sources, and other online business authorities too. Amazon has a lot of online business authority 8230. ( Official Google Webmaster Blog ) SEO more usually talk about domain trust and domain authority based on the number, type and quality of incoming links to a site. Examples of trusted, authority domains include Wikipedia, the W3C and Apple. How do you become a OBA Through building a killer online or offline brand or service with, usually, a lot of useful content on your site. How do you take advantage of being an online business authority Either you turn the site into a SEO Black Hole (only for the very biggest brands) or you pump out information 8211 like all the time. On any subject. Because Google will rank it EXCEPT 8211 If what you publish is deemed low quality and not suitable for your domain to have visibility on Google. I think this 8216quality score8217 Google has developed could be Google8217s answer to this sort of historical domain authority abuse. Can you (on a smaller scale in certain niches) mimic a online business authority by recognising what OBA do for Google, and why Google ranks these high in search result. These provide THE service, THE content, THE experience. This takes a lot of work and a lot of time to create, or even mimic. In fact, as a SEO, I honestly think the content route is the only sustainable way for a most businesses to try to achieve OBA at least in their niche or locale. I concede a little focused linkbuilding goes a long way to help, and you have certainly got to get out there and tell others about your site8230 Have other relevant sites link to yours. Google Webmaster Guidelines Brands are how you sort out the cesspool. 8220Brands are the solution, not the problem,8221 Mr. Schmidt said. 8220 Brands are how you sort out the cesspool. 8220 Google CEO Eric Schmidt said this. Reading between the lines, I8217ve long thought this is good SEO advice. If you are a 8216brand8217 in your space, or well-cited site, Google wants to rank your stuff at the top because it trusts you won8217t spam it and fill results pages with crap and make Google look stupid. That8217s money just sitting on the table the way Google currently awards massive domain authority and trust to particular sites they rate highly. Tip 8211 Keep content within your topic, unless you are producing high-quality content, of course. (e. g. the algorithms detect no unnatural practices) I am always thinking: 8220how do I get links from big KNOWN sites to my site. Where is my next quality link coming from8221 Getting links from 8216Brands8217 (or well-cited websites) in niches can mean 8216quality links8217 . Easier said than done, for most, of course, but that is the point. But the aim with your main site should always be to become an online brand . Does Google Prefer Big Brands In Organic SERPs Well, yes. It8217s hard to imagine that a system like Google8217s was not designed exactly over the last few years to deliver the listings it does today 8211 and it IS filled with a lot of pages that rank high LARGELY because the domain the content is on. Big Brands have an inherent advantage in Google8217s ecosystem, and it8217s kind of a suck for small businesses. There8217s more small businesses than big brands for Google to get Adwords bucks out of, too. That being said 8211 small businesses can still succeed if they focus on a strategy based on depth, rather than breadth regarding how content is structured page to page on a website . Is Domain Age An Important Google Ranking Factor Having a ten-year-old domain that Google knows nothing about is the same as having a brand new domain. A 10-year-old site that8217s continually cited by, year on year, the actions of other, more authoritative, and trusted sites That8217s valuable. But that8217s not the age of your website address ON IT8221S OWN in-play as a ranking factor. A one-year-old domain cited by authority sites is just as valuable if not more valuable than a ten-year-old domain with no links and no search-performance history. Perhaps Domain age may come into play when other factors are considered 8211 but I think Google works very much like this on all levels, with all 8216ranking factors8217, and all ranking 8216conditions8217. I don8217t think you can consider discovering 8216ranking factors8217 without 8216ranking conditions8217. Domain age (NOT ON IT8221S OWN) Length of site domain registration (I don8217t see much benefit ON IT8221S OWN even knowing 8220Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year.8221) 8211 paying for a domain in advance just tells others you don8217t want anyone else using this domain name, it is not much of an indication that you8217re going to do something Google cares about). Domain registration information was hiddenanonymous (possibly, under human review if OTHER CONDITIONS are met like looking like a spam site) Site top level domain (geographical focus, e. g. com versus co. uk ) (YES) Site top level domain (e. g. com versus . info ) (DEPENDS) Sub domain or root domain (DEPENDS) Domain past records (how often it changed IP) (DEPENDS) Domain past owners (how often the owner was changed) (DEPENDS) Keywords in the domain ( DEFINITELY 8211 ESPECIALLY EXACT KEYWORD MATCH 8211 although Google has a lot of filters that mute the performance of an exact match domain in 2016 )) Domain IP (DEPENDS 8211 for most, no) Domain IP neighbours (DEPENDS 8211 for most, no) Domain external mentions (non-linked) (I have no idea in 2016) Geo-targeting settings in Google Webmaster Tools (YES 8211 of course) Google Penalties For Unnatural Footprints In 2016, you need to be aware that what works to improve your rank can also get you penalised (faster, and a lot more noticeably). In particular, the Google web spam team is currently waging a PR war on sites that rely on unnatural links and other 8216manipulative8217 tactics (and handing out severe penalties if it detects them). And that8217s on top of many algorithms already designed to look for other manipulative tactics (like keyword stuffing or boilerplate spun text across pages). Google is making sure it takes longer to see results from black and white hat SEO, and intent on ensuring a flux in its SERPs based largely on where the searcher is in the world at the time of the search, and where the business is located near to that searcher. There are some things you cannot directly influence legitimately to improve your rankings, but there is plenty you CAN do to drive more Google traffic to a web page. Ranking Factors Google has HUNDREDS of ranking factors with signals that can change daily, weekly, monthly or yearly to help it work out where your page ranks in comparison to other competing pages in SERPs. You will not ever find every ranking factor. Many ranking factors are on-page or on-site and others are off-page or off-site. Some ranking factors are based on where you are, or what you have searched for before. I8217ve been in online marketing for 15 years. In that time, a lot has changed. Ive learned to focus on aspects that offer the greatest return on investment of your labour. Here are few simple SEO tips to begin with: If you are just starting out, don8217t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it8217s best to keep your plan simple . GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out 8211 you may as well learn how to do it within Google8217s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google8217s guidelines, or not, and stick to it. Don8217t be caught in the middle with an important project. Do not always follow the herd. If your aim is to deceive visitors from Google, in any way, Google is not your friend. Google is hardly your friend at any rate 8211 but you don8217t want it as your enemy. Google will send you lots of free traffic though if you manage to get to the top of search results, so perhaps they are not all that bad. A lot of optimisation techniques that are effective in boosting sites rankings in Google are against Google8217s guidelines. For example many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back8230. You must be smart, and cautious, when it comes to building links to your site in a manner that Google hopefully won8217t have too much trouble with, in the FUTURE. Because they will punish you in the future. Don8217t expect to rank number 1 in any niche for a competitive without a lot of investment, work. Don8217t expect results overnight. Expecting too much too fast might get you in trouble with the spam team. You dont pay anything to get into Google, Yahoo or Bing natural, or free listings. Its common for the major search engines to find your website pretty quickly by themselves within a few days. This is made so much easier if your cms actually pings search engines when you update content (via XML sitemaps or RSS for instance). To be listed and rank high in Google and other search engines, you really should consider and mostly abide by search engine rules and official guidelines for inclusion. With experience and a lot of observation, you can learn which rules can be bent, and which tactics are short term and perhaps, should be avoided. Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google eyes as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites 8211 trump every other signal.) Ive always thought if you are serious about ranking do so with ORIGINAL COPY. Its clear search engines reward good content it hasnt found before. It indexes it blisteringly fast, for a start (within a second, if your website isn8217t penalised). So make sure each of your pages has enough text content you have written specifically for that page and you wont need to jump through hoops to get it ranking. If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites 8211 those with online business authority 8211 know about it, and they might link to you this is called a quality backlink . Search engines need to understand that 8216a link is a link8217 that can be trusted. Links can be designed to be ignored by search engines with the rel nofollow attribute. Search engines can also find your site by other websites linking to it. You can also submit your site to search engines direct, but I havent submitted any site to a search engine in the last ten years you probably dont need to do that. If you have a new site, I would immediately register it with Google Webmaster Tools these days. Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE. Google will crawl and index every single page on your site 8211 even pages out with an XML sitemap. Many think Google will not allow new websites to rank well for competitive terms until the web address ages and acquires trust in Google 8211 I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A 8220honeymoon period8221 to give you a taste of Google traffic, no doubt. Google WILL classify your site when it crawls and indexes your site and this classification can have a DRASTIC effect on your rankings its important for Google to work out WHAT YOUR ULTIMATE INTENT IS do you want to be classified as an affiliate site made just for Google, a domain holding page or a small business website with a real purpose Ensure you dont confuse Google by being explicit with all the signals you can to show on your website you are a real business, and your INTENT is genuine 8211 and even more importantly today 8211 FOCUSED ON SATISFYING A VISITOR. NOTE 8211 If a page exists only to make money from Google8217s free traffic 8211 Google calls this spam. I go into this more, later in this guide. The transparency you provide on your website in text and links about who you are, what you do, and how you8217re rated on the web or as a business is one way that Google could use (algorithmically and manually) to 8216rate8217 your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google. To rank for specific keyword phrase searches, you usually need to have the keyword phrase or highly relevant words on your page (not necessarily all together, but it helps) or in links pointing to your pagesite. Ultimately what you need to do to compete is largely dependent on what the competition for the term you are targeting is doing. You8217ll need to at least mirror how hard they are competing if a better opportunity is hard to spot. As a result of other quality sites linking to your site, the site now has a certain amount of real PageRank that is shared with all the internal pages that make up your website that will in future help provide a signal to where this page ranks in the future. Yes, you need to build links to your site to acquire more PageRank, or Google 8216juice8217 8211 or what we now call domain authority or trust . Google is a link-based search engine it does not quite understand good or 8216quality8217 content but it does understand popular content. It can also usually identify poor, or THIN CONTENT 8211 and it penalises your site for that 8211 or 8211 at least 8211 it takes away the traffic you once had with an algorithm change. Google doesn8217t like calling actions the take a 8216penalty8217 8211 it doesn8217t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse 8211 low-quality content and unnatural links. If they do take action your site for paid links 8211 they call this a 8216Manual Action8217 and you will get notified about it in Webmaster Tools if you sign up . Link building is not JUST a numbers game, though. One link from a trusted authority site in Google could be all you need to rank high in your niche. Of course, the more trusted links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google in 2016. Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously machine generated e. g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits. Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant often, Ill link to relevant pages when the keyword is in the title elements of both pages. I dont go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them. Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example SEO Scotland as opposed to hobo-web. co. uk or click here. Saying that 8211 in 2016, Google is punishing manipulative anchor text very aggressively, so be sensible 8211 and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links). I think the anchor text links in internal navigation is still valuable 8211 but keep it natural. Google needs links to find and help categorise your pages. Dont underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but dont overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation. Search engines like Google spider or crawl your entire site by following all the links on your site to new pages, much as a human would click on the links to your pages. Google will crawl and index your pages, and within a few days usually, begin to return your pages in SERPs. After a while, Google will know about your pages, and keep the ones it deems useful pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful 8211 too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content. Ideally, you will have unique pages, with unique page titles and unique page meta descriptions. Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I dont worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals. Google will take some time to analyse your entire site, examining text content and links. This process is taking longer and longer these days but is ultimately determined by your domain reputation and real PageRank. If you have a lot of duplicate low-quality text already found by Googlebot on other websites it knows about Google will ignore your page. If your site or page has spammy signals, Google will penalise it, sooner or later. If you have lots of these pages on your site 8211 Google will ignore most of your website. You dont need to keyword stuff your text to beat the competition. You optimise a page for more traffic by increasing the frequency of the desired key phrase, related key terms, co-occurring keywords and synonyms in links, page titles and text content. There is no ideal amount of text no magic keyword density. Keyword stuffing is a tricky business, too, these days. I prefer to make sure I have as many UNIQUE relevant words on the page that make up as many relevant long tail queries as possible. If you link out to irrelevant sites, Google may ignore the page, too but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don8217t do well in Google these days without some good quality backlinks and higher quality pages. Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out. I8217ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once its been soaked up by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don8217t need to think you need to worry about that, too much, in 2016. Original content is king and will attract a natural link growth 8211 in Googles opinion. Too many incoming links too fast might devalue your site, but again. I usually err on the safe side I always aimed for massive diversity in my links to make them look 8216more natural8217. Honestly, I go for natural links in 2016 full stop, for this website. Google can devalue whole sites, individual pages, template generated links and individual links if Google deems them unnecessary and a 8216poor user experience8217. Google knows who links to you, the quality of those links, and whom you link to. These 8211 and other factors 8211 help ultimately determine where a page on your site ranks. To make it more confusing 8211 the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority 8211 sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank. Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank 8216thin8217 pages in results 8211 any page you want to rank 8211 should have all the things Google is looking for. That8217s a lot these days It is important you spread all that real 8216PageRank8217 8211 or link equity 8211 to your sales keyword phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not 8216demote8217 pages into oblivion 8211 or 8216supplemental results8217 as we old timers knew them back in the day. Again 8211 this is slightly old school 8211 but it gets me by, even today. Consider linking to important pages on your site from your home page, and other important pages on your site. Focus on RELEVANCE first. Then, focus your marketing efforts and get REPUTABLE. This is the key to ranking 8216legitimately8217 in Google in 2016. Every few months Google changes its algorithm to punish sloppy optimisation or industrial manipulation. Google Panda and Google Penguin are two such updates, but the important thing is to understand Google changes its algorithms constantly to control its listings pages (over 600 changes a year we are told). The art of rank modification is to rank without tripping these algorithms or getting flagged by a human reviewer 8211 and that is tricky Focus on improving website download speeds at all times. The web is changing very fast, and a fast website is a good user experience . Welcome to the tightrope that is modern web optimisation. Read on if you would like to learn how to SEO8230. Keyword Research The first step in any professional campaign is to do some keyword research and analysis. Somebody asked me about this a simple white hat tactic and I think what is probably the simplest thing anyone can do that guarantees results. The chart above (from last year) illustrates a reasonably valuable 4-word term I noticed a page I had didn8217t rank high in Google for, but I thought probably should and could rank for, with this simple technique. I thought it as simple as an example to illustrate an aspect of onpage SEO or 8216rank modification8217, that8217s white hat, 100 Google friendly and never, ever going to cause you a problem with Google. This 8216trick8217 works with any keyword phrase, on any site, with obvious differing results based on the availability of competing pages in SERPs, and availability of content on your site. The keyword phrase I am testing rankings for isn8217t ON the page, and I did NOT add the key phrase8230. or in incoming links, or using any technical tricks like redirects or any hidden technique, but as you can see from the chart, rankings seem to be going in the right direction. You can profit from it if you know a little about how Google works (or seems to work, in many observations, over years, excluding when Google throws you a bone on synonyms. You can8217t ever be 100 certain you know how Google works on any level, unless it8217s data showing you8217re wrong, of course.) What did I do to rank number 1 from nowhere for that key phrase I added one keyword to the page in plain text because adding the actual 8216keyword phrase8217 itself would have made my text read a bit keyword stuffed for other variations of the main term . It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research 8211 and knowing which unique keywords to add. This example illustrates a key to 8216relevance8217 on a page, in a lot of instances, is a keyword. The precise keyword. Yes 8211 plenty of other things can be happening at the same time. It8217s hard to identify EXACTLY why Google ranks pages all the time8230but you can COUNT on other things happening and just get on with what you can see works for you. In a time of light optimisation . it8217s useful to EARN a few terms you SHOULD rank for in simple ways that leave others wondering how you got it. Of course, you can still keyword stuff a page, or still spam your link profile 8211 but it is 8216light8217 optimisation I am genuinely interested in testing on this site 8211 how to get more with less 8211 I think that8217s the key to not tripping Google8217s aggressive algorithms. There are many tools on the web to help with basic keyword research (including the Google Keyword Planner tool and there are even more useful third party SEO tools to help you do this). You can use many keyword research tools to identify quickly opportunities to get more traffic to a page. Google Analytics Keyword 8216Not Provided.8217 Google Analytics was the very best place to look at keyword opportunity for some (especially older) sites, but that all changed a few years back. Google stopped telling us which keywords are sending traffic to our sites from the search engine back in October 2011, as part of privacy concerns for its users. Google will now begin encrypting searches that people do by default, if they are logged into Google already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive referrer data that reveals what those people searched for, except in the case of ads. Google Analytics now instead displays 8211 keyword 8220 not provided 8220, instead. In Googles new system, referrer data will be blocked. This means site owners will begin to lose valuable data that they depend on, to understand how their sites are found through Google. Theyll still be able to tell that someone came from a Google search. They wont, however, know what that search was. SearchEngineLand (A great source for Google industry news) You can still get some of this data if you sign up for Google Webmaster Tools (and you can combine this in Google Analytics) but the data even there is limited and often not entirely the most accurate. The keyword data can be useful, though and access to backlink data is essential these days. If the website you are working on is an aged site 8211 there8217s probably a wealth of keyword data in Google Analytics: Do Keywords In Bold Or Italic Help Some webmasters claim putting your keywords in bold or putting your keywords in italics is a beneficial ranking factor in terms of search engine optimizing a page. It is essentially impossible to test this, and I think these days, Google could well be using this (and other easy to identify on page optimisation efforts) to determine what to punish a site for, not promote it in SERPs. Any item you can optimise on your page Google can use this against you to filter you out of results. I use bold or italics these days specifically for users. I only use emphasis if its natural or this is really what I want to emphasise Do not tell Google what to filter you for that easily. I think Google treats websites they trust far different to others in some respect. That is, more trusted sites might get treated differently than untrusted sites. Keep it simple, natural, useful and random. How Many Words amp Keywords Do I Use On A Page I get asked this all the time 8211 how much text do you put on a page to rank for a certain keyword The answer is there is no optimal amount of text per page, but how much text you8217ll 8216need8217 will be based on your DOMAIN AUTHORITY, your TOPICAL RELEVANCE and how much COMPETITION there is for that term, and HOW COMPETITIVE that competition is. Instead of thinking about the quantity of the text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind. Well, thats how I do it. I dont find that you need a minimum amount of words or text to rank in Google. I have seen pages with 50 words outrank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other strategy. In 2016, Google is a lot better at hiding away those pages, though. At the moment, I prefer long form pages with a lot of text although I still rely heavily on keyword analysis to make my pages. The benefits of longer pages are that they are great for long tail key phrases. Creating deep, information rich pages focuses the mind when it comes to producing authoritative, useful content. Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a users search query. I dont care how many words I achieve this with and often I need to experiment on a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google. One thing to note the more text you add to the page, as long as it is unique, keyword rich and relevant, the more that page will be rewarded with more visitors from Google. There is no optimal number of words on a page for placement in Google. Every website every page is different from what I can see. Dont worry too much about word count if your content is original and informative. Google will probably reward you on some level at some point if there is lots of unique text on all your pages. What is The Perfect Keyword Density The short answer to this is no . There is no one-size-fits-all keyword density, no optimal percentage guaranteed to rank any page at number 1. However, I do know you can keyword stuff a page and trip a spam filter. Most web optimisation professionals agree there is no ideal percent of keywords in text to get a page to number 1 in Google. Search engines are not that easy to fool, although the key to success in many fields doing simple things well (or, at least, better than the competition). I write natural page copy where possible always focused on the key terms I never calculate density to identify the best there are way too many other things to work on. I have looked into this. If it looks natural, its ok with me. I aim to include related terms . long-tail variants and synonyms in Primary Content 8211 at least ONCE, as that is all some pages need. Optimal keyword density is a myth, although there are many who would argue otherwise. 8216Things, Not Strings8217 Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isnt relying only on keyword phrases on a page to do that anymore. Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs (Search Engine Results Pages).. Google has plenty of options when rewriting the query in a contextual way, based on what you searched for previously, who you are, how you searched and where you are at the time of the search. Can I Just Write Naturally and Rank High in Google Yes, you must write naturally (and succinctly) in 2016, but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience. You can just write naturally and still rank, albeit for fewer keywords than you would have if you optimised the page. There are too many competing pages targeting the top spots not to optimise your content. Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on. Do You Need Lots of Text To Rank Pages In Google User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search. SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz about this. Transactional The user wants to do something like buy, signup, register to complete a task they have in mind. Informational The user wishes to learn something Navigational The user knows where they are going The Google human quality rater guidelines modify these to simpler constructs: As long as you meet the user8217s primary intent, you can do this with as few words as it takes to do so. You do NOT need lots of text to rank in Google. Optimise For User Intent amp Satisfaction When it comes to writing SEO-friendly text for Google, we must optimise for u ser intent, not simply what a user typed into Google. Google will send people looking for information on a topic to the highest quality, relevant pages it has in its database, often BEFORE it relies on how Google used to work e. g. relying on finding near or exact match instances of a keyword phrase on any one page. Google is constantly evolving to better understand the context and intent of user behaviour, and it doesnt mind rewriting the query used to serve high-quality pages to users that comprehensively deliver on user satisfaction e. g. explore topics and concepts in a unique and satisfying way. Of course, optimising for user intent, even in this fashion, is something a lot of marketers had been doing long before query rewriting and Google Hummingbird came along. Optimising For 8216The Long Click8217 When it comes to rating user satisfaction. there are a few theories doing the rounds at the moment that I think are sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy of the relevance and relative quality of the actual SERP. What is a Long Click A user clicks a result and spends time on it, sometimes terminating the search. What is a Short Click A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction. For more on this, I recommend this article on the time to long click . Optimise Supplementary Content on the Page Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery. That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic you should be LINKING OUT to other helpful resources e. g. other websites. A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I cant think of a website that is the true end-point of the web. A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I cant think of a website that is the true end-point of the web. TASK On informational pages, LINK OUT to related pages on other sites AND on other pages on your own website where RELEVANT TASK For e-commerce pages, ADD RELATED PRODUCTS. TASK Create In-depth Content Pieces TASK Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links TASK Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject TASK If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 thinner pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell. Page Title Element The page title tag (or HTML Title Element) is arguably the most important on page ranking factor (with regards to web page optimisation). Keywords in page titles can undeniably HELP your pages rank higher in Google results pages (SERPs). The page title is also often used by Google as the title of a search snippet link in search engine results pages. For me, a perfect title tag in Google is dependant on a number of factors and I will lay down a couple below but I have since expanded page title advice on another page (link below) A page title that is highly relevant to the page it refers to will maximise its usability, search engine ranking performance and click through satisfaction rate. It will probably be displayed in a web browser8217s window title bar, and in clickable search snippet links used by Google, Bing amp other search engines. The title element is the 8220crown8221 of a web page with important keyword phrase featuring, AT LEAST, ONCE within it. Most modern search engines have traditionally placed a lot of importance in the words contained within this HTML element. A good page title is made up of valuable keyword phrases with clear user intent . The last time I looked Google displayed as many characters as it can fit into 8220a block element thats 512px wide and doesnt exceed 1 line of text8221. So 8211 THERE BECAME NO AMOUNT OF CHARACTERS any optimiser could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title. Ultimately 8211 only the characters and words you use will determine if your entire page title will be seen in a Google search snippet . Recently Google displayed 70 characters in a title 8211 but that changed in 20112012. If you want to ENSURE your FULL title tag shows in the desktop UK version of Google SERPs . stick to a shorter title of about 55 characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in March 2015 at least). I have seen 8216up-to8217 69 characters (back in 2012) 8211 but as I said 8211 what you see displayed in SERPs depends on the characters you use. In 2016 8211 I just expect what Google displays to change 8211 so I don8217t obsess about what Google is doing in terms of display. Google is all about 8216user experience8217 and 8216visitor satisfaction8217 in 2016 so it8217s worth remembering that usability studies have shown that a good page title length is about seven or eight words long and fewer than 64 total characters. Longer titles are less scan able in bookmark lists, and might not display correctly in many browsers (and of course probably will be truncated in SERPs). Google will INDEX perhaps 1000s of characters in a title8230 but I don8217t think no one knows exactly how many characters or words Google will count AS a TITLE when determining relevance for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide its 8216 secret sauce 8216. I have had ranking success with longer titles 8211 much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course). You can probably include up to 12 words that will be counted as part of a page title . and consider using your important keywords in the first eight words. The rest of your page title will be counted as normal text on the page. NOTE, in 2016, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet . The search snippet title and description are very much QUERY dependant these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title. When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) 8211 and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to Google Panda or other 8216user experience8217 performance issues. A highly relevant unique page title is no longer enough to float a page with thin content . Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites. Some page titles do better with a call to action 8211 a call to action which reflects exactly a searcher8217s intent (e. g. to learn something, or buy something, or hire something. Remember this is your hook in search engines if Google chooses to use your page title in its search snippet, and there are a lot of competing pages out there in 2016. The perfect title tag on a page is unique to other pages on the site. In light of Google Panda, an algorithm that looks for a 8216quality8217 in sites, you REALLY need to make your page titles UNIQUE, and minimise any duplication, especially on larger sites. I like to make sure my keywords feature as early as possible in a title tag but the important thing is to have important keywords and key phrases in your page title tag SOMEWHERE. For me, when improved search engine visibility is more important than branding, the company name goes at the end of the tag, and I use a variety of dividers to separate as no one way performs best. If you have a re cognisable brand 8211 then there is an argument for putting this at the front of titles 8211 although Google often will change your title dynamically 8211 sometimes putting your brand at the front of your snippet link title itself. Note that Google is pretty good these days at removing any special characters you have in your page title 8211 and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up. I like to think I write titles for search engines AND humans. Know that Google tweaks everything regularly 8211 why not what the perfect title keys off So MIX it up8230 Don8217t obsess. Natural is probably better, and will only get better as engines evolve. I optimise for key-phrases, rather than just keywords. I prefer mixed case page titles as I find them more scan able than titles with ALL CAPS or all lowercase. Generally speaking, the more domain trustauthority your SITE has in Google, the easier it is for a new page to rank for something. So bear that in mind. There is only so much you can do with your page titles 8211 your websites rankings in Google are a LOT more to do with OFFSITE factors than ONSITE ones 8211 negative and positive. Click through rate is something that is likely measured by Google when ranking pages (Bing say they use it too, and they now power Yahoo), so it is worth considering whether you are best optimising your page titles for click-through rate or optimising for more search engine rankings. I would imagine keyword stuffing your page titles could be one area Google look at (although I see little evidence of it). Remember8230.think 8216 keyword phrase 8216 rather than 8216 keyword 8216, 8216 keyword 8216,8217 keyword 82168230 think Long Tail. Google will select the best title it wants for your search snippet 8211 and it will take that information from multiple sources, NOT just your page title element. A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with a colon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to). A Note About Title Tags When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site such as have you repeated the keyword four times or only once I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible. I always aim to keep my HTML page title elements simple and as unique as possible. Im certainly cleaning up the way I write my titles all the time. More Reading: External Links Meta Keywords Tag A hallmark of shady natural search engine optimisation companies the meta-keywords tag. Companies that waste time and resources on these items waste client8217s money 8211 that8217s a fact: I have one piece of advice with the meta keyword tag, which like the title tag. goes in the head section of your web page, forget about them . If you are relying on meta-keyword optimisation to rank for terms, your dead in the water. From what I see, Google Bing ignores meta keywords 8211 or, at least, places no weight in them to rank pages. Yahoo may read them, but really, a search engine optimiser has more important things to worry about than this nonsense. What about other search engines that use them Hang on while I submit my site to those 75,000 engines first sarcasm. Yes, ten years ago early search engines liked looking at your meta-keywords. Ive seen OPs in forums ponder which is the best way to write these tags with commas, with spaces, limiting to how many characters. Forget about meta-keyword tags they are a pointless waste of time and bandwidth. Could probably save a rain forest with the bandwidth costs we save if everybody removed their keyword tags. So you have a new site. You fill your home page meta tags with the 20 keywords you want to rank for hey, thats what optimisation is all about, isnt it Youve just told Google by the third line of text what to filter you for. The meta nameKeywords was actually originally for words that werent actually on the page that would help classify the document. Sometimes competitors might use the information in your keywords to determine what you are trying to rank for, too. If everybody removed them and stopped abusing meta keywords, Google would probably start looking at them but thats the way of things in search engines. I ignore meta keywords and remove them from pages I work on. Meta Description Tag L ike the title element and unlike the meta keywords tag, this one is important, both from a human and search engine perspective. Forget whether or not to put your keyword in it, make it relevant to a searcher and write it for humans, not search engines. If you want to have this 20-word snippet which accurately describes the page you have optimised for one or two keyword phrases when people use Google to search, make sure the keyword is in there. I must say, I normally do include the keyword in the description as this usually gets it in your SERP snippet. Google looks at the description but there is debate whether it uses the description tag to rank sites. I think they might be at some level, but again, a very weak signal. I certainly dont know of an example that clearly shows a meta description helping a page rank. Sometimes, I will ask a question with my titles, and answer it in the description, sometimes I will just give a hint. That is a lot more difficult in 2016 as search snippets change depending on what Google wants to emphasise to its users. Its also very important to have unique meta descriptions on every page on your site. Sometimes I think if your titles are spammy, your keywords are spammy, and your meta description is spammy, Google might stop right there even they probably will want to save bandwidth at some time. Putting a keyword in the description wont take a crap site to number 1 or raise you 50 spots in a competitive niche so why optimise for a search engine when you can optimise for a human I think that is much more valuable, especially if you are in the mix already that is on page one for your keyword. So, the meta description tag is important in Google, Yahoo and Bing and every other engine listing very important to get it right. Make it for humans. Oh, and by the way Google seems to truncate anything over 156 characters in the meta description, although this may be limited by pixel width in 2016. Programmatically Generate Meta Descriptions on Large Sites Googles says you can programmatically auto-generate unique meta descriptions based on the content of the page. Follow Googles example:.and their advice why to do this: No duplication, more information, and everything is clearly tagged and separated . No real additional work is required to generate something of this quality: the price and length are the only new data, and they are already displayed on the site. I think it is very important to listen when Google tells you to do something in a very specific way, and Google does give clear advice in this area. More Reading: External Links Robots Meta Tag Example Robots Meta Tag I could use the above meta tag to tell Google to index the page but not to follow any links on the page, if for some reason, I did not want the page to appear in Google search results. By default, Googlebot will index a page and follow links to it. So theres no need to tag pages with content values of INDEX or FOLLOW. GOOGLE There are various instructions you can make use of in your Robots Meta Tag, but remember Google by default WILL index and follow links, so you have NO need to include that as a command you can leave the robots meta out completely and probably should if you dont have a clue . Googlebot understands any combination of lowercase and uppercase. GOOGLE . Valid values for Robots Meta Tag CONTENT attribute are: INDEX , NOINDEX , FOLLOW , and NOFOLLOW . META NAMEROBOTS CONTENTNOINDEX, FOLLOW META NAMEROBOTS CONTENTINDEX, NOFOLLOW META NAMEROBOTS CONTENTNOINDEX, NOFOLLOW META NAMEROBOTS CONTENTNOARCHIVE META NAMEGOOGLEBOT CONTENTNOSNIPPET Google will understand the following and interprets the following robots meta tag values: NOINDEX 8211 prevents the page from being included in the index. NOFOLLOW 8211 prevents Googlebot from following any links on the page. (Note that this is different from the link-level NOFOLLOW attribute, which prevents Googlebot from following an individual link.) NOARCHIVE 8211 prevents a cached copy of this page from being available in the search results. NOSNIPPET 8211 prevents a description from appearing below the page in the search results, as well as prevents caching of the page. NOODP 8211 blocks the Open Directory Project description of the page from being used in the description that appears below the page in the search results. NONE 8211 equivalent to NOINDEX, NOFOLLOW. Robots META Tag Quick Reference Ive included the robots meta tag in my tutorial as this IS one of only a few meta tags HTML head elements I focus on when it comes to managing Googlebot and Bingbot. At a page level it is a powerful way to control if your pages are returned in search results pages. These meta tags go in the HEAD section of a HTML page and represent the only tags for Google I care about. Just about everything else you can put in the HEAD of your HTML document is quite unnecessary and maybe even pointless (for Google optimisation, anyway). Robots. txt File If you want to control which pages get crawled and indexed by Google see my article for beginners to the robots. txt file . External Links H1-H6: Page Headings I cant find any definitive proof online that says you need to use Heading Tags (H1, H2, H3, H4, H5, H6) or that they improve rankings in Google, and I have seen pages do well in Google without them but I do use them, especially the H1 tag on the page. For me, its another piece of a 8216perfect8217 page, in the traditional sense, and I try to build a site for Google and humans. I still generally only use one lth1gt heading tag in my keyword targeted pages I believe this is the way the W3C intended it to be used in HTML4 and I ensure they are at the top of a page above relevant page text and written with my main keywords or related keyword phrases incorporated. I have never experienced any problems using CSS to control the appearance of the heading tags making them larger or smaller. You can use multiple H1s in HTML5, but most sites I find I work on still use HTML4. I use as many H2 H6 as is necessary depending on the size of the page, but I use H1, H2 amp H3. You can see here how to use header tags properly (basically, just be consistent, whatever you do, to give your users the best user experience). How many words in the H1 Tag As many as I think is sensible as short and snappy as possible usually. I also discovered Google will use your Header tags as page titles at some level if your title element is malformed. As always be sure to make your heading tags highly relevant to the content on that page and not too spammy, either. NOTE: Alt Tags are counted by Google (and Bing), but I would be careful over-optimizing them. Ive seen a lot of websites penalised for over-optimising invisible elements on a page. Dont do it. ALT tags are very important and I think a very rewarding area to get right. I always put the main keyword in an ALT once when addressing a page. Dont optimise your ALT tags (or rather, attributes) JUST for Google Use ALT tags (or rather, ALT Attributes) for descriptive text that helps visitors and keep them unique where possible, like you do with your titles and meta descriptions. Dont obsess. Dont optimise your ALT tags just for Google do it for humans, accessibility and usability. If you are interested, I conducted a simple test using ALT attributes to determine how many words I could use in IMAGE ALT text that Google would pick up . And remember even if, like me most days, you cant be bothered with all the image ALT tags on your page, at least, use a blank ALT (or NULL value) so people with screen readers can enjoy your page. Update 171108 Picked This Up At SERoundtable about Alt Tags : JohnMu from Google: alt attribute should be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is altbig blue pineapple chair. title attribute should be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, titleView a larger version of the big blue pineapple chair image. Barry continues with a quote: As the Googlebot does not see the images directly, we generally concentrate on the information provided in the alt attribute. Feel free to supplement the alt attribute with title and other attributes if they provide value to your users So for example, if you have an image of a puppy (these seem popular at the moment ) playing with a ball, you could use something like My puppy Betsy playing with a bowling ball as the alt-attribute for the image. If you also have a link around the image, pointing a large version of the same photo, you could use View this image in high-resolution as the title attribute for the link. Link Title Attributes, Acronym amp ABBR Tags Does Google Count Text in The Acronym Tag From my tests, no. From observing how my test page ranks 8211 Google is ignoring keywords in the acronym tag. My observations from a test page I observe include Link Title Attribute 8211 no benefit passed via the link either to another page, it seems ABBR (Abbreviation Tags) 8211 No Image File Name 8211 No Wrapping words (or at least numbers) in SCRIPT 8211 Sometimes. Google is better at understanding what it can render in 2016. It8217s clear many invisible elements of a page are completely ignored by Google (that would interest us SEO). Some invisible items are (still) aparently supported: NOFRAMES 8211 Yes NOSCRIPT 8211 Yes ALT Attribute 8211 Yes Unless you really have cause to focus on any particluar invisible element, I think the P tag is the most important tag to optimise in 2016. Search Engine Friendly URLs (SEF) Clean URLs (or search engine friendly URLs) are just that clean, easy to read, simple. You do not need clean URLs in site architecture for Google to spider a site successfully ( confirmed by Google in 2008 ) . although I do use clean URLs as a default these days, and have done so for years. Its often more usable. Is there a massive difference in Google when you use clean URLs No . in my experience its very much a second or third order affect, perhaps even less, if used on its own . However there it is demonstrable benefit to having keywords in URLs. The thinking is that you might get a boost in Google SERPs if your URLs are clean because you are using keywords in the actual page name instead of a parameter or session ID number (which Google often struggles with). I think Google might reward the page some sort of relevance because of the actual file page name. I optimise as if they do. It is virtually impossible to isolate any ranking factor with a degree of certainty. Where any benefit is slightly detectable is when people (say in forums) link to your site with the URL as the link . Then it is fair to say you do get a boost because keywords are in the actual anchor text link to your site, and I believe this is the case, but again, that depends on the quality of the page linking to your site. That is, if Google trusts it and it passes Pagerank () and anchor text benefit. And of course, youll need citable content on that site of yours. Sometimes I will remove the stop-words from a URL and leave the important keywords as the page title because a lot of forums garble a URL to shorten it. Most forums will be nofollowed in 2016, to be fair, but some old habits die-hard. Sometimes I prefer to see the exact phrase I am targeting as the name of the URL I am asking Google to rank. I configure URLs the following way hobo-web. co. ukp292 is automatically changed by the CMS using URL rewrite to hobo-web. co. ukwebsites-clean-search-engine-friendly-URLs which I then break down to something like hobo-web. co. uksearch-engine-friendly-URLs It should be remembered it is thought although Googlebot can crawl sites with dynamic URLs it is assumed by many webmasters there is a greater risk that it will give up if the URLs are deemed not important and contain multiple variables and session IDs (theory). As standard . I use clean URLs where possible on new sites these days, and try to keep the URLs as simple as possible and do not obsess about it. Thats my aim at all times when I optimise a website to work better in Google simplicity. Google does look at keywords in the URL even in a granular level. Having a keyword in your URL might be the difference between your site ranking and not potentially useful to take advantage of long tail search queries for more see Does Google Count A Keyword In The URI (Filename) When Ranking A Page Absolute Or Relative URLs My advice would be to keep it consistent whatever you decide to use. I prefer absolute URLs. Thats just a preference. Google will crawl either if the local setup is correctly developed. What is an absolute URL Example hobo-web. co. uksearch-engine-optimisation What is a relative URL Example search-engine-optimisation. htm Relative just means relative to the document the link is on. Move that page to another site and it wont work. With an absolute URL, it would work. Subdirectories or Files For URL Structure Sometimes I use subfolders and sometimes I use files. I have not been able to decide if there is any real benefit (in terms of ranking boost) to using either. A lot of CMS these days use subfolders in their file path, so I am pretty confident Google can deal with either. I used to prefer files like. html when I was building a new site from scratch, as they were the end of the line for search engines, as I imagined it, and a subfolder (or directory) was a collection of pages. I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, its 6 or half a dozen, what the actual difference is in terms of ranking in Google usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query. In the past, subfolders could be treated differently than files (in my experience). Subfolders can be trusted less than other subfolders or pages in your site, or ignored entirely. Subfolders used to seem to me to take a little longer to get indexed by Google, than for instance. html pages. People talk about trusted domains but they dont mention (or dont think) some parts of the domain can be trusted less . Google treats some subfolders. differently. Well, they used to and remembering how Google used to handle things has some benefits even in 2016. Some say dont go beyond four levels of folders in your file path. I havent experienced too many issues, but you never know. UPDATED I think in 2016 its even less of something to worry about. Theres so much more important elements to check. Which Is Better For Google PHP, HTML or ASP Google doesnt care. As long as it renders as a browser compatible document, it appears Google can read it these days. I prefer PHP these days even with flat documents as it is easier to add server side code to that document if I want to add some sort of function to the site. Does W3C Valid HTML CSS Help Rank Above a Google video confirming this advice I first shared in 2008. Does Google rank a page higher because of valid code The short answer is no, even though I tested it on a small-scale test with different results. Google doesnt care if your page is valid HTML and valid CSS. This is clear check any top ten results in Google and you will probably see that most contain invalid HTML or CSS. I love creating accessible websites but they are a bit of a pain to manage when you have multiple authors or developers on a site. If your site is so badly designed with a lot of invalid code even Google and browsers cannot read it, then you have a problem. Where possible, if commissioning a new website, demand, at least, minimum web accessibility compliance on a site (there are three levels of priority to meet), and aim for valid HTML and CSS. Actually, this is the law in some countries although you would not know it, and be prepared to put a bit of work in to keep your rating. Valid HTML and CSS are a pillar of best practice website optimisation, not strictly a part of professional search engine optimisation. It is one form of optimisation Google will not penalise you for. Addition I usually still aim to follow W3C recommendations that help deliver a better user experience Hypertext links. Use text that makes sense when read out of context. W3C Top Ten Accessibility Tips Point Internal Links To Relevant Pages I silo any relevance or trust mainly via links in text content and secondary menu systems and between pages that are relevant in context to one another. I dont worry about perfect siloing techniques anymore, and dont worry about whether or not I should link to one category from another as I think the boost many proclaim is minimal on the size of sites I usually manage. I do not obsess about site architecture as much as I used to8230. but I always ensure my pages I want to be indexed are all available from a crawl from the home page 8211 and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links 8211 but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance.. Theres no set method I find works for every site, other than to link to related internal pages often without overdoing it and where appropriate. What Are SERP Sitelinks When Google knows enough about the history or relationships of a website (or web page), it will sometimes display what are called site links (or mega site links) under the url of the website in question. This results in an enhanced search snippet in SERPs. This is normally triggered when Google is confident this is the site you are looking for, based on the search terms you used. Sitelinks are usually reserved for navigational queries with a heavy brand bias, a brand name or a company name, for instance, or the website address. Ive tracked the evolution of Google site links in organic listings over the years, and they are seemly picked based on a number of factors. How To Get Google Sitelinks Pages that feature in site links are often popular pages on your site, in terms of internal or external links, or user experience or even recent posts that may have been published on your blog. Google likes to seem to mix this up a lot, perhaps to offer some variety, and probably to obfuscate results to minimise or discourage manipulation. Sometimes it returns pages that leave me scratching my head as to why Google selected a particular page appears. If you dont HAVE site links, have a bit of patience and focus on other areas of your web marketing, like adding more content, get some PR or social activity focussed on the site. Google WILL give you site links on some terms ONCE Google is confident your site is the destination users want. That could be a week or months, but the more popular the site is, the more likely Google will catch up fast. Sitelinks are not something can be switched on or off, although you can control to some degree the pages are selected as site links. You can do that in Google Webmaster Tools AKA Search Console. Link Out To Related Sites Concerning on-page SEO best practices, I usually link out to other quality relevant pages on other websites where possible and where a human would find it valuable. I dont link out to other sites from the homepage . I want the Pagerank of the home page to be shared only with my internal pages. I dont like out to other sites from my category pages either . for the same reason. I link to other relevant sites (a deep link where possible) from individual pages and I do it often, usually. I dont worry about link equity or PR leak because I control it on a page-to-page level. This works for me, it allows me to share the link equity I have with other sites while ensuring it is not at the expense of pages on my domain. It may even help get me into a neighbourhood of relevant sites. especially when some of those start linking back to my site. Linking out to other sites, especially using a blog, also helps tell others that might be interested in your content that your page is here. Try it. I don8217t abuse anchor text, but I will be considerate, and usually try and link out to a site using keywords these bloggers site owners would appreciate. The recently leaked Quality Raters Guidelines document clearly tells web reviewers to identify how USEFUL or helpful your SUPPLEMENTARY NAVIGATION options are 8211 whether you link to other internal pages or pages on other sites. Broken Links Are A Waste Of Link Power The simplest piece of advice I ever read about creating a website optimising a website was years ago and it is still useful today: make sure all your pages link to at least one other in your site This advice is still sound today and the most important piece of advice out there in my opinion. Check your pages for broken links. Seriously, broken links are a waste of link power and could hurt your site, drastically in some cases. Google is a link-based search engine if your links are broken and your site is chock full of 404s you might not be at the races. Heres the second best piece of advice, in my opinion, seeing as we are just about talking about website architecture link to your important pages often internally, with varying anchor text in the navigation and in page text content Especially if you do not have a lot of Pagerank. Does Only The First Link Count In Google Does the second anchor text link on a page count One of the more interesting discussions in the webmaster community of late has been trying to determine which links Google counts as links on pages on your site. Some say the link Google finds higher in the code, is the link Google will count if there are two links on a page going to the same page. For example (and I am talking internally here if you took a page and I placed two links on it, both going to the same page (OK hardly scientific, but you should get the idea). Will Google only count the first link Or will it read the anchor text of both links, and give my page the benefit of the text in both links especially if the anchor text is different in both links Will Google ignore the second link What is interesting to me is that knowing this leaves you with a question. If your navigation array has your main pages linked to in it, perhaps your links in content are being ignored, or at least, not valued. I think links in body text are invaluable. Does that mean placing the navigation below the copy to get a wide and varied internal anchor text to a page As I said, I think this is one of the more interesting talks in the community at the moment and perhaps Google works differently with internal links as opposed to external links to other websites. I think quite possibly this co uld change day to day if Google pressed a button, but I optimise a site thinking that only the first link on a page will count based on what I monitor although I am testing this and actually, I usually only link once from page-to-page on client sites, unless its useful for visitors. Duplicate Content Penalty Webmasters are often confused about getting penalised for duplicate content, which is a natural part of the web landscape, especially at a time when Google claims there is NO duplicate content penalty. The reality in 2016 is that if Google classifies your duplicate content as THIN content, then you DO have a very serious problem that violates Googles website performance recommendations and this violation will need cleaned up. Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin.. Its very important to understand that if, in 2016, as a webmaster you republish posts, press releases, news stories or product descriptions found on other sites, then your pages are very definitely going to struggle to gain in traction in Googles SERPs (search engine results pages). Google doesnt like using the word penalty but if your entire site is made of entirely of republished content Google does not want to rank it. If you have a multiple site strategy selling the same products you are probably going to cannibalise your traffic in the long run, rather than dominate a niche, as you used to be able to do. This is all down to how the search engine deals with duplicate content found on other sites and the experience Google aims to deliver for its users and its competitors. Mess up with duplicate content on a website, and it might look like a Google penalty as the end-result is the same important pages that once ranked might not rank again and new content might not get crawled as fast as a result. Your website might even get a manual action for thin content. Worse case scenario your website is hit by the GOOGLE PANDA algorithm. A good rule of thumb is d o NOT expect to rank high in Google with content found on other, more trusted sites . and dont expect to rank at all if all you are using is automatically generated pages with no value add. Tip: Do NOT REPEAT text, even your own, across too many pages on your website. Double or Indented Listings in Google How do you get Double or Indented Listings in Google SERPs How do you get two listings from the same website in the top ten results in Google instead of one (in normal view with 10 results). Generally speaking, this means you have at least two pages with enough link equity to reach the top ten results two pages very relevant to the search term. In 2016 however it could be a sign of Google testing different sets of results by for instance merging two indexes where a website ranks differently in both. You can achieve this with relevant pages, good internal structure and of course links from other websites. Its far easier to achieve in less competitive verticals but in the end is does come down in many cases to domain authority and high relevance for a particular keyphrase. Redirect Non-WWW To WWW Your site probably has canonicalisation issues (especially if you have an e-commerce website) and it might start at the domain level. Simply put, hobo-web. co. uk can be treated by Google as a different URL than hobo-web. co. uk even though its the same page, and it can get even more complicated. Its thought REAL Pagerank can be diluted if Google gets confused about your URLs and speaking simply you dont want this PR diluted (in theory). Thats why many, including myself, redirect non-www to www (or vice versa) if the site is on a LinuxApache server (in the htaccess file 8211 Basically, you are redirecting all the Google juice to one canonical version of a URL. In 2016 8211 this is a MUST HAVE best practice. It keeps it simple when optimising for Google. It should be noted its incredibly important not to mix the two types of wwwnon-www on site when linking your internal pages Note in 2016 Google asks you which domain you prefer to set as your canonical domain in Google Webmaster Tools. 301 Redirects Are POWERFUL amp WHITE HAT Rather than tell Google via a 404 or some other command that this page isnt here anymore, consider permanently redirecting a page to a relatively similar page to pool any link equity that page might have. My general rule of thumb is to make sure the information (and keywords) are contained in the new page stay on the safe side. Most already know the power of a 301 redirect and how you can use it to power even totally unrelated pages to the top of Google for a time sometimes a very long time. Google seems to think server side redirects are OK so I use them. You can change the focus of a redirect but thats a bit black hat for me and can be abused I dont talk about that sort of thing on this blog. But its worth knowing you need to keep these redirects in place in your htaccess file. Redirecting multiple old pages to one new page works for me, if the information is there on the new page that ranked the old page. NOTE This tactic is being heavily spammed in 2016. Be careful with redirects. I think I have seen penalties transferred via 301s. I also WOULDN8217T REDIRECT 301s blindly to your home page. I8217d also be careful of redirecting lots of low-quality links to one URL. If you need a page to redirect old URLs to, consider your sitemap or contact page. Audit any pages backlinks BEFORE you redirect them to an important page. I8217m seeing CANONICALS work just the same as 301s in 2016 8211 though they seem to take a little longer to have an impact. Hint 8211 a good tactic at the moment is to CONSOLIDATE old, thin under-performing articles Google ignores, into bigger, better quality articles . I usually then 301 all the pages to a single source to consolidate link equity and content equity. As long as the intention is to serve users and create something more up-to-date 8211 Google is fine with this. Canonical Link Element Is Your Best Friend When it comes to Google SEO, the relcanonical link element has become VERY IMPORTANT over the years and NEVER MORE SO. This element is employed by Google, Bing and other search engines to help them specify the page you want to rank out of duplicate and near duplicate pages found on your site, or on other pages on the web. In the video above, Matt Cutts from Google shares tips on the new relcanonical tag (more accurately the canonical link element ) that the 3 top search engines now support. Google, Yahoo. and Microsoft have all agreed to work together in a joint effort to help reduce duplicate content for larger, more complex sites, and the result is the new Canonical Tag. Example Canonical Tag From Google Webmaster Central blog: The process is simple. You can put this link tag in the head section of the duplicate content URLs if you think you need it. I add a self-referring canonical link element as standard these days to ANY web page. Is relcanonical a hint or a directive Its a hint that we honor strongly. Well take your preference into account, in conjunction with other signals, when calculating the most relevant page to display in search results. Can I use a relative path to specify the canonical, such as ltlink relcanonical hrefproduct. phpitemswedish-fish gt Yes, relative paths are recognized as expected with the ltlinkgt tag. Also, if you include a ltbasegt link in your document, relative paths will resolve according to the base URL. Is it okay if the canonical is not an exact duplicate of the content We allow slight differences, e. g. in the sort order of a table of products. We also recognize that we may crawl the canonical and the duplicate pages at different points in time, so we may occasionally see different versions of your content. All of that is okay with us. What if the relcanonical returns a 404 Well continue to index your content and use a heuristic to find a canonical, but we recommend that you specify existent URLs as canonicals. What if the relcanonical hasnt yet been indexed Like all public content on the web, we strive to discover and crawl a designated canonical URL quickly. As soon as we index it, well immediately reconsider the relcanonical hint. Can relcanonical be a redirect Yes, you can specify a URL that redirects as a canonical URL. Google will then process the redirect as usual and try to index it. What if I have contradictory relcanonical designations Our algorithm is lenient: We can follow canonical chains, but we strongly recommend that you update links to point to a single canonical page to ensure optimal canonicalization results. Can this link tag be used to suggest a canonical URL on a completely different domain Update on 12172009: The answer is yes We now support a cross-domain relcanonical link element. Do I Need A Google XML Sitemap For My Website What is an XML sitemap and do I need one to SEO my site for Google (The XML Sitemap protocol) has wide adoption, including support from Google, Yahoo. and Microsoft No. You do NOT, technically, need an XML Sitemap to optimise a site for Google if you have a sensible navigation system that Google can crawl and index easily. HOWEVER 8211 in 2016 8211 you should have a Content Management System that produces one as a best practice 8211 and you should submit that sitemap to Google in Google Webmaster Tools. Again 8211 best practice. Google has said very recently XML and RSS are still a very useful discovery method for them to pick out recently updated content on your site. An XML Sitemap is a file on your server with which you can help Google easily crawl amp index all the pages on your site. This is evidently useful for very large sites that publish lots of new content or updates content regularly. Your web pages will still get into search results without an XML sitemap if Google can find them by crawling your website if you: Make sure all your pages link to at least one other in your site Link to your important pages often, with (varying anchor text, in the navigation and in page text content if you want best results) Remember 8211 Google needs links to find all the pages on your site, and links spread Pagerank, that help pages rank 8211 so an XML sitemap is not quite a substitute for a great website architecture. Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site ) so that search engines can more intelligently crawl the site. Most modern CMS auto-generate XML sitemaps and Google does ask you submit a site-map in webmaster tools. and I do these days. I prefer to define manually my important pages by links and depth of content, but an XML sitemap is a best practice in 2016 for most sites. Rich Snippets Rich Snippets and Schema Mark-up can be intimidating if you are new to them 8211 but important data about your business can actually be very simply added to your site by sensible optimisation of your website footer. This is easy to implement. An optimised website footer can comply with law, may help search engines understand your site better and can help usability and improve conversions. Properly optimised your website footer can also help you make your search snippet stand out in Google results pages: If you are a business in the UK 8211 your website needs to meet the legal requirements necessary to comply with the UK Companies Act 2007 . It8217s easy to just incorporate this required information into your footer. Companies in the UK must include certain regulatory information on their websites and in their email footers 82308230 or they will breach the Companies Act and risk a fine. OUTLAW Here8217s what you need to know regarding website and email footers to comply with the UK Companies Act (with our information in bold ) The Company Name 8211 MBSA Marketing LTD (Trading as Hobo) Physical geographic address (A PO Box is unlikely to suffice as a geographic address but a registered office address would 8211 If the business is a company, the registered office address must be included.) MBSA Marketing LTD trading as Hobo, 68 Finnart Street, Greenock, PA16 8HJ. Scotland, UK the company8217s registration number should be given and, under the Companies Act, the place of registration should be stated (e. g. MBSA Marketing LTD is a company registered in Scotland with company number SC536213 email address of the company (It is not sufficient to include a 8216contact us8217 form without also providing an email address and geographic address somewhere easily accessible on the site) infohobo-web. co. uk The name of the organisation with which the customer is contracting must be given. This might differ from the trading name. Any such difference should be explained The domain hobo-web. co. uk and the Hobo logo and creative is owned by Shaun Anderson and licensed to MBSA Marketing LTD of which Shaun is an employed co-founding director . If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions. VAT No. 249 1439 90 Prices on the website must be clear and unambiguous. Also, state whether prices ar e inclusive of tax and delivery costs. All Hobo Web Co Uk prices stated in email or on the website EXCLUDE VAT The above information does not need to feature on every page, more on a clearly accessible page. However 8211 with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) 8211 ANY signal you can send to an algorithm or human reviewer8217s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course). Note: If the business is a member of a trade or professional association, membership details, including any registration number, should be provided. Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers (B2C, as opposed to B2B, sales). For more detailed information about the UK Companies: Although we display most if not all of this information on email and website footers, I thought it would be handy to gather this information clearly on one page and explain why it8217s there 8211 and wrap it all up in a (hopefully) informative post. Dynamic PHP Copyright Notice in WordPress Now that your site complies with the Act 8211 you8217ll want to ensure your website never looks obviously out of date. While you are editing your footer 8211 ensure your copyright notice is dynamic and will change year to year 8211 automatically. It8217s simple to display a dynamic date in your footer in WordPress, for instance, so you never need to change your copyright notice on your blog when the year changes. This little bit of code will display the current year. Just add it in your theme8217s footer. php and you can forget about making sure you don8217t look stupid, or give the impression your site is out of date and unused, at the beginning of every year. A simple and elegant php copyright notice for WordPress blogs. Adding Schema. org Mark-up to Your Footer You can take your information you have from above and transform it with Schema. org mark-up to give even more accurate information to search engines. Tip: Note the code near the end of the above example, if you are wondering how to get yellow star ratings in Google results pages . I got yellow stars in Google within a few days of adding the code to my website template 8211 directly linking my site to information Google already has about my business. Also 8211 you can modify that link to plus. google to link directly to your REVIEWS page on Google Plus to encourage people to review your business. Now you can have a website footer that helps your business comply with UK Law, is more usable, automatically updates the copyright notice year 8211 and helps your website stick out in Google SERPs. PRO Tip 8211 Now you know the basics, consider implementing rich schema using a much cleaner method called JSON-LD Keep It Simple, Stupid Dont Build Your Site With Flash or HTML Frames. Well not entirely in Flash, and especially not if you know very little about the ever improving accessibility of Flash. Flash is a propriety plug-in created by Macromedia to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website. Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with: Accessibility Search Engines Users not having the Plug-In Large Download Times Flash doesnt even work at all on some devices, like the Apple iPhone. Note that Google sometimes highlights if your site is not mobile friendly on some devices. And on the subject of mobile-friendly websites 8211 note that Google has alerted the webmaster community that mobile friendliness will be a search engine ranking factor in 2016 . Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high-quality search results that are optimized for their devices. GOOGLE Html5 is the preferred option over Flash these days, for most designers. A site built entirely in Flash could cause an unsatisfactory user experience, and could affect your rankings, and especially in mobile search results. For similar accessibility and user satisfaction reasons, I would also say dont build a site with website frames . As in any form of design, dont try and re-invent the wheel when simple solutions suffice. The KISS philosophy has been around since the dawn of design. KISS does not mean boring web pages. You can create stunning sites with smashing graphics but you should build these sites using simple techniques HTML amp CSS, for instance. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc. These elements work fine for TV but only cause problems for website visitors. Keep layouts and navigation arrays consistent and simple too. Dont spend time, effort and money (especially if you work in a professional environment) designing fancy navigation menus if, for example, your new website is an information site. Same with website optimisation keep your documents well structured and keep your page Title Elements and text content relevant, use Headings tags sensibly and try and avoid leaving too much of a footprint whatever you are up to. How Fast Should Your Website Download Site Speed, we are told by Google in the above video, is a ranking factor. But as with any factor Google confirms is a ranking signal, its usually a small, nuanced one. A fast site is a good user experience (UX), and a satisfying UX leads to higher conversions . How fast your website loads is a critical but often completely ignored element in any online business and that includes search marketing and search engine optimisation. Very slow sites are a bad user experience and Google is all about GOOD UX these days. How Much is Website Speed a Google Ranking Factor How much is a very slow site a negative ranking factor is a more useful interpretation of the claim that website speed is a Google ranking factor . First for I have witnessed VERY slow websites of 10 seconds and more negatively impacted in Google, and second, from statements made by Googlers: We do say we have a small factor in there for pages that are really slow to load where we take that into account . John Mueller, GOOGLE Google might crawl your site slower if you have a slow site. And thats bad especially if you are adding new content or making changes to it. Were seeing an extremely high response-time for requests made to your site (at times, over 2 seconds to fetch a single URL ). This has resulted in us severely limiting the number of URLs well crawl from your site. John Mueller, GOOGLE John specifically said 2 seconds disrupts CRAWLING activity, not RANKING ability, but you get the picture. How Fast Should Your Website Load in 2016 Recent research is hard to find, but would indicate as fast as possible . A Non-Technical Google SEO Strategy Here are some final thoughts: Use common sense 8211 Google is a search engine 8211 it is looking for pages to give searchers results, 90 of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content rich websites get a lot of links 8211 especially quality links. Google ranks websites with a lot of links (especially quality links) at he top of its search engines so the obvious thing you need to do is ADD A LOT INFORMATIVE CONTENT TO YOUR WEBSITE . I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site some pages cannot. Some links can. Some cannot. Some links are trusted to pass ranking ability to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES amp FILTERS. Google engineers are building an AI 8211 but it8217s all based on simple human desires to make something happen or indeed to prevent something. You can work with Google engineers or against them. They need to make money for Google but unfortunately for them they need to make the best search engine in the world for us humans as part of the deal. Build a site that takes advantage of this. What is a Google engineer trying to do with an algorithm I always remember it was an idea first before it was an algorithm. What was that idea Think like a Google engineers and give Google what it wants. What is Google trying to give its users Align with that . What does Google not want to give its users Don8217t look anything like that . THINK LIKE A GOOGLE ENGINEER amp BUILD A SITE THEY WANT TO GIVE TOP RANKINGS . Google is a links-based search engine. Google doesn8217t need content to rank pages but it needs content to give to users . Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don8217t worry about reciprocating to more powerful sites or even real sites 8211 I think this adds to your domain authority 8211 which is better to have than ranking for just a few narrow key terms. Everything has limits. Google has limits. What are they How would you go about observing them or even testing, breaking them or benefiting from them or being penalised by them It8217s not a lab setting 8211 you can8217t test much if anything, 100 accurately, but you can hypothesise based on the sensible approach bearing in mind what a Google engineer would do, and what you would do if Google were yours. The best way for Google to keep rankings secret ultimately is to have a randomness 8211 or, at least, a randomness on the surface, as it is presented to users of Google 8211 to it while keeping somethings stable 8211 surely the easiest way for it to prevent a curious optimiser finding out how it works. Well, I think that. And I think this randomness manifests itself in many ways. What will work for some sites might not necessarily work for your sites 8211 not the same anyway. Perhaps no two sites are the same (the conditions are different for a start for any two sites). Google may play dice with the Google multi-verse so be aware of that. It uses multiple results and rotates them and serves different results to different machines and browsers even on the same computer. Google results are constantly shifting 8211 some pages rank at the top constantly because they are giving Google what it wants in some areas or they might just have a greater number and diversity of more trusted links than your do. Google has a long memory when it comes to links and pages and associations for you site 8211 perhaps an infinite memory profile of your site. Perhaps it can forgive but never forget. Perhaps it can forget too, just like us, and so previous penalties or bans can be lifted. I think (depending on the site because Google can work out if you have a blog or an e-commerce site) Google probably also looks at different history versions of particular pages even on single sites WHAT RELATIONSHIP DO YOU WANT TO HAVE WITH GOOGLE Onsite, don8217t try to fool Google 8211 we8217re not smart enough. Be squeaky clean on-site and make Google think twice about bumping you for discrepancies in your link profile. Earn Google8217s trust. Most of our more lucrative accounts come from referrals from clients who trust us. Before clients told them of us, they didn8217t know about us. Ok, they might have heard about us from people, in turn, they didn8217t trust that much. Upon the clients testimonial, the referral now trusts us a lot more. These referrals automatically trust us to some extent. That trust grows when we deliver. The referral now trusts us very much. But it8217s an uphill struggle from that point on to continue to deliver that trust and earn even more trust because you don8217t want to dip in trust 8211 it8217s nice to get even more and more trusted. Google works the the same way as this human emotion, and search engines have tried for years to deliver a trusted set of sites based on human desire and searcher intent. MAKE FRIENDS WITH GOOGLE Don8217t break Google8217s trust 8211 if your friend betrays you, depending on what they8217ve done, they8217ve lost trust. Sometimes that trust has been lost altogether. If you do something Google doesn8217t like manipulate it in a way it doesn8217t want, you will lose trust, and in some cases, lose all trust (in some areas). For instance, your pages might be able to rank, but your links might not be trusted enough to vouch for another site. DON8217T FALL OUT WITH GOOGLE OVER SOMETHING STUPID YOU NEED TO MAKE MORE FRIENDS AND ESPECIALLY THOSE WHO ARE FRIENDS WITH GOOGLE. When Google trusts you it8217s because you8217ve earned its trust to help it carry out what it needs to carry out in the quickest and most profitable way. You8217ve helped Google achieve its goals. It trusts you and it will reward you by listing your contribution in order of the sites it trusts the most. It will list friends it trusts the most who it knows to be educated in a particular area at the top of these areas. IF GOOGLE TRUSTS YOU IT WILL LET YOUR PAGES RANK AND IN TURN, VOUCH FOR OTHER PAGES, or 8216FRIENDS8217, GOOGLE MIGHT WANT INFORMATION ON. Google is fooled and manipulated just like you can but it will kick you in the gonads if you break this trust 8211 as I probably would. Treat Google as you would have it treat you. Be fast . REMEMBER IT TAKES TIME TO BUILD TRUST8230. AND THAT IS PROBABLY ONE OF THE REASONS WHY GOOGLE is pushing the need to be 8216trusted8217 as a ranking modifier. I, of course, might be reading far too much into Google, TRUST and the TIME Google wants us to wait for things to happen on their end8230.but consider trust to be a psychological emotion Google is trying to emulate using algorithms based on human ideas. If you do all the above, you8217ll get more and more traffic from Google over time. If you want to rank for specific keywords in very competitive niches, you8217ll need to be a big brand, be picked out by big brands (and linked to), or buy links to fake that trust, or get spammy with it in an intelligent way you won8217t get caught. Easier said, than done. I suppose Google is open to the con just as any human is if it8217s based on human traits8230. What Not To Do In Website Search Engine Optimisation Google has a VERY basic organic search engine optimisation starter guide pdf for webmasters, which they use internally: Although this guide wont tell you any secrets thatll automatically rank your site first for queries in Google (sorry), following the best practices outlined below will make it easier for search engines to both crawl and index your content. Google It is still worth a read, even if it is VERY basic, best practice search engine optimisation for your site. No search engine will EVER tell you what actual keywords to put on your site to improve your rankings or get more converting organic traffic 8211 and in Google 8211 that8217s the SINGLE MOST IMPORTANT thing you want to know If you want a bigger pdf 8211 try my free SEO ebook . It8217s been downloaded by tens of thousands of webmasters and I update it every year or so. Heres a list of what Google tells you to avoid in the document choosing a title that has no relation to the content on the page using default or vague titles like Untitled or New Page 1 using a single title tag across all of your sites pages or a large group of pages using extremely lengthy titles that are unhelpful to users stuffing unneeded keywords in your title tags writing a meta description tag that has no relation to the content on the page using generic descriptions like This is a webpage or Page about baseball cards filling the description with only keywords copy and pasting the entire content of the document into the description meta tag using a single description meta tag across all of your sites pages or a large group of pages using lengthy URLs with unnecessary parameters and session IDs choosing generic page names like page1.html using excessive keywords like baseball-cards-baseball-cards-baseball-cards. htm having deep nesting of subdirectories like dir1dir2dir3dir4dir5di r6 page. html using directory names that have no relation to the content in them having pages from subdomains and the root directory (e. g. domain page. htm and sub. domainpage. htm) access the same content Mixing and non - versions of URLs in your internal linking structure using odd capitalization of URLs (many users expect lower-case URLs and remember them better) creating complex webs of navigation links, e. g. linking every page on your site to every other page going overboard with slicing and dicing your content (it takes twenty clicks to get to deep content) having a navigation based entirely on drop-down menus, images, or animations (many, but not all, search engines can discover such links on a site, but if a user can reach all pages on a site via normal text links, this will improve the accessibility of your site) letting your HTML sitemap page become out of date with broken links creating an HTML sitemap that simply lists pages without organising them, for example by subject (Edit Shaun Safe to say especially for larger sites) allowing your 404 pages to be indexed in search engines (make sure that your webserver is configured to give a 404 HTTP status code when non-existent pages are requested) providing only a vague message like Not found, 404, or no 404 page at all using a design for your 404 pages that isnt consistent with the rest of your site writing sloppy text with many spellin g and grammatical mistakes embedding text in images for textual content (users may want to copy and paste the text and search engines cant read it) dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation rehashing (or even copying) existing content that will bring little extra value to users Pretty straight forward stuff but sometimes its the simple stuff that often gets overlooked. Of course, you put the above together with Google Guidelines for webmasters. Search engine optimization is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your sites user experience and performance in organic search results. Dont make these simple but dangerous mistakes.. Avoid duplicating content on your site found on other sites. Yes, Google likes content, but it usually needs to be well linked to, unique and original to get you to the top Dont hide text on your website. Google may eventually remove you from the SERPs. Dont buy 1000 links and think that will get me to the top. Google likes natural link growth and often frowns on mass link buying. Dont get everybody to link to you using the same anchor text or link phrase. This could flag you as a 8216rank modifier8217. You don8217t want that. Dont chase Google PR by chasing 100s of links. Think quality of links. not quantity. Dont buy many keyword rich domains, fill them with similar content and link them to your site. This is lazy and dangerous and could see you ignored or worse banned from Google. It might have worked yesterday but it sure does not work today without some grief from Google. Do not constantly change your site pages names or site navigation without remembering to employ redirects. This just screws you up in any search engine. Do not build a site with a JavaScript navigation that Google, Yahoo and Bing cannot crawl. Do not link to everybody who asks you for reciprocal links. Only link out to quality sites you feel can be trusted. Don8217t Flag Your Site With Poor Website Optimisation A primary goal of any 8216rank modification8217 is not to flag your site as 8216suspicious8217 to Google8217s algorithms or their web spam team. I would recommend you forget about tricks like links in H1 tags etc. or linking to the same page 3 times with different anchor text on one page. Forget about 8216which is best8217 when considering things you shouldn8217t be wasting your time with. Every element on a page is a benefit to you until you spam it. Put a keyword in every tag and you will flag your site as 8216trying too hard8217 if you haven8217t got the link trust to cut it 8211 and Google8217s algorithms will go to work. Spamming Google is often counter-productive over the long term. Don8217t spam your anchor text link titles with the same keyword. Don8217t spam your ALT Tags or any other tags either. Add your keywords intelligently. Try and make the site mostly for humans, not just search engines. On Page SEO is not as simple as a checklist any more of keyword here, keyword there. Optimisers are up against lots of smart folk at the Googleplex 8211 and they purposely make this practice difficult. For those who need a checklist, this is the sort of one that gets me results Do keyword research Identify valuable searcher intent opportunities Identify the audience amp the reason for your page Write utilitarian copy 8211 be useful. Use related terms in your content. Use plurals. Use words with searcher intent like buy, compare. I like to get a keyword or related term in every paragraph. Use emphasis sparingly to emphasise the important points in the page whether they are your keywords are not Pick an intelligent Page Title with your keyword in it Write an intelligent meta description, repeating it on the page Add an image with user-centric ALT attribute text Link to related pages on your site within the text Link to related pages on other sites Your page should have a simple Google friendly URL Keep it simple Share it and pimp it You can forget about just about everything else. The Continual Evolution of SEO The 8216Keyword Not Provided8217 incident is another example of Google making ranking in organic listings HARDER a change for users that seems to have the most impact on marketers outside of Googles ecosystem yes search engine optimisers. Now, consultants need to be page-centric (abstract, I know), instead of just keyword centric when optimising a web page for Google. There are now plenty of third party tools that help when researching keywords but most of us miss the kind of keyword intelligence we used to have access to. Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external amp internal links. Altogether, Google uses these signals to determine where you rank if you rank at all. Theres no magic bullet, to this. At any one time, your site is probably feeling the influence of some algorithmic filter (for example, Google Panda or Google Penguin ) designed to keep spam sites under control and deliver relevant, high-quality results to human visitors. One filter may be kicking in keeping a page down in the SERPs while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa. You might have very good content, but a very poor technical organisation of it. Try and identify the reasons Google doesnt rate a particular page higher than the competition the answer is usually on the page or in backlinks pointing to the page. Do you have too few quality inbound links Do you have too many low quality backlinks Does your page lack descriptive keyword rich text Are you keyword stuffing your text Do you link out to unrelated sites Do you have too many advertisements above the fold Do you have affiliate links on every page of your site, and text found on other websites Do you have broken links and missing images on the page Whatever they are, identify issues and fix them. Get on the wrong side of Google and your site might well be selected for MANUAL review so optimise your site as if, one day, you will get that website review from a Google Web Spam reviewer . The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query. You do this by good unique keyword rich text content and getting quality links to that page. The latter is far easier to say these days than actually do Next time you are developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary Which pages on the site are emphasised in the site architecture Which pages would you ignore You can help a site along in any number of ways (including making sure your page titles and meta tags are unique) but be careful. Obvious evidence of rank modifying is dangerous. I prefer simple SEO techniques and ones that can be measured in some way. I have never just wanted to rank for competitive terms I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. I try to create a good user experience for humans AND search engines. If you make high-quality text content relevant and suitable for both these audiences, youll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly URLs . To beat the competition in an industry where its difficult to attract quality links, you have to get more technical sometimes and in some industries youve traditionally needed to be 100 black hat to even get in the top 100 results of competitive, transactional searches. There are no hard and fast rules to long term ranking success, other than developing quality websites with quality content and quality links pointing to it. The less domain authority you have, the more text youre going to need. The aim is to build a satisfying website and build real authority You need to mix it up and learn from experience. Make mistakes and learn from them by observation. Ive found getting penalised is a very good way to learn what not to do. Remember there are exceptions to nearly every rule, and in an ever fluctuating landscape, and you probably have little chance determining exactly why you rank in search engines these days. Ive been doing it for over 15 years and every day Im trying to better understand Google, to learn more and learn from others experiences. Its important not to obsess about granular ranking specifics that have little return on your investment unless you really have the time to do so THERE IS USUALLY SOMETHING MORE VALUABLE TO SPEND THAT TIME ON. Thats usually either good backlinks or great content. The fundamentals of successful optimisation while refined have not changed much over the years 8211 although Google does seem a LOT better than it was at rewarding pages with some reputation signals and satisfying content usability. Google isn8217t lying about rewarding legitimate effort 8211 despite what some claim. If they were, I would be a black hat full time. So would everybody else trying to rank in Google. The majority of small to medium businesses do not need advanced strategies because their direct competition has not employed these tactics either. I took a medium sized business to the top of Google recently for very competitive terms doing nothing but ensuring page titles were optimised, the home page text was re-written, one or two earned links from trusted sites. This site was a couple of years old, a clean record in Google, and a couple of organic links already from trusted sites. This domain had the authority and capability to rank for some valuable terms, and all we had to do was to make a few changes on the site, improve the depth and focus of website content, monitor keyword performance and tweak page titles. There was a little duplicate content needing sorting out and a bit of canonicalisation of thin content to resolve, but none of the measures I implemented I8217d call advanced. A lot of businesses can get more converting visitors from Google simply by following basic principles and best practices: Always making sure that every page in the site links out to at least one other page in the site Link to your important pages often Link not only from your navigation, but from keyword rich text links in text content 8211 keep this natural and for visitors Try to keep each page element and content unique as possible Build a site for visitors to get visitors and you just might convert some to actual sales too Create keyword considered content on the site people will link to Watch which sites you link to and from what pages, but do link out Go and find some places on relatively trusted sites to try and get some anchor text rich inbound links Monitor trends, check stats Minimise duplicate or thin content Bend a rule or two without breaking them and you8217ll probably be ok Once this is complete it8217s time to 8230 add more, and better content to your site and tell more people about it, if you want more Google love. OK, so you might have to implement the odd 301, but again, it8217s hardly advanced . I8217ve seen simple SEO marketing techniques working for years. You are better off doing simple stuff better and faster than worrying about some of the more 8216advanced8217 techniques you read on some blogs I think 8211 it8217s more productive, cost effective for businesses and safer, for most. Beware Pseudoscience Pseudoscience is a claim, belief, or practice posing as science, but which does not constitute or adhere to an appropriate scientific methodology8230 Beware folk trying to bamboozle you with science. This isn8217t a science when Google controls the 8216laws8217 and changes them at will. You see I have always thought that optimisation was about: Looking at Google rankings all night long, Keyword research Observations about ranking performance of your pages and that of others (though not in a controlled environment) Putting relevant, co-occurring words you want to rank for on pages Putting words in links to pages you want to rank for Understanding what you put in your title, that8217s what you are going to rank best for Getting links from other websites pointing to yours Getting real quality links that will last from sites that are pretty trustworthy Publishing lots and lots of content Focusing on the long tail of search. Understanding it will take time to beat all this competition I always expected to get a site demoted, by: Getting too many links with the same anchor text pointing to a page Keyword stuffing a page Trying to manipulate Google too much on a site Creating a 8220frustrating user experience.8221 Chasing the algorithm too much Getting links I shouldn8217t have Buying links Not that any of the above is automatically penalised all the time. I was always of the mind I don8217t need to understand the maths or science of Google, that much, to understand what Google engineers want. The biggest challenge these days are to get trusted sites to link to you, but the rewards are worth it. To do it, you probably should be investing in some marketable content, or compelling benefits for the linking party (that8217s not just paying for links somebody else can pay more for). Buying links to improve rankings WORKS but it is probably THE most hated link building technique as far as the Google web spam team is concerned. I was very curious about the science of optimisation I studied what I could but it left me a little unsatisfied. I learned that building links . creating lots of decent content and learning how to monetise that content better (while not breaking any major TOS of Google) would have been a more worthwhile use of my time. Getting better and faster at doing all that would be nice too . There are many problems with blogs, too, including mine. Misinformation is an obvious one. Rarely are your results conclusive or observations 100 accurate. Even if you think a theory holds water on some level. I try to update old posts with new information if I think the page is only valuable with accurate data. Just remember most of what you read about how Google works from a third party is OPINION and just like in every other sphere of knowledge, 8216facts8217 can change with a greater understanding over time or with a different perspective. Chasing The Algorithm There is no magic bullet and there are no secret formulas to achieve fast number 1 ranking in Google in any competitive niche WITHOUT spamming Google. A legitimately earned high position in search engines takes a lot of hard work. There are a few less talked about tricks and tactics that are deployed by some better than others to combat Google Panda, for instance, but there are no big secrets (no 8220white hat8221 secrets anyway). There is clever strategy, though, and creative solutions to be found to exploit opportunities uncovered by researching the niche. As soon as Google sees a strategy that gets results8230 it usually becomes 8216out with the guidelines8217 and something you can be penalised for 8211 so beware jumping on the latest fad . The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn8217t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick. After over a decade practising and deploying real campaigns, I8217m still trying to get it down to its simplest, most cost-effective processes. I think it8217s about doing simple stuff right. Good text, simple navigation structure, quality links. To be relevant and reputable takes time, effort and luck, just like anything else in the real world, and that is the way Google want it. If a company is promising you guaranteed rankings and has a magic bullet strategy, watch out. I8217d check it didn8217t contravene Google8217s guidelines. How Long Does It Take To See Results Some results can be gained within weeks and you need to expect some strategies to take months to see the benefit. Google WANTS these efforts to take time. Critics of the search engine giant would point to Google wanting fast effective rankings to be a feature of Googles own Adwords sponsored listings. Optimisation is not a quick process, and a successful campaign can be judged on months if not years. Most successful, fast ranking website optimisation techniques end up finding their way into Google Webmaster Guidelines 8211 so be wary. It takes time to build quality, and it8217s this quality that Google aims to reward in 2016. It takes time to generate the data needed to begin to formulate a campaign, and time to deploy that campaign. Progress also depends on many factors How old is your site compared to the top 10 sites How many back-links do you have compared to them How is their quality of back-links compared to yours What the history of people linking to you (what words have people been using to link to your site) How good of a resource is your site Can your site attract natural back-links (e. g. you have good content or a great service) or are you 100 relying on your agency for back-links (which is very risky in 2016) How much unique content do you have Do you have to pay everyone to link to you (which is risky), or do you have a 8220natural8221 reason people might link to you Google wants to return quality pages in its organic listings, and it takes time to build this quality and for that quality to be recognised. It takes time too to balance your content, generate quality backlinks and manage your disavowed links. Google knows how valuable organic traffic is 8211 and they want webmasters investing a LOT of effort in ranking pages. Critics will point out the higher the cost of expert SEO, the better-looking Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors 8211 the sooner you start, the sooner you8217ll start to see results. If you start NOW and are determined to build an online brand, a website rich in content with a satisfying user experience 8211 Google will reward you in organic listings. Web optimisation is a marketing channel just like any other and there are no guarantees of success in any, for what should be obvious reasons. There are no guarantees in Google Adwords either, except that costs to compete will go up, of course. That8217s why it is so attractive 8211 but like all marketing 8211 it is still a gamble. At the moment, I don8217t know you, your business, your website, your resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its results 8211 sometimes that8217s ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours. Nothing is absolute in search marketing. There are no guarantees 8211 despite claims from some companies. What you make from this investment is dependent on many things, not least, how suited your website is to convert visitors to sales. Every site is different. Big Brand campaigns are far, far different from small business SEO campaigns that don8217t have any links to begin with, to give you but one example. It8217s certainly easier if the brand in question has a lot of domain authority just waiting to unlocked 8211 but of course, that8217s a generalisation as big brands have big brand competition too. It depends entirely on the quality of the site in question and the level and quality of the competition, but smaller businesses should probably look to own their niche, even if limited to their location, at first. Local SEO is always a good place to start for small businesses. What Makes A Page Spam What makes a page spam: Hidden text or links 8211 may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSSJavascript, or viewing source code Sneaky redirects 8211 redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100 frame Keyword stuffing 8211 no percentage or keyword density given this is up to the rater PPC ads that only serve to make money, not help users Copiedscraped content and PPC ads Feeds with PPC ads Doorway pages 8211 multiple landing pages that all direct user to the same destination Templates and other computer-generated pages mass-produced, marked by copied content andor slight keyword variations Copied message boards with no other page content Fake search pages with PPC ads Fake blogs with PPC ads, identified by copiedscraped or nonsensical spun content Thin affiliate sites that only exist to make money, identified by checkout on a different d omain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question Pure PPC pages with little to no content Parked domains There8217s more on this announcement at SEW . If A Page Exists Only To Make Money, The Page Is Spam, to Google If A Page Exists Only To Make Money, The Page Is Spam GOOGLE In BOTH leaked quality rater guidelines we8217ve seen for Google quality raters this statement is pretty standout 8211 and should be a heads up to any webmaster out there who thinks they are going to make a fast buck from Google organic listings these days. It should, at least, make you think about the types of pages you are going to spend your valuable time making. Without VALUE ADD for Google8217s users 8211 don8217t expect to rank. If you are making a page today with the sole purpose of making money from it 8211 and especially with free traffic from Google 8211 you obviously didn8217t get the memo. Consider this from a manual reviewer: when they DO get to the top . they have to be reviewed with a human eye in order to make sure the site has quality. potpiegirl It8217s worth remembering: If A Page Exists Only To Make Money, The Page Is Spam If A Site Exists Only To Make Money, The Site Is Spam This is how what you make will be judged 8211 whether it is fair or not. Of course not 8211 in some cases 8211 it levels the playing field. If you come at a website thinking it is going to be a load of work and passion, thinking: DIFFERENTIATE YOURSELF BE REMARKABLE BE ACCESSIBLE ADD UNIQUE CONTENT TO YOUR SITE GET CREDITED AS THE SOURCE OF UNIQUE CONTENT HELP USERS () IN A WAY THAT IS NOT ALREADY DONE BY 100 OTHER SITES 8230. then you might find you8217ve built a pretty good site and even, in time 8211 a 8216brand8217. Google doesn8217t care about us SEO or websites 8211 but it DOES care about HELPING USERS . So, if you are helping your visitors 8211 and not by just getting them to another website 8211 you are probably doing one thing right at least. With this in mind 8211 I am already building affiliate sites differently. Doorway Pages Google has announced they intend to target doorway pages in the next big update. The definition of what a doorway page is sure to evolve over the coming years 8211 and this will start again, soon. The last time Google announced they were going after doorway pages and doorway sites was back in 2015. Example: in the images below (from 2011), all pages on the site seemed to be hit with a -50 penalty for everything. First 8211 Google rankings for main terms tanked 8230. 8230 which led to a traffic apocalypse of course8230. 8230and they got a nice email from Google WMT : Google Webmaster Tools notice of detected doorway pages on xxxxxxxx 8211 Dear site owner or webmaster of xxxxxxxx, We8217ve detected that some of your site8217s pages may be using techniques that are outside Google8217s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages 8211 groups of 8220cookie cutter8221 or low-quality pages. Such pages are often of low value to users and are often optimized for single words or phrases in order to channel users to a single location . We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you8217ve made these changes, please submit your site for reconsideration in Googles search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team What Are Doorway Pages Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination. Doorway pages are web pages that are created for spamdexing, this is, for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending visitors to a different page. They are also known as bridge pages, portal pages, jump pages, gateway pages, entry pages and by other names. Doorway pages that redirect visitors without their knowledge use some form of cloaking. Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our Webmaster Guidelines. Google8217s aim is to give our users the most valuable and relevant search results. Therefore, we frown on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practice, including removing these sites from the Google index. If your site has been removed from our search results, review our Webmaster Guidelines for more information. Once you8217ve made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration. At the time (2011), I didn8217t immediately class the pages on the affected sites in question as doorway pages. It8217s evident Google8217s definition of a doorways changes over time. When I looked in Google Webmaster Forums there are plenty of people asking questions about how to fix this, at the time 8211 and as usual 8211 it seems a bit of a grey area with a lot of theories8230. and some of the help in the Google forum is, well, clearly questionable. A lot of people do not realise they are building what Google classes as doorway pages8230. and it8217s indicative that 8230. what you intend to do with the traffic Google sends you may in itself, be a ranking factor not too often talked about. You probably DO NOT want to register at GWT if you have lots doorway pages across multiple sites. Here is what Google has said lately about this algorithm update: Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination. 8230with examples of doorway pages listed as follows: Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page Pages generated to funnel visitors into the actual usable or relevant portion of your site(s) Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy Google also said recently: Here are questions to ask of pages that could be seen as doorway pages: Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your sites user experience Are the pages intended to rank on generic terms yet the content presented on the page is very specific Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic Are these pages made solely for drawing affiliate traffic and sending users along without crea ting unique value in content or functionality Do these pages exist as an island Are they difficult or impossible to navigate to from other parts of your site Are links to such pages from other pages within the site or network of sites created just for search engines A Real Google Friendly Website At one time A Google-Friendly website meant a website built so Googlebot could scrape it correctly and rank it accordingly. When I think 8216Google friendly8217 these days 8211 I think a website Google will rank top, if popular and accessible enough, and won8217t drop like a famping stone for no apparent reason one day, even though I followed the Google SEO starter guide to the letter8230. just because Google has found something it doesn8217t like 8211 or has classified my site as undesirable one day. It is not JUST about original content anymore 8211 it8217s about the function your site provides to Google8217s visitors 8211 and it8217s about your commercial intent. I am building sites at the moment with the following in mind8230.. Don8217t be a website Google won8217t rank 8211 What Google classifies your site as 8211 is perhaps the NUMBER 1 Google ranking factor not often talked about 8211 whether it Google determines this algorithmically or eventually, manually. That is 8211 whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search 8211 what do you think Google thinks about your website Is your website better than the ones in the top ten of Google now . Or just the same Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same8230. how can you make yours different . Better. Think, that one day, your website will have to pass a manual review by 8216Google8217 8211 the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google 8211 it better 8216do8217 something other than only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL . to a visitor that Google will send you 8211 and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site 8211 or a 8216thin affiliate8217 as Google CLASSIFIES it. Think about how Google can algorithmically and manually determine the commercial intent of your website 8211 think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster8217s particular commercial intent 8211 hence why Google has a Top Heavy Algorithm. Google is NOT going to thank you for publishing lots of similar articles and near duplicate content on your site 8211 so EXPECT to have to create original content for every page you want to perform in Google, or at least, not publish content found on other sites8230. Ensure Google knows your website is the origin of any content you produce (typically by simply pinging Google via XML or RSS) 8211 I8217d go as far to say think of using Google to confirm this too8230. this sort of thing will only get more important as the year rolls on Understand and accept why Google ranks your competition above you 8211 they are either: 1. more relevant and more popular, 2. more relevant and more reputable, or 3. manipulating backlinks better than you. 4. spamming Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete 8211 relying on Google to take action on your behalf is VERY probably not going to happen. Being 8216relevant8217 comes down to keywords amp key phrases 8211 in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich mark-up and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you8217ll probably trigger spam filters. If it is 8216hidden8217 in on-page elements 8211 beware relying on it too much to improve your rankings. The basics of GOOD SEO hasn8217t changed for years 8211 though effectiveness of particular elements has certainly narrowed or changed in type of usefulness 8211 you should still be focusing on building a simple site using VERY simple SEO best practices 8211 don8217t sweat the small stuff, while all-the-time paying attention to the important stuff 8211 add plenty of unique PAGE TITLES and plenty of new ORIGINAL CONTENT. Understand how Google SEES your website . CRAWL it, like Google does, with (for example) Screaming Frog SEO spider, and fix malformed links or things that result in server errors (500), broken links (400) and unnecessary redirects (300). Each page you want in Google should serve a 200 OK header message. This is a complex topic, as I said at the beginning of this in-depth article. I hope you enjoyed this free DIY SEO guide for beginners. DO keep up to date with: Google Webmaster Guidelines You do not pay to get into search engines, and you don8217t necessarily need to even submit your site to them, but you do need to know their 8216rules8217 8211 especially rules lay down by Google. Note these rules for inclusion can and do change. These rules are official advice from Google to Webmasters, and Google is really cracking down on 8216low-quality8217 techniques that influence their rankings in 2016. Below is a list of the most important Google Webmaster Guidelines pages and with links to each one The Google Webmaster Channel is also useful to subscribe to . If you made it to here, you should read my Google Panda post 8211 which will take your understanding of this process to a higher level. Free SEO EBOOK (2016) PDF Hobo UK SEO 8211 A Beginner8217s Guide (2016) is a free pdf ebook you can DOWNLOAD COMPLETELY FREE from here (2mb) that contains my notes about driving increased organic traffic to a site within Google8217s guidelines. I am based in the UK and most of my time is spent looking at Google. co. uk so this ebook (and blog posts) should be read with that in mind. Google is BIG 8211 with many different country-specific search engines with wildly different results in some instances. I do all my testing on Google. co. uk. It is a guide based on my 15 years experience. I write and publish to my blog to keep track of thoughts and get feedback from industry and peers. As a result, of this strategy, I get about 100K visitors a month from Google. My ebook is meandering 8211 I am not a professional author 8211 but contained within it is largely the information I needed as I took a penalised site to record Google organic traffic levels in 2016 . This is the 4th version of this document I8217ve published in 7 years and I hope this, and previous ones, have demonstrated my interest in this field in a way that others can learn from. There8217s no warranties 8211 it is a free pdf. This SEO training guide is my opinions, observations and theories that I put into practice, not advice. I hope you find it useful and I hope beginners can get something out of the free ebook or the links to other high-quality resources it references. Follow Hoboweb Ranking high in Google in 2016 is more about the delivery of a satisfying end product to users than it is tweaking meta tags or keyword stuffing text. You can hire me today to review your site. I can QUICKLY deliver clear direction on what you need to do to your website to get more traffic from Google.

No comments:

Post a Comment