Categories
Organic Digital Marketing SEO Web Development

Optimize your websites Robot experience with Metadata

There is a lot of focus at the moment around UX on the web, and rightly so.

Googles guidelines are basically create websites for your users and don’t try to ‘game’ the system. In other words, write good content and make your websites easy to use. This should always be the baseline. Don’t create a website that you think will rank well, focusing on the your search terms in the title and making sure that your content has the correct keyword density. Focus on the user.

Having said that, there is one visitor to your website that you may need to pay attention to. The Googlebot.

The Googlebot is the crawler that reads web pages and follows links to your other content. It is what determines what is shown from your site on the Search engine result page.

The Googlebot reads the content on your website, but it also reads metadata that the typical user doesn’t see. For example, the page description is what the Googlebot pulls from the pages metadata to display on the search engine results page (SERP).

NIke’s Meta Description, something the user doesn’t normally see.
The Search engine result

In SERP answer boxes

Google, in the last few years have started including answer boxes in their search engine results page that give the user an answer to their query without the need for them to visit the actual site. While this is probably a better experience for the user, it’s not good if you are trying to earn revenue on your site from advertising.

While there is not a lot you can do to change this, you can ensure that it is your website that is providing the answers.

Elsewhere on the SERP it is important that your website appears in the best way possible. There are set lengths for the text that you use in your titles and page descriptions. It is important that you test these so that they look correct.

If they are not right, Google can sometimes make a best guess as to what the content on the page is. This takes the control away from you and prevents you from having consistency across your site as to how it appears in search engines.

Metadata

Your site is crawled by robots, they used to be called ‘spiders’ in the more romantic age of the web. These bots (e.g. the Googlebot) love data, but more importantly, they love data about data. Another name for this data is Metadata.

An example of metadata in this blog post could be the word count, or the author or the publish date.

There are many different ways this data exists, for example the title of this post is wrapped in a h1 tag, this is implicit metadata telling the bots that the sentence wrapped in the h1 tag is what this post is about.

The title of the page is another example of implicit metadata.

You can also create explicit metadata. There are 2 main ways to do this Schema and JSON-LD (JSON for linked data).

Both of these technologies allow you to tell the bots exactly what the content is about. This is not only for use on the web with search engines, but a way of structuring data so that computers can understand it. Another example of how it is used could be a calendar app that has access to an email confirming a hotel booking. If there is structured data in the email, then the calendar could add your hotel stay automatically.

This is a huge topic and there is a good article from Google showing the benefits.

Hreflang

This is another type of metadata that you can add into your page or sitemap. It helps to describe content in multi-language websites. Hreflang is especially useful when you have have language variant content; for example, US English and UK English.

What it does is it tells the bots which language and locale the content is meant for. SEO Moz have an up-to-date article on the topic.

Be kind to your bots

Technical SEO is a huge area and so much is hidden in the secret world of search algorithms. Adding structured data to our websites gives them an extra layer and turns content into a set of data that can be used by APIs and give a richer experience to our users.

Categories
Productivity Project Management Web Development

Running a web team for internet marketing.

Working on the web in a team can be difficult. Sometimes you are required to deliver on large projects, such as a new feature or offer; and the rest of the time, you are doing service delivery; update a page, add a new blog post etc.. How do you reconcile both? What is the best way to run a web team for Internet Marketing today?

Scrum

SCRUM has been around since the Agile Manifesto in 2001. It is a project management technique built around different ceremonies (meetings). The best known of these is the daily standup. A fifteen minute meeting early every day to remove blockers. Another important piece of SCRUM is the concept of a Sprint. A Sprint is where a block of time, usually 2 weeks, is set aside to work through the tasks in the project.

To get the most out of a Sprint all dependencies should be met before starting. The person working on the Sprint should be able to dedicate their time exclusively to the Sprint. This can be difficult in small teams as specialities arise within the team. For example, one of your developers could also be the technical SEO expert. If an issue arises on the website, the expert may have to be pulled out of the Sprint, negating the advantages of running SCRUM.

SCRUM is the best way of efficiently getting a project completed. However, there are not many teams that can lock themselves away for 2 week stretches, especially if the web team reports into a faster paced business unit, like marketing.

Service Queue

So what happens when the marketing team needs to launch their campaign now, not in 2 weeks?

This is where the Service Queue comes in. Tasks are added to the queue and someone on the team is set aside in order to have the capacity to work on the queue. This means that if anything is really urgent it can be worked on straight away.

By combining SCRUM and the service queue, it allows you to give a predictable timeline for projects using SCRUM while the service queue gives you the capability to look after your stakeholders with speed and agility.

Categories
Javascript Web Development

Component based web systems

The Web as a platform is all the rage these days. It’s not hard to see why. With the ability to create web apps in a matter of hours with very little experience in coding, it will soon be the de facto way to build applications.

It even looks like the war between web based apps and native apps on mobile has been won by the web platform in the guise of progressive web apps (pwas).

The driving force behind all these apps is javascript and the many, many frameworks that are out there.

The reason for this, is of course because after fending off Flash and Java applets, JS is the only way to code applications on the web.

Components are becoming popular as it allows companies to create block of UI that they can reuse across apps and sites. For example EA Games share a set of components through their many individual game sites.

React is the most popular framework out there at the moment and at it’s core, it is a component system. React components are created with jsx, a JavaScript templating engine.

React components can be shared between applications and sites. There are many sites out there that even offer UI libraries of React components.

Polymer
Polymer

Polymer is a project that is backed by Google. It aims to help you create components with a small library to ease the process. The Polymer project also includes a CLI (command line interface) to help you build and test your components.

Webcomponents.org is the home for a W3C project that aims to make web components a standard. It outlines 4 specifications that make up web components; Custom Elements, Shadow DOM, ES Modules and HTML Templates.

Webcomponents.org

Custom elements allow developers to create custom HTML tags. So for example, you could create an element called <my-side-navigation> that would render a side navigation on your site. You could then share it with your other sites.

The Shadow DOM allows you to encapsulate HTML and CSS so that they don’t effect the rest of the site.

ES Modules are essentially JavaScript Modules. This specification allows you to create js modules and then import them.

HTML Templates are where the components html is created. This is the HTML that is rendered when the tag is added to a page.

There are other libraries out there for creating web components, but these are the main ones.

Web components are sign of the maturing of the web as a platform. In traditional application development, components have been around for a long time. Together with concepts like Atomic Design, there are now really powerful tools for developing reusable components to put together really compulsive experiences for your visitors.