![]() In addition to improving relevancy, using structured data can add to the overall quality score of the domain and page. No, or at least very little, subjective interpretation is needed by the search engine. The site owner makes an explicit statement what the core sections of the site are exactly about, less important sections can be de-emphasized. You see, a deeper semantic understanding of the important content on a site can serve as a strong relevancy signal. This shift is creating a serious opportunity to gain visibility in search engine results - a chance to improve search rank for certain pages and content sections. ![]() This helps them to better interpret pages and page sections, as well as to improve the results that are displayed for specific search queries. The main search engines increasingly rely on structured data to understand the content they are spidering and indexing. “Markup like this is a strong signal to our algorithms to show this image in preference over others” For example, when Google announced support for structured markup for organization logos they stated: With a recent number of support announcements and tool releases, Google is increasingly emphasising structured data in general, and the vocabulary specifically, as an important method for enhancing the overall search experience.įor website owners this creates an interesting opportunity to gain more visibility in search engine results. Why it should be on your priority list for 2015 Using the vocabulary to describe website content is also often referred to as (employing) structured data. The additionally supplied information about the content with the vocabulary provides a structure for search engines (and other machines) that makes it easier to (unambiguously) interpret the information. The initiative defines a vocabulary based on the microdata format for describing information - for adding meaning - to website content and HTML Markup. In a continued effort to improve the search experience Bing, Google and Yahoo teamed up (similar to what they did when standardizing the XML Sitemap Protocol) and launched in 2011. ![]() XML Sitemaps are a good example of this - not rendered on the webpage itself and therefore not directly visible to the human eye, search engines have been reading these xml files to retrieve more information about a domains URL structure and learn about each individual URL. The future web is semanticįor years search engines have been relying on data that is invisible to humans for extracting additional meaning about the visible content. For 2015, adding semantic data remains one of the top priorities. Using the latest version of The HTML Editor with a drag-n-drop workflow for adding semantic markup, a lot of our customers started updating their sites in 2014. A simple fact, but something with big consequences for the way the websites and pages will need be constructed or reconstructed. Text and images, the main components of any webpage, are easier to understand for humans than for machines.
0 Comments
Leave a Reply. |