The Digital Marketing Excellence Blog | Perficient Digital

The Digital Marketing Excellence Blog

In 2018, Perficient acquired Stone Temple Consulting, an award-winning Boston-based $9 million annual revenue digital marketing agency. Read more about the acquisition here, and below you’ll find blog posts by our digital marketing experts from Stone Temple.

 

Latest Posts

Recent Posts
  • Google made several announcements at Google I/O in early May of 2019 about search and SEO. Images were one of the topics mentioned during the announcement. In this episode of the popular Here’s Why digital marketing video series, Google’s Martin Splitt joins Eric Enge and explains how visual search is vital to SEO and why Google places emphasis on high-resolution images.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric: Hey, everybody. I’m Eric Enge. Today, I’m pleased to tell you that I’ve got Martin Splitt from Google joining me. He is a Webmaster Trends Analysts based out of Zurich. Thanks for joining us today, Martin. Martin: Thank you very much for having me, Eric, and hello everybody. Eric: Today we’re going to talk about the announcements at IO that related to images and visual aspects of search. To start, maybe you could talk a little bit about the announcement made about high-resolution images. Martin: As you know, we now have the new Discover Feed. If you’re using an Android device, you have it on your home screen. If you’re using google.com on mobile, then you will see that there. The search app on Android has it as well, and it can drive a lot of traffic, but also it is very keen on having good visuals. The question then is, what kind of visuals should we be using? We would like to use high-quality images that are a little bigger than usual— that would be nice. But we want our webmaster staff to have full control over what they’re giving us to show and discover, and also, we want to use these images in other surfaces. We have smart home devices with displays now: we have the Assistant, and we have many different surfaces that can show visual content and promote your site, basically. We announced the upcoming possibility to opt-into sharing high-resolution visual content with us: basically, images in high resolution that we use on different surfaces and different devices, and also in Discover Feed and other features. We have no timeline yet for when that opt-in program will start. Eric: So how do you actually opt in or enable the feature? Martin: The way to opt in is probably going to be based around structured data. You would use the regular optimized fast images for your website, so that your web content loads fast on user devices, but you would include a little bit of structured JSON-LD that points us to an high-resolution image that we can use. And probably, there will be settings around that in search console, as well. Eric: Got it. You mentioned that this will be part of the Discovery feature, for example. Do you see the high-resolution images ever coming into play via regular Google Search or image search? Martin: Probably, but at this point, I only know that we will do this in Assistant, especially the smart displays that we have with Assistant and in Discover, but there’s probably more to come. Eric: Got it. And how about 3D images? That’s the other thing that you talked about at IO. Martin: Yes, absolutely. Right now, we are working with a few partners on bringing AR capabilities and 3D models to Google Search. There are plenty of interesting use cases. If you have educational use cases, or if you have things like furniture or real estate, those might be really interesting to try out and have a more visual approach, as these are, by definition, very visual things. Lots of people are visual learners. You definitely want a special understanding of, let’s say, a piece of furniture. There are use cases where it makes sense, and we are piloting that. There are multiple teams working towards a common goal in this case. We have teams that work with WebVR and AR—actually, it’s called WebXR, which is like AR and VR in one specification—to bring these possibilities to the web. There are also teams working to make 3D models smaller—that would be the Draco Mesh Compression. We also work with other teams to make developers more comfortable and make it easier for developers to work with 3D content, as well. And we are participating in the standardization effort that is GLTF, which is basically like a JPEG format but for 3D models. Eric: That’s awesome. I think the functionality and the things that you’re putting into visual search these days are really amazing. Martin: Thank you. Eric: So how important is visual search going to be going forward? Martin: I think visual search is one of the most underrated search experiences that we have right now. I mean, image search can be a fantastic funnel for additional traffic, especially if what you have is a very visual thing. If your product is very visual, if your niche is about visual things—let’s say food or tourism or specific kinds of marketing—then I can definitely see that visual will become more important for users as we also offer them more engaging and more visual ways to discover and interact with content. Eric: Great. Thanks so much for joining us today, Martin. Martin: It’s been a pleasure. Thank you very much for having me, Eric. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • Eric Enge and Brian Weiss discuss SEO in Here's Why Video - educational video series on Digital Marketing

    User experience is becoming a part of SEO, but why does Google want to use it as a signal in their algorithm? In this episode of the award-winning Here’s Why digital marketing video series, Brian Weiss explains how Google is using user experience to rank pages.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Brian: User experience is becoming part of SEO. Here’s why. Hey, Eric. Eric: Hey, Brian. Brian: I have a question for you. Eric: Okay, what’s that? Brian: Would you rather have a website that was magically guaranteed to rank number one for every query, or a website that was guaranteed to convert every visitor who came to the site? Eric: Okay. Now, you’re going to tell me you’re the magical genie who can grant me these wishes. Brian: That is what it says on my business card. Eric: It seems like you could do pretty well in either situation, but I bet you have a point of view on this, don’t you? Brian: I’m glad you asked. If I had a website that converted every visitor, I can get it to rank number one for everything. And as an SEO, I’m probably not supposed to say this, but if you can’t convert visitors to your site, you don’t have anything to optimize. Eric: But you seem to think that converting customers in itself will lead to better SEO results? Brian: No, not so directly as that. Eric: But if you’re satisfying 100% of visitors, you’re probably creating some excellent signals for relevance and overall user experience that maybe Google would be interested in. Brian: Right, and SEO gives us an interesting lens to look at user experience through. Eric: It is certainly what Google is trying to optimize for. They want to send users to what they think is the best experience for them. Brian: Yes, and if you think about how they’re doing that over time, using human ratings to feed their machine learning algorithms, then over time, they may come up with some very interesting signals that we wouldn’t necessarily think of as traditional SEO factors. Eric: That’s certainly possible. But as SEOs, how do you think we should best use that information? Brian: I think there are two parts to answering that question. First of all, we can look to the Google results themselves to get clues about what elements Google thinks are important for answering user needs for a particular query. Eric: That’s where something like our semantic content optimization tool, that analyzes content on the top-ranking pages compared to yours and tells you what your page might be missing, would actually come in really handy. Brian: Exactly, those are the pages that Google thinks are doing the best job of providing relevant answers to the largest percentage of user needs. Ideally, you’d go beyond what they’re doing, but you don’t want to just copy your competition. Eric: Got it. It’s hard to beat the competition if all you’re doing is copying them. Brian: Right, but it’s good to know what your starting point should be. Eric: Didn’t you say there was a second consideration related to how SEO should respond to Google optimizing the results for user experience? Brian: Yes, I did say that. Eric: Are you testing me here, Brian, or what? Brian: Exactly, Eric, user testing. Now, would you say that you just had a bad experience? Eric: Yes, I might say that. Brian: Okay, that’s excellent feedback. I won’t give any more passive aggressive answers. Eric: So, user testing. Brian: Yes, doing user testing for conversion rate, bounce rate and time on site—on the one hand, it’s just basic due diligence at this point, given the impact it can have on revenue. But I think it also can help to avoid some of the SEO pitfalls for those not-so-obvious indicators that Google may start using over time, through developing their own user testing and machine learning operation. If users love your site, you definitely have a better chance of Google seeing it as a user-friendly destination. Eric: Of course, that user engagement will create signals that Google does pick up on over the long run. Brian: For sure. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • A speaker speaking on main stage at SMX Advanced at 2019 SMX Advanced in Seattle, WA

    Along with my Perficient Digital colleagues, I had the privilege of attending SMX Advanced in Seattle from June 3-5, 2019. I’m still applying everything I learned from the engaging talks by great speakers from all sorts of backgrounds. Rather than offer a full blow-by-blow of every presentation, I’ll focus my takeaways from three key areas of search that reflected the strongest technical themes throughout the conference. These clusters seemed to be where the technical SEO eye of Sauron was focused for the week. 1. Structured Data and Semantic HTML Providing good website structure isn’t a new idea. In fact, it’s probably one of the oldest themes in web development—but it has been approached with renewed interest at the last few conferences I’ve attended or watched presentations from (Next10x, Google I/O, and SMX). Structured data and semantic HTML are two different major elements that have been brought up increasingly in 2019. Cata Milos of Bing talked about structured data and schema at length during the “What’s New With Schema and Structured Data” talk. Milos explained how Bing sees a web page, highlighting that Bing has evolved to become more visual, just as the web has. Bing has to “visually understand complex documents,” breaking them down into primary content (which is the content the user wants to see on the page), secondary content, and invisible content. This came up again in the “Periodic Table of SEO Success” panel (with Ginny Marvin, Detlef Johnson, Barry Schwartz, and Jessica Bowman), where we were advised to “start looking at using html5 semantic elements to define different parts of a web page.” Barry Schwartz noted that you should “make a website that Google is embarrassed not to rank.” Milos also noted in the structured data session that you should use semantic HTML elements, and they should cascade in the correct order. One note I found particularly enlightening is that heading tags (h1, h2, h3, etc) in descending order should visually match what they look like on the page. Bing compares the HTML tag with what it visually looks like. So, the most important heading on your page should be an h1, and it should be the largest heading tag on the page (don’t use CSS to resize it!). Another takeaway is about how Bing decides what structured data surfaces rich snippets—FAQ Schema currently has low adoption, and Q&A data is used by the top 1-2% of docs. Which is more likely to be well used, and useful to be surfaced? Similarly, Max Prin discussed the utility of structured data. He presented data showing that rich results generally end in a higher CTR—and that the rich result’s message can impact CTR heavily:   What a lot of this comes down to is the idea of “technical quality”—we as SEOs talk a lot about quality as a content metric, but focusing on having a good technical experience is really important too. The message is to align the syntax of your code with the semantics of your page. When Frédéric Dubut, lead of the spam team at Bing, and Fili Wiese, a former Googler who worked on spam and manual actions, spoke together on stage about algorithms and penalties, their focus continued in this way. Avoid serving manipulative or misinformed information in search. Wiese and Dubut mentioned recent experiments from SEOs showing different favicons in mobile search results and said that penalizing that made sense. When SEOs use incorrect favicons it’s a similar problem of misinformative structured data markup. Quick Takeaways: Use semantic HTML in concert with your content Use structured data Pay attention to technical quality in line with your content quality 2. Development Another major theme of this conference was the importance of talking to developers and understanding development. From the opening talk by Jessica Bowman, Detlef Johnson, and Alexis Sanders, SEOs were encouraged to interact with and understand their developer partners. Detlef Johnson focused on this specifically during the keynote, encouraging SEOs to get into code and understand the new paradigm of the web. SEO also needs to involve privacy and security. Perficient’s own Eric Enge touched on the more technical side of this in his discussion of mobile-first indexing and the issues that can come up when optimizing for mobile-first SEO. Types of problems found in our crawls of mobile subdomains include missing product pages, broken sitelinks between sites and ridiculous crawl depth levels. Enge also surfaced notes from I/O, where Martin Splitt heavily suggested using responsive designs rather than mdot URLs. The JavaScript panel had a deeper dive into the complexities of JavaScript SEO, starting with the mantra “JavaScript is not evil.” Hamlet Batista spoke at length on his experience with using JavaScript frameworks, and provided some great technical insights. He noted some pros and cons to using Angular for SEO. Among the pros: there’s no need for hash fragment URLs (history API paths by default), painless universal JavaScript, the same codebase on the client and server prevents compliance problems, and there’s basic support for titles and meta tags. He noted, on the other hand, that there is very basic support for SEO tagging, no built in support for structured data, that absolute and relative URLs can introduce errors, and there’s no built-in support for link tags. Robin Rozhon went more into how to monitor JavaScript, and how to use reporting to improve JavaScript performance. Rozhon suggested switching the user agent to Google Smartphone, then checking render time in the initial state vs. extending the JS timeout, vs allowing the site to load with cookies and storage turned off. Rozhon reminded the audience to verify the content visibly, to ensure that the user experience was aligned with the bot experience. Quick takeaways: Figure out ways to work with your developers Try using JavaScript frameworks to understand the SEO wins and pitfalls Monitor your pages for technical issues Build responsive sites rather than separate desktop and mdot sites 3. Voice Max Prin noted that Amazon Alexa uses structured data for local search, so that’s

  • You’ve heard that content is king, but today, content is more important than ever. Here’s why. Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been.  Note: Our future videos will start publishing on Perficient Digital channel, please subscribe to Perficient Digital channel Don’t miss a single episode of Here’s Why, click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been. I’m going to start though by talking a little bit about Google’s algorithm updates over the past 14-16 months. I’m currently showing a chart for you that shows all the major updates that were called “core algorithm updates” by Google. It turns out that these updates all had a certain number of things in common. There seemed to be a pretty big focus on user intent and better understanding of user intent. They were looking to lower the rankings of poorer quality content and raise the rankings of higher quality content. But another element of it that I felt really emerged is a much bigger emphasis on the depth and breadth of your content. So, with that in mind, I want to jump into the case studies and show you some data. Here’s the first case study. This is in the addiction marketplace. The first chart shows the publishing volume of one particular vendor in that marketplace. You can see that there are wild fluctuations, but at times we’re talking about hundreds of actual new pieces of content being published every month, some months as high as 700. So, that’s the first data point. Second data point: Let’s look at the rate at which this site was adding links, that you see in this chart here. The linked volume begins to grow rapidly around the same time as the content volume started growing. And now for our third chart. This is the SEO visibility from Searchmetrics. You see that that begins to accelerate rapidly in May of 2017. So, it’s very interesting to see the correlation between the rapid content growth, the rapid linked growth, and how it drove massive changes in traffic to this particular site. Now let’s look at case study two. This one’s in the career space. And again, I’m going to start with a chart on the publishing volume for this particular company. The volume was actually moderately heavy in 2017, running about 45ish pieces of content a month. That’s pretty significant—one and a half pieces a day on average. But in January of 2018, this scaled into many hundreds of pieces of content per month. So, now let’s look at the “rate of links added” chart for this particular company. Here you see that the links did not really scale until you got into around March and April of 2018, when it has a really sharp spike. Now, what that sharp spike is actually showing us is: it turns out that that was due to a redirect of another domain to this particular domain, and so a lot of links transferred very instantaneously, if you will. Let’s look at the traffic chart for this particular company. The traffic actually scaled very rapidly after the links took off in May of 2018. What I like about this case study is that it shows us that the content publishing at a volume where the links aren’t really growing isn’t going to do much for you. You need to create lots of great content. It’s a key part of the picture, but if you don’t promote it effectively, you’re not going to get the right results. Let’s look at case study number three. This one is a consumer retail sales site. Let’s start with the publishing volume chart. This site has been adding content at a heavy volume for a very sustained period of time—it’s consistently in the thousands per month. Now let’s look at the rate of links added for this chart. This doesn’t have as sharp a spike as the second example I showed, or even as dramatic growth as the first example. Yet you do see that links are being added steadily over time built on top of a very strong base. Now let’s look at the traffic for this one. This is actually the SEO visibility chart again from Searchmetrics. In this particular case, the SEO visibility started at a very high level, but you get continuous steady growth over time, as supported by the strength of their publishing program and the rates at which they’re adding links. I have two more charts for you before we wrap up. This chart is data from a company called serpIQ that shows the correlation between ranking in Google and length of content. You’ll see from this chart there’s a clear bias for Google to rank longer form content. Now, before we go off and say that every page should have tons of content on it, it’s very dependent on the context. There are plenty of pages where you don’t need a long-form article. I’m not saying every piece of content or every page on your site needs to have a mass of text on it. That’s not the point. But from the point of view of informational content, it’s very clear that longer form is better And then another chart. This one’s from HubSpot. This data shows that longer form content actually earns more links. Now you can see how I’m making the connection here and drawing all the pieces together. One last chart. This one’s a bonus chart from a

  • ADA compliance and web accessibility are more serious than you likely know. Consider this scenario: You or one of your clients suddenly receives a letter stating that the website you administer is not ADA compliant and you’re facing litigation. Facing litigation? Now what! The best course of action is to proactively review your website for ADA compliance and ensure that it is accessible to people with disabilities before you get into trouble. The level of compliance necessary is outlined in the Web Content Accessibility Guidelines (WCAG) 2.0 (available here). These guides are quite detailed, but it will help you fully comply with the law and insulate your company from litigation because it’s comprehensive. A good place to start for website ADA compliance and accessibility is to use the following: Check the current state of your website accessibility with tools like WAVE wave.webaim.org and the Google Lighthouse tool (available in the Chrome browser) Ensure that all images have descriptive alt text Provide closed captioning on any videos your site may have Provide text transcripts of any video or audio only files Give users the ability to pause, stop or hide any automated content like email signups Use simpler design, be sure the website isn’t overly complex and provide options for adjustments to size/color of text and content Be sure your website supports keyboard navigation (think navigation between elements with arrows and tab keys) Provide support features so a person with a disability can contact the webmaster and receive a response Be sure any forms on your website have instructions for their use and that each form element is labeled with clear and understandable text Also, use the id and label HTML elements on form items Once the above checklist has been followed, it is advisable to have a legal professional review your website in light of the WCAG 2.0 guidelines.  

  • Perficient Digital's Eric Enge and Google's Martin Splitt discussing digital marketing related topics

    Google made an announcement at Google I/O in early May of 2019 that Googlebot is now evergreen. What does it mean for the search community? In this episode of the popular Here’s Why digital marketing video series, Eric Enge, together with Google’s Martin Splitt, explains of the new evergreen Googlebot in search including rendering hash URLs, <div> tags, and infinite scroll.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric: Hey, everybody. My name is Eric Enge and today I’m excited to bring to you Martin Splitt, a Google Webmaster trends analyst based out of Zurich, I believe. Martin: Yes. Eric: Say hi, Martin. Martin: Hello, everyone. Very nice to be here. Thank you very much, Eric, for the opportunity to be a guest here as well. And yes, I am, in fact, based in Zurich. Eric: Awesome. Great. Today, we want to talk a little bit about what happened to Google I/O related to the announcement that Googlebot became evergreen, which means that it will be on an ongoing basis on the latest version of Chrome— in this case, Chrome 74, for right now. So, what are some of the things that that means, and what are some of the things that still won’t be supported as a result of this move? Martin: What it means is that we now support many, many features. I think it’s 1,000 features or so that haven’t been supported beforehand. I think most notably, ES 2015 or ES 6, and onwards. We have now upgraded to a modern version of JavaScript. A lot of language features are now supported by default; a bunch of new web APIs is supported, such as the intersection observer or the web components APIs version, one of which is the stable ones. That being said, there is a bunch of stuff that just doesn’t make sense for Googlebot and that we continue not to support. To give you examples, there is the service worker. We’re not supporting that because users clicking onto your page from the search result might never have been there beforehand. So, it doesn’t make sense for us to run the service worker who is basically caching or which is basically caching data for later visits. We do not support things that have permission requests such as webcam or the geolocation API or push notifications. If those block your content, Googlebot will decline these requests, and if that means that your content doesn’t show up, it means that Googlebot doesn’t see your content either. Those are the most important ones. Also, Googlebot is still stateless. That means we’re still not supporting cookies, session storage, local storage or IndexedDB across page load. So, if you wanna store data in any of these mechanisms, that is possible, but it will be cleared out before the next URL or the next page comes on. Eric: Got it. There are some other common things that I’ve seen that people do that maybe you could comment on. I’ll give you three. One is putting or having URLs that have hash marks in them and rendering that as separate content. Another one is infinite scroll, and then a third one is links, implemented as <div> tags. Martin: All of the examples you gave us, we have very good reasons not to implement. The hash URLs—the issue there is that you’re using a hack. The URL protocol was not designed to be used that way. The hash URL— the fragments these bits with a hash in front of them—they are supposed to be a part of the page content and not different kinds of content. Using hash URLs will not be supported still. Using links in things that are not links, like buttons or <div> tags or anything else, would still not be supported because we’re not clicking on things—that’s ridiculously expensive and also a very, very bad accessibility practice. You should definitely use proper links. What was the third one? Eric: Infinite scroll. Martin: Yes, infinite scroll is a different story. Googlebot still doesn’t scroll, but if you’re using techniques such as the Intersection Observer that we are pointing out in our documentation, I highly recommend using that and then you should be fine. You should still test it and we need to update the testing tools at this point. We’re working on that sooner rather than later. But generally speaking, lazy loading and infinite scroll is working better than before. Eric: One of the things that I believe is still true is that the actual rendering of JavaScript-based content is deferred from the crawl process. So, that also has some impact on sites. Can you talk about that? Martin: Yes. Absolutely. As you know, we have been talking about this last year as well as this year. Again, we do have a render queue. It’s not always easy to figure out when rendering is the culprit or when crawling is the culprit because you don’t see the difference necessarily or that easily. But basically, we are working on removing this separation as well, but there’s nothing to announce at this point. If you have a site that has a high-frequency change of content—let’s say, a news site where news stories may change every couple of minutes—then you are probably well off considering something like server-side rendering or dynamic rendering to get this content seen a little faster. If you are a site like an auction portal, you might want to do the same thing. Basically, if you have lots of pages—and I’m talking about millions—that content basically continuously changes. Then you probably want to consider an alternative to client-side rendering. Eric: Right. One of the things that used to be recommended was this idea of dynamic rendering. If you have one of these

  • Images SEO in visual search has been around for a long time, but why is it becoming more important to marketers?   In this episode of the award-winning Here’s Why digital marketing video series, Jess explains changes Google has made to their search result pages to show more visual content and how it may impact rankings.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources Google at 20: A Shift from Text to Images See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric: So, Jess, images SEO in visual search have been around for a long time. Why are they becoming more important now? What’s changed recently?  Jess: In a macro sense, the technology surrounding image hosting, image recognition, visual search, and that kind of thing has really improved. Image processing has become faster and you can get better quality images. And Google has noticed. In the “Next 20 Years of Google Search” post, Google signaled a switch from text to a more visual way of search. You can see this with their commitment to a much more visual mobile SERP (Search Engine Result Page).  Eric: A lot of these changes have happened over the last year. What changes have you seen most recently?  Jess: Some major changes have been with Google Lens, SERP experiments and changes, the Google Discover feed, and Google Collections.  Eric: Tell us about Google Lens.  Jess: Lens is Google’s built-in image recognition and search product. It’s accessible through the Google app and it lets you search for objects, image first. Say I want a version of a shirt—I can just take a picture of it on my phone and search for it online.  Eric: And we’ve also seen it in Discover and Collections. Both are services used by Google. Discover shows a feed of topics related to what the user’s interests are, and Collections lets the user save search results to boards. It’s kind of like Pinterest in that way. Both display search results with large visuals, titles, and then short amounts of text. They’re usually extremely visual-first, especially compared with traditional SERPs. So how is this showing up in the SERPs?  Jess: We’ve seen massive fluctuations in visuals in the SERP results. Image thumbnails, increased importance of images on the page, all that kind of thing. But the million-dollar question is, “Does this impact rankings?”  Eric: Probably. Maybe. Well, we don’t know directly, and we don’t know how much, especially when compared with other ranking factors. But recently, I did have a chance to talk with Bing’s Fabrice Canel, who confirmed the concept that a page with a high-quality relevant image on it could be seen as a higher-quality page, as a result. And as for Google, we know they also care about a user’s experience. Having relevant, well-optimized images can create a much better experience than just a big block of text. We do know that speed is a ranking factor and is clearly very important to Google. Won’t images slow down your page? Maybe that would impact rankings.  Jess: You can use good compression and next-gen image formats like WebP and JPEG 2000. But you can also think about the speed of the information making its way to the user. In that way, images are speed.  Eric: Can you explain?  Jess: You can explain what the Mona Lisa is in 1,000 words, or you can just show what the Mona Lisa looks like.  Eric: If images are important, how can publishers best implement images on their pages?  Jess: The usual rules for image optimization still apply. Make sure your images are a good size, that you use alt text correctly and accurately, and make sure that your images are a good quality. Beyond that, for speed, you can try implementing lazy loading while still making sure Googlebot can see your images. Try next-gen image formats and use unique images. And even run your images through the Google Image Recognition API to see if it sees what you want it to see.  Eric: Images can be useful in different ways for different niches. You have to think about how your images can be used, for users to find you—and then how they can help your user when they have found you. eCommerce sites, for example, should make sure their products are discoverable using a reverse image search. Financial pages should use images and visual storytelling to help their users understand their text, as well.  Jess: Yes, exactly. You can use images to stand out in the SERPs, help your users take advantage of visuals and take advantage of search features like Collections and Google Discovery.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • Google’s John Mueller confirmed that Google has not made use of rel=prev/next tags for some time. But should we still implement pagination? In this episode of the award-winning Here’s Why digital marketing video series, Eric Enge explains why pagination is still important and how you should implement it.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources Pagination Canonicalization & SEO: Your Technical Guide See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript So recently, Google’s John Mueller tweeted that Google has not made use of rel=prev/next tags for some time. But my assessment is that the reason they did this is because the quality of the tagging web developers were using was probably poor on average. This is actually a parallel to what happened with rel=author tags back in 2014, when Google discontinued support for those. Back at that time, we actually did a study on how well those were implemented by people at the time. We’ll share that in the show notes below. This study shows that 71% of the sites with prominent readership made no attempt to implement authorship or implemented it incorrectly. Many of those who had implemented it didn’t understand exactly how to do it and they just got it wrong. That said, what should we do to support paginating page sequences now? If you have prev/next tags, you could still use them on your page if you want. Google won’t use them. Bing might use them—we don’t actually know for sure. But if you are going to keep them on your pages, make sure they are implemented correctly. You do have to take the time to learn how to follow the specs carefully and get it right. Putting aside the prev/next tags for a moment, let’s think about how you should implement pagination otherwise on your page. Our first preference is to implement that pagination in clean HTML tags that are visible in the source code for the pages on your site—something that is easy for the search engines to parse. The second choice would be to implement it in a way that isn’t clinging to the source code, but you can actually see it in the DOM or the Document Object Model. That means that your links are going to be anchor tags with a valid href attribute, not span or button elements with attached JavaScript click events. Paginated pages should also canonical to themselves—that’s a good reinforcing signal. These are the things that you need. The reason why this is still important is that pagination is something that still matters to users. If you’ve got 200 products in a particular category, you probably don’t want to show 200 products on one single page. Breaking that up into many pages is actually a very good way to make the content more parsable and readable and usable for users. This is really why pagination is still important. But make sure you get that pagination implemented the correct way as I’ve outlined in today’s video. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • On May 2, 2019, Perficient Digital hosted the third annual Next10x conference in Boston. The one-day agenda was packed with relevant, valuable digital marketing and SEO information and networking breaks. It included 12 industry speakers and had a strong focus in two areas: The future of digital marketing Things that you can do right now to grow your business Many of the industry’s top speakers came and shared their knowledge, expertise, and insights. Didn’t get a chance to join us this year? No worries – today’s post will provide you with a recap of the top takeaways from the day. Sign me up to stay in the loop about Next10x 2020 Top 5 SEO Opportunities Eric Enge, General Manager, Digital Marketing, Perficient Digital SEO is about giving Google what they want — a great user experience based on intent. Our learnings from the algorithm updates show that consistent updates raise the rankings of sites that meet user intent. It’s in our interest as publishers to align with Google’s goals. As a result, here are the five biggest opportunities for SEO in 2019: High-quality content – Google recognizes that user needs are complex and unique to each user, and its algorithm updates are focused on surfacing sites that offer a depth and breadth of content likely to satisfy those needs. In short, publishing high volumes of content (when compared to the competition) can cause your organic search traffic to soar. Promote content effectively – You can have the world’s greatest web site, but you won’t get much traffic if no one knows about it. Promote your site, drive high levels of visibility to what you’ve created, and get cited and referenced across the web. Links still matter a great deal, and they remain a big key to SEO success. Speed matters – A one-second mobile delay can reduce conversions by up to 20%, and 53% of users abandon pages that take more than three seconds to load. Yet, the average page takes more than 12 seconds to load on mobile. Find ways to speed up your site and you’re likely to see great results. One approach to consider is to implement accelerated mobile pages (AMP), a progressive web app (PWA), or both (a PWAMP!). Publish original, high-quality images – Searching with a camera is the next big phase of search. Original, large, clean, and optimized images directly related to the site will offer users a better experience with your content and open the door to new traffic opportunities, such as traffic from Google Discover. Invest in voice – Users are becoming more and more comfortable speaking to their devices. Personal assistants will be the driving applications behind voice usage. As a publisher, the biggest opportunities are for those who create personal assistant apps, such as an Alexa Skill or an Actions on Google app. These will advance from the scripted conversations available today to fully cognitive conversations. Make Your Mobile Site Fly with AMP Ben Morss, Developer Advocate, Google Speed is everything. It matters to users across the globe. It is even more critical in a world where most users have 3G connections or slower (40% of connections worldwide are 2G). Here in the U.S., delays in page load times significantly impact user engagement and conversions on your site. AMP is an open source program that provides an industry-standard approach to speeding up your pages. Based on a collection of web components built off HTML, AMP provides some JavaScript functionality like menus and image carousels. AMP also includes these key aspects: AMP discourages/bans features that slow speed, provides a stable layout that eliminates distracting ads, and only loads content when it’s needed. Originally, site owners that adopted AMP created an HTML/JavaScript version of their site and then an AMP version that was used as an alternate mobile experience. Today, more and more implement AMP as the standard (and only) version of their mobile pages. In general, most sites can largely be re-created in AMP, which can support visually rich experiences. Some exceptions remain but are rare. Checkout pages are one of the few pages that still usually require too much JavaScript to translate to AMP pages. Ben shared a case study of an eCommerce site in India that saw a 60% improvement in speed and a 40% reduced bounce rate. PWAs create an app-like experience on the web, and adoption of these is spreading. Microsoft is actively looking for PWAs to feature in their app store and Chrome has started launching PWAs for PCs, with Macs hopefully soon to follow. Consider the key aspects of PWAs: If your site is developed with a PWA, your normal web pages behave like a smartphone app when accessed via your phone, eliminating the need to develop a separate code experience for phones. This drives rapid adoption — since all users who access your site get the PWA, maintenance and development are simplified. A core component of the PWA is the Service Worker, which actively preloads content prior to a user requesting it. As a result, the page they access next is often preloaded onto their phone even before they request it, resulting in great increases in speed. The Future is Conversational and Visual Duane Forrester, VP of Industry Insights, Yext Trends are driven by platform change. It’s important to have these new devices and platforms in your life to understand how users are searching and what content they are consuming. Smart speakers and personal assistants are integrating with everyday life, including in houses, cars, and during a user’s day-to-day routine. Your brand has numerous audiences and consumer touchpoints. Search engines want consistent and reliable data across touchpoints to determine how to serve up a reasonable and expected answer. Seventy-three percent of high-intent traffic (someone intending to enter a business/make a purchase) happens off-site. Most customers never visit a homepage because search intent takes them to other channels and specific landing pages. It’s important to manage entities (companies, events, people) as users become more

  • Consumers want accurate, reliable, easy-to-understand information. Can they trust your content? In this episode of the award-winning Here’s Why digital marketing video series, Eric Enge explains why It matters who creates your brand content.  Publishing Note: Starting with episode #215 scheduled to publish on May 20th, the series will feature Eric Enge and a variety of select industry guests. After episode #215, the publish schedule will be every other week. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Mark: Eric, why do we always say that our content needs to be created by subject matter experts? Eric: That’s a great question, Mark. I think the big key is understanding what the content is being used for. At Perficient, our focus is usually on developing content for content marketing purposes, to build our reputation, increase our rankings in search, and increase our audience. With all those things in mind, you have to realize that the kind of thing we’re trying to do is really thought leadership oriented. You can’t just expect anybody to create that content for you. You need someone who actually knows the topic really well, or else our audience won’t accept it. Mark: You’re saying you should never use just copywriters? Eric: First of all, not exactly. I mean, there are plenty of good roles for copywriters. There’s maybe a lot of content on your site which is really simple, product descriptions or something like that, where you don’t need a true subject matter expert. I think the big key, in that case, is to give them the time to research the topic and be able to write intelligent stuff about whatever they’re addressing. But you can’t expect them to do thought leadership level content in whatever your marketplace is. You can’t just give someone 60 minutes of time, and suddenly, they’re a leading expert on the topic. It really doesn’t work that way, but there are still many ways to leverage the skills of copywriters. Mark: Okay. Can you give an example where SME, subject matter expert level writers are required? Eric: Sure. One is, if you’re trying to build a section in your site, like a content hub with thought leadership level advisory content. These really work best if you answer common user questions and address their needs related to whatever your market space is. This typically requires a pretty high level of expertise to execute really, really well, particularly if you want to create a resource that others might actually link to. So, this might be a wide array of great, helpful articles or a video series, like “Here’s Why”. Hmm, that sounds like a great idea! Or user surveys or other types of research. This level of content really requires a subject matter expert level of, well, expertise. Mark: Okay. How about another example? Eric: If you engage in some level of off-site content marketing–so for example, I publish regularly on Search Engine Land, a column. This provides great visibility for our brand, which is awesome, but Search Engine Land isn’t going to let me publish on their site unless I know something about the topic. So, this is a case where guest posting really makes sense. It’s good for visibility and really getting exposure to your target audience. You’ve got to use this tactic with care, though, because there can be too much of a good thing. So, focus your efforts on publishing in places that have sizable audiences, that are direct interest for your business to be in front of.   Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

 

Most Read Posts

Enjoy some of our posts that get the most attention and readers: