The Digital Marketing Excellence Blog | Perficient Digital

The Digital Marketing Excellence Blog

In 2018, Perficient acquired Stone Temple Consulting, an award-winning Boston-based $9 million annual revenue digital marketing agency. Read more about the acquisition here, and below you’ll find blog posts by our digital marketing experts from Stone Temple.


Latest Posts

Recent Posts
  • The world of SEO is full of misconceptions and myths. Anyone can claim they’re a search expert and publish any theory, and then call it the latest, greatest search (or SEO) technique. Chasing these wild ideas can be a major waste of time, or worse, harmful to your business. On August 8, 2019, Eric Enge (General Manager, Perficient Digital) and Rand Fishkin (Founder, SparkToro) presented a webinar, Eric Enge and Rand Fishkin Debunk the Biggest Myths in Search. They used data along with their own experience to debunk today’s top search myths. Here are just a few: You can’t get value from no-click searches Featured snippets cost you traffic Using schema always increases traffic Paid search is bigger than organic Find out how Eric and Rand debunk these common misconceptions by checking out the webinar below or reading the transcript. Original live webinar aired on Thursday, August 8, 2019 at 1:00 PM ET Transcript Eric: Hello everybody. My name is Eric Enge. I’m a General Manager at Perficient Digital in the digital marketing team. With me is my longtime friend Rand Fishkin. I think you probably all know him, but he was the founder of Moz, now founder of SparkToro, and author of the book ”Lost and Founder.” Forgive me for putting it this way, Rand, but he’s also a general industry legend. Thanks for joining us today. Rand: It’s my pleasure, Eric. I’m very excited to bust some SEO myths with you. Eric: Yeah, this should be fun. What we’re going to do is run through nine popular myths. We’re going to go pretty quickly because we have a lot of data to share in a short amount of time. Hopefully, you’ll get a ton of value out of this. Rand: And we want some Q&A time at the end. Eric: Yes, we do. And if you want to tweet stuff, we suggest you use the #SEOMythBusters. Rand: #SEOMythbusters. Eric: You’ve got it. I’ve always wanted to be a ghostbuster, mythbuster or something like that. Are we ready to get rolling? Rand: Let’s roll. Eric: Excellent. So let’s start with our first myth – you can’t get value from no-click searches. Another related myth is that traffic that goes to Google properties is of no use to you. So, what’s the reality? Rand: The reality is a bit complicated, and frustrating. Eric, let’s look at a search on the population of Boston or the weather. It’s absolutely the case that folks like,, and in the case of the population query, have lost a lot of traffic on these head of the demand curve terms. However, I think the reality is that when this happens, many of these searches and many of the searchers are not the most lucrative or engaged ones. They’re looking for quick, instant answers. Because of that, there’s not a tremendous amount of value that can be extracted from them. Google tends to take away a lot of the traffic that is of the least value to a website. And so I think some of these concerns have been overblown. I also wanted to specifically mention the speed test one. This is an excellent example because at Ookla, which runs, you can see that the bottom result there, the second one on the mobile, has continued to benefit massively from this ranking despite the fact that Google has its own speed test. I think that’s because people trust the speed tests from Ookla more. On many search queries related to speed test, Ookla comes up, but the Google OneBox answer doesn’t. To your point, if Google is sending traffic to its other properties, whether it’s Google Books or shopping, Google Images or maps or YouTube, you can still benefit from being in those rankings and controlling the information that’s seen through Google. Eric: Yes. And for YouTube in particular, it’s a very large search engine in its own right. You can see that a significant number of YouTube results show up in organic search results. So it’s actually another way to get back into Google if you have a very effective video strategy within YouTube. Rand: Absolutely right. Eric: Our next myth is that featured snippets cost you useful traffic. In the data point from one site we looked at in detail where they got a featured snippet, it was pretty apparent on what day that happened. Traffic really took off for them in a very big way on this particular page. It’s just one page, but traffic more than doubled. We’ve seen this kind of data in many, many different scenarios. This is just one example, which isn’t to say that there aren’t situations where featured snippets do cause you traffic. I think there are, I just happen to think that they occur more where it’s a benefit similar to your Ookla example. Rand: Yes. I think about it like there’s a flow chart in my head that asks, would I rather have the featured snippet than my competitor? If that’s the case, I want to pursue it even if it costs me traffic because I want to own and control the information that people find when they come to Google. I worry that some SEOs are biased against it because they don’t want to cost themselves traffic as opposed to thinking about the bigger picture, which is that this is a competitive landscape. You don’t have a ton of options here. Either you’re going to get it or someone else is, and I’d rather it be you. When I was at Moz, we had this with searches like meta-description and title tag length. Moz would get the feature description, and we were shocked to see that the title tag length actually increased our traffic even though the number was right in there. Eric: That’s interesting. And like you said, it actually cost you traffic, which was probably low-value traffic anyway. Rand: Exactly. Eric: All right, on to our next myth. This one is

  • Eric Enge and Barry Schwartz

    Google makes numerous search algorithm updates each year. Those updates may affect our sites’ ranking one way or the other. But do we really know how? In this episode of the award-winning Here’s Why digital marketing video series,  Barry Schwartz, Search Engine Land’s News Editor, together with Eric Enge, explains different kinds of Google updates and why it is really important that you understand Google’s algorithm updates.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources Barry Schwartz’ Search Engine Roundtable Blog See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric: Hi everybody, Eric Enge here, and I’m really pleased to have Barry Schwartz with me today. We are going to talk about search algorithm updates. Say hi, Barry. Barry: Hi Barry. Eric: Well done. Thank you for the ritual stupid joke. That’s a good way to start this off. But anyway, since we’re going to talk about algorithm updates, why don’t we talk for a moment about some of the moldy oldies: Penguin and Panda. How do core updates differ from those kinds of algorithms? Barry: I think it’s easy when you look at the Penguin algorithm—it was more focused on links, where you don’t really think or most people aren’t saying that these core algorithms are link-focused. Penguin really was about targeting people, manipulating Google search algorithms around faking your link profile, faking that your PageRank should be higher than it should, faking that your authority should be higher than it Panda was more about the site’s content being something that you could trust: is it authoritative? It’s going back to that list of like 23 questions that we all went ahead and posted back when people wanted answers of, “What they should do to fix their sites” around the Panda update. Most people are comparing the Panda algorithm, which is now built into the core algorithm, according to Google, related to the core algorithm, where everybody’s talking about, “Is this site trustworthy?,” “Is the site content something that people would stand behind?” I think when you look at Panda, it’s probably more related to these core updates. I think it’s just terminologies Google is using in order to basically say, “Hey, we’re not really confirming updates but yeah, we did an update to our core algorithm,” this way they don’t have to come up with new names every day around Fred or Florida or whatever you wanna call these updates. Eric: Well the Fred name was your fault, right? Barry: Actually, it’s Gary’s fault. He named his fish Fred. Eric: Yes, but you’ve spurred him into doing that. Putting that aside for a moment, the Penguin algorithms are actually still around, right? They’re just incorporated in the core algorithm now. Barry: Google doesn’t talk about it much. They said they incorporated into the Panda, at least their core updates. I’m not sure if they incorporated the Penguin algorithm into their core updates. But the Panda they said they did, right? Maybe Penguin also. Eric: Yes, both. Barry: Both are incorporated? Eric: Yes. Barry: Penguin is definitely real-time. Panda is less real-time, according to the last update, but we’re not really getting any information from Google in the past couple years about Panda or Penguin. I think they just stopped updating them or they’ve just embedded them into the core algorithm and it’s Google’s way of not talking about them. They just say, “Yes, it’s a core update.” That’s it; we don’t know much about it now. Eric: Given that you’re sort of like a journalist for the industry, I think a really good question to ask you is, “What do you see that SEOs are typically doing in response to updates?” Barry: Most SEOs that are talking about these core updates are looking at the site as a whole and saying, “What could I do to this site in order to improve it, from a user perspective?” Making the site’s user experience better, making the content better, making it just appear that the site’s more trustworthy, authoritative and around the E-A-T topic, and a lot of SEOs that I’m talking to are looking at things around, “Are the authors that are writing this blog post or the authors writing about this evergreen content someone that has the experience and history behind them to actually go ahead and say, ‘Yeah, this is actually authoritative content written by an expert on the topic, as opposed to just some low-level bloggers regurgitating whatever John Mueller says on an SEO blog called Search Engine Roundtable?'” Eric: Right. I think that’s good advice in general, and there is that low-level blog called Search Engine Roundtable, I understand, but actually I think the guy who writes on that is pretty good, so maybe you ought to read it. But with this whole idea of the quality of the content you’re creating, and having the right expertise behind the writing, I think is a really good one. I think that’s a good area for people to focus Barry: I just had a meeting with somebody on a high level, just talking about their website, and they’re like, “Should we invest time and effort into our content?” I’m like, “Yes of course.” Even so, you should even invest in building the personalities of your employees through the content of your website, because that’s not going to hurt you, it might help you. If you believe everything around what people are saying around content and authority and stuff like that, that definitely will help Google, but it also might help you get more publicity on news channels. You might be interviewed by CNN or Fox or think to extremes, or you might get on CNBC about a specific topic or be interviewed in some journal that’s related to your topic. It just helps you overall by going ahead and investing

  • Business Meeting

    Recently, I reached out to a number of enterprise SEOs to poll them on the biggest issues that they face. Participants included senior SEO people from Fortune 500 companies and other large enterprises. The question was free-form, so each person was allowed to submit whatever they saw as their biggest issue. I then mapped their responses into different categories based on the responses. Here are the results, in a nutshell: Close variants of “Getting buy-in from the rest of the enterprise” were easily the most common problem. If you’ve had the opportunity to work with large enterprises, this won’t be a big surprise to you. It’s a common issue that’s driven by the fact that the great majority of those working in the large enterprise typically don’t have any education about, or exposure to, SEO. Basically, they may consider SEO to be some sort of mysterious black art. To make matters worse, the results from SEO campaigns normally take some time to emerge, and can be difficult to directly measure. The executives may be accustomed to direct-response mediums such as pay-per-click, where the measurement is far easier and the results come nearly immediately. This is unfortunate, as the latest Jumpshot data from Rand Fishkin at SparkToro indicates that users are still far more likely to click on an organic result than a paid one: The above chart shows the data for both the US and EU markets. When we aggregate the data across desktop and mobile, in the US we’re still 6.16 times more likely to see an organic click from the SERPs than a paid one. That by itself should create a lot of motivation to focus energy on SEO campaigns! If that’s not enough, show them a view that illustrates how your competitors are making hay in SEO while you’re not. That view could look something like this:   Demonstrating that your competitors are making a lot of high-ROI dollars from organic search can often be an effective tactic. However, it may still not be enough. The uncertainty around what to do, how to do it, and when it will show results can hurt your efforts to educate senior managers, even if they are receptive to the idea. People who approve budgets typically want to have a strong sense of the ROI they should anticipate in return. This leaves us in a tough spot. What I usually recommend is to find one or two senior executives that are willing to listen. If you can get them on board with the understanding that SEO is a solid investment, you’ll have someone willing to advocate for it, even if you’re not in the room. This can require time and effort, but the investment here is well worth it. In many cases, the best way to start is to show examples of it working. Keep it simple. For example, show them a page that you intend to optimize, along with its current rankings and organic traffic. Pick something where the proposed changes are simple, such as updates to title tags and on-page content. Then, do the optimizations and show them the results once they come in. You can also do this after the fact. Choose pages that you’ve already optimized and show them the resulting benefits you obtained as a result.  A simple case study like this can go a long way in helping you find a champion for your efforts. Regardless, finding a way to get buy-in to SEO is critical to your success. Without the funding to do your work or the ability to get development to make the desired changes, you’re going to be hard pressed to show any benefits.

  • Image Recognition Accuracy Study - Featured Image

    One of the hottest areas of machine learning is image recognition. There are many major players investing heavily in this technology, including Microsoft, IBM, Google, and Amazon. But which one does the best job? That is the question that we’ll seek to answer with this study. Image recognition engines covered in this study: Amazon AWS Rekognition Google Vision IBM Watson Microsoft Azure Computer Vision In addition, we had three users go through and hand-tag each of the images we used in this study. The total number of images used was 2,000 broken into four categories: Charts Landscapes People Products Our research team at Perficient Digital used two different measures to evaluate each engine: How accurate the tags were from each of the image recognition engines (for 500 images). We’ll call this the “Accuracy Evaluation.” Whether the tags from the image recognition engine are the best match for describing each image (for 2,000 images). This is referred to as the “Matching Human Descriptions Evaluation.” If you want to see how we collected the data and did the analysis, please jump down to the bottom of this post to see a detailed description of the process used. Let’s dig in!

  • Eric Enge and Google's Martin Splitt Discuss SEO in Here's Why Video Series

    Google made an interesting announcement at Google I/O in early May 2019: they now prefer responsive sites more than mobile subdomains, and dynamic serving. In this episode of the popular Here’s Why digital marketing video series, Google’s Martin Splitt joins Eric Enge and explains why Google now prefers responsive sites. Splitt shares some key metrics to consider when developing mobile sites.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric Enge: Hey, everybody. I’m Eric Enge, and today I’m pleased to let you know that Martin Splitt is joining me from Google, where he’s a webmaster trends analyst based out of the Zurich office. Say hi, Martin. Martin Splitt: Hello, everyone. Very good to be here. Thank you very much for having me, Eric. Eric Enge: Absolutely. At Google I/O, you said something that was really interesting, that seemed different from what we’ve heard from Google before. Historically, there’s definitely been a sense that your recommendation is that people build responsive sites rather than mobile subdomains, and presumably also dynamic serving. But at Google I/O, something that you said really started to sound like you’re pushing the recommendation to go responsive a little more formally, and forcefully. Did I hear that right, and why? Martin Splitt: Yes, we did say something along those lines. I wouldn’t say that the recommendation necessarily changes what we have been recommending with RWD. But as the last year went by, we noticed that many, many people are having issues or facing challenges or struggling with getting the M-Dot URLs right. It’s another moving target. So many things can go wrong there. You can canonicalize it incorrectly, you can forget your structured data in one of the two sites, you can forget to link your M site, you can get your hreflang wrong. There’s just so much that can go wrong, and then you have to deal with two different moving targets. It seems to me that it is not just recommended to switch to responsive sites, but also that it’s probably a lot easier. We just wanted to highlight that, before you get yourself into hot water and struggle with the M-Dot domain. If you have one that works, okay. But before you do that, and if you want to become more mobile friendly, then probably consider investing in responsive web design. Eric Enge: If you have it right, then maybe there isn’t a great deal of urgency to switch. But my observation is that maybe even if your mobile subdomain is working really well now, it’s just a little harder to maintain too, right? Martin Splitt: It is. Eric Enge: So, would you actually recommend that people who have mobile subdomains or even dynamic serving setups actually invest in converting to responsive? Martin Splitt: If you have a working setup and don’t have any pain points, definitely do not change your running system. But if you are struck by the fact that there is an additional maintenance overhead, or you see that it’s like a hit or miss for you, then I would say it’s a good long-term investment to do responsive. Eric Enge: Got you. There’s just one other mobile topic I want to bring up, which wasn’t really covered in Google I/O: the whole issue of speed. I’m sure that was covered in Google I/O, but I’m not pulling something specific out of it. Talk a little bit about the importance of speed in the mobile environment. Martin Splitt: Mobile is usually using networks that have a higher latency, or are a lot more flaky than stable Wi-Fi connections or stable cable connections, so speed is a much, much more important factor. Oftentimes, we see that developers are not testing it on real-world conditions. It is sometimes just really hard to test it on real-world conditions on mobile phones. Mobile phones happen to have a different CPU architecture; not all of them, but some of them, and we do see that the mobile performance especially in graphics performance, as well as CPU performance, is just not the same as on desktop. We think it is important to emphasize that you should make your website as fast as possible, especially because mobile would suffer if you don’t. When we say fast, then the question is, what do we mean by fast? Speaking of that, you don’t just want to make sure that your server delivers the bytes really quickly over network because basically, the problem might start once the bytes have left your server, and it’s the transmission path or it’s on the device that is slow to parse the data as it comes in. You probably want to also look into things like time to interactive. How long does the phone hang until I can actually start interacting with the page? When do I actually start to see the content? If you’re using client-side rendering, for example, the problem is that your content is bound by the entire JavaScript having to arrive and be parsed, or is cycling the CPU. If it’s compressed, which it should be—that’s a very important factor—your compressed JavaScript needs to be uncompressed, which is usually pretty fast. But then it needs to be parsed. Is this actually JavaScript? Is this valid JavaScript? And it needs to be executed, and then it starts generating the content, which is again putting additional work on the CPU, which will also make the phone respond and make a website much, much later to input. So that’s not a good experience. You should definitely look into making your site faster. That’s generally a really, really good investment. Eric Enge: For years now, I’ve seen people obsessing about things like time to first byte. What you said was time to first interactive—that seems a lot

  • Eric Enge and Google's Martin Splitt Discuss SEO in Here's Why Video Series

    Google made several announcements at Google I/O in early May of 2019 about search and SEO. Images were one of the topics mentioned during the announcement. In this episode of the popular Here’s Why digital marketing video series, Google’s Martin Splitt joins Eric Enge and explains how visual search is vital to SEO and why Google places emphasis on high-resolution images.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Eric: Hey, everybody. I’m Eric Enge. Today, I’m pleased to tell you that I’ve got Martin Splitt from Google joining me. He is a Webmaster Trends Analysts based out of Zurich. Thanks for joining us today, Martin. Martin: Thank you very much for having me, Eric, and hello everybody. Eric: Today we’re going to talk about the announcements at IO that related to images and visual aspects of search. To start, maybe you could talk a little bit about the announcement made about high-resolution images. Martin: As you know, we now have the new Discover Feed. If you’re using an Android device, you have it on your home screen. If you’re using on mobile, then you will see that there. The search app on Android has it as well, and it can drive a lot of traffic, but also it is very keen on having good visuals. The question then is, what kind of visuals should we be using? We would like to use high-quality images that are a little bigger than usual— that would be nice. But we want our webmaster staff to have full control over what they’re giving us to show and discover, and also, we want to use these images in other surfaces. We have smart home devices with displays now: we have the Assistant, and we have many different surfaces that can show visual content and promote your site, basically. We announced the upcoming possibility to opt-into sharing high-resolution visual content with us: basically, images in high resolution that we use on different surfaces and different devices, and also in Discover Feed and other features. We have no timeline yet for when that opt-in program will start. Eric: So how do you actually opt in or enable the feature? Martin: The way to opt in is probably going to be based around structured data. You would use the regular optimized fast images for your website, so that your web content loads fast on user devices, but you would include a little bit of structured JSON-LD that points us to an high-resolution image that we can use. And probably, there will be settings around that in search console, as well. Eric: Got it. You mentioned that this will be part of the Discovery feature, for example. Do you see the high-resolution images ever coming into play via regular Google Search or image search? Martin: Probably, but at this point, I only know that we will do this in Assistant, especially the smart displays that we have with Assistant and in Discover, but there’s probably more to come. Eric: Got it. And how about 3D images? That’s the other thing that you talked about at IO. Martin: Yes, absolutely. Right now, we are working with a few partners on bringing AR capabilities and 3D models to Google Search. There are plenty of interesting use cases. If you have educational use cases, or if you have things like furniture or real estate, those might be really interesting to try out and have a more visual approach, as these are, by definition, very visual things. Lots of people are visual learners. You definitely want a special understanding of, let’s say, a piece of furniture. There are use cases where it makes sense, and we are piloting that. There are multiple teams working towards a common goal in this case. We have teams that work with WebVR and AR—actually, it’s called WebXR, which is like AR and VR in one specification—to bring these possibilities to the web. There are also teams working to make 3D models smaller—that would be the Draco Mesh Compression. We also work with other teams to make developers more comfortable and make it easier for developers to work with 3D content, as well. And we are participating in the standardization effort that is GLTF, which is basically like a JPEG format but for 3D models. Eric: That’s awesome. I think the functionality and the things that you’re putting into visual search these days are really amazing. Martin: Thank you. Eric: So how important is visual search going to be going forward? Martin: I think visual search is one of the most underrated search experiences that we have right now. I mean, image search can be a fantastic funnel for additional traffic, especially if what you have is a very visual thing. If your product is very visual, if your niche is about visual things—let’s say food or tourism or specific kinds of marketing—then I can definitely see that visual will become more important for users as we also offer them more engaging and more visual ways to discover and interact with content. Eric: Great. Thanks so much for joining us today, Martin. Martin: It’s been a pleasure. Thank you very much for having me, Eric. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • Eric Enge and Brian Weiss discuss SEO in Here's Why Video - educational video series on Digital Marketing

    User experience is becoming a part of SEO, but why does Google want to use it as a signal in their algorithm? In this episode of the award-winning Here’s Why digital marketing video series,  Brian Weiss explains how Google is using user experience to rank pages.  Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Brian: User experience is becoming part of SEO. Here’s why. Hey, Eric. Eric: Hey, Brian. Brian: I have a question for you. Eric: Okay, what’s that? Brian: Would you rather have a website that was magically guaranteed to rank number one for every query, or a website that was guaranteed to convert every visitor who came to the site? Eric: Okay. Now, you’re going to tell me you’re the magical genie who can grant me these wishes. Brian: That is what it says on my business card. Eric: It seems like you could do pretty well in either situation, but I bet you have a point of view on this, don’t you? Brian: I’m glad you asked. If I had a website that converted every visitor, I can get it to rank number one for everything. And as an SEO, I’m probably not supposed to say this, but if you can’t convert visitors to your site, you don’t have anything to optimize. Eric: But you seem to think that converting customers in itself will lead to better SEO results? Brian: No, not so directly as that. Eric: But if you’re satisfying 100% of visitors, you’re probably creating some excellent signals for relevance and overall user experience that maybe Google would be interested in. Brian: Right, and SEO gives us an interesting lens to look at user experience through. Eric: It is certainly what Google is trying to optimize for. They want to send users to what they think is the best experience for them. Brian: Yes, and if you think about how they’re doing that over time, using human ratings to feed their machine learning algorithms, then over time, they may come up with some very interesting signals that we wouldn’t necessarily think of as traditional SEO factors. Eric: That’s certainly possible. But as SEOs, how do you think we should best use that information? Brian: I think there are two parts to answering that question. First of all, we can look to the Google results themselves to get clues about what elements Google thinks are important for answering user needs for a particular query. Eric: That’s where something like our semantic content optimization tool, that analyzes content on the top-ranking pages compared to yours and tells you what your page might be missing, would actually come in really handy. Brian: Exactly, those are the pages that Google thinks are doing the best job of providing relevant answers to the largest percentage of user needs. Ideally, you’d go beyond what they’re doing, but you don’t want to just copy your competition. Eric: Got it. It’s hard to beat the competition if all you’re doing is copying them. Brian: Right, but it’s good to know what your starting point should be. Eric: Didn’t you say there was a second consideration related to how SEO should respond to Google optimizing the results for user experience? Brian: Yes, I did say that. Eric: Are you testing me here, Brian, or what? Brian: Exactly, Eric, user testing. Now, would you say that you just had a bad experience? Eric: Yes, I might say that. Brian: Okay, that’s excellent feedback. I won’t give any more passive aggressive answers. Eric: So, user testing. Brian: Yes, doing user testing for conversion rate, bounce rate and time on site—on the one hand, it’s just basic due diligence at this point, given the impact it can have on revenue. But I think it also can help to avoid some of the SEO pitfalls for those not-so-obvious indicators that Google may start using over time, through developing their own user testing and machine learning operation. If users love your site, you definitely have a better chance of Google seeing it as a user-friendly destination. Eric: Of course, that user engagement will create signals that Google does pick up on over the long run. Brian: For sure. Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why See all of our Here’s Why Videos | Subscribe to our YouTube Channel

  • A speaker speaking on main stage at SMX Advanced at 2019 SMX Advanced in Seattle, WA

    Along with my Perficient Digital colleagues, I had the privilege of attending SMX Advanced in Seattle from June 3-5, 2019. I’m still applying everything I learned from the engaging talks by great speakers from all sorts of backgrounds. Rather than offer a full blow-by-blow of every presentation, I’ll focus my takeaways from three key areas of search that reflected the strongest technical themes throughout the conference. These clusters seemed to be where the technical SEO eye of Sauron was focused for the week. 1. Structured Data and Semantic HTML Providing good website structure isn’t a new idea. In fact, it’s probably one of the oldest themes in web development—but it has been approached with renewed interest at the last few conferences I’ve attended or watched presentations from (Next10x, Google I/O, and SMX). Structured data and semantic HTML are two different major elements that have been brought up increasingly in 2019. Cata Milos of Bing talked about structured data and schema at length during the “What’s New With Schema and Structured Data” talk. Milos explained how Bing sees a web page, highlighting that Bing has evolved to become more visual, just as the web has. Bing has to “visually understand complex documents,” breaking them down into primary content (which is the content the user wants to see on the page), secondary content, and invisible content. This came up again in the “Periodic Table of SEO Success” panel (with Ginny Marvin, Detlef Johnson, Barry Schwartz, and Jessica Bowman), where we were advised to “start looking at using html5 semantic elements to define different parts of a web page.” Barry Schwartz noted that you should “make a website that Google is embarrassed not to rank.” Milos also noted in the structured data session that you should use semantic HTML elements, and they should cascade in the correct order. One note I found particularly enlightening is that heading tags (h1, h2, h3, etc) in descending order should visually match what they look like on the page. Bing compares the HTML tag with what it visually looks like. So, the most important heading on your page should be an h1, and it should be the largest heading tag on the page (don’t use CSS to resize it!). Another takeaway is about how Bing decides what structured data surfaces rich snippets—FAQ Schema currently has low adoption, and Q&A data is used by the top 1-2% of docs. Which is more likely to be well used, and useful to be surfaced? Similarly, Max Prin discussed the utility of structured data. He presented data showing that rich results generally end in a higher CTR—and that the rich result’s message can impact CTR heavily:   What a lot of this comes down to is the idea of “technical quality”—we as SEOs talk a lot about quality as a content metric, but focusing on having a good technical experience is really important too. The message is to align the syntax of your code with the semantics of your page. When Frédéric Dubut, lead of the spam team at Bing, and Fili Wiese, a former Googler who worked on spam and manual actions, spoke together on stage about algorithms and penalties, their focus continued in this way. Avoid serving manipulative or misinformed information in search. Wiese and Dubut mentioned recent experiments from SEOs showing different favicons in mobile search results and said that penalizing that made sense. When SEOs use incorrect favicons it’s a similar problem of misinformative structured data markup. Quick Takeaways: Use semantic HTML in concert with your content Use structured data Pay attention to technical quality in line with your content quality 2. Development Another major theme of this conference was the importance of talking to developers and understanding development. From the opening talk by Jessica Bowman, Detlef Johnson, and Alexis Sanders, SEOs were encouraged to interact with and understand their developer partners. Detlef Johnson focused on this specifically during the keynote, encouraging SEOs to get into code and understand the new paradigm of the web. SEO also needs to involve privacy and security. Perficient’s own Eric Enge touched on the more technical side of this in his discussion of mobile-first indexing and the issues that can come up when optimizing for mobile-first SEO. Types of problems found in our crawls of mobile subdomains include missing product pages, broken sitelinks between sites and ridiculous crawl depth levels. Enge also surfaced notes from I/O, where Martin Splitt heavily suggested using responsive designs rather than mdot URLs. The JavaScript panel had a deeper dive into the complexities of JavaScript SEO, starting with the mantra “JavaScript is not evil.” Hamlet Batista spoke at length on his experience with using JavaScript frameworks, and provided some great technical insights. He noted some pros and cons to using Angular for SEO. Among the pros: there’s no need for hash fragment URLs (history API paths by default), painless universal JavaScript, the same codebase on the client and server prevents compliance problems, and there’s basic support for titles and meta tags. He noted, on the other hand, that there is very basic support for SEO tagging, no built in support for structured data, that absolute and relative URLs can introduce errors, and there’s no built-in support for link tags. Robin Rozhon went more into how to monitor JavaScript, and how to use reporting to improve JavaScript performance. Rozhon suggested switching the user agent to Google Smartphone, then checking render time in the initial state vs. extending the JS timeout, vs allowing the site to load with cookies and storage turned off. Rozhon reminded the audience to verify the content visibly, to ensure that the user experience was aligned with the bot experience. Quick takeaways: Figure out ways to work with your developers Try using JavaScript frameworks to understand the SEO wins and pitfalls Monitor your pages for technical issues Build responsive sites rather than separate desktop and mdot sites 3. Voice Max Prin noted that Amazon Alexa uses structured data for local search, so that’s

  • Perficient Digital's Eric Enge

    You’ve heard that content is king, but today, content is more important than ever. Here’s why. Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been.  Note: Our future videos will start publishing on Perficient Digital channel, please subscribe to Perficient Digital channel Don’t miss a single episode of Here’s Why, click the subscribe button below to be notified via email each time a new video is published. Subscribe to Here’s Why Resources See all of our Here’s Why Videos | Subscribe to our YouTube Channel Transcript Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been. I’m going to start though by talking a little bit about Google’s algorithm updates over the past 14-16 months. I’m currently showing a chart for you that shows all the major updates that were called “core algorithm updates” by Google. It turns out that these updates all had a certain number of things in common. There seemed to be a pretty big focus on user intent and better understanding of user intent. They were looking to lower the rankings of poorer quality content and raise the rankings of higher quality content. But another element of it that I felt really emerged is a much bigger emphasis on the depth and breadth of your content. So, with that in mind, I want to jump into the case studies and show you some data. Here’s the first case study. This is in the addiction marketplace. The first chart shows the publishing volume of one particular vendor in that marketplace. You can see that there are wild fluctuations, but at times we’re talking about hundreds of actual new pieces of content being published every month, some months as high as 700. So, that’s the first data point. Second data point: Let’s look at the rate at which this site was adding links, that you see in this chart here. The linked volume begins to grow rapidly around the same time as the content volume started growing. And now for our third chart. This is the SEO visibility from Searchmetrics. You see that that begins to accelerate rapidly in May of 2017. So, it’s very interesting to see the correlation between the rapid content growth, the rapid linked growth, and how it drove massive changes in traffic to this particular site. Now let’s look at case study two. This one’s in the career space. And again, I’m going to start with a chart on the publishing volume for this particular company. The volume was actually moderately heavy in 2017, running about 45ish pieces of content a month. That’s pretty significant—one and a half pieces a day on average. But in January of 2018, this scaled into many hundreds of pieces of content per month. So, now let’s look at the “rate of links added” chart for this particular company. Here you see that the links did not really scale until you got into around March and April of 2018, when it has a really sharp spike. Now, what that sharp spike is actually showing us is: it turns out that that was due to a redirect of another domain to this particular domain, and so a lot of links transferred very instantaneously, if you will. Let’s look at the traffic chart for this particular company. The traffic actually scaled very rapidly after the links took off in May of 2018. What I like about this case study is that it shows us that the content publishing at a volume where the links aren’t really growing isn’t going to do much for you. You need to create lots of great content. It’s a key part of the picture, but if you don’t promote it effectively, you’re not going to get the right results. Let’s look at case study number three. This one is a consumer retail sales site. Let’s start with the publishing volume chart. This site has been adding content at a heavy volume for a very sustained period of time—it’s consistently in the thousands per month. Now let’s look at the rate of links added for this chart. This doesn’t have as sharp a spike as the second example I showed, or even as dramatic growth as the first example. Yet you do see that links are being added steadily over time built on top of a very strong base. Now let’s look at the traffic for this one. This is actually the SEO visibility chart again from Searchmetrics. In this particular case, the SEO visibility started at a very high level, but you get continuous steady growth over time, as supported by the strength of their publishing program and the rates at which they’re adding links. I have two more charts for you before we wrap up. This chart is data from a company called serpIQ that shows the correlation between ranking in Google and length of content. You’ll see from this chart there’s a clear bias for Google to rank longer form content. Now, before we go off and say that every page should have tons of content on it, it’s very dependent on the context. There are plenty of pages where you don’t need a long-form article. I’m not saying every piece of content or every page on your site needs to have a mass of text on it. That’s not the point. But from the point of view of informational content, it’s very clear that longer form is better And then another chart. This one’s from HubSpot. This data shows that longer form content actually earns more links. Now you can see how I’m making the connection here and drawing all the pieces together. One last chart. This one’s a bonus chart from a

  • ADA compliance and web accessibility are more serious than you likely know. Consider this scenario: You or one of your clients suddenly receives a letter stating that the website you administer is not ADA compliant and you’re facing litigation. Facing litigation? Now what! The best course of action is to proactively review your website for ADA compliance and ensure that it is accessible to people with disabilities before you get into trouble. The level of compliance necessary is outlined in the Web Content Accessibility Guidelines (WCAG) 2.0 (available here). These guides are quite detailed, but it will help you fully comply with the law and insulate your company from litigation because it’s comprehensive. A good place to start for website ADA compliance and accessibility is to use the following: Check the current state of your website accessibility with tools like WAVE and the Google Lighthouse tool (available in the Chrome browser) Ensure that all images have descriptive alt text Provide closed captioning on any videos your site may have Provide text transcripts of any video or audio only files Give users the ability to pause, stop or hide any automated content like email signups Use simpler design, be sure the website isn’t overly complex and provide options for adjustments to size/color of text and content Be sure your website supports keyboard navigation (think navigation between elements with arrows and tab keys) Provide support features so a person with a disability can contact the webmaster and receive a response Be sure any forms on your website have instructions for their use and that each form element is labeled with clear and understandable text Also, use the id and label HTML elements on form items Once the above checklist has been followed, it is advisable to have a legal professional review your website in light of the WCAG 2.0 guidelines.  


Most Read Posts

Enjoy some of our posts that get the most attention and readers: