The Latest in Technical SEO | Ep. 3
Connect with Lance Montana on
- Quickbyte: Suffering from Digital Overwhelm? Here Are Just 5 Digital Trends to Try That’ll Have You Ahead of the Pack in 2020
- Quickbyte: 5 Social Media Marketing Trends Your Brand Should Try in 2019
- QuickByte: Digital Marketing Trends We Want To Disappear
- QuickByte: WordPress 5, Should You Update?
- SEO Marketing Trends You Need To Know About | Ep. 4
- QuickByte: Website Design Trends in 2019
- The Latest in Technical SEO | Ep. 3
- QuickByte: How to Leverage Image Optimisation for SEO
- QuickByte: Brilliant WordPress Plugins to Improve Your Marketing
- QuickByte: How Much Does Site Speed Really Affect SEO?
Keyword On The Street
Woman: Welcome to “Keyword on the Street” podcast, presenting the latest developments in the world of SEO and digital marketing. “Keyword on the Street” is brought to you by Lance Montana, a digital marketing agency based in Brisbane, Australia.
Grace: Hello and welcome to “Keyword on the Street.” I’m here with Laurence today and our tech expert, Pav to talk about all things technical SEO, and in the wake of the massive Google update that happened about a month ago, technical SEO is as important as ever. So let’s jump in.
Laurence: Yeah, even more, important than ever I would argue. Great to be here, Grace. Hello. Hi, Pav.
Laurence: Yeah, so essentially, you know, we’ve decided to do this episode of “Keyword on the Street” about technical SEO, but bring it to the people and let you know what you can do from a technical standpoint to improve your website’s rank in Google because as the Google algorithm has gotten more and more intelligent and updates roll out with faster and faster frequency, you know, the number, quality, and amount of links you have is very important but it’s decreasing in importance compared to the quality of your website, the structure of your content, and how well you can serve that content to crawlers and to people. So this is a hugely important topic within SEO and, yeah, it’s great to have Pav here with us. He’s just flown up this morning from New South Wales, where he’s brought a little Champion’s Cup with him from the State of Origin. Actually you don’t care about the State of Origin at all, do you Pav?
Pav: No, I just go for the blues because you guys go for the reds.
Laurence: I don’t even know why I brought it up. It’s terrible. We should just forget about it. Yeah, cool. So, Grace, you are heavily involved in SEO at Lance Montana both in terms of strategy, content development, actual content editing and on-page optimization of websites, particularly WordPress, which is a specialty of ours. When somebody says technical SEO, what does that actually mean to you?
Grace: Well, technical SEO would be the portion of things that are definitely more technical – that a web developer might have an answer for you if you have a question. I guess the big technical SEO topics that we want to cover today are things like crawling, rendering, indexation, whether your site is mobile friendly, whether it’s got structured data on it, things like page speed are very important for technical SEO. And we’ve got a couple of other things there that we’ll definitely talk about in this episode.
Laurence: Cool. Awesome. Pav, does technical SEO even really matter?
Pav: Yes, 100% it does.
Laurence: Okay, easy question. True and, look, I touched on it earlier but, yeah, essentially because Google is getting, you know, more intelligent with the algorithms that serve results in search and to combat, you know, backlink spam and ugly active, kind of, link building. You know, Google’s putting more importance on the quality of websites and it’s also, I guess, just, like, in response to just the normal evolution of digital. You know, websites are more technical these days. They’ve got a lot more going on than they used to. Yeah, so yeah. Let’s get stuck into a couple of, like, really important, kind of, foundation concepts for people to understand so that we can build upon that. So, Pav, what is an index? And can I just point out that this has, like, got a beautiful resonance because we’re sitting here recording this podcast from the Brisbane State Library, which is a massive index in itself, looking out over the beautiful Brisbane River. But yeah, for the purposes of SEO, Pav, what’s an index?
Pav: To explain it simply, it’s basically like an index of a book. You go to the index to look for what you’re looking for and then that tells you exactly what’s there, just like what Google does. It goes through a website and indexes all the pages. So it knows what all the page URLs are and then from that index it generates the content for those pages, excerpts and, you know, depending on what schema you’re using you can actually build rich link descriptions like in the Google searches.
Laurence: Woah, woah, woah. I just asked what an index was. Let’s not jump into schema. You’re going to scare everybody off. Okay, cool. So, you know, Google basically is storing all of its web pages that it uses to deliver search results from in the index, the Google index, right?
Laurence: Yup, okay. Cool, all right. What about crawl?
Pav: I prefer to call them spiders.
Pav: Yeah, so basically what that is is they’re just little…just think of them as little bugs that crawl across the website and all of its pages and just absorb all the information and take it back to Google and say, “Here it is.”
Laurence: Okay. So, like, automatic software that crawls pages from the internet and then brings them back and puts them in the index?
Pav: That’s correct.
Laurence: Yes, okay. Awesome. Awesome. And some of the most important crawlers are the Googlebots. So they’re the Google-specific crawlers, right?
Pav: Yes, that’s right.
Laurence: But there’s other types of creepy-crawlies on the internet.
Pav: There are. There are plenty of different crawlers and you have to be careful in that way because it’s a lot of spam crawlers, as well. They end up, you know, indexing your website and putting indexes on the internet which don’t really have a very good rating or, in fact, are actually linked to spam themselves. So that, in turn, can actually affect your website’s rankings.
Laurence: Yeah. We’ve kind of touched on this before, haven’t we Grace? With a previous “Keyword on the Street” episode where we talked about the, kind of, reputational side of link building and making sure that you’re associating yourself with the nicest parts of the neighborhood, you know, on the internet.
Grace: Yes, exactly.
Laurence: Okay. Awesome. All right. So what about getting stuck into some actual, technical SEO best practices? One of the first ones we’ve got here that is something that, you know, should be on everybody’s checklists when you’re launching a new website is to use Robots.txt to tell Google which pages of your website shouldn’t be crawled. This might be counter-intuitive for some people because they’re thinking, “Okay, should we get all of our pages into this Google index and make sure it’s all readable? Get as much content on there as possible?” So why would we do that, Pav?
Pav: Yeah, definitely. Right. Everyone’s in the mindset where they just want to put everything out there on the internet but then they always forget that they’ve got login pages, admin pages, pages which allow them to actually change the actual content and look of the website.
Laurence: Oh, okay. Yeah.
Pav: So Robots.txt is basically a file where you’re just put in the link of the actual page you don’t want Google to look at and then…but the problem with Robots.txt is that it’s just a guide. It’s not something that will stop an actual indexing bot to…
Laurence: Oh, that’s a really good point. Okay.
Pav: …actually stop it from indexing, and that’s where when I previously mentioned, you know, you’ve got other indexing platforms on the internet which send bots to your website and they start crawling it. They will usually ignore the Robots.txt file and end up indexing your login pages, for example, and it just opens your website up to more attacks. So Robots.txt is meant to be there as a guide and any reputable platform which actually crawls your website will follow whatever is in the Robots.txt.
Grace: Right, okay.
Laurence: Right, okay. So, you know, some big crawlers like Baidu and, you know, like SEO software crawlers like Alexa and, of course, the Googlebots, you’d expect them to, kind of, follow your polite Robots.txt instructions.
Pav: That’s correct.
Laurence: But other less reputable, you know, kind of crawlers with, you know, mal-intentions, you know, may not. So it’s entirely up to the discrimination of the crawler itself.
Laurence: Yup. Okay. What about if I really, really, really didn’t want a page to be indexed by any crawler? How would I do that?
Pav: You’d usually put it behind…the best way to do that would be to put it behind a login wall so that it’s only accessible to people who have access to it, but then, you know, it really depends if it’s a really…if it’s really sensitive information, you might even want to lock it down to your own system or, you know, IP address or something along those lines, which just don’t allow anything on the internet to actually see it. But then again, you know, the hacker’s always one step ahead of us.
Laurence: Yeah, and I guess we can use a meta noindex tag, as well, for specific content?
Pav: Yes, we can. However, that is, once again, just a guide.
Laurence: Yeah. Yeah, yeah, yeah. Cool. Now, what about creating and editing Robots.txt files? You know, how do people do it? For instance, in a WordPress site, is it something that happens automatically?
Pav: It doesn’t happen automatically. There are many plugins you can use, especially for WordPress, which allow you to just create a Robots.txt file and you can just directly enter in there, you know, any pages you don’t want indexed.
Laurence: The number one plugin that we’ve used in the past and we’ve mentioned before and we’re huge fans of is Yoast for WordPress.
Laurence: It has a lot of functionality and that’s one of its features.
Pav: Yeah, and the other way to do it, obviously, is just to create a simple, plain text file. If you happen to have access to the file system for your website, you can just create a simple text file, name it Robots.txt, and just put in your URLs that you don’t want indexed and just upload them straightaway. So you don’t technically need a plugin but it just makes it easier for anyone who doesn’t actually have the technical know-how to access the file system.
Laurence: Sure. One of the biggest competitors to Yoast is All in One SEO Pack plugin. We’ve used that on the odd occasion but we’re pretty much just down the line fervent, die-hard Yoast fans. But also the All in One SEO Pack for WordPress is very good. If you’re using that, you can use that to create a Robots.txt file, as well.
Pav: I like All in One SEO.
Laurence: Yeah, you would.
Grace: Oh, do you? All in One?
Laurence: Trust the New South Welshman. Okay, awesome. Grace, what do we got next on the introduction to technical SEO best practices there?
Grace: All righty, so next HTTPS versus HTTP. What are your thoughts, Pav?
Pav: HTTPS all the way. It just makes the connection that little bit more private. HTTP, if anyone’s listening in on your connection, say if you’re even on, you know, a cafe Wi-Fi where it’s not secured, people can actually…whoever are logged into the same system can actually look at your internet activity, whereas with HTTPS, S standing for, you know, secure, they can see what’s going on but they can’t read it. So they can see data coming in and going out, but they can’t actually tell what’s going on. Like what the actual data is. So…
Grace: So it puts up another wall?
Pav: Yeah. So it’s definitely something that is more preferred. So Google is actually putting a fair bit of weight on that. Especially, you know, in the coming few months it’s going to I think really, really ramp up.
Grace: Yeah, as a ranking factor.
Pav: As a ranking factor.
Grace: Definitely, yeah. Yeah. So you’ll get left behind if you still have an HTTP site.
Laurence: Yeah, no doubt and this is something that is also…it’s been championed by the browser. It’s the main search browsers, yeah? So Mozilla and everybody’s favorite Internet Explorer and Chrome and Safari. You know, they display warnings, you know, to people that have got unsecured sites these days in many cases. So if you don’t want people to be consuming your website content with an ugly warning sign at the front of it, then you need to make sure that you’ve got a security certificate installed against your domain. The good news is that these days that, you know, can be free to do. You used to have to pay cold, hard cash money, you know, and there’s all kinds of different security certificates you can purchase and that would cost upwards of, you know, up to the, kind of, $500 mark.
Pav: Some even more.
Laurence: Yeah, but these days, you know, a lot of web hosts will provide them for free as part of your hosting package. The takeaway here is that there’s no excuse now not to have HTTPS protocol implemented for your website. Okay? You’re going to get hurt by poor engagement. People are going to be worried about why it’s unsecured and it’s free to do, so just get on it. Do it.
Grace: Yeah. I mean, I guess sometimes having those little warnings come up on your website would be enough to turn some users away. So…
Pav: Oh, yeah.
Grace: And along that line, as well, another thing that can turn users away is when your site takes longer than a few seconds to load. So site speed. Let’s talk about that one.
Pav: Yeah. Site speed is a big one. A lot of the time it just comes from large images on the website and we just need to make sure that the website actually has, you know, images that have been scaled down for internet viewing.
Laurence: Absolutely. This is super important. Grace and I have talked about this in a previous episode. Image optimization for SEO.
Laurence: I think that was a Quick Byte episode, wasn’t it?
Laurence: Yes, we’re including it as part of technical SEO because site speed is actually a huge formula with a whole lot of different inputs, you know, determining how fast your website is. But, Pav, you hit the nail on the head, like, in terms of technical SEO for non-technical people, image optimization, just reducing your file size of your images, is the quickest and easiest thing that you can do and it’s probably going to have one of the biggest effects on your overall site speed, as well.
Laurence: So what are some of the tools that we talked about people can use to do that?
Grace: Yeah. Bulk Resize Photos is a great one. We’ll put that in the show notes. You can also use the Adobe programs like Photoshop and Lightroom to reduce image sizes in bulk. Otherwise, if you go to WordPress site you can reduce image sizes in WordPress but you usually have to do it one at a time. So just a little bit slower.
Laurence: They don’t need to be a little bit slower in this day and age, do they?
Grace: No, exactly.
Laurence: Okay, so that’s a great one. Pav, what do you use to, like, test an actual website’s speed?
Pav: Well, the first thing I do is just log on to the website and see how long it takes. If I get bored, then it’s too slow. So…
Laurence: And that’s presupposing that you’ve got an excellent internet connection.
Pav: Yeah, that is true.
Laurence: And we all live in Australia so that might not be the case here.
Grace: Not reliable.
Pav: True, true. The other things I do, I just go onto some speed testers, website speed testers online. So, for example, there’s GTmetrix which, you know, tests your websites. It gives you a full rundown, gives you a little graph.
Laurence: Pretty waterfall charts.
Pav: Yeah. Little graph which shows you what loads, when it loads, how long it took to load which, you know, can give you a bunch of information and you can adjust your website…
Laurence: For sure.
Pav: …which helps and there’s another tool, as well, which I actually use more often, which is Pingdom.
Laurence: Tools.pingdom.com. It’s my favorite.
Pav: Tools.pingdom.com. Yeah.
Grace: Yeah, that’s a good one, too.
Pav: And, yeah, that does basically the same thing as GTmetrix but it has an advantage for us Aussies here because they’ve got servers based in Melbourne.
Laurence: Yeah, you can switch the server between…actually, I’ll open up here. Melbourne, New York, San Jose, and Stockholm.
Pav: So, yeah, Pingdom’s the way to go if your website is mainly focused at the Australian consumers and then other than that, there are others which I don’t use as often but I should. For example, Google PageSpeed Insights.
Laurence: This is kind of a controversial one for me personally.
Pav: It is.
Pav: It is. I’m not a big fan of it.
Laurence: Which is unusual because we’re, kind of, like Google fanboys and girls and we have to be because we’re, you know, operating a business within the Google ecosystem essentially. You know? But I’m just surprised by how wrong the Google PageSpeed load test is.
Pav: Yeah, see…
Laurence: How often it is wrong.
Pav: Yeah, it can be arguable. Is it wrong or is it just picking up on something…
Laurence: No, no, no. Well, that’s the thing. It’s definitely wrong because, you know, I’ll test it in parallel with Pingdom and GTmetrix and, you know, a visual, manual test of actually, you know, loading up the website content in various browsers on my computer. And I can see that there’s an outlier in the test results. I’m doing them all at roughly the same time, you know, and it’s the Google page load speed test, which is usually the one that’s furthest away from the other three.
Pav: And it’s not just the actual page speeds that it has issues with. I don’t actually know whether it’s testing for something that the others aren’t testing for, but it’s not just the page speed. It’s the actual other things it picks up, as well.
Laurence: The recommendations often seem wrong… compared to Pingdom and GTmetrix.
Pav: Yeah, so it almost seems like that there is an extra, like, effect or criteria that it is using that the others are not aware of. Well, that’s what it looks like, anyway.
Laurence: But then it struggles to actually implement that extra layer correctly in a lot of cases.
Laurence: Yeah, cool. Okay. So far we’ve talked about using Robots.txt to tell Google which pages shouldn’t be crawled. We’ve talked about the importance of using a secure protocol, HTTPS. Go out and get yourself a free security certificate and install it against the domain for the website. And we’ve talked a little bit about site speed, although we should maybe…you know, we should talk probably a little bit more about site speed. So we’ve talked about how to test for it and, you know, one of the major things you can do, which is optimize your images. What other things can and should people be doing to improve their site speed? Keeping in mind that obviously some of these things are going to be for a web developer and, you know, are there other things that non-technical agents can do?
Pav: Yes, definitely. So the next biggest one or it’s actually arguable if it’s the biggest or not, would be caching. So server-side and browser side.
Laurence: Yeah, this is a huge area. Can you just explain in really brief layperson terms what caching is?
Pav: Caching is…how can I put this in simple terms?
Laurence: Because most people can’t even say, like, don’t even pronounce it correctly. That’s just my personal opinion.
Laurence: Yeah, exactly.
Pav: In terms of websites, it’s basically, say…okay, it’s going to be a bad analogy but it might work. Think of a jigsaw puzzle and you’ve got all the separate pieces that came together to make that image. So caching is basically taking that and getting rid of all the connections and just making it one large image so that you don’t have the individual pieces. You just have the end result.
Laurence: Yeah, okay. All right, that was a really interesting visual…
Grace: Yeah, that was great.
Laurence: …demonstration. I didn’t know where you were going with the jigsaw puzzle. Okay, I’ve just, like, Googled it so we can give, like, a really, kind of, stock standard definition, as well. But essentially it’s a different area of memory, right? So it’s, like, storing things in memory that’s more easily accessible. So front-loading it. So it’s available there but it’s not in real time. So people are accessing, you know, cached memory. So this might not be the most up-to-date version of your website but it’s one that is very easily accessible.
Pav: Yeah, so it’s…we’ll have to use memory lightly there because it actually is just on your drive space, your disk space, it’s just the end result of what the system had to go through to generate that page. So it’s usually just a plain HTML file which is not dynamic. So it’s a static page and that’s what displays to you so the server has to do the least amount of work to present information to you. So as I mentioned before, there’s two types of caching. So there’s the browser side and the server side. The browser side, what it does is when you actually go to a website, it downloads all of that content and stores it on your computer so when you go back to the website, it will actually access what data there is on your computer, rather than pulling it again from the server which makes it a lot quicker because it can, you know…
Laurence: It’s also putting less load on the actual website…
Laurence: …and the website servers.
Pav: Yeah. Because of that, it becomes quicker and then each of those items, images, you know, the assets, the actual pages, they have a time limit on it set by the web developer and every…say they’ve got a time limit of 10 days. Every 10 days, if you access the website again it’ll download fresh content so that you’re always up-to-date, as well.
Laurence: So you’re…the web developer is telling the website to tell the browser to cache its content on a certain frequency of time.
Pav: Yes, that’s correct.
Laurence: Yeah, yeah. But if somebody was…a content editor was to go into that website and upload a new page, would that go into the cache?
Pav: No, adding new pages is fine. Adding new content is fine. A lot of the caching happens for assets on the website. So images, the style sheets, the Java Scripts, all the things that make the UX experience, like the user experience. Content, on the other hand, will really get cached on the browser side. Content gets cached on the server side. So things like WordPress comes with a lot of caching plugins. So, for example, WP Super Cache. They also have W3 Total Cache?
Laurence: Yeah, you’re right. Total Cache, yeah.
Pav: And what they do is go through your website and generate a snapshot of each of the pages and store them. So that way when someone requests the page, it just gives them that one static page that’s already been generated, rather than generating it on the spot. Which makes the way that it actually presents it to you much quicker because the server doesn’t actually have to do much work at all, and then when a content uploader goes in and changes one of the pages, updates a page or adds a new one, it updates that particular cached page in the system. So that way…
Laurence: So you can still use caching and not worry about the content that’s getting served to website visitors being dropped?
Pav: Yes, that’s correct. Yeah.
Laurence: Okay. In a nutshell, what should people do to utilize the power of caching to improve their site speed? Would you recommend that somebody…let’s talk about a couple of different case scenarios. So somebody who doesn’t have ongoing access to a web developer, they might be a smaller business, they’ve got a good website and, you know, perhaps it’s being maintained but they don’t have, you know, a web developer or an agency on call. Is there something that they can do themselves without contacting an agency for assistance?
Pav: If they’re running a CMS such as WordPress, there are plugins, as mentioned before, that you can use for server-side and browser-side caching. So if you want to do server-side caching, you want to cache all of your content so that the server can give them to the user a lot quicker, you can use things like WP Super Cache. And it’s simple enough. It is a little bit technical but when you go into it, most of the things are self-explanatory. It just asks you questions such as, you know, how long do you want each page to be cached for before it checks for a new version? And it can go through and, you know, just things like that. There’s another option in there where it actually super caches the pages. So it doesn’t just cache the pages that people have tried to access. It goes through the whole system and it caches every single page. So in that case, at the beginning, it does take up a fair bit of load on the server because of trying to cache all the pages, but then what that means is that the first time anyone actually accesses a page, it’s already there to present to them.
Grace: So once this is implemented, how much will it affect site speed?
Pav: Well, I have seen caching a website both…if you cache your website both browser and server side, I’ve seen speed increases of up to 80% to 90%.
Grace: Wow, okay. So it’s definitely worthwhile doing this.
Pav: Hundred percent.
Laurence: It’s pretty much an integral part of, you know, superior website management.
Laurence: Yeah. I don’t think there’s any high-performing websites out there that don’t utilize caching of some kind.
Grace: Okay, yeah.
Laurence: Okay. Cool. Image optimization and utilizing the vast powers of caching to improve your site speed. Anything else that you think the non-technical, you know, audience we’ve got out there should be doing to assist with their site speed?
Pav: A lot does also depend on the type of server your website is sitting on. So if you go with a cheaper, shared server it’s going to be a little bit slower. Whereas if you go with a more powerful, dedicated server it’s going to be super quick. So it also, you know, comes down to what you’re willing to spend on your hosting, as well. That can range massively in prices.
Pav: So it depends on how heavy content-wise your website is and that should determine what kind of server you put it on. So these three, I think, things would be probably the major, major factors in website speed.
Laurence: Yeah, cool. So super basic little one-pager website which has got some images and some text and no, you know…not calling any membership database functions or anything like that and doesn’t have a huge amount of website traffic can get away with the cheaper server host environment, right?
Pav: Yeah, yeah.
Laurence: But if you’re running any kind of significant website that’s running, you know, functions where it’s, you know, checking membership of certain systems and users having certain permissions and it might have a hidden [SP] video that’s stored locally on the database, then the power and the resources of the server become massively important in determining the site speed. Don’t they?
Laurence: Okay, awesome. Well, what’s next after site speed, Grace?
Grace: Site speed. So we’ve touched on security already. We’ve got a point here cross-browser compatibility. I’m not, like, super familiar with this so I’d love to school up on what this means.
Laurence: Well, you’re in the right place, Grace.
Laurence: The right place at the right time.
Pav: This touches on my favorite browser, Internet Explorer 6.
Laurence: Everyone’s favorite.
Pav: Cross-browser compatibility. Now, this is where the web developers start ripping their hair out. So something that works on one browser does not necessarily work on another browser. So when you put that into perspective, there are hundreds and hundreds of browsers.
Laurence: Because there’s major browser platforms like Internet Explorer and Mozilla and Chrome, but then there’s heaps of versions of all of those magic platforms and then there’s quite a lot of other browsers that you’ve probably never heard of that get significant amount of users.
Pav: Yeah. I have on my system, I’ve got Firefox installed, I’ve got Chrome, and I’ve also got Opera, which is another one that I’m actually testing and also Safari. A lot of the browsers it’s easier to test for. For example, Safari and Chrome because they use the same engine. They use WebKit. So if something works in Chrome, it usually works in Safari.
Laurence: Definitely have come across cases where they don’t for some reason.
Grace: Oh, yes.
Pav: But yeah, so it’s important to actually test and especially the first thing you want to do when you build a website is have a look at your user database or user base and see what percentage of the users are using what browser. Because you want to really build the website properly for those browsers and then you want to have graceful degradation for the rest of the browsers. So graceful degradation meaning if something doesn’t work, then how can we present it in a different way? Maybe take the UX away a little bit but still have it functional.
Laurence: Yeah, if some element of the website is not going to render properly in an older version of a specific browser, you don’t want the whole website experience to be totally thrown out the window. You want people to still be able to consume that content and not think that there’s something majorly wrong going on. That’s the idea of graceful degradation, yeah.
Pav: Yeah, so there’s two ideas. There’s graceful degradation and progressive enhancement. So graceful degradation works on the fact that you build it for the most popular browsers and then you degrade the user experience across the other browsers that can’t handle it. Progressive enhancement is you build it as simple as possible and then enhance the user experience for the most popular browsers. So there’s two ways you can go about it.
Laurence: So that’s, like, a mobile-first development strategy, isn’t it?
Pav: Yeah, that’s another podcast I think.
Laurence: Yup. Look, I think the basic secret sauce to cross-browser compatibility is just to tell people that if they’re not using the most recent version of a browser, they can just go onto somebody else’s website but that actually is obviously impossible and it’s crazy how many people are using, you know, five eight-year-old browsers. So it is an important part of web development and rather than have every single version of every single browser installed on your operating system or…
Pav: And computer [SP] device.
Laurence: Yeah, and all the different devices, if you’re serious about making sure that your website works on different browsers, you should use a tool like BrowserStack is the one that we use. It’s probably the market leader, the biggest one out there, and there’s all kinds of different…you know, you don’t have to spend too much money to get the entry-level functionality which will allow you to view static, kind of, image snapshots of how your page content is going to look in all the different browsers. If you want, like, interactive ability through all the different browsers, you’re going to have to pay a little bit more. So it just depends on probably how big your website visitor audience base is as to how much money you want to spend on this, and you can use Google Analytics to just have a look at how many website visitors you’re getting and check out which browsers and operating system combinations they’re using to access your site content.
Pav: Yeah, that’s the way to go.
Laurence: There you go. Grace, look, you learned something. Cross-browser compatibility.
Grace: Yeah, I’ve learned something. Yeah.
Laurence: Okay, so the last technical SEO tip we want to impart today is to use structured data to drive rich results in your SERPs. Okay? So breaking that down, rich results is all of the, kind of, interesting little bits and pieces that give you more information than just your standard organic listings in search engine results pages. So an example of that, Pav you were talking about earlier was for the Brisbane Powerhouse.
Pav: Yeah. So especially things like events, what you can do is use a certain schema which actually tells Google or any other indexing platform what kind of thing it is. So, like, an event or a product or something like that. And what you can do is actually…it actually does get pretty technical. There’s, like, coding and scripting you need to do for XML, which actually tell the indexing platform what…what’s the word?
Grace: What the content is.
Laurence: What the type of content is. It’s a label, isn’t it?
Pav: Yeah, it’s a type of content and what its attributes are. So for an event…
Laurence: So it’s metadata.
Pav: Yeah, it is.
Laurence: It’s metadata. It’s telling Google this kind of content is this kind of content. Yeah.
Pav: And what you can essentially do with that kind of content is say, for example, you’ve got an event coming up. You can set a title, you can set the dates and times, you can set up, you know, if someone’s playing at this event, and what happens is when someone searches for this particular event in Google, for example…
Grace: Or even just, you know, events near me and things like that.
Pav: Yeah, yeah. Even that. What’ll happen is it’ll actually, instead of going to your website, it will just bring up a listing of this event that you guys are hosting and it’ll actually tell you what the dates and times for it are.
Laurence: Yeah. Showing rich results…
Pav: Rich results, yeah.
Laurence: …in Google.
Pav: That’s exactly right. So…
Laurence: And people love to click on those rich results. It really stands out. You know, it’s kind of the Holy Grail, really, in the SERPs.
Grace: Yeah and it usually comes up at the top of the Google results, as well. So it’s a really good way of getting to the top of the ranking page. Yeah.
Laurence: Now, as Pav mentioned, you know, doing it manually is super technical but you can use tools like the Data Highlighter and the Markup Helper, which we’ll put links to in the show notes, to add structured data to your content. And you can test it with the Google Structured Data Testing Tool.
Grace: Yeah, and if you’re on a WordPress site there’s a bunch of plugins, as well, that’ll just give you fields to add in this data really simply and easily. So we’ll put a couple of links to those plugins in the show notes, as well.
Laurence: Sounds great.
Grace: All righty, so that wraps up our podcast today on the technical aspects of SEO. If you liked today’s podcast, we’d love it if you could give us a rating on iTunes and hit us up with any topics that you’d like us to cover in future podcasts. Thanks, Laurence and Pav.
Pav: Thanks, Grace. Thank you.
Woman: Thank you for listening to the “Keyword on the Street” podcast. This has been a production of Lance Montana, a digital marketing agency based in Brisbane, Australia. For more great free resources, go to lancemontana.com.au.
In the third episode of Keyword on the Street, we’re joined by our Technical Director Pav Ratra to talk about the latest in technical SEO.
Topics covered in this podcast:
- Why technical SEO is so important.
- Everything you need to know about indexes, crawlers, robots.txt, structured data, cross-browser compatibility and caching.
- Why HTTPS is a MUST for your website.
- How much site speed will influence your search traffic and how to speed it up.