Webmaster

English Google Webmaster Central SMB site-clinic

Google+ Pinterest LinkedIn Tumblr

all right welcome everyone to today's Google Webmaster central office hours hangouts my name is John Mueller I am a webmaster trends analyst here at Google in Switzerland and part of what we do is talk with webmasters and publishers like the ones here in the hangouts and the ones that submitted tons of sites to check out so what I did here is a very brief kind of presentation about some of the issues that I found there some of you submitted lots of information in the site clinic and I almost prefer to like send you something a response back directly so one thing you could do is post in the help forum to either get advice from other people other webmasters who might be in similar situations or just to me the note that you submitted for the site clinic directly on Google+ and I can try to see if I can get you some information there too all right let's get started so I looked at about 30 of the sites that were submitted I selected them kind of randomly using Google Spreadsheets random number generator and filtered mostly for brick and mortar businesses which is kind of tricky because a lot of businesses are active mostly online but they're actually physically located somewhere so where do you categorize those of the ones that I reviewed most of them were mobile-friendly so that was a great start let's take a quick look at some of the the things I looked at for these sites this is obviously I think a a rough way to start and after you look at some of these general categories you really need to dig into the specifics of the actual site what what makes this site unique and what's kind of different compared to other sites so one thing I looked at is crawling is that roughly okay indexing are we roughly indexing the content that appears to be on the site are we crawling enough to kind of keep up with index manual actions these are visible in search console does it kind of work so I check these all on a phone browser I use the chrome mobile emulator to to look at these sites and I think that probably makes sense because most of the time or at least in a lot of cases more of the users are actually using mobile phones themselves so you need to check where the users are I looked at the queries that these sites rank for to get a rough understanding of are they mostly brand oriented queries are they kind of mostly broad queries does it does the site have matching content for these queries or is it kind of ranking accidentally or is it not ranking for queries where it should be ranking then I looked at geo targeting specifically href Lang to see if that was roughly set up correctly this is something that's kind of easy to check in search console as well and then I looked at something that's I think probably not so typically SEO topic but is it kind of understandable if you open the page in a browser would a user understand what the site is about if they clicked on a search result do they feel kind of at home and understand what they're looking at and where they need to go from there so starting with crawling which is I guess the the basic foundation that you need to show up in search we limit crawling based on the server so we try to avoid causing problems with the server with other users on the server so we watch out for things like response code server-side errors network issues timeout and when we see a mass of these we'll assume that we're crawling too much and we'll slow down this is a bit trickier now because for each HTML page that your site has we actually have to try to render this page to see what it actually looks like that means we have to pull in a lot more content from your server to look at each page individually very few sites had problems here because I guess most servers are pretty pretty awesome nowadays and can kind of keep up with the load that Googlebot and lots of users put on it so that was a good thing but for sites where you do think this is a problem solutions could be to kind of attack this from multiple sides on the one hand you can just get a better server get a CDN host things so that they're faster the other things you can do is to kind of avoid Googlebot having to access all of these resources anyway so you could put fewer embedded resources on a page you could make sure that the embedded resources are easier cacheable so that they're smaller that they have cache headers where appropriate and you can make sure that your site has less duplicate content so in general for most sites when we look at duplicate content having something like I don't know two three times that the number of URLs as you have content generally isn't the problem and that's something that a lot of sites have like dub dub dub dub dub dub technically that's super good content we could crawl those URLs twice and find the same information if you have ten times that's obviously a little bit of a bigger issue but the real problems kind of come up when when we see things like I don't know hundred times the duplicate content where theoretically we'd have to crawl a hundred times all of the URLs to actually get everything and some of the sites that had problems with regards to crawling were these here listed on the bottom so if these are your sites you might want to look into ways to kind of make it easier for Googlebot to actually crawl your content the other thing with regards to speed is obviously it's not just Google BOTS kind of accessing the individual URLs but users want something reasonable in fairness to one of the sites that was submitted was really awesomely fast the UI is a bit clunky it tastes kind of hard to use sometimes but it's amazingly fast you can click around and it just kind of loads almost instantly and another site that was submitted was really really slow and that's something or you'll probably see users kind of jumping off and not kind of digging into your site as much the other thing to keep in mind is now that we render pages as well we have the same problem here so if a page renders really slow then that means we might not be able to render this page directly in one go we might have to kind of take multiple tries to actually get this content so that we have it in our index whereas if a page renders really quickly has few requests that's something that we can pick up an index of content a lot faster web page org is a testing tool that you can use for this it creates these nice waterfall diagrams also videos of your page loading that's something I definitely recommend looking into and seeing what that pulls up with your site mobile-friendly is of course a big topic most of the sites were mobile friendly you can test your own sites or pages from your site using our mobile-friendly testing tool which is linked here the other thing to keep in mind here though is that it's not just from a technical point of view but you really want to make sure that it's actually usable on a mobile device so some of the things I found were for example interstitials on mobile which are really painful because how do you click that stupid X to make that interstitial go away and everything loads a lot slower because of this extra interstitial that's loaded sometimes the user interface isn't really that easy to understand and to figure out and sometimes things like images are really hard to recognize as well where the testing tool will say well this is an image it's fine to have an image here but as in this case here this image has almost microscopic text on it and it's almost like you'd want to read what's on this image but you can't really read it at all so these are things to watch out for we're technically is I might be mobile friendly but from a practical point of view it's not really that friendly another thing I noticed with at least one of the sites was that we were indexing some of the mobile URLs so specifically if you have separate mobile URLs make sure that you have everything set up properly that you have the rel canonical from the mobile version back that you have the link element from the desktop to the mobile page so that we can understand this kind of connection between the desktop and the mobile pages and kind of make sure that we're we're able to understand that properly and that's something you won't necessarily see in the testing tool because the testing tool checks for technically does this page load as a mobile-friendly page it doesn't check for the situation where is this really from an indexing point of view recognizable as being mobile-friendly manual actions web spam none of the sites had manual actions no web spam in in the sites that I reviewed which was really fantastic some sites had some older quality issues or older link issues for example this site here still has a bunch of really spammy links that are really old that are really pulling the site down so that's something you might want to clean up to make sure that you're ready for the next web spam algorithm update understandability I kind of mentioned this in the beginning I think this is something that a lot of small businesses sometimes have trouble with and it kind of comes back from the general situation where a lot of small businesses don't really understand what what kind of they're the unique selling proposition is what's the unique thing that they offer that makes their business unique that means that kind of the reason why people should come to your website and if you don't have this really visible on your web pages from the start then people who come to your pages might not really recognize that this is actually what they're looking for and these are some of the pages that I ran across from the psych clinic which were submitted or I'd say well I might be able to figure out what some of these pages are about or if I dug in and kind of browse around a bit I'd understand what it is that they're really trying to offer like me but at first glance it's really hard to tell and the sad thing here I think is that for some of these sites they're really well made sites otherwise really nice to look at they have a lot of content but the average user when they come to this page and I don't really know what what to do they can easily get lost and here's some comparison for example some of the other sites that were submitted where maybe they they don't look as fancy but it's really easy for people to understand what it is that they can do here what they can click on what this kind of site offers what kind of services or products that you're selling and this is kind of reflected on our side in search as well in that if you don't tell us what you're actually offering what you're really fantastic at as a business then that makes it really hard for us to figure out what we should rank your site for so kind of tell us what what you'd like to rank for what you'd like to offer people what people are what you want people to search for your site for it also makes it a lot easier for us to pick a good title and a snippet because we if we have that text on the page we can choose that and of course like I mentioned it's easier for users to understand what they landed on which kind of means that you're not just ranking for that in that first click you're kind of ranking because people are actually going into your website and looking around a bit more so it's it's something where you're the ranking that you have isn't something that how can I say it holds people back almost from kind of coming in and converting and actually buying something but but rather they're they're able to come in and understand what your site is about use it properly and then move on to conversion because that's kind of what you really want you don't want to show up in rankings you want people to come to your business and actually buy something right another thing I notice which which kind of came up here and there with the sites that were submitted these were to kind of sample questions that were in the comments that were submitted is people that want to rank for the brand name and people that say I can't rank unless people search for my brand name and that kind of comes back to I guess a general competition question with regards to like the search terms that you're trying to target is that something that your site is actually able to rank for does your site have kind of this the strength with regards to content with regards to other signals to rank for those queries so if you're trying to target a market that's already really saturated with some really good players and it's hard to get in there unless you you're getting in there by being visible through your brand name for example so if people can search for your brand name and find your site then that's a great way to kind of start being active on the web and to kind of get your existing customers to find you on the web because you can say well just first search for my name you'll find my business I'll be there on the other hand if your brand name is something like like in this case here where it's essentially a business type and a city name I don't know pizza Zurich for example then that's going to be really hard to kind of rank as a brand name because it's actually a combination of generic terms and it's kind of yeah takes away that advantage of having a brand that people can try to search for so that's something that's obviously really hard to change afterwards but if you're looking at the situation where you don't rank for anything generic and people can't search for your brand name because it's also too generic then maybe it makes sense to say well I'll find a name that's kind of unique that makes it easier for people to find my website at all on the web a trifling completely different topic this is more of a technical issue I'd say most of the sites that were submitted didn't use href Lang they didn't have anything unique with regards to different languages or countries which is perfectly fine you don't have to target all countries individually some of them did set up a trio flank properly which was fantastic to see and some of them I saw had problems with with the way they set up the href Lang and the problems with href Lang you can almost always see in search console they'll be flagged there and the issues that are flagged there are actual issues it's not that we're kind of looking at it from a theoretical point of view but actually we tried to crawl this page we looked at the href Lang we weren't able to find their matching link for example the most common issue I see with a trio flaying is that it needs to be between the canonical URLs so if you have one canonical page for in this case Irish and one canonical page for Great Britain then the href Lang needs to be between those two pages and we need to kind of understand the canonical that you have and accept that so in this case here for example the the Irish page has a slightly different URL than the English page here and we found the atrial flank to the English page here on the Irish page but we found an HR flank back to a slightly different Irish page from the English page so this is something where the the canonical that was picked up for indexing wasn't the canonical that you wanted to have used for href line and that's sometimes really tricky to figure out where that's coming from so what I just recommend doing there is especially if you're using href Lang make sure that you do everything you can to have the correct canonical selected that means using rel canonical where you can using 301 redirects if you know this URL should be replaced by a different one making sure all of the internal links are pointing at their preferred canonical version that you have those URLs and your sitemap file as well so that really all of the signals that we get are pointing at exactly that canonical URL that you want the canonical URL that you want to have used a bit for the href Lang pairs and one way you can kind of check that in most cases is just to do an info query for the URL so just info : and the URL and then you'll see the URL that we actually pick as a canonical for that specific URL all right and I think this is pretty much towards the end so we move over to questions soon one issue that I notice is some sites could use href Lang but they aren't you can kind of recognize this on your web site if you have multiple language or country versions and if you're seeing users going to the wrong version of your site so if you have a German and a Swiss version of your site and you see Swiss users going to the German version of your site or seeing that as impressions in search console then that's pretty strong sign that people are probably going to the wrong version and that you could improve that by setting up a try fine this is particularly common for situations where the same language is used in different countries so like I said German Austria Switzerland this is a local example I guess this probably also the same for for Spanish and Spanish in South America and it's also very common for kind of location type issues where people are searching for a location name and that could be in multiple variations on your web site and it's hard for us to recognize from the query alone is the user searching for an English or a German word and then we'll try to kind of fall back to the users location and pick the the more preferred version there one thing you can also do with a trifling is just use it on parts of your site so you don't have to put it across all of your site maybe just pick the important landing pages or just pick the pages where you see that this is kind of being mixed up in search so that we can focus on those pages to kind of start off with and here are two example sites where I thought maybe setting up a tree or flank for at least some of the pages would would be helpful all right that kind of leads us to the end here like I mentioned in the beginning if you have more questions if you submitted a big chunk of text in your site clinic request I'd recommend first of all trying to post in the webmaster help forum or one of the other webmaster forums out there to get advice from other people but you're also welcome to send me the link to your forum thread directly for example on Google+ and I can take a look from there don't like but I mean and you redirect those two dashes the way it's supposed to be done so if you well I mean there was a mascots video where he was saying you can still leave them this is really neat to read your 301 them but suppose you went with what you guys like can that really hurt the site's ranking within like a couple of weeks suppose you did it like let's say three weeks ago you redirect everything properly plus you you made the proper changes so for you know for duplications and so on time to freak kind of so-called site and can it hurt rankings pretty hard after a couple weeks so so I guess first of all we don't really care about underscore vs.

Dashes so that's something where I wouldn't bother setting up redirects for something like that if the site is set up in one way and you think the other way would be nicer obviously you could do that but I don't think you'd see a significant kind of positive effect in search from that alone no the point wasn't for a positive ranking is just that can it hurt ranking is my sure sure I mean the thing to keep in mind there is essentially you're doing a site structure change of the URLs on your site so we don't recognize that you're just like flop flipping out characters there we see these are completely different URLs and if you change your internal linking structure and the URLs that you're using on your website then we kind of have to re-crawl and re-index the whole website to understand it again and the context of the individual pages it's kind of the same if you go from I don't know dot PHP to dot HTML or you remove the dot PHP completely we we run across completely new URLs and we don't understand kind of intuitively this relationship between the old version and new version we just see these are new URLs with your their redirects they're the internal linking structure changed significantly we have to kind of re-evaluate this site and that could result in after sometime definitely seeing some fluctuations for where things kind of need to settle down and in the long run I think you'll see everything come back to the same same place again but it can definitely take I don't know a couple of weeks maybe two to kind of settle down again okay and also the last question is regarding schemas how often are the updates so suppose I you know you change certain things for this ecommerce website and yeah I mean they have you know 100 reviews and they had like 60 or whatever in the rating the basically let's say you make some changes to the to your schema how long would it take for that to take effect usually pretty much when we crawl so we have to recrawl and reindex those pages but it's it's kind of like content on your pages where if you change the content once you we've been able to reinvest that updated content we should be able to take that into account immediately there's no there's no additional delay from there because I fetched it nothing I don't see anything okay I guess I'll just John you mentioned the info command to help with figuring out the rel age our fling so you show an example how that will work with us like Amazon that the was curious oh you can just just do info : and the URL for it for example probably info : google.com and you'll probably see you know whatever URL we have canonical for google.com which is HTTP dub dub dub Google car engine so if I put an Amazon dot d they don't have proper things about that I think you quickly see no no you basically see if that's the URL that we have chosen as a canonical okay it's it doesn't check for the href Lang or anything like that it's just a kind of a rough and dirty check to see if this is the canonical URL and there's some kind of situations where that doesn't match exactly but for the most part it gives you a good idea for example is dub dub dub non dub dub dub with like a trailing slash or no trailing slash to kind of see which one we chose it cites the like her local like locally like in Zurich for instance instead of the same site in different areas just more catering to you know this crowd and that crowd like it doesn't have to would you guys appreciate that more if it's each site is unique to its own location yeah sure I mean the question is is kind of are these equivalent for us so for user in this location is this equivalent version of this site like I don't know if you take let's say Amazon de and Amazon comm those home pages are for for the average user essentially equivalent so they can kind of dig into different parts of the site and obviously if someone is searching for something in German then they probably want the German version I don't know if Amazon has everything set up properly so I'm kind of cautious to use that as an example all right let me just run through some of the questions that were submitted through the site clinic as well and then we'll go to the QA of the questions that were submitted in the Hangout which are even more questions let's see I'm concerned that my being based in five locations is affecting my ability to be ranked as well as possible in any one of them in general that's not a problem that you you'd need to worry about one thing I do there though is kind of see if you can have the the common the shared information on one really strong place and then just list the individual locations separately so for example you'd have one page about your business about the type of work the type of services that you do the type of products that you have and then the individual locations each kind of have an individual landing page but they're focused on that location they're not focused on the generic information you have there so that way you have really strong pages about the type of business that you do and you have kind of focused pages on the locations but you don't duplicate the type of work that you do on all of the individual location pages subdomain or subdirectory it's it's kind of up to you from from my point of view you could use either one and sometimes there might be marketing reasons where you say well I want to do it like this or maybe there are technical reasons where you say I want to keep everything the same CMS and then I have to put everything in subdirectories it's it's more of something that I'd say is up to you so recommend subdomains or subdirectories from my point of view I go for sub directories just because it's easier to maintain it's it's kind of easier to track it's easier to expand you don't have to worry about setting up separate sites just from a practical point of view but that's I don't know that's that's kind of up to you not really not really so so the thing there is we try to recognize if these are essentially separate sites or if it's the same site and when we try to recognize that on a sub domain as well as on a subdirectory level so it's not the case that we'd say well if you put it on a subdirectory then it'll look like one site our algorithms might look at that and say oh these are like separate hosting things they need to be treated as separate sites and the zone baby increasing I I don't know I'd have to look at the actual example your audio is really bad we're building a lot of content on the blog must we do link building no you don't need to do link building I in general we we kind of expect sites to kind of stand on their own and to kind of grow on their own it's not something where you artificially have to do link building to to get traffic we've been a target of negative SEO and would like to see for efforts to remove this allow have been seen so I looked at this site specific case and from what I saw it looked like you were doing a good job and I don't see the negative SEO causing any problems there let's see we lost our previous domain name and started a new one we kept the same structure and improved some content the old domain is now parked without any content does Google still see this as the same website no if there are no redirects from the old version to the new one and the old version is kind of standing on its own now then we will treat these as separate websites so with the new website you kind of have to get that set up again and kind of get the ball rolling there again so this is a good reminder to make sure that you pay your domain names and set up calendar reminders that you don't accidentally lose a domain name I'm worried in terms of speed I check different tools and I found my site is great but it is it good in Google's eye as well how important is speed so this is something where from my point of view speed is definitely important and we did say that it's a ranking factor but it's not something where you have to focus on the milliseconds and say oh I got 10 milliseconds more therefore my ranking will be like a slot higher it doesn't go like that we essentially look at sites to see is a significantly slower than everything else or is this kind of in the normal range and if it's kind of in the normal range then the speed tweaks that you do essentially have a stronger effect on what people do on your website so if they kind of browse around and they end buying something from your business or not which is probably your ultimate goal and less to do with the actual search ranking so it's if your site is a hundred milliseconds faster than your competitors you're not going to be ranking above them just because of that speed advantage why don't my pages have any Rich Snippets and their search results I set up the structured data I didn't look at this specific site but in general for structured data and which snippets we look at three things on the one hand is a technically correct implemented in correct way on the other hand is it implemented in a correct way from kind of a general policy point of view and is is a site such that we can actually trust it enough that we showed Rich Snippets so those are kind of the three things to look at from a technical point of view it's easy to test with a testing tool from kind of a policy point of view it's something that the testing tool doesn't really pull up that much so if you put for example recipe markup on a page about I don't know as on an SEO blog then that's probably not what we're looking for that's something where you can probably get advice from other peers to see is this really the content the type of markup that matches your pages or not and the general quality issue I'd say is almost a trickiest but in many cases it's kind of obvious if the website is really lower quality then that's something where you need to kind of work on or in general if it's something where you say well the other two are definitely covered well I'm implementing the markup right on the right type of pages then quality is something that could be kind of like a long-term goal where you can continue working at it to kind of keep it getting it better we're concerned the site is not being indexed correctly due to query parameters how do we treat query parameters so in general query parameters are fine URLs with parameters in them we accept them without a problem and we should be able to crawl and index these pages directly you can double check that with search console and the fetch as Google tool enter your URL with the parameters that you have and see if Google can pull up the content directly and if Google can pull up the content there if it can render those pages then we should be able to index those pages as well all right time to move over to the next batch of questions which were submitted let me just see if I can get that the order based on the kind of rough votes I guess on a-pluses doesn't seem to show in the right order but fine we'll just start here we've acquired two of our competitors who ranked on page one for the same keyword but since the purchase of ranking start to drop is there any algorithm which detects domain crowding yes we do try to recognize when sites are essentially the same or kind of focus on the same thing or run by the same company and try to fold those into into one site so that's something you might be seeing there where essentially we recommend just having one really strong site rather than trying to create multiple sites that would rank separately you've said in the past that cross-linking a big amount of websites could look spammy would this be the same for international websites which use href Lang tag and therefore Google needs to know that they're related we have over 80 international sites href Lang is perfectly fine so because what happens with HR flang is the ranking of these pages doesn't change it's just we show the right version of the page and if you have 80 versions of the same page and which are essentially equivalent that we could swap out then that could be providing a really great user experience for people in those countries where you do have those versions so that's something that I think could be really useful our main navigational links are generated by Ajax and don't appear in the source code nor when using fetch and render would Google see them if we add them in the source code as an AAF with with a display:none or still serving them to users with Ajax so we should be able to see links that are generated by Ajax and if we're not able to see those pages with those links with fetch and render then I try to look into what's actually happening there that prevents Googlebot from seeing those links because I almost recommend not focusing on content that's hidden because it makes it really hard for us to figure out is this really relevant for this page but rather making sure that those pages can be rendered properly with the full content with all of those links as well so that's kind of something where I take one of these sample pages and kind of try to strip it out into multiple parts and figure out which part of your configuration is blocking Googlebot from actually seeing those links do you treat affiliate links the same as links to external domains the same same as pages on the same domain or is there a difference I know I have to make great quality site for users I'm just asking about the links so obviously great quality sites is one of the most obvious issues that we run across with regards to affiliates in general we we treat those links as saying the one thing I'd watch out for with affiliate links is that you put a nofollow on them so that we know that there's kind of a monetary relationship here that we can take them out of our link graph essentially for many cases we recognize that automatically and kind of treat them as nofollow anyway so essentially they would be the same but with regards to an affiliate site if you have links to kind of the affiliate source where people can actually buy the content that's not something I'd worry about I just put a nofollow on there and then that would be generally fine is there a maximum number of requests per page no there isn't it's fairly rare that I see Googlebot kind of running into a situation where we can't render a page because of the number of requests per page but you can kind of test that with fetch and render tool and you can also look at things like the waterfall diagram on webpagetest.org and try to kind of compare it to other sites that you're seeing out there to see that you're kind of in a reasonable range sometimes it also depends on what kind of requests are in there if they're able to be cached or not if they're from other hosts that are actually really fast all of this kind of comes together we're planning to launch a new shop but unsure how many pages we should index there are different sizes colors materials etc is it intelligent to combine every value with each other and lettered all the index or is it too much what would you recommend this is something where there is no answer to or no absolute answer to all of the cases here what I'd recommend doing is kind of taking a step back and looking at the pages that you have and thinking is this really something that can stand on its own so would I make a page for a blue shoe in this this model for this size would I make a page separately for that or would it make more sense to kind of have one general more general page maybe a step back and saying well all sizes male and female on on one page or maybe I split it by male and female those kind of things so is this something that would stand on its own or not and in some cases you kind of drill down a little bit further because you know this is one very specific thing that people want just like that and then it makes sense maybe to have a separate page and sometimes you say well people generally just want this a vague kind of thing here and I have all of these variations different sizes colors on the same page and that kind of helps solve their problem as well our shops treated differently compared to other sites does Google recognize that a site is a shop and ranks it higher for buying related search queries is a problem that shops mostly have less textual content than normal sites at first on first hand I'd say shops are treated the same as generally any other kind of site but we do try to recognize when someone is specifically looking for a product and we'll try to kind of bubble up those kind of more product more shop oriented pages in a case like that so that's not something where you need to put things like oh this is a web shop on top in general we can pick that up pretty fairly well with regards to shops generally having less textual content than other sites that's perfectly fine you don't need to write a novel on every page and even for normal sites sometimes like just a small paragraph of information is enough to make this page relevant so you don't need to focus on a minimum word word count per page three years ago we had a domain that got a manual penalty attempted many times to resolve disavow etcetera little success what should we do now that it's been a few years I'm not completely sure of what you mean with regards to little success if kind of cleaning up was there was it wasn't that successful or you didn't see that many that much of an improvement in the search results if cleaning up wasn't that successful maybe that's something you could tackle again or someone else could tackle again if you're just not seeing results in the search results then that's obviously a little bit trickier with some things if you're looking at something that's been in in place for multiple years you need to also assume that maybe things have changed in search in that where your site was ranking three or four years ago isn't that where it would be ranking now even with like all of the cleanup taking into account so that's that's one thing to kind of look at like is is my site really still relevant for these queries or has the web kind of moved on and changed significantly since then and obviously that means maybe it's a little bit harder that you have to take a step back and say well I have to rethink what my offering is what I want to provide on the web or how are provided on the web that's something sometimes you kind of have to take into account and for other things it's also possible that it just takes a lot longer for things to kind of change when you clean things up on the web so specifically since you mentioned disavow and deleting links that's something where on the one hand the manual actions might might get resolved fairly quickly or depending on how much work was done here but things like algorithmic changes sometimes take a little bit longer to take effect can you explain a discrepancy between Google Analytics and search console search query data even though Google Analytics is pulling this data from search console not sure exactly what you're suggesting there the one thing of course is the general Google Analytics data which is collected on this on a site directly with the analytics JavaScript itself that's obviously based on kind of what users are doing when they come to your website so that's something that might not be reflected in search console because in search console we only look at the search side the impressions of the number of times that the site was shown and the clicks through to the site and even the clicks through to the sitemap might not match exactly what Google Analytics and the JavaScript side is tracking so that's something where you you'll definitely always see a slight discrepancy the other part is I believe in analytics it's the SEO report not sure on the exact name where you see the search console data itself one thing that's different in analytics is that it explicitly bubbles up the the queries that were filtered out in search console so we filter out things where a query maybe was only done once and we might filter that out for privacy reasons and in search console you'll see that in that the top number and impressions will show one number and if you add up all of the individual rows you might come up with a different number so maybe you'll see I don't know 90 if you add up the rows and 100 is shown on top as impressions and that's kind of the difference between the URLs that work or the queries that were filtered and in analytics that split out separately I believe it's called not set or something like that where you see one row with this number so in this case in this example you'd see kind of those 90 which were the individual queries in that one row with not with ten impressions for the filtered out ones so that's one of the differences you might be seeing there you're not the first person to suggest that yes I I don't know what the specific plans are there I can't make any promises but this is something that people have mentioned before as well Wow still tons of questions left and so little time what's what's the reason sites always lose organic search visibility even when moving domains as opposed to reap lat forming without changing domains i Ono so in in general you should be able to do a site move without seeing significant drop in in search value but this is something where if you do a site move sometimes it just takes it a little bit longer for everything to settle down in the new state can you clarify 301 redirecting a domain will pass a penalty I asked about a weird domain from China redirecting to my site a year ago and you said it would not pass the penalty but last week on the webmaster hangout you implied that it would so I think there are two two different situations here on the one hand some random website that's redirecting to your website that's something that we generally ignore on the other hand if you take your website and redirect it to another new website where you're essentially moving from one domain to another not just kind of like randomly redirecting to some existing website then that's the case where we might pass on the penalty information the manual actions the algorithmic issues there as well especially if they're based on on the links of course also on the content if you move all of your content and the manual action is based on the content then that's something that also makes sense to kind of apply to the new domain all right let me just open it up for more questions from you all what else is on your mind I've got a quick question Jonathan s'alright all right go for it and in terms of backlinks there's a lot of like companies you talk about you know regularly monitoring your backlinks and entering them for disavow is that something we should be doing on a regular basis like every month every quarter something like that in general you don't need to do that so that's something where most of kind of the random weird backlinks that you get on a site or things that we take out of our algorithms anyway so that's not something you need to do but on the other hand if you look at your backlinks and you see something really crazy and you're worried that this might have a negative effect on your site then I mean just put it in a disavow file so that you can move on but for most sites most businesses I don't think you need to monitor your backlinks yeah okay thanks Don hi John hi I've got bank related question as well so in our industry most are still us weapons they were a couple websites they're coming up hey without significantly too in the ACO you actually what does to do so that means they're just doing everything what you actually don't want us to do and they are being successful and my question is these actually eliminated filters such as penguins and stuff still effective and why yes why they are actually not detected those kind of websites so I guess the question is general you you see people doing web spam and getting by giving it away from it yeah so from our point of view we we do so focus on these issues we do still take manual action on them we do still take algorithmic action on these things but sometimes things kind of can get through so that's one thing where maybe a web spam report will help us to kind of figure that out here about that or be able to take action on that manually okay thank you all right one one quick mention again for those that submitted in the site clinic and they sent me this big chunk of text if you want to post in the help forums and link to your thread there or send me that directly on Google+ feel free to do that I think for some of you there might still be some questions open and I'm happy to kind of help out where I can there alright I need to move on but it's been great having y'all here thank you all for all of the sites are submitted I think maybe we need to figure out a way to do this in a more systematic way with site clinics because there's just so many questions that are coming in and so many sites that didn't kind of match the pattern that we were looking for who also have questions so we'll try to see if we can figure something out there as well thank you all for the questions here as well for joining us live and maybe I'll see you guys again and one of the future hangouts thank you John yes but I'm one